Back in the 1980s, I understood that "Feminism" is the belief that men are horrible, violent, abusive monsters; and that women should strive to be just like them.
There is nothing “Feminine” about “Feminism”.
It saddens me. As a man, one of the things I always appreciated about females is that they had the power to be a strong moderator and positive influence on male behavior, often for the better.
Men and women together in a good match are far happier and more capable together than they could be separately. Each sex brings components to the table, and those components (surprise) often compliment each other perfectly.
Indoctrinating women that they don’t “need” men is just as counterproductive as indoctrinating men that they don’t “need” women.
It is undeniable that for certain people and situations, this is true, but to codify it in society has been exceedingly corrosive and destructive.
And I think many of these kinds of things are products of that.