I have never understood why women take such great delight in referring to themselves as victims. Now, that doesn't mean women don't get set upon by men - they certainly are.
What I'm referring to is this "culture of victimhood" one sees on college campuses nearly everywhere in America where women seem to be trying to cow men into some state of emotionalism nihilism. Why do they take such delight in it?
BTW, these seem to be the very same women who positively revel in performances of "The Vagina Monologues."
As simple as this may sound, I think it's a power grab. I went to undergraduate and graduate school and met a lot of feminist students during that time. It took me a while to recognize it but I got the impression after all that time that they really want the power they see men having. It's not about equality but being in charge completely and utterly.