Women used to be the guardians of religion, morality and decency in society. What the heck happened?
Something about fish and bicycles.
The men were supposed to be.
Some would explain the problem thusly: Due the vote, the pill, and no fault divorce, American women have largely been freed from accountability for their conduct (aside from gross criminality), and they have degenerated accordingly. From that point of view, they are only virtuous to the extent there is a functioning patriarchy to force it upon them. Correct or not, it’s an argument I’ve seen advanced to explain our current crop of feral females.