The political climate for years has been to move women and minorities into college to “decrease” the unfair advantage that “white” men have. (I guess that means we are smarter than them) This has led to an unprecedented number of men going into other areas and NOT into college.
Sounds like the women brought this onto themselves.
In my experience, college-educated women are far more likely that college-educated men to buy into every destructive social trend to come out of academia in the last 50 years. Women are far more likely than men, for example, to buy into the kind of politically correct "social crusading" nonsense that has permeated our culture in recent years. I don't know any real men who would put up for more than ten minutes with the kind of idiocy that passes for political and social discourse these days.