This trend has been obvious for many years.
If the public schools ever do reopen, look at most of the Back To School advertisements. Most of them show girls looking at new stationery or new clothing.
Only rarely do any store posters depict boys or young men ready to buy their stationery or their school clothes.
Even more rare, are ads showing white young men.
What does that silently imply to most kids?
It may suggest to them that going to school or even wanting to be in school is mostly something girls do.
It may suggest to them that white boys are not expected or even wanted in most schools. Why spread such poisonous, negative concepts?
That attitude will not help the boys at all.
The best thing that could happen to the USA in education, is for our government run / union taught school system to die. Rebuild with a combo home-school and private school system. Fire all union teachers, and give the existing school buildings to private , including, but not limited to religious groups.
The answer might be for the white boys to learn a trade that the left can’t live without.
The left, in its bloodlust to destroy white, essentially Christian, masculinity, is shooting itself in the foot and have no clue.
Ask the citizens of South Africa how it’s worked out for them getting rid of the white population.
They aren’t wanted in schools. At school they are taught that having white skin makes them fundamentally racist and that they need to be “eradicated”; their classmates are indoctrinated to despise white people and regard them as oppressors with no right to exist on American soil; they get hammered with stuff about “toxic masculinity” and “rape culture”. Starting in kindergarten those white boys have teachers scrutinizing them for any vaguely girly behavior so they can be shipped off to a Dr. Mengele gender clinic.