The schools teach that men can become women, and vice versa, by changing their minds. They teach American students to hate their country and to regard Western civilization as a cancer to be eradicated. They insist on the “nothingness burped and vomited up the cosmos” theory of creation, and that meaning, value, purpose, truth, reason and justice are objectively meaningless. They teach that we are naturally good and that unrestrained self-expression, self-actualization and self-gratification are entitlements: every man a god to himself. They teach that socialist thugocracies and islamic death cults are preferable to liberal democracies because failed cultures are oppressed and the culture that built the modern world is an oppressor.
Why shouldn’t schools teach that the world is about to end? It is as true as the rest of what they teach, and they teach ultimately that truth itself is meaningless so the question itself is irrelevant.
They also teach that conflict can be resolved by retreating to safe rooms and sucking their thumbs.
Does anybody else see an urgency for Education reform?