Americans have been brainwashed for the last 60 years into believing that a college degree is as vital and necessary as having two arms and two legs.
They wanted to herd successive generations away from family owned small businesses and towards corporate and government jobs.
Little kids enter elementary school being told that their entire future hinges on a corporate hiring department scrutinizing their academic credentials.
The parents have no clue what kind of subversive crap their kid’s heads are filled with. Why should they care if they’re being jabbed with poison over and over again. As long as they get the degree, and the corporate job, nothing else matters.
Statistically, the person with a college degree earns a higher wage than the person without one; also, statistically speaking, additional degrees translate into additional income.
Is this always true? Of course not. (And before the anecdotal stories come in of people with nothing more than a pre-k education making millions, I'll save everyone the time and say that I believe all of them. I'm only referring to statistical averages.)
But I think the cost of a four-year college tuition has simply become prohibitive; it was financially expensive prior to Covid, and the requirement to get this shot makes it physically expensive.
Many universities became popular when young men looking to avoid service in Vietnam saw them as a viable alternative; eventually, they became an expected part of one's education. If there is a benefit to all of this, it will be that a university education falls back to its proper place and the trades become a more reliable means of making a living.