This might be the best lesson he's ever learned; he simply tells the college, "Absolutely not."
If the degree means any risk to his health, it's simply not worth it.
Americans have been brainwashed for the last 60 years into believing that a college degree is as vital and necessary as having two arms and two legs.
They wanted to herd successive generations away from family owned small businesses and towards corporate and government jobs.
Little kids enter elementary school being told that their entire future hinges on a corporate hiring department scrutinizing their academic credentials.
The parents have no clue what kind of subversive crap their kid’s heads are filled with. Why should they care if they’re being jabbed with poison over and over again. As long as they get the degree, and the corporate job, nothing else matters.