In the 1950s my cousin went through medical school with a program that paid if one signed up to work in underserved areas for a certain number of years. He chose to work on an Indian reservation, and then settled in Arizona for the rest of his life.
I like that idea. You get to practice your career. It sounds like a win win.
I have a couple of things to say on this subject:
#1: I understand the med school program with a COMMITMENT to working for several years in an area that needed medical services. This actually had merit.
#2. I don't think he's talking about specific majors or
a commitment to work. Free college for “women's studies”?
free degree in “1970’s Television and it's impact on the Rain Forest”? or some other asinine waste of time. Having
a degree whether AA/AS or BA/BS or even a Masters/PhD doesn't equate to employment if that degree is in a useless
major!!
#3 Nothing is FREE - somebody (i.e. you and me) will be paying. Frankly I'm still paying off my children's state
college degrees. And, yes, for the record, their degrees
were actually useful. Son - Biology/chemistry employed!
Daughter - early childhood education - employed at a private
school (no public unions for her!)