I sometimes wonder, where did this idea, that everyone should go to college, come from?
There are some job and career fields for which college and/or grad school is mandatory, such as law, medicine, various hard sciences.
But how many jobs are out there, for college grads who major in Women’s Studies, Sociology, Philosophy, various liberal arts areas?
I'd suggest, that in comparison to many offerings, that is a noble field of study - if run largely in a traditional way.