Just to be clear, are you stating a universal belief in the role of college, or the state of decay of college degrees today?
Wasn't there a time when industry would give grants to colleges for research into needed technologies for the dual purpose of developing next generation technologies for businesses and preparing a program for students to learn the funded field?
In other words, there was always a Humanities component to basic college curricula to teach critical thinking and the arts, but then there were the specific upper division disciplines that reflected the economic needs of the workforce.
Now, this may have been more distinguished when America was still an industrial manufacturing powerhouse, and colleges now are reflecting the social changes from the liberal "peace dividend" that transitioned away from industrial (read: wartime) economies to service economies to "social justice" economies (read: green, climate-change, privilege transfer) that we see today.
We are seeing the inevitable result that colleges today are no longer serving the economic needs of the nation, because the products of these schools are unprepared to take the place of their retiring counterparts.
-PJ
To the left, this is no longer seen as symbiosis. They see it as parasitism.
An evil private company funding research at a university using state money and university resources?
Unless it's for a self-proclaimed 'green science' like man made global warming research this can get a school in trouble.