Sadly there's some of that even in the professional sphere.
I'm in Computer Science, for example, and I'm a big fan of the Ada programming language because it is designed to avoid bugs as early as possible -- and it is a well known fact that the earlier you address a problem the less it costs, and compile-time is the earliest that a compiler can go (Design-time is an earlier phase, but not addressable on the software/too side) -- but many companies refuse to look into "non-standard"* languages. (As an example Yahoo Stores used to be written in LISP, and then it was rewritten in C++ & perl, IIRC... despite already being a proven technology and highly adaptable.)
In some sense it's the case of "we can't hire people who know X" and "we can't teach X because it's not marketable" on the parts of employers and educators which, in turn, degrades the whole of the profession precisely because ideas aren't tested by reality. -- A good example here would be the recent "problem" of efficiently/effectively using multiple cores and parallelism; it's been solved for thirty years: Ada [for one] has *always* had the capability. (But for the "industry standard" C-style languages it's usually not provided as a language-level construct, but a library-level construct.)
* 'Standard' here being C-like (C, C++, Java, PHP, etc).
And yet America re-elected the Problem!
The first 90% of a software project burns up 90% of the money alloted for it.
The remaining 10% of it, burns up the other 90% of the money.