Posted on 04/23/2005 8:30:02 PM PDT by anymouse
This is good news for current CS pros like myself. Makes my services that much more valuable.
who can blame students for not wanting to go into CS when every day more and more jobs are going to India and China.
Sometimes I question the value of a strict CS degree.
And we should care, why?
(Speaking as a female with an M.S. in CS...)
It might have been more meaningful 20 years ago, but now a lot of those problems are solved.
You know most of the greats on the pure research side did not have CS degrees at all - Knuth, Kay, McCarthy. A good many of them ended up with Turing prizes.
Well, it is a discussion...
"Is CS really that rich of a "science" to spend 12 years getting a PhD? "
Yes, it is, and that is why such comments are so sad. We have left our profession to the MBAs. 99% of the software out there is junk and very poorly designed. Well, it isn't really designed at all. It happens by accident.
BTW, a PhD is about 7-8 years total, not 12. No Masters required.
A good student will need no more than 5 years to get a PhD. Those taking 7 to 12 years are in the humanities where the whole point is to avoid real work and to stay in college as long as possible.
"Is CS really that rich of a "science" to spend 12 years getting a PhD?"
As a PhD in CS, er ... yes. Theoretic CS (algs to applied Math), to CAD/VLSI algorithms (designing solutions to automatically solve complex IC design tasks), to computer architecture and engineering (ie architecting the most complex creations man has made), to AI (ie understanding the concepts of thinking, memory etc.), to software (which by itself covers large ground, ie optimization to computer languages to ).
CS is the most knowledge-intensive field one can think of, actually, since it touches on so many other fertile areas from cognitive sciences to math to electrical engineering.
" Is it really a science at all?"
Most of CS is really a form of Engineering, which is why they are generally in schools of Engineering, and the better programs are "EECS or "ECE" (electrical and computer engineerings) depts (eg how U Mich. and Berkeley does it), and not a CS dept in the Liberal Arts college.
But "Computer Engineering" is a narrower term, ie engineering of computers. Computer Science has been defined as the science of anything to do with computers.
" I have seen some pretty silly doctorial work; comparable work in Physics or EE would not really be allowed, at least in the big schools."
dang, you read my thesis, have you? :-)
"You know most of the greats on the pure research side did not have CS degrees at all - Knuth, Kay, McCarthy. A good many of them ended up with Turing prizes."
Cheap shot. They were educated *before* CS really took off as a field (which was in the 1960s and 1970s).
Nevertheless, CS touches other fields so there is oppty for someone in other fields/depts to contribute to CS and vice versa. E.g. any good applied mathematician can work in theoretical computer science and/or algorithms, if they want to study those problems.
As one of my applied math profs used to say:
"Whats the difference between CS and applied math?"
"About $10,000 a year" (more for the CS profs)
"Yes, it is, and that is why such comments are so sad."
yes, it was a sad comment ... and not too well-informed either.
OTOH, what drives the volume of people getting BS degrees is if they can get a good job ...
"We have left our profession to the MBAs. 99% of the software out there is junk and very poorly designed."
... and likely there is a relationship between
your statement and the decline in CS interest.
Somewhere along the line, software development itself
become commoditized so much that there is a decline in the value of CS (you can hire in India), and a vicious cycle
where professionalism declines in the field.
baut the error may be that the job of "software developer" and "Computer Scientist" are distinct things. I am lucky to be in the one area - EDA software and IC design - that actually utilizes many parts of what you learn to get a CS, or in my case EECS degree.
That is the problem. Anyone with a liberal arts degree, an IQ above 125 and a willingness to learn can become a passable programmer in under a year. they wont have a clue about runtime complexity, may be weak on SW engineering, and have no inkling about P vs NP, but they can hack SQL or Visual Basic just fine.
((SW Job != SW profession)
&& (SW Job != Computer Science as a field))
It's a pity, because Moore's Law is rolling down the track like a runaway train, we have computers 1,000 times faster than 20 years ago.
... and software has barely evolved !!! ...
... the best OS out there (LINUX) is basically the same danged OS that was banged out in berkeley 20 years ago (UNIX BSD). In 1990, I figured UNIX would beat out the pathetic MS DOS cr*p. boy, was I wrong. by 1995 Win95 took the world by storm. same year I started playing with LINUX.
10 year later, and *still* MS is not as good as UN*X.
Huge development teams only seem to create dinosaur code.
more people notice LINUX these days ... now that it has become less of a hacker's innovation, and more of a 'commoditization of the OS'.
Maybe we can do better.
My thoughts exactly.
Looking at the graphs... What happened around 1982-83 (besides of the recession). A very similar trend as now.
My #1 daughter is finishing up her CS degree at a top 10 CS school. Her comment was that they intentionally decimated the field with three killer gateway classes during which 2X as many women dropped the major compared to guys. In a couple of the 400 level classes she has been the only woman taking it. The good news is that she thinks nothing of it.
The bad news is she wants to go into testing vice development. Thats good for the world in generally (we need more software testers), but bad for me since it dramatically lowers her chances of becoming the next Bill Gates and I will actually have to work long enough to retire.
What's tragic is the amount of work done today by people who don't seem to have ever read Knuth.
BTW, I find it interesting that the original Pentium bug occurred in long-division with some immensely-complicated circuitry that was needed to do two bits at a time. I came up with a simple method for doing many bits at a time (turns out someone else came up with it to); someone who read Knuth would have gotten a clue to the method, though Knuth (at least in 1969) didn't take it all the way.
What's the difference between hardware and software?
As time goes by, hardware get smaller, faster, and cheaper.
Software gets bigger, slower, and more expensive.
One of the things I like about my job is that I program for micros where size and speed actually matter. A typical machine I program for has less than a millionth the RAM of a typical desktop machine, and runs at about a thousandth of the speed.
Maybe they should get Larry Summers' take on this?
I wonder if we will see some of the Computer Science departments start shrinking their faculty? If the incoming number of students goes down precipitously, I would imagine that many Computer Science departments would feel some pressure to reduce the number of faculty, even if the grant money was still rolling in at full pace.
It matters in other areas too, it's just that CS colleges are cranking out graduates who are taught that it doesn't. Typical grads today produce some of the most bloated, inefficient, and resource hungry crap code I've ever seen.
J2EE/Java = a plot by Sun to sell more hardware.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.