Posted on 12/15/2007 7:18:06 PM PST by ShadowAce
SAN JOSE, Calif. (AP) - Sixty years after transistors were invented and nearly five decades since they were first integrated into silicon chips, the tiny on-off switches dubbed the "nerve cells" of the information age are starting to show their age.
The devices - whose miniaturization over time set in motion the race for faster, smaller and cheaper electronics - have been shrunk so much that the day is approaching when it will be physically impossible to make them even tinier.
Once chip makers can't squeeze any more into the same-sized slice of silicon, the dramatic performance gains and cost reductions in computing over the years could suddenly slow. And the engine that's driven the digital revolution - and modern economy - could grind to a halt.
Even Gordon Moore, the Intel Corp. (INTC) co-founder who famously predicted in 1965 that the number of transistors on a chip should double every two years, sees that the end is fast approaching - an outcome the chip industry is scrambling to avoid.
Preparing for the day they can't add more transistors, chip companies are pouring billions of dollars into plotting new ways to use the existing transistors, instructing them to behave in different and more powerful ways.
Intel, the world's largest semiconductor company, predicts that a number of "highly speculative" alternative technologies, such as quantum computing, optical switches and other methods, will be needed to continue Moore's Law beyond 2020.
"Things are changing much faster now, in this current period, than they did for many decades," said Intel Chief Technology Officer Justin Rattner. "The pace of change is accelerating because we're approaching a number of different physical limits at the same time. We're really working overtime to make sure we can continue to follow Moore's Law."
Transistors work something like light switches, flipping on and off inside a chip to generate the ones and zeros that store and process information inside a computer.
|
On Dec. 16, 1947, Bardeen and Brattain created the first transistor. The next month, on Jan. 23, 1948, Shockley, a member of the same research group, invented another type, which went on to become the preferred transistor because it was easier to manufacture.
Transistors' ever-decreasing size and low power consumption made them an ideal candidate to replace the bulky vacuum tubes then used to amplify electrical signals and switch electrical currents. AT&T saw them as a replacement for clattering telephone switches.
Transistors eventually found their way into portable radios and other electronic devices, and are most prominently used today as the building blocks of integrated circuits, another Nobel Prize-winning invention that is the foundation of microprocessors, memory chips and other kinds of semiconductor devices.
Since the invention of the integrated circuit in the late 1950s - separately by Texas Instruments Inc. (TXN)'s Jack Kilby and future Intel co-founder Robert Noyce - the pace of innovation has been scorching.
|
"I think (the transistor) is going to be around for a long time," Moore said. "There have been ideas about how people are going to replace it, and it's always dangerous to predict something won't happen, but I don't see anything coming along that would really replace the transistor."
But there have been considerable stumbling blocks in recent years.
One problem has been trying to prevent too much heat from escaping from thinner-and-thinner components. That has led chip companies to look for new materials and other ways to improve performance.
Earlier this year, Intel and IBM Corp. separately announced that they discovered a way to boost transistor efficiency.
The solution involves replacing the silicon dioxide used for more than 40 years as an insulator, but has since been shaved too thin, with various metals in parts called the gate, which turns the transistor on and off, and the gate dielectric, an insulating layer, which helps improve transistor performance and retain energy.
Still more novel ways to prevent electricity leakage - and other problems - are being pursued. And nobody has won a bet against maintaining the pace of innovation in technology.
"The only thing that's been predicted more frequently than Moore's Law has been its demise - everybody's been wrong," said Sun Microsystems Inc. (JAVA) Chief Technology Officer Greg Papadopoulos. "It's a pretty robust set of observations and really it's about techno-economics ... It's a dangerous thing to bet against because of the economic investment cycle that's in there."
I’ve been expecting this. The next leap will be quantum computing, but it’s still quite a few years down the road before it is used in practical applications.
*sigh* Or not.
Master Jordan needs to take a breath.
Yeah, I hear we’re gonna run out of oil and food, too.
“The transistor was invented by scientists William Shockley, John Bardeen and Walter Brattain to amplify voices in telephones for a Bell Labs project, an effort for which they later shared the Nobel Prize in physics.”
Back when the Nobel Prize meant something. Today the libs have taken it over and use it as a political tool.
That is true. Don't write the transistor off so easily.
My thoughts exactly. They may not make them smaller, but they will make them more efficient and more powerful with multiple processors, better memory, elimination of the hard drive, etc.
500th thread!!!
Congrats!
How much lower than submicron can they go or have they?
I remember when core memory and magnetic tapes ruled, what an invention. boy, did things change since then. ;-)
I don’t know, the transistors have been getting pretty small - like less than 100 atoms thick. It’s those pesky laws of physics. Plus, it has to be affordable, too. There’s reasons why people don’t own supercomputers.
I remember this being said about the Vacuum tube, the transistor and the first microchips.
And Microsoft keeps making software bigger and slower.
My guess is that optical processors (or elctro-optical) will be next. TRW started toying with the idea in the early eventies and IBM is currently working in that direction.
(Besides, it seems like every year someone espouses this "end is near" stuff.)
eventies = seventies
Started with Tubes,,they had their day,,transistors will have theirs’,,magnetics is next,,
10 years from now someone will refresh this thread and we’ll all have a good laugh
I remember the eventies! Those were the days.
You may have invented a new seperation of decades. LOL
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.