Posted on 03/24/2006 1:36:37 AM PST by Straight Vermonter
Big Blue researchers feat suggests the material could be a candidate to replace silicon in chips.
IBM researchers have achieved a milestone by creating an integrated circuit out of a single carbon nanotube, a feat that makes the material a likely candidate to replace silicon as the main ingredient for making chips.
Big Blue plans to detail the accomplishment in the journal Science on Friday.
Long thought to be a good candidate for replacing silicon, carbon nanotube has posed great challenges for scientists who try to coax transistors out of the material and create an integrated circuit (IC). ICs are chips that process and store information in a variety of electronics devices, from computers to cell phones.
Creating carbon nanotube transistors has been done before, but figuring out a reliable way to assemble them to form an IC has stumped many bright minds. Wiring together transistors developed from a single carbon nanotube is an even more difficult task.
But the IBM research team did it. With an 18-micron long carbon nanotube, the scientists built a 10-transistor ring oscillator, a device typically constructed to test new manufacturing technologies or materials. Using one instead of many carbon nanotubes to build an IC reduces the manufacturing steps and therefore cost.
We were working on it for one tough year. said Joerg Appenzeller, an IBM researcher who worked on the project, which also involved researchers from the University of Florida and Columbia University in New York.
The feat will advance the engineering and manufacturing of carbon nanotube chips for the commercial market. Electrical current moves more freely and faster through carbon nanotube than silicon, making carbon nanotube a more energy-efficient material for a speedier chip. It also is super small. A nanometer is a billionth of a meter, and a carbon nanotube is 50,000 times thinner than a human hair.
All these properties make carbon nanotube an appealing candidate for improving performance by piling on more and smaller transistors on a chip without causing overheating.
But the material also is difficult to manipulate so that it develops uniformly during the chip-making process. More research will have to be done to figure out how to cheaply and efficiently make carbon nanotube chips that can outperform silicon chips.
Its a way off, said Fred Zieber, an analyst at Pathfinder Research, about commercializing carbon nanotube chips. It could be a few years or an eternity.
IBMs carbon nanotube IC is nearly a million times faster than previous ICs built with multiple carbon nanotubes. Even then, IBMs prototype clocks only at 50 megahertz. The fastest chip on the market today is a 3.8-gigahertz Pentium 4 by Intel.
Mr. Appenzeller wont even give his estimate of when carbon nanotube chips will be available for the commercial market. But he and his colleagues aim to build one in the gigahertz range, possibly within two years. The long-term goal is to build a terahertz chip.
Its like the first time we built a car, Mr. Appenzeller said. Now we know the obstacles, and we have ideas on how to improve it.
Actually, we are reaching the limitation of silicone technology as we us it today. But when we get carbon nanotube technology downpat. The next great technological leap will be silicone nanotube technology (far smaller than carbon nanotubes) than can carry and are thinner than lightwaves.
I do not know that much of the internals. In fact, we don't pay much attention to them because they do not accurately reflect what the performance will be for a given application. The Pentium D chip consists of two Pentium 4 Prescott dies in a single package. The processors work as two independent CPU's. The Pentium 4 with H/T technology is a single core chip that enables multi-threaded software applications to execute two software threads in parallel.
Depending on your application, hyperthreading may provide no increase in performance. Applications which benefit from H/T include transcoding, compression, encryption, etc. which require a lot of floating point operations.
If you want faster desktop application computing, the P4D is likely your best choice. If you do a lot of multi-media, game playing and handle your audio and video, the P4 H/T may give you better performance.
As to the heat, most of the machines we work with have 300-500 Watt power supplies. I do scalability testing. My intent is to find the maximum capability of the server. In any given server, there is a weakest link. Could be non-paged memory, paged memory, disk I/O, etc. For my purpose, it doesn't matter what it is, it just matters that we know what it's maximum capabilities are. The harder we push them, the hotter they get.
Hopefully that was of some help. If it didn't help answer your specific question, Intel does have very good doc on the chips. If you are really interested in the chips, look at the AMD Opteron.
Eventually, liquid cooling is the way we will have to go. Cooled Fluorinert comes to mind. Many home computers now cool both the central and graphics processors with water as apposed to fans. Much quieter. I am headed in that direction myself.
Absolutely. I wasn't really clear as to how my comment related to the article, but, we need a revolution to advance technology, evolution of silicon is at an end. That IBM is looking for it is very exciting. IBM did extensive research on quantum computing as well. Not sure what is happening with that.
The first computer I repaired had tubes w/ 256 bytes of ferrite core memory and the drive had a hydraulic positioning mechanism and stored 2.5MB. I certainly hope I get the opportunity to purchase a silicon nanotube laptop. The "roasting my chestnuts" was not a completely meaningless observation! :)
Bus speeds are one of the limiting factors. We already have:
CPU - Central Processor unit (multiple cores)
GPU - Graphics processor Unit (with multiple pipelines etc)
DSP Sound co-processors (XFi processor comes to mind, 51 million transistors)
PPU - Physics Processor -Unit (Physx)
All told, just those processors alone (ignoring the rest of the computer) are starting to push a billion transistors by themselves.
We both are. I had an S-100 computer with hand soldered daughter boards using an 8080 CPU and front panel switches back in the late 70s. :-)
Used an ASR-33 Teletype as my terminal. Had to build a conveter unit that converted 20Ma current loop to RS-232 to allow them to talk to each other.
Not quite yet. :-) 45 nm is on the way.
So... unusable by liberals, then?
That thing is so small new born babies won't even have a reaction to it when they are chipped with their Social Security number before leaving the hospital.
A friend of mine and his son experimented with water cooled computers. They had a heat sink that would freeze the components it was so cold. It was interesting to watch.
I remember paying close to $600 for my Hayes 1200b internal modem.
Just my luck! I just spent a bunch on sand futures!
Well, from what I've heard, these days it almost feels like the real thing!
I believe you and the previous poster meant silicon. The extra "e" makes it an entirely different substance.
I'm off to confiscate extra "e"s from the rest of the thread. Sees ya!
my next major upgrade is going to see water cooling for sure. There are some neat ways of implementing it these days.
But I'll be it uses something like less than 1/1000th the power.
Power Management is hot these days.
Hmmm. The whole tinfoil concept is going to need some refinement with a cell phone inside your head.
With a computer based on these, you could get a BSOD in half the time, and not use as much power doing it!
Yup. You're obviously as old as I am, so I'd say you were old. But I had a similar first computer, so I know what you're saying. Dot-matrix printing...mmmmmm!!
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.