Posted on 07/12/2014 10:46:29 PM PDT by BenLurkin
IBM boffins have been given a tidy $3bn cash pile to solve a problem that lurks not too far in our future.
That problem is the imminent breakdown in conventional chip operation and chip materials as we shrink transistor gates from today's 14nm process size to 10nm and 7nm.
At around 7nm, which most industry observers expect we will hit in the early 2020s, things start to get really unpleasant. More and more jostling electrons jump in and out of gates against processor designers' wishes, leading to a frustrating problem known as gate current leakage.
...
These investment areas include: quantum computing, neurosynaptic computing, silicon photonics, carbon nanotubes, gallium arsenide, low-power transistors and graphene.
It's an ambitious list of some well understood technologies and some less understood ones.
Silicon photonics, for example, is an area where companies like Intel and Corning are doing work to make silicon photonic cables, and IBM rivals like HP are preparing their own endeavors.
"Nearer term we are looking at integrating components of conventional Von Neumann systems like connecting processors and memory in a more integrated way," Guha said. "Silicon photonics is going to play a huge role over there [for] efficient and cheap optical interconnects. You might see implementations in a few years timeframe."
...
Between quantum computing and silicon photonics are the many medium-term technologies that IBM expects to develop over the next few years.
The company sees carbon nanotubes (CNT) as a good candidate for the replacement of silicon as they are "three to ten times better than silicon tech on a [process] node-to-node basis". Production of CNT at mass scale is ramping up as well, he said: "You have to make carbon nanotubes with purity levels that are six nines. Today we are at four nines..."
(Excerpt) Read more at theregister.co.uk ...
Related?
Lomiko’s Graphene 3D Lab Files Patent for Multiple Material Printer Filament
http://www.freerepublic.com/focus/f-news/3113868/posts
That’s all well and good — but are they going to formally verify the chip-designs for future processors?
It seems to me that having your processor mathematically proven correct would be a good thing.
There is no law that says computers need to have the same hardware. I guess it’d be okay for computers to have totally different technology through and through.
Nothing in my statement indicates I would want that. -- just that I want to be sure that it works as intended.
I guess itd be okay for computers to have totally different technology through and through.
It is.
as long as they can browse the web and send e-mail, they’ll have customers.
pingie thing...
Neuromancer is almost here?
She’s not a ten but she’s six nines...?
All the nines are after the decimal point. She’s not quite a one, but getting closer.
Next up: razorgirls and brainjackings.
How much of this will be done in China considering IBM is largely an Chinese company now.
As long as it isn’t Lotus Notes.
Processing power and higher density integration is important, but handling 'big data' is where the major challenge is now, and more so in the future.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.