Posted on 05/23/2010 11:50:29 AM PDT by Ernest_at_the_Beach
The writer was going to ask, but the gas turbine generator was so loud his question couldn't be heard.
/johnny
Hopefully it’ll come in a refreshed Mac Pro...I have two G5’s, one a Quad, and they’re getting creaky and old!
Ed
It doesn't have to be practical just needs people willing to buy one.
It is surprising that Hollywood hasn’t figured out how to take The Forbin Project and Wargames, suck all the life and excitement out of them, and try to ruin both movies. Maybe with a Man Made Global Warming theme, or some crap like that.
“The system appears to look like just an ordinary personal computer and seems to operate flawlessly.”
BWAHAHAHAHAHAHAAA!!
Because ordinary PCs operate flawlessly.... the author must still use a mechanical typewriter.
**************************************EXCERPT********************************
*************************************snip****************************************
IBM already is engaged with early access foundry clients in 32nm technology and ARM is developing design libraries for the technology. An initial 32nm ARM library is available now and IBM has extended this collaboration to include 22nm SOI technology, enabling ARM to gain early access to this technology. This represents the two companies' commitment to align early on process technology, design rules, design library and cores for next-generation SOI technology.
"We are making this 32nm offering available to clients who are ready to benefit from the significant performance and power advantages of this seventh generation of IBM SOI technology," said Gary Patton, vice president for IBM's Semiconductor Research and Development Center. "The industry-leading, dense embedded memory, and our design library agreement with ARM, underscore our ability to provide clients with a market edge and a clear progression path to 32nm and 22nm SOI technology nodes."
As people on this thread have already implied, the trend is going to be to move more processing power into the cloud. A laptop is unlikely to be able to use a chip like this for any number of reasons. A datacenter computer will undoubtedly be built that can. Cloud computing is inherently more parallel than desktop of laptop computing. As we move to massive numbers of cores on a chip, technology is going to push more of the computing tasks into the cloud. This will probably play out over 10 years or so.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.