Skip to comments.
China, NVIDIA Create World's Most Powerful Supercomputer
Daily Tech ^
| October 28, 2010 10:00 AM
| Jason Mick (Blog)
Posted on 10/28/2010 9:12:31 AM PDT by Ernest_at_the_Beach
System has 2.507 petaflops of computer power, draws 4 MW
NVIDIA has plenty to worry about in the consumer segment as it find itself yet again a generation behind AMD's latest graphics cards. However, the company may simply be quietly divesting itself of its consumer market share by instead focusing on commercial GPU computing sales.
The graphics processor maker revealed today at HPC 2010 China an incredible new supercomputer, built using NVIDIA's GPUs which support CUDA, a C-driven technology that allows for the implementation of parallel computing code on the GPU. The new supercomputer is named Tianhe-1a. It is located at the National University of Defense Technology (NUDT) in Tianjin, China and is fully operational.
With a total computing power of 2.507 petaflops, as determined by the LINPACK benchmark which solves a dense system of linear equations, China's new supercomputer is the most powerful one in the world.
And NVIDIA's real bragging rights come when the power consumption is discussed. By using GPUs instead of purely CPUs to fuel its calculations, the installation's power footprint is cut from an estimated 12 MW to 4.04 MW, saving enough electricity to power 5,000 homes a year.
(Excerpt) Read more at dailytech.com ...
TOPICS: Business/Economy; Computers/Internet
KEYWORDS: hitech; supercomputing
The supercomputer uses 7,168 NVIDIA Tesla M2050 GPUs. (Source: NVIDIA)
To: ShadowAce
To: rdb3; Calvinist_Dark_Lord; GodGunsandGuts; CyberCowboy777; Salo; Bobsat; JosephW; ...
3
posted on
10/28/2010 9:14:11 AM PDT
by
ShadowAce
(Linux -- The Ultimate Windows Service Pack)
To: Ernest_at_the_Beach
Tesla M2050 / M2070 GPU Computing Module**************************************EXCERPT************************************
Based on the next-generation CUDA architecture codenamed Fermi, the Tesla M2050 and M2070 Computing Modules enable seamless integration of GPU computing with host systems for high-performance computing and large data center, scale-out deployments. The 20-series Tesla GPUs are the first to deliver greater than 10X the double-precision horsepower of a quad-core x86 CPU and the first to deliver ECC memory. The Tesla M2050 and M2070 modules deliver all of the standard benefits of GPU computing while enabling maximum reliability and tight integration with system monitoring and management tools.
To: Ernest_at_the_Beach
draws 4 MW
A friend of mine at IBM told me the first time Los Alamos Labs turned on their Blue Gene super computer it browned out the whole town. Got a furious call from the power company.
5
posted on
10/28/2010 9:17:23 AM PDT
by
DManA
To: Ernest_at_the_Beach
By using GPUs instead of purely CPUs to fuel its calculations, the installation's power footprint is cut from an estimated 12 MW to 4.04 MW, saving enough electricity to power 5,000 homes a year. They're not "saving" anything. They're using less than normal. I wish tech writers would learn to write--and think.
6
posted on
10/28/2010 9:18:29 AM PDT
by
ShadowAce
(Linux -- The Ultimate Windows Service Pack)
To: DManA
A friend of mine at IBM told me the first time Los Alamos Labs turned on their Blue Gene super computer it browned out the whole town. LOL!! I did something similar. Fired up a cluster for the Navy, and four buildings went down. The power they had supplied for me wasn't adequate for the job. :)
7
posted on
10/28/2010 9:20:36 AM PDT
by
ShadowAce
(Linux -- The Ultimate Windows Service Pack)
To: ShadowAce
They're not "saving" anything. They're using less than normal. I wish tech writers would learn to write--and think. Bravo! Avoidance is not saving. If the article's author cannot write intelligently he certainly cannot think intelligently!
8
posted on
10/28/2010 9:36:26 AM PDT
by
DakotaGator
(Weep for the lost Republic! And keep your powder dry!!)
To: Ernest_at_the_Beach
Unlikely that nVidia would drop its consumer base. Theres
not enough buisness in this GPU computing sector to sustain it yet.
Are they even OpenCL compliant yet?
9
posted on
10/28/2010 9:40:39 AM PDT
by
rahbert
To: rahbert; ShadowAce; SunkenCiv; blam; Marine_Uncle; NormsRevenge
No idea.
OT...was just looking at this Youtube:
CPU vs. GPU
Excellent demo for Novices.
To: Ernest_at_the_Beach
but-but-but Ahrnold was on ABC news last night tellign us that caleefornea had built a faster computer. he would n’t lie to us, would he?
11
posted on
10/28/2010 10:56:56 AM PDT
by
camle
(keep an open mind and someone will fill it full of something for you)
To: ShadowAce
They're not "saving" anything. They're using less than normal. I wish tech writers would learn to write--and think.Yeah, I agree. What is this crap about NVidia being a generation behind ATI? The GTX4xx series is the most advanced consumer GPU on the market right now. Sure, they were 6 months late getting them out there, but they did and now they are ahead. Yeah, I am a bit biased since I just spent over $1k on a pair of GTX 480s...
12
posted on
10/28/2010 12:13:33 PM PDT
by
America_Right
(The best thing about the Obama Presidency: McCain isn't the President!)
To: camle
If you only knew how much....he will shade his statements.
To: Ernest_at_the_Beach
I saw the video on the painting exercise a few months back. I believe you may have posted it in a post similar to this one, ex. CPU verse GPU.
But what the heck. It is a fun video to watch.
14
posted on
10/28/2010 4:09:04 PM PDT
by
Marine_Uncle
(Honor must be earned....)
Disclaimer:
Opinions posted on Free Republic are those of the individual
posters and do not necessarily represent the opinion of Free Republic or its
management. All materials posted herein are protected by copyright law and the
exemption for fair use of copyrighted works.
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson