Posted on 05/19/2009 11:25:10 AM PDT by Ernest_at_the_Beach
Try to imagine where 3D gaming would be today if not for the graphics processing unit, or GPU. Without it, you wouldn't be tredging through the jungles of Crysis in all its visual splendor, nor would you be fending off endless hordes of fast-moving zombies at high resolutions. For that to happen, it takes a highly specialized chip designed for parallel processing to pull off the kinds of games you see today, the same ones that wouldn't be possible on a CPU alone. Going forward, GPU makers will try to extend the reliance on videocards to also include physics processing, video encoding/decoding, and other tasks that where once handled by the CPU.
It's pretty amazing when you think about how far graphics technology has come. To help you do that, we're going to take a look back at every major GPU release since the infancy of 3D graphics. Join us as we travel back in time and relive releases like 3dfx's Voodoo3 and S3's ViRGE lineup. This is one nostalgiac ride you don't want to miss!
A virgin in the 3D graphics arena, S3 in 1995 thrust itself into this new territory with its ViRGE graphics series. Playing on the hype surrounding virtual reality a decade and a half ago, ViRGE stood for Virtual Reality Graphics Engine and was one of the first 3D GPUs to take aim at the mainstream consumer. While nothing compared to todays offerings, early 64-bit ViRGE cards came with up to 4MB of onboard memory, and core and memory clockspeeds of up to 66MHz. It also supported such features as Bilinear and Trilinear texture filtering, MIP mapping, alpha blending, video texturing mapping, Z-buffering, and other 3D texture mapping goodies.
Ironically, those same cutting edge features took a toll on the ViRGE silicon resulting in underwhelming 3D performance. In some cases, performance was so bad that users could obtain better results with the CPU, causing the ViRGE to be unaffectionate dubbed the first 3D decelerator. Ouch.
Fun Fact: Just how far has graphic cards come in the past 15 years? Enough so that we've seen the S3 ViRGE selling for as little as $0.45 in the second-hand market.
fyi
“Just how far has graphic cards come in the past 15 years? Enough so that we’ve seen the S3 ViRGE selling for as little as $0.45 in the second-hand market.”
Hell. I’ve got one I’ll GIVE away!
I’m out of the loop. I don’t play any of those games. I fly Microsoft Flight Simulator X, and it’ll pretty much smoke any video card out there. I wish FSX was the standard by which benchmarks were compared.
Shoot ‘em games don’t intrigue me at all.
My first 3d video card was a Voodoo II, circa 1998 or 1999 and used the, now-extinct, glide api. It was an “add-on” card and only did 3d graphics. You still had to have a separate 2D card for non-3D display. It only had 12MB of VRAM but was quite fast in it’s day. Worked like a charm with Need For Speed III.
HAHA reminded me of your AGP that your STILL using!
Great stuff though. Talked about a lot of technology that my generation missed out on...
Then there was the 5080 engineering workstation....each tube require a 2 drawer file cabinet sized box at the display to hold all of the circuitry connected to a unit at the CPU via coax....
I don't know if this aspect is covered in the article (I'll read it later), but the GPU architecture has been found to be so amenable to so many academic, defense, and industrial computational endeavors that at least two GPU manufacturers (nvidia and ati) are now developing (minor revisions of their exisitng) chips specifically for the scientific computation market.
Just another example of supposedly "useless technology" eventually bearing gifts thanks to the twin engines of genuine human progress: science and the free market!
***********************************************
The IBM 2250 Display Unit was originally shipped with the IBM 1130 computer, introduced in 1965. The 2250 could also be attached to IBM 360-series mainframes, as ours was to the 360/91. Like most IBM terminals, attachment was via control unit (or in this case, direct channel) rather than communication port.
The 2250 was the "first commercially available graphics terminal" if you don't count the DEC PDP-1 display (1961). As late as 1971 there were only about 1000 interactive CRT graphic terminals installed in the USA, compared with 100,000 line printers, 50-100,000 Teletypes, and 70,000 "alphanumeric terminals" (such as the 2260) (CACM Vol.14 No.1, January 1971, p.60).
See the last page,...page 10... for comment by the head of Nvidia regarding that GPU’s were taking over from CPU’s....aimed at Intel.
It's interesting how frequently the circle comes back around in the computer industry. CPUs used to need floating-point co-processors to do much math. Then, they were integrated into the CPU. Now, we have GPUs which are essentially super-duper floating-point co-processors all over again.
While it can't hold a candle to ATI and nVidia's PCI-e offerings right now, I STILL don't have to rebuild my old system.
Picked up some new heat sinks for it today from New Egg. RAM and Proc. The old fans are getting noisy and I wanted to OC the DDR a bit to help it keep up.
Just one more year... ;-)
fyi
Are you going for a record?
My current GPU...
The HD 3850 AGP I'm replacing it with benchmarks pretty darn close to what it's PCI-e cousin does. I just don't feel like shelling out another $700-800 for a new mobo/RAM for my gamer box.
Here is I suspect the same or nearly so ..system....online:
Lenovo Desktop PC Powered by Intel Atom 230 Processor with Windows XP Home Edition
Price: $ 260.99
$50.00 Rebate
Detailed Description
********************************
You going in for a LC case? If I had the buck I would
definitely liguid cool it. My work system is a twin xeon
dell precision 490 + Nv quadroFX 3500.
I could still use an upgrade though.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.