When did computers start getting “powered” by GPUs?
Oh, about the time SETI got started.
Journalists take liberties with technical details,
I think it started as a way to offload physics in games from the CPU.
What’s the frame rate at 4K playing games on it? Pretty good, I’d bet.
But it would cost a lot more than $0.50 a game just on the electricity alone...
Over the last decade, GPUs started overtaking the limits of Moore’s Law constraining the standard CPU.
The problem with CPUs is that they have to processor much more than a specific program. Operating system kernel command architecture must be loaded into memory to provide the proper instruction sets for the CPU to work.
Once GPUs went to PCI-X and PCI-E, manufacturers started building instruction sets into the firmware for the GPUs, and programmers could take advantage of the dedicated architecture to offload specific program needs while the CPU keeps the larger system online. NVidia has taken the lead with GPU-dedicated processing, and many experts predict that GPUs will take the lead for processing going forward while the CPU sits in the background to maintain the interconnect architecture and keep the system online and stable.
GPUs offer power that the CPU can’t manage. The GPUs have dedicated memory space, processing management, and channel architecture. Many use triple- and quad-SLI configurations to mine Bitcoins since large prime calculations require an enormous amount of processing resources. Bottom line: the less work being shared, the more work can be done.