AMD said on Wednesday that it will lay off 4% of its global staff as the longtime computer chipmaker seeks to gain a stronger foothold in the growing artificial intelligence chip space
https://www.techradar.com/news/what-is-an-ai-chip-everything-you-need-to-know
While typically GPUs are better than CPUs when it comes to AI processing, they’re not perfect. The industry needs specialised processors to enable efficient processing of AI applications, modelling and inference. As a result, chip designers are now working to create processing units optimized for executing these algorithms. These come under many names, such as NPU, TPU, DPU, SPU etc., but a catchall term can be the AI processing unit (AI PU).
The AI PU was created to execute machine learning algorithms, typically by operating on predictive models such as artificial neural networks. They are usually classified as either training or inference as these processes are generally performed independently.
They should model them after the human brain. A set of CPU’s just to handle just the basic stuff like breathing and heart rate, and the SPU (Specialized PU) for doing the thinking stuff.................
“The AI PU was created to execute ...
I think the AI PUs are fast adders, multipliers, etc. They’re a bunch of transistors most of which are always switching from 0 to 1 or vice versa. It’s a “switch” that consumes energy.
Contrast this with a memory chip, which has thousands of transistors but only a small fraction are switching at a given time.
Thus, the AI PUs are energy gobblers. Put a bunch of them together, each running essentially the same not-so-imaginative software, and you have these data centers that can gobble a Three Mile Island unit.