So I read the article and still can’t figure out want an “AI PC” does for me that a regular old PC doesn’t do. Near the bottom the author wrote “If you’re someone like Couvert, who does a lot with AI...”
That doesn’t help one bit. WHAT is Couvert doing, exactly?
In layman's terms, not much unless you do a lot of high powered processing. And even then, we're talking about only certain tasks that are programmed to utilize that kind of instruction set.
Perhaps in 5 or so years, "normal" software apps will be made to utilize those instruction sets as well. For example, the laptop I'm typing this on is running a SQL Server database engine -- requiring more horsepower than most people need from their PC's (except maybe hardcore gamers). But the database software isn't design to utilize the GPU style chips of an "AI PC". So my database engine wouldn't run any faster on one of those chips...for now. Neither would any of my homemade software.
But a few years from now, perhaps I could recompile my same old software (a lot of it is C#) in a new version of the IDE I use, but with a few property settings it'd compile the end user program to utilize the GPU style chips too, with no real changes needed from me as a programmer to change a lot of the code. Then my software apps (and other out of the box apps) would run tons faster on the GPU PC's.
But even then, it wouldn't matter unless your app was very heavy on processing use (instead of being a light-weight app that's waiting for information to download from the internet, like my TUBI app on my ROKU device does, which will never speed up unless I speed up my internet connection).
For me, it would mean that when I lead a financial small group and someone asks, "What would have happened with investment portfolio A for someone who retired right before the dot-com bubble burst?". None of that data has to be downloaded -- I have it on my laptop (no lag time because of internet speed). The only lag in time for me to produce a report to answer that question is purely processing power to churn through the market data for that time period --- speeding up the CPU processing would be nice and make it take maybe half a second instead of 2 or 3 seconds. But virtually nobody does that kind of thing for personal use.
The same for if I wanted to study the past few years of my solar inverter data (which records all of the solar power coming in, how much of it was to power my home, how much I had to pull from the grid, etc. every 5 minutes). Perhaps if I wanted to re-examine all of the power rate plan options my power utility offers to see which one would cost me the less money based on what I know I would have pulled from the grid at each time of day during each month with each of the different time of day rates and how each of those change during the seasons. Again, none of that is data I'd have to wait to download from the internet (the usual reason we have to wait when using PC's). But it's a task I might have to wait a few seconds or even a minute to run because it's churning through a lot of data. But virtually no one else does that kind of thing with their laptop/desktop. So this GPU technology is really for AI and blockchain and gaming stuff.