"Actually you can run AI on your local PC. Just guy a decent Nvidia graphics card and it runs well."
... and your PC consumes twice as much power as before.
Of course I can run some AI on my PC. I actually did. But the AI that is being used in the industry, the Large Language Models that we are talking about ...
they need to be trained. This leads to companies like Microsoft building data centers in far away countries like New Zealand because there is not enough electricity in the US for that.
Overall, this can lead to up to 10 gigawatt-hour (GWh) power consumption to train a single large language model like ChatGPT-3. This is on average roughly equivalent to the yearly electricity consumption of over 1,000 U.S. households.
Q&A: UW researcher discusses just how much energy ChatGPT uses