I asked Grok: Why does AI require so much electricity?
AI requires significant electricity due to the immense computational power needed for training and running large models. Training involves processing vast datasets through complex neural networks, often on specialized hardware like GPUs or TPUs, which are power-hungry. For example, training a single large language model can emit as much CO2 as a transatlantic flight, with data centers consuming hundreds of megawatts. Inference—using the model for tasks like answering queries—also demands constant computation, especially for real-time applications. Cooling systems for these data centers add to the energy load. As AI models grow larger and more widespread, their energy consumption scales, often outpacing efficiency gains.
“Cooling systems for these data centers add to the energy load.”
There have also been issues with nearby property water wells being less productive, as well as municipal water costs increasing.