One organic human brain = about 90 million neurons = 100-500 trillion connections. Grok 4 is about 3 trillion parameters (not that they’re 1-1, but hey), costs $490M to train, and took 200k GPUs. So, playing very fast and loose with numbers one could say they’re about 100x behind anything like a human-equivalent AI.
We do this work on 20w power. Grok took 310 gigawatt-hours to train.
If this thing isn’t going to collapse itself the designers need to be thinking energy efficiency. They’ll hit that wall before they get to where they want to be.
Source:
https://epoch.ai/data-insights/grok-4-training-resources
Exactly! The delta in power consumption is all you need to see to know that the arch is wrong. Also, it ignores quantum effects.