AI Supercomputers by 2030 Could Cost $200 bn, Use 2 mn Chips, and Demand Power of 9 Nuclear Reactors
Power constraints, not compute or chips, will likely become the primary bottleneck in AI advancement

The computational power of AI supercomputers has been doubling every nine months, while their hardware costs and power demands have doubled annually, according to a new analysis by Epoch AI.
Drawing from a dataset of 500 AI systems built between 2019 and 2025, the report warns that if these trends continue, the top AI supercomputer by 2030 could require two million AI chips, cost $200 billion, and consume 9 GW of electricity—equal to nine nuclear reactors.
For context, the current leading system, xAI’s Colossus, already uses 200,000 chips and demands as much energy as 250,000 homes.
Epoch AI notes this growth is shifting AI infrastructure away from governments and academia toward private companies.
"The computing resources used to train notable AI models have increased at a rate of 4–5× per year since the beginning of the deep learning era in 2010. This exponential increase has been a major driver of improvements in AI capabilities across many domains, such as in large language models or image generation. Most of this increase in compute has been driven by larger, higher-performance AI supercomputers," Epoch AI said in the research paper.
Power constraints, not compute or chips, will likely become the primary bottleneck in AI advancement, pushing the industry toward distributed training models.
The findings raise urgent questions about sustainability, national competitiveness, and the future of AI deployment.
Comments ()