When Sam Altman said the next wave of AI would require energy breakthroughs, most dismissed it as CEO hyperbole. He was understating it. The AI boom has triggered a demand for electricity that is straining power grids, reviving mothballed nuclear plants, and forcing serious conversations about whether the benefits of AI justify its energy cost. The numbers are staggering, and they demand honest examination.
The Scale of the Problem
Training a frontier AI model like GPT-4 consumed an estimated 50 gigawatt-hours of electricity โ roughly equivalent to the annual energy consumption of 5,000 US homes. Inference โ running the models to respond to user queries โ may ultimately dwarf training energy consumption. ChatGPT processes hundreds of millions of queries per day. Each query consumes roughly 10x the electricity of a traditional Google search. The International Energy Agency estimated that global data center electricity consumption will double by 2026, driven primarily by AI.
Grid Pressure
In Virginia, home to the highest concentration of data centers in the world, power utility Dominion Energy is projecting electricity demand growth of 85% over the next 15 years โ driven primarily by AI data center expansion. Microsoft, Google, and Amazon are all building data centers in regions with abundant, cheap electricity: the Pacific Northwest (hydropower), Iowa (wind), and increasingly, nuclear power states. The companies have made commitments to 100% renewable energy, but the scale and speed of AI growth is making these commitments increasingly difficult to fulfill.
Nuclear Power's AI Moment
Nuclear power is experiencing a renaissance driven by AI demand. Microsoft signed a deal to restart Three Mile Island specifically to power its AI data centers. Google signed a power purchase agreement with Kairos Power for small modular reactor output. Amazon has announced multiple nuclear investments. The appeal is obvious: nuclear power provides reliable, around-the-clock electricity without carbon emissions. Unlike wind and solar, it does not require battery storage to handle intermittency.
Efficiency Improvements
The energy per AI computation has been decreasing dramatically, even as total consumption rises. DeepSeek's efficiency breakthrough demonstrated that the same model quality achievable with massive compute can be approached with much smaller compute through better architecture. Distillation techniques allow large model capabilities to be transferred to smaller models. Quantization reduces model precision and energy consumption with minimal quality loss. These efficiency gains provide reason for cautious optimism.
The Bigger Picture
The AI energy crisis is real, but not necessarily disqualifying. If AI helps optimize the power grid, accelerate climate science, improve energy efficiency across the economy, and discover new clean energy materials, it may contribute more to solving the energy problem than it creates. The key is ensuring that AI's energy consumption is matched by genuine decarbonization of the grid โ not just paper renewable commitments. This requires policy action, investment in clean energy infrastructure, and honest accounting of AI's true environmental impact.