The Trajectory

By 2030–2035, data centers could account for 20% of global electricity use, putting an immense strain on power grids. But the crunch comes sooner: By 2028, AI could use over half of data centre power demand and consume as much electricity a year as 22% of all US households. By 2028, more than half of data centre electricity could be devoted to AI, according to NTT Data's Sustainable AI for a Greener Tomorrow report. At that point, AI alone could consume as much electricity annually as 22% of all US households.

By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatt-hours (which would bump data centers up to fifth place on the global list, between Japan and Russia).

The Cooling Trap

The need for advanced cooling systems in AI data centers also leads to excessive water consumption, which can have serious environmental consequences in regions experiencing water scarcity.

Innovation Response

Training the neuro-symbolic model used only 1% of the energy required to train a VLA model, and the energy savings continued during execution of tasks with the neuro-symbolic model using only 5% of the energy required for running the VLA.

But breakthroughs like this are rare. When you search on Google, the AI summary at the top of the page consumes up to 100 times more energy than the generation of the website listings.

My Take: Energy is becoming the real cost ceiling for AI. Capital and compute are abundant; clean power isn't. Companies racing to deploy massive models are taking on hidden liabilities—energy costs, water permits, grid strain. The next wave of AI innovation won't be about model size, it will be about efficiency per watt. Expect a bifurcation: frontier labs will burn whatever energy they need; startups will compete on efficiency.

Sources