Earlier this week, Economist Paul Krugman discussed data centers in a recent column. Like Toto in The Wizard of Oz, he pulls back the curtain, so that we can see what’s powering the magic…
Many applications of information technology are, like the automats of yore, less miraculous than they seem. True, the user experience makes you feel as if you’ve transcended the material world. You click a button on Amazon’s web site and a day or two later the item you wanted magically appears on your porch. But behind that hands-free experience lie a million-strong workforce and a huge physical footprint of distribution centers and delivery vehicles.
And the disconnect between the trans-material feel of the consumer experience and the physical realities that deliver that experience is especially severe for the hot technology of the moment, AI. We’re constantly arguing about whether AI is a bubble, whether it can really live up to the hype. We don’t talk enough about AI’s massive use of physical resources, especially but not only electricity.
And we certainly don’t talk enough about (a) how U.S. electricity pricing effectively subsidizes AI and (b) the extent to which limitations on generating capacity may nonetheless severely limit the technology’s growth.
How much generating capacity are we talking about? The Department of Energy estimates that data centers already consumed 4.4 percent of U.S. electricity in 2023, and expects that to grow to as much as 12 percent by 2028:

Krugman also recognizes that there’s a disconnect between increasing energy needs and policies that help or hinder energy creation…
So suppose that AI really does consume vast quantities of electricity over the next few years. Where are all those kilowatt-hours supposed to come from?
America is, of course, adding generating capacity as you read this, and can accelerate that expansion if it chooses to. But there are two big obstacles to any attempt to keep up with the demand from AI.
The first is that in recent years growth in U.S. generating capacity has become increasingly dependent on growth in renewable energy. According to S&P Global, almost 90 percent of the generating capacity added in the first 8 months of 2024 came from solar and wind:

Krugman explains that this is a problem because the current Administration has eliminated many of the green energy subsidies and they have been trying to prevent future solar and wind projects. So as the graph above indicates, our energy generation is decreasing.
Krugman offers a possible solution…
Indeed, requiring that the AI industry take responsibility for the costs it imposes makes a lot of sense. It would by no means end progress in AI. As the website Tech Policy notes, there are many AI applications in which smaller, more focused models can perform almost as well as the bloated, all-in-one models currently dominating the field, while consuming far less energy. Until now there has been no incentive to take energy consumption into account, but there’s every reason to believe that we could achieve huge efficiency gains at very low cost.
But will we do the sensible thing? It’s obvious that any attempt to make AI more energy-efficient would lead to howls from tech bros who believe that they embody humanity’s future — and these bros have bought themselves a lot of political power.
So I don’t know how this will play out. I do know that your future electricity bills depend on the answer.