Yves here. This post does a good, layperson-friendly job of describing how the tech-overlord-envisaged explosion in data centers is even more problematic than you might have realized. It adds in a new impediment to the buildout, which is transformer scarcity.
By Michael Kern, newswriter and editor at Safehaven.com and Oilprice.com. Originally published at OilPrice
- AI’s shift from CPUs to ultra–power-dense GPUs has created a structural surge in electricity demand that is outpacing efficiency gains and overwhelming the grid.
- The need for 24/7 baseload power is delaying coal retirements, boosting natural gas, and shifting data center growth to regions with cheap land, weak regulation, and vulnerable communities.
- Global transformer shortages, mineral constraints, and interconnection bottlenecks mean that compute—not oil—may become the next strategic resource nations hoard.
The Cloud” might be the greatest branding trick in history. It sounds fluffy, ethereal, and notably light.
It implies that our digital lives…our emails, our crypto wallets, our endless scrolling…exist in some vaporous layer of the atmosphere, detached from earthly constraints.
But if you actually drive out to Loudoun County, Virginia, or stare at the arid plains of Altoona, Iowa, you realize the Cloud is actually just a very big, very loud, and very hot factory.
We’ve been telling ourselves a lovely story about the energy transition. We were retiring coal plants, building wind farms, and decoupling economic growth from carbon emissions. It was all going according to plan.
For years, the tech sector achieved relative decoupling…
Moore’s Law kept server efficiency gains ahead of the curve, allowing internet traffic to surge while power demand grew slowly.
The exponential curve of AI, however, has shattered this delicate balance. AI workloads are so compute-intensive that demand is now skyrocketing faster than efficiency gains can compensate.
This is a re-coupling with physics..and the defining narrative of the next decade isn’t about supply anymore.
Now it’s about a structural shift in demand that almost nobody priced in: The thermodynamics of Artificial Intelligence.
According to the International Energy Agency (IEA), global electricity demand from data centers is projected to more than double by 2030. This is the same as the entire annual electricity use of a country like Japan.
The invisible hand is hitting a concrete wall.
The question is no longer if the grid can handle it, but what is making the demand curve look like a rocket launch. The answer isn’t better software or smarter algorithms; it’s the raw physics happening inside a rack that now demands the power of a city block.
The Thermodynamics of “Thinking”
To understand why the grid is struggling right now, you have to look at the silicon.
For a long time, we ran the internet on CPUs (Central Processing Units). These are the general managers of the chip world. Efficient, predictable.
But Generative AI doesn’t want a manager. It wants a battalion of mathematicians. It runs on GPUs (Graphics Processing Units), specifically monsters like Nvidia’s H100.
Here’s what that actually means for power draw:
- Traditional server rack: Draws about 5 to 10 kilowatts (kW).
- Modern AI rack (H100s/Blackwell): Draws 50 to 100 kW.
We have effectively moved from powering a toaster to powering a neighborhood, all inside the same metal box. Air cooling…fans blowing over hot metal…doesn’t work anymore. Air just isn’t physically dense enough to move that much heat away.
We are now plumbing data centers like chemical refineries, running liquid coolant loops directly to the silicon die.
This is the new reality of Direct-to-Chip (DTC) cooling. It is already happening in cutting-edge AI centers because it is the only way to manage the extreme heat density of chips like the H100.
Liquid cooling saves energy compared to air conditioning. While the chip itself still draws 100 kW, the overall cooling system…the pumps and chillers…consumes far less power than running massive air handlers for the whole room. This makes it an efficiency measure born of necessity.
The next step is Immersion Cooling, where entire server racks are submerged in a non-conductive fluid. This is also being deployed now, often in pilot programs and specialized facilities.
This shift from fans to specialized plumbing and chemically inert fluids is the physical realization of the industrialization of thought.
Just like the industrialization of textiles or steel, it requires massive inputs of raw power and exotic, specialty materials. This industrial intensity demands something traditional renewable sources…intermittent solar and wind…struggle to provide: reliability.
When an AI training run costs tens of millions of dollars, a 1% flicker is an existential threat.
The Dirty Secret of the “Green” AI Boom
Every major tech CEO is currently on a podcast tour talking about their “Net Zero” 2030 goals. And sure, they are buying a lot of paper credits.
But physics doesn’t care about carbon offsets. The reality is that AI needs baseload power. It needs to run 24/7/365 with “five nines” (99.999%) of reliability.
You know what provides that?
- Nuclear (which takes 10 years to build).
- Batteries (which we don’t have enough of).
- Fossil Fuels.
According to IEA data, coal still accounts for about 30% of global data center power. And in the U.S., natural gas is doing the heavy lifting, covering over 40% of demand.
The irony is palpable. We spent billions trying to kill coal, only to have the most futuristic technology on earth, AI, throw it a lifeline.
In places like Virginia or Kansas, utilities are delaying the retirement of coal plants. They simply cannot risk the grid instability when a gigawatt-scale data center comes online.
The “future” is being powered by the “past.”
The need for this reliable baseload power, combined with the sheer gigawatt-scale hunger of these new facilities, is now fundamentally reshaping the American power landscape. Capital always flows to the path of least resistance—and right now, that path runs right through communities that have never seen a single dollar of tech prosperity.
The New Geography of Power (and Inequality)
This energy hunger is redrawing the map. We are seeing a “K-shaped” geography of infrastructure.
In the U.S., “Data Center Alley” in Northern Virginia supposedly handles 70% of the world’s internet traffic. But the grid there is tapped out. You can’t get a new hookup for years.
So, the capital is fleeing to places with looser regulations and cheaper land: Texas, Ohio, Arizona.
But this brings us to the friction point. These facilities are neighbors. And they are often bad neighbors. They are loud, they consume massive amounts of water for cooling, and they raise local utility rates.
There is also a significant Environmental Justice component here. Industrial infrastructure is rarely sited in wealthy neighborhoods.
According to the NAACP’s “Fumes Across the Fence-Line” report:
- African Americans are 75% more likely than white Americans to live in “fence-line” communities (areas adjacent to industrial facilities).
- A disproportionate number of fossil-fuel peaker plants, which fire up when data centers max out the grid, are located in low-income areas and communities of color.
This directly contributes to higher rates of asthma and respiratory issues.
While the “invisible prosperity” of AI stock gains flows to portfolios in San Francisco and New York, the “visible decay”…the pollution, the water usage, the hum of the cooling fans…is localized in communities that often see none of the upside.
Even if a community were willing to bear the cost, the industrial machine that once smoothly supplied the electrical grid is choked.
The problem is no longer just where to put the data center, but how to physically connect the massive, power-hungry factory to the existing grid infrastructure. This process is crippled by a global bottleneck of essential, non-digital hardware.
The Great Transformer Shortage
Let’s say you have the money, the land, and the permits. You still have a problem. You can’t get the gear.
The lead time for a high-voltage power transformer used to be 12 months. Today? It’s 3 to 5 years.
We are trying to rebuild the electrical grid at the exact moment everyone else is trying to electrify cars and heat pumps. The supply chain is fractured.
We are also running out of the raw stuff: Copper. Lithium. Neodymium for the magnets in the cooling fans.
We are dependent on China for the processing of nearly all these critical minerals. As I explained in this “Data Center Guide,” we are realizing that the digital economy is actually a material economy.
If China restricts graphite or gallium exports (which they have started doing), the Cloud stops growing.
The “Trust Me, Bro” Efficiency Pitch
The counter-argument from Silicon Valley is the “Handprint” theory. The pitch goes like this: Yes, training the AI uses a lot of energy, but the AI will make the rest of the world so efficient that it pays for itself.
The IEA models suggest that AI could optimize logistics, manage smart grids, and reduce building energy usage by 10-20%.
And honestly? It’s a compelling argument. If AI can figure out how to drive a truck platoon 5% more efficiently, that saves more carbon than the data center emits.
But this is a long-term bet against a short-term, guaranteed withdrawal of power.
The core efficiency problem is two-fold:
- Training vs. Inference: Training a colossal model takes a massive, months-long burst of power. The resulting AI is then put to work performing inference…answering questions. While inference is far cheaper per interaction than training, its global volume is exponentially growing, turning tiny energy costs into a massive, persistent drain.
- The Hardware Treadmill: A high-end CPU might last 5-7 years in a data center. The new AI GPUs are considered obsolete in as little as two years. This brutal, accelerated hardware cycle…the constant replacement of power-hungry H100s with even more power-hungry Blackwells…means that the embodied carbon and raw materials tied up in the silicon are never given a chance to pay back their energy debt over a reasonable lifespan.
We are spending the carbon now in hopes of efficiency later. While the industry is working on “smarter” silicon, efficient ASICs for inference, that transition won’t arrive fast enough to save the grid from the current exponential surge.
What Comes Next?
We are moving from an era of Generation Constraints to Connection Constraints.
The most valuable asset in the world right now isn’t the H100 chip; it’s a signed interconnection agreement with a utility company. The “queue” to get on the grid is the new velvet rope.
This is going to force a few things:
- Off-Grid AI: Tech giants will stop waiting for the utility. They will build their own SMRs (Small Modular Nuclear Reactors) or massive solar farms with battery storage, effectively taking their ball and going home.
- Sovereign Compute: Nations will realize that “compute” is a strategic resource like oil. You will see countries hoarding power to feed their own AI models rather than exporting it.
- The Efficiency Wall: We will hit a point where the cost of power makes brute-force AI training uneconomical, forcing a shift to “smarter” chips (ASICs) and maybe, eventually, neuromorphic or photonic computing.
The invisible hand is dealing cards, but the laws of thermodynamics are calling the bluff. The virtual world requires real power, and for the first time in a long time, we are realizing that “unlimited data” was a temporary illusion.


The transformer thing could in part go back to Ukraine. I’ve not heard anything about Ukraine not getting priority on delivery and a quick Google search makes me think it’s still ongoing. And if anything the demand has only grown as they go through the supply faster.
As I’ve written my neighborhood was switched to all underground lines and had to wait a couple of years for the new transformers to complete the change.
So it’s not about Ukraine but may be about utilities doing elsewhere what was done here. According to this article the shortage is
https://www.renewableenergyworld.com/power-grid/can-the-us-catch-up-to-transformer-demand/
The us and Ukraine use totally different frequencies, the US 60, Ukraine 50 HZ.
Also Ukraine has a very old USSR design which has some unusual voltages that are making getting equipment difficult as they are non standard.
The US is years behind in transformers, I highly doubt the major US companies are going to build totally new designs for Ukraine. I’d say China or India come to mind as suppliers.
I don’t know how much this matters as large transformers are pretty much all custom made.
We’d need somebody who has experience with this sector to comment.
Yes, comments or articles by knowledgeable people would be nice.
I have a hard time believing that the likes of ABB, General Electric, Mitsubishi, Siemens, or Hyundai are not producing that kind of equipment through industrial processes reliant on standardization and large series. After all, hundreds of large transformers must be installed every year in the USA alone — I doubt this can be achieved in a manufacture-type of organization.
Don’t have much experience (unless almost electrocuting myself as a kid while “studying” a small transformer counts) but most of the world follows IEC standards for transformers, while North America follows IEEE standards. The standards cover voltage, currency, power, frequency and temperature regimes, how to test those and so on.
As far as I understand, former Soviet countries and China tend to use 330 kV grid substation transformers, while most European countries use 380 kV transformers. There’s no technical reason any company couldn’t build transformers for Ukraine, but every year tens (if not hundreds) of thousands are needed to replace existing old ones in the grids that work and are not bombed to smithereens – it’s a capacity issue.
But I may be quite wrong here.
20 years ago (this week!) when I was defending my MS in disaster response engineering, the large transformer wait time was 1.5 to 2.5 years and the two countries that manufactured the things for use in the US were China and Poland. It’s been 2 decades so it’s clearly changed.
At the time, my focus was on the elimination of warehousing as a significant vulnerability. Utilities used to stock up on these things. Starting in 2000-ish spare capital equipment such as large transformers, large valves, etc. started to be viewed as liabilities. Even if the (bulky) gear was seen as valuable, the warehouse property was being sold off to developers which made selling the spare gear necessary simply for lack of space to store it.
According to a Wood Mackenzie study cited in a National Infrastructure Advisory Council Report from 2024, wait time for large transformers had lead times from 80 to 120 weeks.
We wrote about this a long time ago. Ukraine is on the old USSR power standard as Russia is now. Western transformers will not work.
Any chance you could find this and repost it, Yves? I remember this piece vaguely, and it definitely increased the sophistication of my thinking on the Ukraine power grid.
This article is the equivalent of throwing mud at a wall to see what sticks. Adding in concerns about data centres and poor communities is the clearest sign of this (has the author ever actually walked around data centres and peaking plants, as opposed to, say, warehouse distribution warehouses or foundries or widget manufacturing plants?).
Yes, I get it, people hate data centres, and they hate the industry, even while they love their internet and mobile phones. People particularly hate AI, or at least the visible manifestation of AI. I share the distaste.
But in terms of the energy and pollution problems facing the planet today, data centres are a regional, minor, and eminently manageable one (as the links in the article demonstrates clearly if you go through them in detail).
Some data analysis from the IEA:
Data centre use (in its broadest definition, including AI), accounts for around 1.5% of world electricity consumption, and is anticipated to double by 2030 (assuming of course, AI growth doesn’t collapse). This is equivalent to Japans total energy demand. In terms of global energy usage, its a rounding error and is likely to continue to be so. It is, however, a major issue in specific regional markets, particularly in the US and China.
In terms of overall growth in electricity demand, its not even close to being the biggest driver of electricity demand growth – it amounts to around 10% of predicted demand growth, less than industrial uses, air conditioning and EV’s (although of course EV’s long term will lead to a reduction specifically in fossil fuel use).
There is already plenty of evidence that the boom has ended – in China there are vast amounts of unused data facilities, mostly in the Beijing area. And we all know about the issues with Nvidia, Meta, etc.
In reality, the growth in pollution – both CO2 and from other sources is overwhelmingly driven by agriculture (our hunger for meat and dairy in particular), industry (in particular the vast and still growing plastic manufacturing sector), air conditioning, oversized houses, air travel and our love of cars, whether ICE or EV’s. The data shows clearly that these are the major drivers of planetary breakdown – data centres are a very small subset of these.
Yes but there is still a principle at stake as politically powerful tech bros declare their latest product to be more important than slowing AGW. And the perpetual assault on that save the planet principle is the reason for the meat, the huge carbon spewing vehicles etc. It’s the same with those private jets which may contribute little carbon overall but send the message that only the little people need care about global warming in deed if not in rhetoric.
The little people respond by buying a Suburban or a Ford king cab 150 pickup. Sociopathy is contagious.
The flip side of this is that the bad faith actors in this (mostly PR firms hired to muddy the waters along with their paid shills) love to play the ‘yeah, but you can’t object to this, you drove here/flew on your vacation/eat beef’ line to distort and distract from the arguments. The core message is always that ‘its someone elses fault, and you have no power over them, so no need to worry your little head’.
The causes of climate change and the sources of emissions are extremely well known and the data is out there for everyone to see. You don’t need a science degree to understand the issues, and the urgent things that need to be done. But what you do need is clear thinking and a relentless focus on the big picture and how they are systemically linked. Articles like this do not help.
One rule for me, one rule for thee, gets the peasants in a pitchfork mood, and moreover, the lumpen proletariat are resistant to following the rule being hypocritically applied to them – they may choose to believe it’s all a lie to impoverish them. Next thing you know, you get a US administration that actively accelerates global warming (at home and abroad).
The solution to global warming is political, and thus, I would argue, the big picture is the big political picture.
Yes, but it has to be a political picture that ignores the us vs them team sports, and focuses on following the money.
From the link you provided:
So it would seem AI is a pretty big driver of electricity demand.
When one looks at an image of the H100 and Blackwell, they appear not that far off from the T-800 or Neural Net CPU in the Terminators.
Coincidence?
So maybe human brainpower will turn out to be the best bet. Only requires 2000 calories a day.
And honestly? It’s a compelling argument. If AI can figure out how to drive a truck platoon 5% more efficiently, that saves more carbon than the data center emits.
Just what I was thinking DWhite. How many smart people working together do we need to figure out how to drive a truck platoon 5% more efficiently?
Since 2000, the power generation of China and the United States has shown distinctly different growth trajectories: China’s power generation soared from about 1.37 trillion kilowatt-hours to 10.09 trillion kilowatt-hours in 2024, an increase of about 6.36 times with a compound annual growth rate of approximately 8.5%; the United States’ power generation rose from about 3.8 trillion kilowatt-hours to 4.3 trillion kilowatt-hours in 2024, an increase of about 0.13 times with a compound annual growth rate of only about 0.3%.
Why has the US, which once created railway miracles and highway miracles, been so incompetent in infrastructure construction since the 21st century?
Its worse, if you consider population growth, the US is falling dramatically.
In 2008, an American athlete wore a mask while participating in the Beijing Olympic Games. This act sparked intense anger in China. Later, the United States Olympic Committee and the athlete himself issued a public apology.
Nearly 20 years on, Beijing’s air quality has improved, and China’s energy output has surpassed that of the United States by a wide margin. What standards of air and drinking water exactly do environmentalists in the Western world want?