U.S. Utilities Put in a No-Win Position by Phantom Data Centers

Yves here. The possibility of utility excess capacity due to AI promoters placing redundant orders for data center power with multiple utilities poses a real risk to consumers. That’s before getting to the odds that the AI bubble will pop, so that not only will duplicative orders go poof but even base demand forecasts could crash. If utilities underbuild, users will face brownouts and outages as a way to ration capacity. If they overbuild, customer bills will be excessive relative to actual need.

There is an additional possible knock-on effect. Remember Y2K? Many tech mavens contend the risk was real and the extensive action to get in front of it was warranted. One part of the response was buying new hardware, as in upgrading as part of the remediation process. This happened on such as scale as to lower equipment demand for the next few years, increasing the severity of the dot bomb bust. Perhaps informed readers can pipe up as to whether a whipsaw in utility capital spending could have a similar broader economy impact.

By Tsvetana Paraskova, a writer for Oilprice.com with over a decade of experience writing for news outlets such as iNVEZZ and SeeNews. Originally published at OilPrice

  • US electric utilities are struggling to accurately forecast future power demand due to numerous speculative data center interconnection requests that may not materialize.
  • The practice of AI-focused tech groups filing power requests with multiple utilities for a single potential data center project creates “phantom” demand, making accurate capacity planning difficult.
  • Overestimating demand could lead to utilities overbuilding new capacity, potentially at the expense of American ratepayers who are already experiencing rising electricity prices.

America’s electric utilities are preparing for the surge in electricity demand coming with the data centers powering AI. Utilities have increased investments as they see unprecedented demand growth in the coming years after two decades of flat U.S. electricity consumption.

But they are grappling with increased levels of uncertainty because not all requests for interconnection they receive will materialize in actual data centers, necessitating electricity supply.

Phantom Data Centers

Hyperscalers and AI-focused tech groups are sounding out the utilities in the areas they are considering for future data centers, and are filing requests for interconnection of one data center with several utilities in several areas.

The huge number of requests does not paint an accurate—or full—picture of the power needs of the technology giants because companies tend to inquire about data center power supply with at least three utilities in different areas.

Of these three requests for new power capacity, only one will become a project for which agreements will be signed. Analysts and utilities cannot reliably say how much new capacity is needed, considering that one data center project pitches electricity supply requests to different utilities in different states.

After one site is picked, all the other previously proposed locations – and the interconnections – will never be built. These would be “phantom” data centers, which will never see the light of day, but which are currently haunting the projections and plans of the U.S. utilities.

So, electric utilities face a high degree of uncertainty over future revenues as the boom of AI data centers generates widely varying forecasts of peak demand in many areas across the country.

If utilities overestimate their future demand, they risk overbuilding new capacity that will not be met by consumption. A possible overbuild would come at the expense of the American ratepayers, who have already seen electricity prices rising at a faster pace than U.S. inflation over the past three years.

Puzzled Utilities  

The phantom data centers and the speculative projects are making projections difficult for utilities.

For example, Sempra’s Texas-based utility Oncor said its active large commercial and industrial (LC&I) interconnection queue as of June 30, 2025, was about 38% higher than at the same time last year. As of June 30, Oncor’s active LC&I interconnection queue had 552 requests, which includes approximately 186 gigawatts (GW) from data centers and over 19 GW of load from diverse industrial sectors.

American Electric Power Company, which serves over 5 million customers in 11 states, said it now has 24 GW of firm customer commitments for incremental load by the end of the decade, up from 21 GW previously, thanks to data center growth, reshoring, and manufacturing.

“Beyond the 24 gigawatts, customers are also actively seeking to connect approximately 190 gigawatts of additional load to our system. This is five times our current system size of 37 gigawatts,” AEP president and CEO William J. Fehrman said on the Q2 earnings call.

U.S. power utilities are investing a record amount of money into transmission and grid connection. But current forecasts of AI-driven power demand vary so much that there is a massive margin of error, analysts and utility officials told Reuters Events in June.

The U.S. market faces “a moment of peak uncertainty,” according to Rebecca Carroll, Senior Director of Market Analytics at energy advisor Trio.

The latest report from the U.S. Department of Energy (DOE) puts data center consumption at anywhere between 6.7% and 12% of total U.S. electricity by 2028.

“The report estimates that data center load growth has tripled over the past decade and is projected to double or triple by 2028,” DOE said.

However, there is a huge difference between double or triple growth in data center load.

This has prompted utilities to demand clear demand estimates from data centers for future connections and power purchase agreements (PPAs), to reduce the risk of getting demand and/or prices wrong.

AI Drives U.S. Power Demand Growth

“We know not all of that is going to come online, but even a fraction of that is significant,” AEP’s chief financial officer, Trevor Mihalik, said on the earnings call.

U.S. power utilities have announced billions of dollars in capital plans for the next few years and are getting a lot of requests from commercial users, most notably Big Tech, for new power capacity in many areas next to planned data centers.

Onshoring of manufacturing activity and AI-related data centers are driving an increase in U.S. electricity consumption, Goldman Sachs said in a report earlier this year.

U.S. electrical power demand is expected to rise by 2.4% each year through 2030, with AI-related demand accounting for about two-thirds of the incremental power demand in the country, the investment bank said.

The world’s biggest economy will need all energy sources to ensure power demand is met. Natural gas is the biggest near-term winner of AI advancements, but renewables will also play a key role in powering the data centers of next-generation computing, analysts say.

Print Friendly, PDF & Email

20 comments

  1. Ignacio

    This looks like a basket case of market-driven (neoliberal) development without central planning which won’t probably result in reduced costs and increased reliability but quite possibly the contrary. Following the link on “record expenditures”, in 2025 about half of these are going to increase utility capacity and the other half in transport and distribution networks. As we have learnt here, at NC, there are bottlenecks (transformers are apparently in short supply) and one wonders whether part of the noticeable increase in grid investments is due to price increases besides the real physical infrastructures which in turn are being delayed by such bottlenecks. Besides, such projections of power demand are fuelling investment in NG-based utilities impacting not only power electricity but the markets for heating demand and international energy markets.

    Then, there is the discussion about AI market being in bubble phase or in line with “market fundamentals”, but the segment growing fastest “generative AI” looks, IMO, pretty much a bubble with unreasonable expectations of real utility.

    In the past, investment sprees could be associated with overall societal improvements but the free market approach seems to be showing it’s limits and increasingly resulting in chaos and doubtful investments with societal results that might turn negative in the worst cases. I have just received Galbraith and Chen book on entropy economics which among other things states that boundaries, plans and regulations (centralized I guess) are essential. I suspect that utilities will turn to be a basket case of the failure of “classic” economics.

    It will be interesting to follow the news in this issue.

    Reply
    1. Revenant

      It’s OK, in the UK the phantom demand will be met by phantom supply!

      There is a similar but mirror problem in renewable generation in the UK. Nobody knows which schemes will actually get built because developers have applied for connections to solar parks and wind farms which they may never be able to build – and new viable projects cannot get connections (there is a moratorium until the deliverable pipeline is understood) because of these blocking speculators….

      Reply
  2. Plutoniumkun

    There is always an issue with demand for electricity being more dynamic and unpredictable than supply, which is why well functioning grids always have a big surplus of inbuilt energy and grid capacity. Much of the latter is – or should be – driven by existing regulations – outages don’t cost energy producers much, so in a ‘free market’ they will always underbuild capacity.

    When you look at IEA projections it is clear that AI/Data demand is primarily a regional problem for infrastructure, not a national/global one (although arguably China’s enormous build out of electricity capacity is based at least in part based on assumptions on AI demand).

    The obvious problem for energy planners is that while energy use for data/AI is soaring, nobody quite knows where the peak is – plenty of observers think we are already at the bubble peak, but nobody can be sure of this. So there seems to be an emphasis on ‘quick fix’ approaches, such as putting very high peaking load capacity into local grids (i.e. gas/diesel peaking plants). These are designed to provide quick surges in supply in the event of demand peaks or a drop in supply.

    I don’t think a surge in electricity supply investment represents much of a wasteful boom. Upgrading local or national grids takes a long time – investment cycles are in decades rather than years. If the whole AI/data demand turns out to be a bubble, then it will leave behind some surplus electricity infrastructure, but its unlikely to represent a huge waste. Most existing grids in ‘mature’ economies need rapid upgrading anyway and in developing country any investment in energy, even poorly planned, is rarely a complete waste. The most wasteful element would be local peaking plants, but these don’t represent a huge financial investment on a global scale. It would be bad news for the fossil gas industry though if they are dependant on this demand for their projections.

    In terms of threat to the overall economy, I tend towards being quite sanguine about data centres/AI. There seems little doubt that there is an enormous bubble in the physical provision of the infrastructure, but for the most part I think spare computing capacity will always get used (even if it is just for stablecoins or blockchains or cat videos), and ‘spare’ electricity infrastructural investment at worse represents an increase in resilience – we are not going to see parades of rusting unused pylons littering the landscape. Our economies need electrification anyway – it is impossible to achieve CO2 targets without massive electrification of all sectors. So IMO, as wasteful booms go, I think this is closer to a railway boom than a tulips one. It will cost investors if and when it pops, but the long term impact will be broadly neutral and may be even positive*.

    *assuming of course, it doesn’t trigger a global financial apocalypse and subsequent nuclear war.

    Reply
    1. upstater

      I’m not sure it is correct that infrastructure built to supply data centers the either never are built or become defunct is easily redeployed to serve different loads or improve grid resilience and reliability.

      Equipment used to supply data centers: Large power transformers (especially), switch gear and breakers are sometimes bespoke designs and not generic off-the-shelf products. Also data centers are served by the grid networks with multiple EHV feeds, but basically are radial, concentrated loads that could be only repurposed as industrial sites. But they are not creating redundancy within the grid itself, unless long transmission lines and swtchyards are built away from the sites. Much of the supply for large data centers come from nearby generation and also doesn’t contribute to resilience. Remember, long lines and switchyards go into the rate base and that cost is socialized. Hence there is growing push back by some regulators.

      Reply
      1. Plutoniumkun

        I can’t speak for the US experience, but in Europe the overwhelming majority of these centres are being built in areas with relatively strong existing infrastructure – the supporting investment is mostly into strengthening the grids for capacity and redundancy. It makes little sense to do otherwise unless there is a specific power source they are seeking to plug into. They are not built in an infrastructural or policy vacuum. The main ‘excess’ investment that I’ve seen is peaking plants as a quick and dirty alternative to circuit upgrades. I suspect most of those will hardly be used.

        The history of computing shows that when someone provides processing and storage capacity, it gets used in the same way that when energy is generated, someone almost always finds a use for it. It may not be a good use for society, but its almost always used for something. First generation data centres have already gone through a period of repurposing – Amazon have significantly changed the use of their in-house capacity for other commercial and other no doubt nefarious purposes.

        Reply
    2. Ignacio

      I, on the contrary, believe that if there is an AI bubble, and it explodes, lot of the investments consisting on highly specialized buildings inappropriate for other uses plus the associated transport/distribution networks will turn to be bridges to nowhere, and in the middle of nowhere, not helping very much the infrastructure buildt for other purposes. I may be wrong but seeing pictures of AI centres so it looks to me.

      Upstater comment is better pointed than mine. I hadn’t read it before posting.

      Reply
  3. Reader Keith

    “Huang admitted that we are at the 50 percent phase. The difference between what he says and what we say is that he is acting like 50 percent growth can go on between now and 2030.”

    https://www.nextplatform.com/2025/08/27/nvidia-sets-the-datacenter-growth-bar-very-high-as-compute-sales-dip/

    If these trends continue…

    https://www.youtube.com/watch?v=e6LOWKVq5sQ

    Like the dot com bust, there will be a glut not only in GPUs and “phantom datacenters” but also very expensive networking gear designed for datacenter GPU to GPU network traffic that isn’t necessarily useful for run of the mill datacenter usage.

    Reply
    1. vao

      There is probably an issue regarding all that investment in equipment:

      1) The investment in power production and transmission capacity should not be an issue. The electricity networks seriously need an overhaul anyway, and investments in that sector depreciate over decades.

      2) Similarly, the massive investments in data networks (optical fibre & co) during the .com bubble were not really a problem. Those investments depreciate over a fairly long period (in IT terms), so there was leeway to find uses for them.

      3) On the other hand, all those CPUs and GPUs needed for AI — as well as some other components such as hard disks — depreciate really fast, comparatively. If the AI hype comes to an abrupt end and those data centres are underused or not used at all for just a few years, this may well mean a huge irrecoverable waste in terms of IT capex.

      Reply
  4. Cervantes

    > This has prompted utilities to demand clear demand estimates from data centers for future connections and power purchase agreements (PPAs), to reduce the risk of getting demand and/or prices wrong.

    This is a little uninformed depending on the jurisdiction. Essentially, there are at least three very different regulatory constructs for large load interconnection in the United States, and more depending on how you want to categorize “vertically integrated IOU” states whether they’re in the Southeast, West, or in the SPP RTO. The only specific facts are from AEP, which operates across all three different jurisdictional types and whose quotes are only at a very high level, and Oncor, which is in ERCOT so only handles transmission and grid issues. The quotes don’t address generation specifically.

    Most of the dollar value at issue would be generation capacity. New power plants are expensive, especially when everybody wants to build them all at once. For a data center with a 1 GW or 500 MW peak load, the generation to support that data center is going to be in the $500M-$1B range for capex, while the substation and other grid stuff would normally be measured under $100M. If a new data center needed a transmission line, then it would cost more, but developers and utilities are specifically aiming to site these in a way that minimizes new transmission investment since that adds delay.

    Utilities know all this, so they have some standard practices to address a few of these things. For the grid investment side, they run the expected revenues through a formula called contribution in aid of construction or “CIAC” (pronounced “kayak”) and, if the revenues won’t be enough to pay back the upfront investment fast enough, the utility will require a payment.

    “It’s the generation, stupid.” Needing to add generation at the scale envisioned by new data centers is not adequately addressed in CAIC. Plus, even if revenues were enough to cover upfront grid investment or to justify generation investment, the payback period might be in the 10-15 year range where the utilities envision some risk as to the longevity of the counterparty. That’s one of the issues that creates fear and risk that investments will be made to serve a data center, then others will pay for it when the data center doesn’t stick around.

    So, utilities are adding a number of practices for large loads, including minimum bills, minimum-length contracts, and so on. And they’re generally not spending any money on site development or investment until they get firm commitments from large loads. For example, AEP reached a plan with its Indiana jurisdiction regulators in February, which is curiously unmentioned in the oilprice.com article.

    This stuff has been the talk on the utility regulation conference circuit for a while now. It’s definitely not a “no win” mentality on team utility right now: their eyeballs are gyrating with dollar signs. To be clear, I’m not saying there are no risks or anything like that. It’s just that the oilprice.com people are piecing things together at a very high level from a couple financial reports and have no idea what’s actually going on because they don’t follow regulatory developments or the trade press.

    Reply
  5. Cervantes

    > Perhaps informed readers can pipe up as to whether a whipsaw in utility capital spending could have a similar broader economy impact.

    The effect, if any, will be dwarfed by any whipsaw in data center and other large load development. The ratios of capex involved would be something like $1B of utility capex, both generation and grid investment, for $5-10B of data center capex.

    Reply
  6. The Rev Kev

    Maybe those US utilities should get together to try to force those AI promoters to wake up to themselves. Tell them that energy will be priced at a flat rate but that the more you use, the more higher the charges. You use sky high amounts of energy, then expect sky high bills. Surprise. It would be a good measure to get people to conserve energy too. As it stands, those AI promoters want to hang the costs of energy, water and infrastructure on anybody else but them. There was articles on NC years ago talking about how run down parts of the energy infrastructure in the US are and AI coming along to ramp up demand is doing nobody a favour. My crystal ball shows black-outs and brown-outs for the US in the coming years – and big profits to AI investors.

    Reply
    1. tegnost

      As it stands, those AI promoters want to hang the costs of energy, water and infrastructure on anybody else but them.

      Seems like the perfect opaque situation for profit farming by the sempras and pg+e’s and the tech bros never think they should have to pay for their efforts to create their perfect world

      Reply
    2. John Steinbach

      The problem with making the AI industry pay for electricity requirements rather than ratepayers is the result of depression era regulations designed to promote near universal access to electricity. Recognizing that electrifying rural America mad no economic sense, the REA required all ratepayers (then mostly urban) to share the costs of extending electrification to the entire nation.

      Today the AI industry has turned this regulation on its head & is requiring ratepayers to cover the lions share of CAPEX costs associated with the AI boom.

      Regarding the article, here in Prince William County, the courts have thrown out approval of the Digital Gateway, planned to be the world’s largest DC complex. Local utilities including Dominion Power & NOVA Electric Cooperative are left in the dark about the future of the project.

      Reply
  7. MicaT

    I’ve listened to many podcasts about this and the idea that utilities are going to spend billions on infrastructure because maybe someone may build a AI isn’t real. Planning for many options sure.

    Lots of variables especially now: actual NG limits due to pipeline capacity, steel prices up for towers and pipelines, gas turbines years behind.
    There are ways out for many of these if foreign suppliers are allowed to be used. Biden stopped that, who knows what Trump will do.

    The other part is that the electric grid needs more power anyway. And as some people in the industry say, if you have enough extra surplus then prices go down as they are looking for buyers. Vs now use is really high vs capacity so prices are high.
    See Texas with negative power prices at night.

    And while the AI/data seems to be in the east coast, it might be the south west, mountain states and Texas where most are built because of solar and wind.
    Those being the fastest to build and least expensive to run.
    The water use is mitigated by using actual cooling vs pass through usage. More power, no water.

    And finally will they actually build them in the US where energy is expensive, building etc is slow and expensive vs China?

    Reply
  8. Adam1

    During the Dot Com years there were two sectors that money were shoveled into… the infamous negative profit online world and the technology world that supplied the processing and connectivity. I worked on the network side during this time period, first for a consulting firm and then for one of the larger CLECs.

    I can’t say I ever saw the original study, but I was told by colleagues that a study from the early 1990’s said that world demand or data bandwidth was tripling (or there abouts) every quarter. While that may have actually been true at the time it was created, there were still business cases being written in the late 1990’s that were using that data for their growth forecasts. Fiber was being laid at such a high rate back then that most of it was known as dark fiber – the cables had no known customer use at the time it was buried. I would suspect that you’ll find there were dozens of start-up fiber companies that all went bankrupt in the early 2000’s because of all the debt they had tied to dark fiber. I’m sure it took years after the fact to light up much of that cabling and some of it may still be dark today.

    Reply
    1. Vodkatom

      Worldcom! Your comment brings me back to those heady days when i first became aware that bubbles are real, and markets can be dumb. And there still seemed to be consequences for bad behavior. But I fear those days are done.

      Reply
      1. Adam1

        WorldCom is known because it was so big relatively. There were lots of smaller startups that gobbled up cash like no tomorrow even thought most everyone with an analytical brain could tell them that by 1998 or 2000 the math didn’t even make crazy person sense – it was totally in another universe.

        I remember being in my late 20’s in CNY an a company that came from know where, Telergy Communications, was hiring people faster than they could open office space. I knew people who took jobs with them that had work space in what used to be a café area of the building Telergy was in. She and like 100+ people worked in that “café” area while the company (on paper) worked to secure more space.

        Telergy’s concept was smart (directionally)… why spend the money and time to lay new fiber one new right of ways. They partnered with all of the NYS utilities to add cabling on their existing rights of way. The problem was that between Telergy and every other company burring cable at the time, that it was not possible for ANYONE to make money.

        The business case claimed demand FAR exceeded the existing infrastructures demand so even the MOST cost effective NEW provider couldn’t make enough money to cover their new capital expenses!

        Reply
  9. ilsm

    Can DHS/DoD demand fund ROI for huge AI and infrastructure spending?

    Facial and object recognition require clouds of data and analytics.

    Hundreds of billions a year!

    Taxpayer on hook?

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *