How Much Energy Does ChatGPT’s Newest Model Consume?

Yves here. This is seriously not good. I don’t understand how something as destructive as ChatGPT is being aggressively embraced. And it is not only imposing planetary-destruction cost but more mundane increased power costs to consumers generally.

By Haley Zaremba, a writer and journalist based in Mexico City. Originally published at OilPrice

  • The energy consumption of the newest version of ChatGPT is significantly higher than previous models, with estimates suggesting it could be up to 20 times more energy-intensive than the first version.
  • There is a severe lack of transparency regarding the energy use and environmental impact of AI models, as there are no mandates forcing AI companies to disclose this information.
  • The increasing energy demands of AI are contributing to rising electricity costs for consumers and raising concerns about the broader environmental impact of the tech industry.

How much energy does the newest version of ChatGPT consume? No one knows for sure, but one thing is certain – it’s a whole lot. OpenAI, the company behind ChatGPT, hasn’t released any official figures for the large language model’s energy footprints, but academics are working to quantify the energy use for query – and it’s considerably higher than for previous models.

There are no mandates forcing AI companies to disclose their energy use or environmental impact, so most do not offer up those kinds of statistics publicly. As of May of this year, 84 percent of all large language model traffic was conducted on AI models with zero environmental disclosures.

“It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,” says Sasha Luccioni, climate lead at an AI company called Hugging Face. “It’s not mandated, it’s not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere,” she continued.

Sam Altman, the Chief Executive Officer of OpenAI, has thrown out some figures into the public sphere – saying that ChatGPT consumes 0.34 watt-hours of energy and 0.000085 gallons of water per query – but has left out key details like what model these numbers refer to, and has offered no backup or corroboration for his statements.

Experts from outside the OpenAI fold have estimated that ChatGPT-5 may use as much as 20 times more energy as the first version of ChatGPT, and at the very least uses several times more. “A more complex model like GPT-5 consumes more power both during training and during inference. It’s also targeted at long thinking … I can safely say that it’s going to consume a lot more power than GPT-4,” Rakesh Kumar, a professor at the University of Illinois, recently toldThe Guardian. Kumar’s current work focuses on AI’s energy consumption.

While a query to ChatGPT in 2023 would have consumed about 2 watt-hours, researchers at the University of Rhode Island’s AI lab found that ChatGPT-5 can use up to 40 watt-hours of electricity to configure a medium-length response (around 1,000 tokens). On average, they estimate that the model uses slightly over 18 watt-hours for such a response. This places ChatGPT-5 at a higher energy consumption rate than any other of the AI models they track save for two: OpenAI’s o3 reasoning model and Deepseek’s R1.

Calculating these estimated energy consumption rates was no easy feat, considering the severe lack of transparency in the sector, in spite of increasing scrutiny. “It’s more critical than ever to address AI’s true environmental cost,” University of Rhode Island professor Marwan Abdelatti told The Guardian. “We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5’s environmental impact.”

While tech companies consume more and more energy each year to power their AI ambitions, common consumers are suffering the consequences. It’s consumers who are footing the bill for skyrocketing energy usage. The New York Times warns that “electricity rates for individuals and small businesses could rise sharply as Amazon, Google, Microsoft and other technology companies build data centers and expand into the energy business.” Moreover, Silicon Valley’s backtracking on climate pledges will directly impact global communities, whether or not they ever use AI.

“We are witnessing a massive transfer of wealth from residential utility customers to large corporations—data centers and large utilities and their corporate parents, which profit from building additional energy infrastructure,” Maryland People’s Counsel David Lapp recently told Business Insider.

“Utility regulation is failing to protect residential customers, contributing to an energy affordability crisis.”

Print Friendly, PDF & Email

38 comments

  1. Ignacio

    A lot of entropy increase for very little utility, might Galbraith say. And going worse for every new version.

    Reply
        1. catchymango

          In light of the recent forest fires in Canada, this week I reread that excellent, eye-opening piece by lambert a couple years ago. I was struck by how the debates I was seeing up here, 2 years later, even in a big logging province like New Brunswick still oscillates between people who blame crown lands, or just selfish individuals/firebugs.

          The fairly unpopular Irving oligarch family, which controls the province’s logging industry and has many private forest tracts, isn’t being subject to any restrictions or regulations, even while controversy has erupted over the government cracking down on recreational users.

          In the absence of any organized political force that’s publicly making the connection between energy bills, class warfare and AI, I feel like this too might largely fall under the radar.

          No major party in Canada—especially not the NDP—will attack the existence of an industry sadly. They will just blame rising electricity bills on underinvestment by previous governments, growing electricity demand from EVs/a necessary cost of fighting climate change.

          Reply
  2. Matthew T Hoare

    Don’t worry, the “AI” bubble is just about to burst and it might even take the entire stock market with it.

    This is a good thing because financialisation needs to be stopped if we are to bring our resource usage down to within planetary limits.

    I’m looking forward to watching the stock market traders panic, it will be hilarious 😂

    Reply
      1. John Wright

        Perhaps the little people are hurt the most, but that may be because of the “we must save the financial industry” response by the powers that be as losses are socialized.

        The exposure of the “little people” to the stock market is rather limited.

        See https://www.pewresearch.org/short-reads/2024/03/06/a-booming-us-stock-market-doesnt-benefit-all-racial-and-ethnic-groups-equally/

        “A recent study from the Federal Reserve Bank of New York found that 35% of White Americans’ individual financial wealth is invested in stocks and mutual funds, versus 8% and 14% for Black and Hispanic Americans’ financial wealth, respectively.”

        see https://www.fool.com/research/how-many-americans-own-stock/

        “The bottom 50% of U.S. adults hold only 1% of stocks, worth $490 billion.”

        If one does the math, there are about 129 million adults in the bottom half of the population,

        $490E9 / 129E6 = $3790 per adult in the lower half.

        If the stock market fell by 50%, they would lose about $1900.

        The financial industry in the USA has convinced many that it is a critical industry that must be preserved in its current form whenever there is a financial crisis.

        If, as suggested below , the economic footprint of the USA financial industry is two to three times the size it SHOULD be for society’s overall benefit, a collapse and restructuring of the US financial industry could be a good thing.

        https://www.newyorker.com/magazine/2010/11/29/what-good-is-wall-street

        Reply
        1. david

          I’m no expert in these things. But I did think during 2008 they should have let the whole thing collapse as all their actions did was make it worse for when it next goes.

          However, it isn’t workers exposure to the stock market that is where the pain comes from for them. If the stock market collapses then many companies will fold or at least have mass layoffs. That is where workers will br harmed.

          Reply
          1. John Wright

            A company that is not issuing new stock could have its stock go to zero and it would not effect operations.

            Already issued stock is not something that a company benefits from, similar to a car sold by a car dealer years ago.

            The company has already used the stock sale proceeds, perhaps many, many years ago.

            And new shareholders, who purchase already issued stock, are not providing any new funds to the company.

            A company can fold if the market for their products goes away, not because the market for their stock goes away.

            One could argue a profitable company, whose stock falls to zero, should “go private” and buy their stock back to avoid being taken over and, possibly, to also avoid paying dividends.

            A falling stock market may indicate general future product demand is low, but a vastly lower stock price need not have any effects on a company’s operation, other than possibly signaling there are existing company concerns or the demand for their products is falling.

            Reply
      2. Terry Flynn

        Yep. Though I have decided if I don’t laugh I’ll cry so got into a conversation about my latest dump with grok which is in moderation on links thread!

        What have we been reduced to?

        Reply
    1. Acacia

      It would indeed be pleasing to see the AI bubble pop, tho I am less sanguine about this actually happening near term.

      Reply
  3. TiPi

    It’s widely reported that Ireland’s entire electricity consumption now involves 21% demand from data centres, mostly around Dublin, and this or projected to more than double to 43% in barely a decade, creating major grid pressures for all other energy consumers.

    A house of chips rather than cards.

    Reply
  4. Skeptical Scott

    Is this how humanity deals with the Climate Crisis?…by creating something unnecessary that requires more energy than ever before. It’s hitting the gas instead of the breaks as we head 100mph into environmental catastrophe. It’s so surreal.

    Reply
    1. The Rev Kev

      It really started with cryptocurrency about a decade ago just when you had an emphasis on energy efficiency.

      Reply
      1. Robert Hahl

        Around that time I thought of a sci-fi premise in which society was reduced to walled city-states, each dedicated only to producing electricity for bitcoin, while people outside lived without electricity. Looks more realistic now.

        Reply
        1. david

          Part of me thinks we should leave these tech bros to their own little vr based society and make the rest of the world fit for us. Make everyone happy. Thiugh they crave the power too much.

          Reply
        2. david

          Part of me thinks we should leave these tech bros to their own little vr based society and make the rest of the world fit for us. Make everyone happy. Thiugh they crave the power too much.

          I’ve just bought a new washing machine and it comes with wifi. I mean why? What benefit is thos for anyone. It is just another energy requirement.

          Reply
          1. Keith Newman

            I just bought a new dishwasher (Bosch) and it operates with wifi. I’m able to turn it on from outside the country. Very useful. (sarc) Luckily the most basic functions don’t require the ridiculous app.

            Reply
      2. david

        Electric cars are just as bad. Making them bigger and heavier and crammed full of more and more energy guzzling devices. Exact opposite of what should have been done. Encourage smaller and lighter electric cars and leave heavier duty diesel caes for where there is a need for them.

        Reply
        1. jefemt

          Traditional lead-acid batteries in Golf Carts, or even (clutches pearls) a human-powered bike or trike with baskets/ trailer) would serve more than 50% of most people’s transportation needs. Add the bonus of better overall health of body and mind.

          Reply
        2. Gregorio

          I could never understand why so many EV manufacturers feel the need to put 750hp motors in them that will propel them from 0-60 in less than 3 seconds.

          Reply
    2. Michael Fiorillo

      The growth model requires ever-increasing throughput (speed and volume), which is inherently entropic.

      Growth, Overshoot, Crash.

      Reply
  5. paul

    I remember William gates III jr explaining that A.I. will work out how to save more energy than it consumes
    So no need to worry our little heads

    Reply
  6. Zephyrum

    The API costs for the GigaChat LLM in Russia are 12x higher than for OpenAI, and electricity there is on the order of 5x cheaper. The difference is that they are trying to make a profit today.

    At some point when the US marketshare grab becomes less subsidized by investors then I expect AI and its power usage to drop as well. One could imagine prices increasing to cover marginal operating costs of the data centers and associated R&D, which is probably at least 5x current pricing.

    Reply
    1. Michael McK

      I agree with your general point but 300 watts is a measure of power draw not of total consumption over time which would be measured in watthours. I have heard a brain’s power consumption is about 60 watts. Times 24 hours would equal about 1440 watt hours or 1.4 kwh, kilowatt hours per day.

      Reply
  7. Skip Intro

    Ed Zitron has a very timely and detailed examination of how ChatGPT 5 works, and finds it massively more costly due to the adaptive router which decides where to assign various queries. Among other problems, this has broken the reuse of ‘static tokens’ which could previously be cached over the stages of a query.

    Every single thing that can happen when you ask ChatGPT to do something may trigger the “router” to change model, or request a new tool, and each time it does so requires a completely fresh static prompt, regardless of whether you select Auto, Thinking, Fast or any other option. This, in turn, requires it to expend more compute, with queries consuming more tokens compared to previous versions.

    As a result, ChatGPT-5 may be “smart,” but it sure doesn’t seem “efficient.” To play Devil’s Advocate, OpenAI likely added the routing model as a means of creating more sophisticated outputs for users, and, I imagine, with the intention of cost-saving. Then again, this may just be the thing that it had ready to ship — after all, GPT-5 was meant to be “the next great leap in AI,” and the pressure was on to get it out the door.

    By creating a system that depends on an external routing model — likely another LLM — OpenAI has removed the ability to cache the hidden instructions that dictate how the models generate answers in ChatGPT, creating massive infrastructural overhead.

    Reply
  8. Mikel

    “I don’t understand how something as destructive as ChatGPT is being aggressively embraced.”

    One way is to keep pumping “keep up with China” articles.
    “Keeping up with the USSR” playbook worked in developing the parasitic MIC.

    Reply
    1. Ram

      Having used paid AI for coding for last 10 days, it’s hard to not be impressed with what it does. From clean slate to 1000s of lines of code in a single day is easy. C suites are loving it..

      But wow factor goes away pretty fast. Once system becomes complex enough, it’s faster to fix by hand. Keeping the model straight becomes more and more difficult. It’s breaking things as fast as it built

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *