Yves here. This is seriously not good. I don’t understand how something as destructive as ChatGPT is being aggressively embraced. And it is not only imposing planetary-destruction cost but more mundane increased power costs to consumers generally.
By Haley Zaremba, a writer and journalist based in Mexico City. Originally published at OilPrice
- The energy consumption of the newest version of ChatGPT is significantly higher than previous models, with estimates suggesting it could be up to 20 times more energy-intensive than the first version.
- There is a severe lack of transparency regarding the energy use and environmental impact of AI models, as there are no mandates forcing AI companies to disclose this information.
- The increasing energy demands of AI are contributing to rising electricity costs for consumers and raising concerns about the broader environmental impact of the tech industry.
How much energy does the newest version of ChatGPT consume? No one knows for sure, but one thing is certain – it’s a whole lot. OpenAI, the company behind ChatGPT, hasn’t released any official figures for the large language model’s energy footprints, but academics are working to quantify the energy use for query – and it’s considerably higher than for previous models.
There are no mandates forcing AI companies to disclose their energy use or environmental impact, so most do not offer up those kinds of statistics publicly. As of May of this year, 84 percent of all large language model traffic was conducted on AI models with zero environmental disclosures.
“It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,” says Sasha Luccioni, climate lead at an AI company called Hugging Face. “It’s not mandated, it’s not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere,” she continued.
Sam Altman, the Chief Executive Officer of OpenAI, has thrown out some figures into the public sphere – saying that ChatGPT consumes 0.34 watt-hours of energy and 0.000085 gallons of water per query – but has left out key details like what model these numbers refer to, and has offered no backup or corroboration for his statements.
Experts from outside the OpenAI fold have estimated that ChatGPT-5 may use as much as 20 times more energy as the first version of ChatGPT, and at the very least uses several times more. “A more complex model like GPT-5 consumes more power both during training and during inference. It’s also targeted at long thinking … I can safely say that it’s going to consume a lot more power than GPT-4,” Rakesh Kumar, a professor at the University of Illinois, recently toldThe Guardian. Kumar’s current work focuses on AI’s energy consumption.
While a query to ChatGPT in 2023 would have consumed about 2 watt-hours, researchers at the University of Rhode Island’s AI lab found that ChatGPT-5 can use up to 40 watt-hours of electricity to configure a medium-length response (around 1,000 tokens). On average, they estimate that the model uses slightly over 18 watt-hours for such a response. This places ChatGPT-5 at a higher energy consumption rate than any other of the AI models they track save for two: OpenAI’s o3 reasoning model and Deepseek’s R1.
Calculating these estimated energy consumption rates was no easy feat, considering the severe lack of transparency in the sector, in spite of increasing scrutiny. “It’s more critical than ever to address AI’s true environmental cost,” University of Rhode Island professor Marwan Abdelatti told The Guardian. “We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5’s environmental impact.”
While tech companies consume more and more energy each year to power their AI ambitions, common consumers are suffering the consequences. It’s consumers who are footing the bill for skyrocketing energy usage. The New York Times warns that “electricity rates for individuals and small businesses could rise sharply as Amazon, Google, Microsoft and other technology companies build data centers and expand into the energy business.” Moreover, Silicon Valley’s backtracking on climate pledges will directly impact global communities, whether or not they ever use AI.
“We are witnessing a massive transfer of wealth from residential utility customers to large corporations—data centers and large utilities and their corporate parents, which profit from building additional energy infrastructure,” Maryland People’s Counsel David Lapp recently told Business Insider.
“Utility regulation is failing to protect residential customers, contributing to an energy affordability crisis.”
A lot of entropy increase for very little utility, might Galbraith say. And going worse for every new version.
AI pimps, AI pimps, whatcha gonna do?
Whatcha gonna do when the 2nd law of thermodynamics comes for you?
https://www.youtube.com/watch?v=LVTOYWkwbPI&list=RDLVTOYWkwbPI
What happens when people learn that because of all those AI centers, that their electricity bills are now more expensive than their mortgage payments-
https://www.youtube.com/watch?v=uvqJ1mTkEuY
‘Altman, Altman
Whatcha gonna do?
Whatcha gonna do when they come for you?’
Don’t worry, the “AI” bubble is just about to burst and it might even take the entire stock market with it.
This is a good thing because financialisation needs to be stopped if we are to bring our resource usage down to within planetary limits.
I’m looking forward to watching the stock market traders panic, it will be hilarious 😂
It’s widely reported that Ireland’s entire electricity consumption now involves 21% demand from data centres, mostly around Dublin, and this or projected to more than double to 43% in barely a decade, creating major grid pressures for all other energy consumers.
A house of chips rather than cards.
Is this how humanity deals with the Climate Crisis?…by creating something unnecessary that requires more energy than ever before. It’s hitting the gas instead of the breaks as we head 100mph into environmental catastrophe. It’s so surreal.
It really started with cryptocurrency about a decade ago just when you had an emphasis on energy efficiency.