The Growing Environmental Footprint Of Generative AI

Yves here. This article usefully provides data on potential generative AI energy use and why cheery notions that its appetite will fall are likely to prove to be wrong.

By David Berreby, who writes about AI and robotics, and his work has appeared in The New York Times, National Geographic, Slate, and other publications. He is the author of “Us and Them: The Science of Identity.” Originally published by Yale Environment 360; cross posted from Undark as part of the Climate Desk collaboration

Two months after its release in November 2022, OpenAI’s ChatGPT had 100 million active users, and suddenly tech corporations were racing to offer the public more “generative AI.” Pundits compared the new technology’s impact to the Internet, or electrification, or the Industrial Revolution — or the discovery of fire.

Time will sort hype from reality, but one consequence of the explosion of artificial intelligence is clear: this technology’s environmental footprint is large and growing.

AI use is directly responsible for carbon emissions from non-renewable electricity and for the consumption of millions of gallons of fresh water, and it indirectly boosts impacts from building and maintaining the power-hungry equipment on which AI runs. As tech companies seek to embed high-intensity AI into everything from resume-writing to kidney transplant medicine and from choosing dog food to climate modeling, they cite many ways AI could help reduce humanity’s environmental footprint. But legislators, regulators, activists, and international organizations now want to make sure the benefits aren’t outweighed by AI’s mounting hazards.

“The development of the next generation of AI tools cannot come at the expense of the health of our planet,” Massachusetts Senator Edward Markey said in a Feb. 1 statement in Washington, after he and other senators and representatives introduced a bill that would require the federal government to assess AI’s current environmental footprint and develop a standardized system for reporting future impacts. Similarly, the European Union’s “AI Act,” approved by member states last week, will require “high-risk AI systems” (which include the powerful “foundation models” that power ChatGPT and similar AIs) to report their energy consumption, resource use, and other impacts throughout their systems’ lifecycle. The EU law takes effect next year.

Meanwhile, the International Organization for Standardization, a global network that develops standards for manufacturers, regulators, and others, said it will issue criteria for “sustainable AI” later this year. Those will include standards for measuring energy efficiency, raw material use, transportation, and water consumption, as well as practices for reducing AI impacts throughout its life cycle, from the process of mining materials and making computer components to the electricity consumed by its calculations. The ISO wants to enable AI users to make informed decisions about their AI consumption.

Right now, it’s not possible to tell how your AI request for homework help or a picture of an astronaut riding a horse will affect carbon emissions or freshwater stocks. This is why 2024’s crop of “sustainable AI” proposals describe ways to get more information about AI impacts.

In the absence of standards and regulations, tech companies have been reporting whatever they choose, however they choose, about their AI impact, said Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside, who has been studying the water costs of computation for the past decade. Working from calculations of annual use of water for cooling systems by Microsoft, Ren estimates that a person who engages in a session of questions and answers with GPT-3 (roughly 10 t0 50 responses) drives the consumption of a half-liter of fresh water. “It will vary by region, and with a bigger AI, it could be more.” But a great deal remains unrevealed about the millions of gallons of water used to cool computers running AI, he said.

The same is true of carbon.

“Data scientists today do not have easy or reliable access to measurements of [greenhouse gas impacts from AI], which precludes development of actionable tactics,” a group of 10 prominent researchers on AI impacts wrote in a 2022 conference paper. Since they presented their article, AI applications and users have proliferated, but the public is still in the dark about those data, said Jesse Dodge, a research scientist at the Allen Institute for Artificial Intelligence in Seattle, who was one of the paper’s coauthors.

AI can run on many devices — the simple AI that autocorrects text messages will run on a smartphone. But the kind of AI people most want to use is too big for most personal devices, Dodge said. “The models that are able to write a poem for you, or draft an email, those are very large,” he said. “Size is vital for them to have those capabilities.”

Big AIs need to run immense numbers of calculations very quickly, usually on specialized Graphical Processing Units — processors originally designed for intense computation to render graphics on computer screens. Compared to other chips, GPUs are more energy-efficient for AI, and they’re most efficient when they’re run in large “cloud data centers” — specialized buildings full of computers equipped with those chips. The larger the data center, the more energy efficient it can be. Improvements in AI’s energy efficiency in recent years are partly due to the construction of more “hyperscale data centers,” which contain many more computers and can quickly scale up. Where a typical cloud data center occupies about 100,000 square feet, a hyperscale center can be 1 or even 2 million square feet.

Estimates of the number of cloud data centers worldwide range from around 9,000 to nearly 11,000. More are under construction. The International Energy Agency, or IEA, projects that data centers’ electricity consumption in 2026 will be double that of 2022 — 1,000 terawatts, roughly equivalent to Japan’s current total consumption.

However, as an illustration of one problem with the way AI impacts are measured, that IEA estimate includes all data center activity, which extends beyond AI to many aspects of modern life. Running Amazon’s store interface, serving up Apple TV’s videos, storing millions of people’s emails on Gmail, and “mining” Bitcoin are also performed by data centers. (Other IEA reports exclude crypto operations, but still lump all other data-center activity together.)

Most tech firms that run data centers don’t reveal what percentage of their energy use processes AI. The exception is Google, which says “machine learning” — the basis for humanlike AI — accounts for somewhat less than 15 percent of its data centers’ energy use.

Another complication is the fact that AI, unlike Bitcoin mining or online shopping, can be used to reduce humanity’s impacts. AI can improve climate models, find more efficient ways to make digital tech, reduce waste in transport, and otherwise cut carbon and water use. One estimate, for example, found that AI-run smart homes could reduce households’ CO2 consumption by up to 40 percent. And a recent Google project found that an AI fast-crunching atmospheric data can guide airline pilots to flight paths that will leave the fewest contrails.

Because contrails create more than a third of commercial aviation’s contribution to global warming, “if the whole aviation industry took advantage of this single A.I. breakthrough,” says Dave Patterson, a computer-science professor emeritus at UC Berkeley and a Google researcher, “this single discovery would save more CO₂e (CO₂ and other greenhouse gases) than the CO₂e from all A.I. in 2020.”

Patterson’s analysis predicts that AI’s carbon footprint will soon plateau and then begin to shrink, thanks to improvements in the efficiency with which AI software and hardware use energy. One reflection of that efficiency improvement: as AI usage has increased since 2019, its percentage of Google data-center energy use has held at less than 15 percent. And while global internet traffic has increased more than twentyfold since 2010, the share of the world’s electricity used by data centers and networks increased far less, according to the IEA.

However, data about improving efficiency doesn’t convince some skeptics, who cite a social phenomenon called “Jevons paradox”: Making a resource less costly sometimes increases its consumption in the long run. “It’s a rebound effect,” Ren said. “You make the freeway wider, people use less fuel because traffic moves faster, but then you get more cars coming in. You get more fuel consumption than before.” If home heating is 40 percent more efficient due to AI, one critic recently wrote, people could end up keeping their homes warmer for more hours of the day.

“AI is an accelerant for everything,” Dodge said. “It makes whatever you’re developing go faster.” At the Allen Institute, AI has helped develop better programs to model the climate, track endangered species, and curb overfishing, he said. But globally AI could also support “a lot of applications that could accelerate climate change. This is where you get into ethical questions about what kind of AI you want.”

If global electricity use can feel a bit abstract, data centers’ water use is a more local and tangible issue — particularly in drought-afflicted areas. To cool delicate electronics in the clean interiors of the data centers, water has to be free of bacteria and impurities that could gunk up the works. In other words, data centers often compete “for the same water people drink, cook, and wash with,” said Ren.

In 2022, Ren said, Google’s data centers consumed about 5 billion gallons (nearly 20 billion liters) of fresh water for cooling. (“Consumptive use” does not include water that’s run through a building and then returned to its source.) According to a recent study by Ren, Google’s data centers used 20 percent more water in 2022 than they did in 2021, and Microsoft’s water use rose by 34 percent in the same period. (Google data centers host its Bard chatbot and other generative AIs; Microsoft servers host ChatGPT as well as its bigger siblings GPT-3 and GPT-4. All three are produced by OpenAI, in which Microsoft is a large investor.)

As more data centers are built or expanded, their neighbors have been troubled to find out how much water they take. For example, in The Dalles, Oregon, where Google runs three data centers and plans two more, the city government filed a lawsuit in 2022 to keep Google’s water use a secret from farmers, environmentalists, and Native American tribes who were concerned about its effects on agriculture and on the region’s animals and plants. The city withdrew its suit early last year. The records it then made public showed that Google’s three extant data centers use more than a quarter of the city’s water supply. And in Chile and Uruguay, protests have erupted over planned Google data centers that would tap into the same reservoirs that supply drinking water.

Most of all, researchers say, what’s needed is a change of culture within the rarefied world of AI development. Generative AI’s creators need to focus beyond the technical leaps and bounds of their newest creations and be less guarded about the details of the data, software, and hardware they use to create it.

Some day in the future, Dodge said, an AI might be able — or be legally obligated — to inform a user about the water and carbon impact of each distinct request she makes. “That would be a fantastic tool that would help the environment,” he said. For now, though, individual users don’t have much information or power to know their AI footprint, much less make decisions about it.

“There’s not much individuals can do, unfortunately,” Ren said. Right now, you can “try to use the service judiciously,” he said.

Correction, February 21, 2024: An earlier version of this article incorrectly quoted researcher Dave Patterson as referring to CO₂ emissions from global aviation. Patterson was actually referring to CO₂e (“carbon dioxide equivalent”) emissions, a measurement that includes both CO₂ and other greenhouse gases.

Print Friendly, PDF & Email


  1. Barnes

    “That would be a fantastic tool that would help the environment,”

    Replace “would” by “could” and this sentence has some truth to it.
    I bet Jevon’s paradox will prevail because the competitive logic of the whole enterprise does not allow for scaling down by choice. And I simply cannot fathom why this now should become a precedent. If we don’t decisively use efficiency gains to reduce absolute resource consumption (which would require societies to make ethical choices in the first place), it will be more of the same but faster.
    IMHO real AGI as in “the singularity” preached/feared by techno utopians is in a race against dwindling resources and competing directly against what keeps modern civilisation running. The race is on and it’s finale open.

    1. digi_owl

      The hope of AGI is that its exponential takeoff will not have an S curve, and that it will discover some kind of en run around thermodynamics.

      The fear is that it will just turn into a paperclip maximizer (effectively a variant of the grey goo scenario), ignoring that we already have those in the form of the corporation.

      And yeah, Jevons is the basic horror nobody, least of all economists, wants to talk about. The irony is that the best summary of it was delivered by Agent Smith of Matrix fame, only for the whole franchise to be turned into yet more idpol in order to bury the millennial message.

    2. Susan the other

      Recycling to the rescue? Is there something about the density of cold water that makes it so useful for cooling electronic centers? Could we use treated sewage instead? Seems appropriate. But too corrupted probably. Cloud data cooling center facilities might smell even funnier. I wonder if all the speed that AI produces is mostly counterproductive because real biological processes, say those of epigenetics, take a generation or two. So that AI might evolve at a speed to be resilient to subtle biological cues. Even obstruct them. Let AI write that meta poetry please. On a practical note, there truly is energy to be extracted from all those vast volumes of human waste. Not to mention plastics. And the absurd danger of AI becoming so circular that it doesn’t recognize its own polluted demise seems like it could become a real threat. Cosmic even.

  2. SocalJimObjects

    Meanwhile, Nvidia just posted stellar earnings, and they are raising estimates for the next quarter. This horror show will go on till who knows when.

  3. Keith

    Lumping in scientific HPC-style computing with “cloud” and “AI” is kind of unfair when talking about the cost/benefit analysis of power consumption versus societal benefit. All the benefits listed above do not come from cloud monopolies (AWS, Google, MSFT) nor are they going to come from generative AI (OpenAI, Meta, Google). IMHO scientific research computing is very frugal with both their resource consumption and utilization. Cloud and generative AI, well, not so much, lots of money is made on over subscription and under utilization (with little to no societal benefit and lots of environmental damage)

    Not all compute is grossly inefficient and power hungry. However, GPUs are notorious power hogs, a single GPU can consume ~500W when crunching numbers. The proliferation of generative AI means stuffing hundreds of GPUs in a cluster of servers which is straining the power constraints of most data centers and is the least socially responsible computing “workload” in recent years (see

    1. digi_owl

      Because the GPU at its core is brute force. You set up a set of instruction and then ramrod the data through them continuously. Never mind that silicon is hitting its limit as IC components are measured in mere atoms.

  4. Mikerw0

    Yves — it strikes me your introductory comments are way too mild. Let’s call a spade, a spade. This is a resource devouring technology, whose primary objective is to further the libertarian objective of downward pressure on labor. It is consuming resource that are increasingly vital to maintaining a functioning society and are needed for more important things, like food production.

    The notion that this is an information issue, so that ‘educated’ consumers can make decisions is complete bunk. We know that is not how the world works. And, as we have seen with manner other of these technologies, they get embedded into things, done secretively, and in a way that you would need highly complex AI (if it existed) to figure it out — by design.

    There is zero chance, in my view, that we will get useful legislation to create clarity. It is the wrong objective.

    Does anyone really think that airlines, and regulators, will start re-rerouting planes over contrails, as one example cited? No chance in the actual world. As has been more than well articulated on NC, our leaders are just not serious about climate change and its implications. So, of course they will let things like AI compound the problem.

    How about we start stewarding power and water and stop de-facto subsidizing these things in the name of enriching a bunch of Silicon Valley types.

    1. gwb

      The Cloud, like modern finance, is an exponential system and is incompatible with finite planetary resources. The thing will keep growing and growing until it runs out of resources or collapses of its own dead weight. AI’s potential to save energy? Doesn’t matter – not when 500 hours of video are uploaded to YouTube every minute or 4.5 billion e-mails are sent every day. Every new data center will fill up soon after it’s online. The suggestion at the end of the article that we use AI “judiciously”? Ha. When did modern humans do anything “judiciously”…

  5. John Anthony La Pietra

    From my first career in transportation planning, I know of the Jevons paradox in the form of the Fundamental Law of Highway Congestion (first conjectured in 1962). See, for example:

    It basically says that building new roads, adding lanes, etc. in order to ease traffic jams tends to generate/enable more travel and perpetuate the Congestion.

    Not a huge mystery to anyone with eyes to see and time to observe the roads one is stuck on. Even the callow youth I was in the late 1970s and through most of the 1980s could perceive the unknowing independent re-discovery I called the Field of Dreams Rule (“If you build it, they will come” — or go).

    I don’t really see the concept as a paradox; if you make something easier or cheaper or less unpleasant to do (or seem so), of course people are going to do it more. By the same logic, I can well believe the concern and skepticism discussed in this article.

    1. digi_owl

      A variant of that is also seen online, as increases in bandwidth (an abuse of the term, but here we are) has basically turned internet into cable TV with a return pipe.

  6. i just dont like the gravy

    If only DFW didn’t kill himself and lived to see he was right about Infinite Jest in every way possible…

  7. New_Okie

    AI-run smart homes could reduce households’ CO2 consumption by up to 40 percent.

    I find this both wrecklessly optimistic and misleading.

    I clicked through the supplied link, which linked an article that linked to a paper “sponsored” by a bunch of tech companies like Microsoft and Verizon. And I didn’t see the 40% figure in the smart home section. They cite an example where smart thermostats saved 15%-30% on power use, so maybe the rest comes from…smart meters and time of use billing encouraging people to use more renewables? I’m not clear.

    So already the stat seems overegged.

    However there is a larger issue, which is that none of this requires AI to work. We don’t need AI for the smart grid to work–we had the smart grid before AI. And we don’t need AI to create a programmable thermostat that uses less power when no one is in the building. Simple timers and switches can achieve the same. Indeed, I did a search for “how much energy do smart homes save” and found other figures more in the 1-15% range. [ ] And homes that had a programmable thermostat before the smart thermostat was installed saw very little in the way of savings.

    Even if the smart thermostats were way, way, better than regular programmable thermostats, they did not require AI to work, just more complex programming. Perhaps AI will be used to enhance them further, but a fraction of a fraction of heat savings isn’t much to brag about.

    So there are some savings to be had by using technology to keep building temperature more flexible. But I am puzzled at why the author attributes these gains to AI.

    1. Jeremy

      This casual suggestion that putting an LLM in your thermostat will magically reduce emissions by 40% is the perfect encapsulation of AI discourse. No one wants to call snake oil snake oil.

  8. veritea

    The discussion also misses the nuance that the vast majority of energy consumption takes place when training the models. The actual use of the models is relatively low power and could be performed on a desktop computer (albeit with a slower response rate than the cloud resources).

    Right now we are in a constant race between companies to train larger models and models trained on multi-modal data. If this approach runs out of steam and the model improvement slows down (as I am near-certain will be the case), or even reverses (a distinct possibility as well) with size the demand for training energy will drop dramatically.

    It will only take one AI model company to release a desktop-optimized trained model to pretty much bring the edifice crashing down (like the recently released NVIDIA Chat with RTX, but an actually cutting edge model). The companies themselves are well aware of this and are desperate for regulation that will provide them with a moat that prevents this from happening. The only way this ends up profitable long-term is if the government can be prodded to regulate AI as “unsafe” unless it meets a large volume of complicated requirements that are only practical for very large companies to pull off. Without this, it only takes one company to destroy the business of everyone else.

  9. dirke

    I’m currently working on an article on the up coming energy crises. Here’s a short of analysis on the impact of AI on energy in addition on what’s stated above.
    One the whole electrical grid of the United States is physically falling apart. Estimates of at least two trillion dollars are needed just to fix it and handle expected EV needs. This is most likely low. How much more is it going to cost for AI? Remember a few years back, when China shut down Bitcoin mining? At the same time the greatly limited computer gaming due to energy shortages. China is currently starting again come down on the gaming for power consumption and other issues.
    Two, communication data (the internet), like the grid, infrastructure is in bad shape. Look at today’s service outage. Communications requires power. The net is currently being over taxed. The telcos will soon be crying for money to fix it. Where I’m at they’re laying of switch engineers and canceling equipment upgrades and closing NOCs (network operations centers). So how in hell can all the new AI traffic be handled? AI is a internet data vampire that will mostly suck the internet dry.
    I’ll let you all ponder this. If Yves is interested I’ll have post it when I get it finished.

  10. Craig Dempsey

    Let me speak up for programmable thermostats. We recently sold our old house in which we had a battery-powered programmable thermostat. It even worked well with time of day billing. We set it for the coldest we could usually tolerate in winter, and the warmest in summer. We timed to use the end of each cheap electric cycle. We used HI (Human Intelligence) to temporarily override the settings when we felt the need. Maybe once or twice a year we changed the two AA batteries. Finally, when we sold the house, the buyers were astounded at how low our utilities were, especially with an old furnace and AC. Well, we did have one extra secret, when we bought the house in 1979 we immediately added about 50 rolls of extra insulation. Now that fiberglass saved a lot more money than any AI could.

  11. Domestic Extremist

    At some point it will emerge that human intelligence is the most environmentally friendly option. All we need do is relearn the three Rs and the fine art of critical thinking…

  12. Kevin Smith

    Almost every day I see reports of highly energy-efficient photonic [as opposed to electronic] devices under development which are expected to greatly reduce the amount of energy consumed per unit of computing work. In addition, I frequently see reports of efficiency improvements in AI algorithms. The power companies can of course see this also, and may hold back on increasing power production and transmission facilities, expecting that within a few years power demand for mass computation centers will start to decline.

Comments are closed.