The Destructively High Energy Cost of AI

Yves here. Perhaps I have been looking in the wrong places, but I have seen perilous little discussion of the high energy cost of artificial intelligence. One would think AI alarmists would be all over this issue, since Pigovian taxes (ones designed to make the private cost of AI as high as the social cost, as in the cost including fossil fuel use at current production levels) would presumably lower AI uptake. But nooes! The lack of publicity around this issue confirms the notion that AI concerniks are really out to get AI regulated simply to create moats to protect the investment made by tech titans.

This post usefully provides some preliminary data about AI energy consumption. Note the EU is trying to get AI systems to track their energy use.

By Felicity Bradstock, a freelance writer specialising in Energy and Finance. Originally published at OilPrice

  • Unregulated AI adoption may lead to a substantial increase in global energy consumption, similar to the impact seen with cryptocurrency mining.
  • Advanced technologies, particularly AI, are becoming essential tools in various industries, optimizing operations but potentially driving up energy usage.
  • The EU’s AI Act acknowledges the energy-intensive nature of AI systems, highlighting the need for global regulatory measures to address the surge in energy consumption from new technologies.

As the world welcomes innovative technologies, they could spur a sharp spike in energy consumption if not regulated appropriately. A wide range of industries, including the energy sector, is looking to artificial intelligence (AI) to modernize operations. However, many are not considering the potential energy costs of adopting AI and other innovative technologies. Just as we’ve seen with cryptocurrency mining, the use of new, advanced technologies is expected to significantly drive up energy usage across different industries, and a lack of management could lead to disaster.

There have been great advances in AI technology in recent years, leading many companies to adopt the technology and many individuals to gain a better understanding of it. It is quickly becoming an essential tool in everyday life, as it is used for a range of activities that we may not even consider. Checking into a flight, conducting a Google search, or using cruise control all rely on AI. For companies, the use of AI technology can optimize operations through smart decision-making and automation. It minimizes human error and typically drives up efficiency. It’s for this reason that so many companies are investing in the technology.

Each online interaction requires the use of remote servers – machines in data centers that use electricity to carry out operations. At present, data centers worldwide account for between 1 and 1.5 percent of the world’s electricity use, according to the International Energy Agency. While this figure may seem fairly low, the rapid rollout of new technologies, such as AI, is expected to drive up the sector’s energy usage significantly. There have been increasing discussions in the academic world around the high energy needs of AI, but this will have to quickly translate into national policy if we hope to manage future energy use in the sector.

An analysis published in October showed that the Nvidia Corporation – a multinational technology company, will be shipping around 1.5 million AI server units a year by 2027, which, when running, could equate to the annual use of 85.4 terawatt-hours of electricity. This is higher than the total electricity use of several small countries. Companies such as Nvidia now use advanced graphics processing units (GPUs) rather than simpler processors, called CPUs, to power operations, which require more energy to power. Brady Brim-Deforest, the CEO of Formula Monks, stated, “For the next decade, GPUs are going to be the core of AI infrastructure. And GPUs consume 10 to 15 times the amount of power per processing cycle than CPUs do. They’re very energy intensive.”

One of the technologies to take off in recent months is OpenAI’s ChatGPT, a chatbot that can conduct a humanlike conversation, respond to questions, and create written content. A recent paper from the University of Washington showed that hundreds of millions of queries on ChatGPT can cost around 1 gigawatt-hour a day, equivalent to the energy consumption of 33,000 U.S. households. A professor of electrical and computer engineering at Washington, Sajjad Moazeni, explained “The energy consumption of something like ChatGPT inquiry compared to some inquiry on your email, for example, is going to be probably 10 to 100 times more power hungry.”

Industry experts expect the individual and industry use of AI to increase significantly in the coming years. We are at just one percent of where AI adoption is expected to be within the next two to three years. This means that governments must prepare their countries now to ensure the spike in energy use from advanced technologies is controlled early on.

While this seems like a negative outlook for AI and other innovative technologies, there are many plus sides to using advanced tech. Although energy use is high, these types of machines are typically much more efficient than humans, contributing to improved productivity and fewer human errors.

In Europe, the EU’s AI Act recognizes that AI systems will likely have a high energy consumption during their lifecycle. The legislation categorizes AI systems, setting out requirements for so-called “high-risk AI systems”. They must be designed and developed with logging capabilities that can record energy consumption, the measurement or calculation of resource use, and the environmental impact throughout the system’s lifecycle. At present, there are no regulations in place to reduce the energy consumption of AI technologies in the EU, rather, the European Parliament focuses on transparency and gaining a better understanding of the energy use of the advanced technology.

The adoption of AI technology, and other advanced tech, is happening rapidly and is expected to soar in the coming years. While this could dramatically boost operational efficiency across a range of industries, it is also expected to drive up energy consumption. The EU has taken a step in the right direction by encouraging greater transparency in AI, but governments worldwide must now consider introducing regulations to reduce unnecessary energy waste in advanced technologies.

Print Friendly, PDF & Email

31 comments

  1. Trees&Trunks

    Smart people doing stupid things. Or is it time for the word „smart“ to be redefined to mean „stupid“?
    Adequation, pejoration, antiphrasis. There are many tools out there

    How about founding governmental innovation/product control bodies asking a few simple questions, asking for proofs that will be thoroughly controlled by these bodies before either approving the invention or throw the inventor and „entrepreneur“ in jail for endangering life?
    Examples of questions:
    – will this invention reduce energy consumption per capita?
    – will this pharmaceutical cure the disease?
    – will this food increase the nutritional value?
    – will this reduce poverty?
    – will this tax billionaires to death?
    Etcetcetc

    Maybe we could habe solved quite a few big problems?

    1. flora

      SMART is an acronym – Specific, Measurable, Assignable, Realistic and Time-related. Setting goals and objectives. Who, what committees are setting the goals? What are their real goals and objectives in SMART applicances? What is a SMART car or SMART household thermostat or SMART TV measuring and metering and reporting back to corporate that is to your benefit instead of to their benefit? Think about the SMART phone. aka an unregulated tracking device and communications spy, among its other features, imo, (tracking and spying are not bugs). Data collection is the new gold rush./ ;)

      1. Lefty Godot

        With finance capitalism, you don’t just need more, you need an acceleration in more.

        Ever-increasing information mining of people’s private data.
        Ever-increasing amounts of advertising targeted to each individual.
        Ever-increasing levels of consumption of throwaway goods by everyone.
        Ever-increasing levels of personal indebtedness to afford that consumption.

        We know this can be sustained because we live in an unbounded system with infinite inputs, so we can keep consuming more natural resources and using more energy and pumping more toxic waste into our environment, like, forever, right?

        Isn’t this the worldview that finance capitalism requires that we all share?

  2. The Rev Kev

    Yet another example of magical thinking here going on. You look at the energy demands of AI & ChatGPT here, add in the energy demands for cryptocurrency which is needed to stop civilization collapsing apparently and then all the e-cars, e-bikes & e-scooters which will all need to plug in constantly to recharge themselves and I am not sure that there will be the energy to meet these demands. To do so you would need a massive overhaul of the present electrical grid but there does not seem to be the political will to do so. So I would guess that all those mega corporations will bid up the prices of electricity to secure what they need but which will have the effect of making energy costs much more expensive for the peons.

  3. Michaelmas

    OP: Each online interaction requires the use of remote servers – machines in data centers that use electricity to carry out operations.

    This right here. Why should the use of AI necessarily require the use of remote servers in corporate data centers belonging to Google and its ilk?

    After all —

    DeepMind AI accurately forecasts weather — on a desktop computer

    https://www.nature.com/articles/d41586-023-03552-y

    Sure, the development of LLMs requires the use of massive data sets and consequent electricity consumption.

    But the reality is that the a lot of the resulting systems have quite small kernels that don’t have to run in massive corporate datacenters and be accessed that way.

    The hype about extinction risk-level AI and so forth that the Effective Altruist crowd and certain other tech types have been promoting is in no small measure about elements of the tech industry realizing that they don’t really have a proprietary hold on this technology and therefore wanting to deploy government regulation to keep hold of it.

    e.g. If they have their way, users will only access many of these AI systems via massive corporate data centers for whose services they’ll have to pay monthly subscriptions.

    1. SocalJimObjects

      ” Why should the use of AI necessarily require the use of remote servers in corporate data centers belonging to Google and its ilk?”

      That’s the client server model, thin and “stupid” clients coupled with remote servers. There are articles out there that will explain the pros and cons, but one advantage is that client computers will not need to install anything more than a number of software packages such as a browser. Servers can also be updated with new code quickly, whereas if software were to run in client computers, each and every client will need to download the latest and greatest every few weeks or months. And then there’s also compatibility issues like one software running perfectly in Windows and not in Linux or vice versa.

      In terms of Google Cloud, AWS, and Azure and other corporate data centers, these guys were initially super cheap, but nowadays cloud spending has to be one of the biggest expenses in a company’s income statement. Many companies were promised that prices would be competitive for a “long time”, and just like suckers, they all fell for it.

      1. Michaelmas

        SocalJimObjects: …Nowadays cloud spending has to be one of the biggest expenses in a company’s income statement. Many companies were promised that prices would be competitive for a “long time”, and just like suckers, they all fell for it

        Thank you.

        And this is the model Google Cloud, AWS, Azure, and the rest are trying to maintain

        1. t

          Yep. Personal computers are cheaper now, in some ways, but they have such short life spans that it doesn’t matter.

      2. Mikel

        A bit reminiscent of the old “liquor by the drink” laws in some states.
        Weird post-prohibition law where people had to take their own bottle of liquor to a bar, check it in, and buy drinks from the bottle they supplied.

    2. floraa

      It’s all about control, centralized control. It’s creeping in everywhere, even in unis, even if it works less well – often much less well than distributed managements. Think the cartoon Dilbert, his pointy haired boss, and his boss’s dickhead boss. It doesn’t have to work well, it has to control. / ;)

    3. scott s.

      I run OpenAI “Whisper” on Nvidia GTX 2060 cuda OK, but could use more VRAM. But Amazon rolled out their “Trainium2” AI chip this week that blows everything else away.

  4. lyman alpha blob

    “…these types of machines are typically much more efficient than humans, contributing to improved productivity and fewer human errors.”

    Maybe I’m being nitpicky, but duh. If a machine is doing the work and not humans, then of course there will be fewer human errors. There will also be more machine errors. This sounds like something a stupid AI would write.

    1. Lefty Godot

      But if we efficiently “just add AI” to automate bad processes based on unintelligent management requirements, those processes will be more, uh, more…efficient. And don’t forget buzzword-compliant too! And that’s the goal, right? Doing bad stuff more efficiently with shinier tech.

    2. redleg

      It also means that fewer human input errors, as few as one, would mess things up on a global scale.
      This is where the cult of efficiency ends up.

    3. Adam1

      This reminds me of a joke from my youth… humans make mistake, to really screw things up requires a computer.

  5. vao

    Companies such as Nvidia now use advanced graphics processing units (GPUs) rather than simpler processors, called CPUs

    If measured in terms of transistor count, GPU and CPU have the same complexity. However, a CPU is a general-purpose processor with support for a wide range of tasks (including operating system-specific functions), while a GPU is a specialized processor geared towards parallel processing of large arrays of data (originally for graphics). A CPU is therefore more complex — and that is also why it is slower for the specific kind of tasks that are best performed by a GPU.

    these types of machines are typically much more efficient than humans, contributing to improved productivity and fewer human errors.

    By definition, if one replaces human beings with machines, then the outcome is fewer human errors. The crucial point is whether overall there are fewer errors, whatever their source, and if they are less critical.

    The meaning of productivity is also a minefield; the concept would need to be explicitly defined.

    There have been increasing discussions in the academic world around the high energy needs of AI

    The issue has multiple facets:

    a) Widespread usage of AI mechanically results in increased energy needs.

    b) For the same AI technique, a larger model (with more parameters) requires more energy to develop, fine-tune, and re-train than a smaller one.

    But there is another forgotten aspect that had been observed some years ago:

    c) We are already past the point of diminishing returns with current AI deep learning technology, i.e. object detection accuracy increases linearly as the number of training examples increased exponentially.

    While this [i.e. AI] could dramatically boost operational efficiency across a range of industries

    I doubt that this will happen as fast as is surmised by the author of the article, since there are issues with civil liability that have not yet been cleared up.

    1. StevRev

      There is an article in today’s Financial Times about how AI was used to discover 2 million new crystals (sorry I can’t link, its behind a paywall). Efficiency can be gained by directing research into more promising/less trial-and-error directions, saving lots of time and effort. I’ve seen similar gains in other areas…model training and prediction is less computer intensive than first principles computation, and directs the researcher into areas with higher probability of success.

      1. flora

        I agree there a many good uses for AI. There are also many bad uses for AI. It’s the bad uses that worry me.

    2. JustTheFacts

      I was going to make the same point.

      GPUs are much simpler than CPUs. One has 2048, 4096, 8192 or whatever copies of simple processors that can only add, multiply, etc. The key innovation is that one can run them all in parallel, and they are designed to process a lot of data from memory — which means they are architected to constantly switch tasks so that they do not block on requests to memory but instead work on any data that is available. Thus they are keeping track of many “threads” simultaneously in a way CPUs cannot.

      I also do not believe the statement “GPUs consume 10 to 15 times the amount of power per processing cycle than CPUs do. They’re very energy intensive.” They are very energy intensive, but we are making them do insane amounts of compute (1.7 trillion parameters for GPT-4 means many orders more multiplications and additions to generate a single token which is part of a word). For instance, this is a recent study: “at the same performance level, the GPU-accelerated system would consume 588 megawatt-hours less energy per month than a CPU-only system. Running the same workload on a four-way NVIDIA A100 cloud instance for a month, researchers could save more than $4 million compared to a CPU-only instance.”

      “A recent paper from the University of Washington showed that hundreds of millions of queries on ChatGPT can cost around 1 gigawatt-hour a day, equivalent to the energy consumption of 33,000 U.S. households.”.

      So the cost of an email may be less. But to compare like to like, one should be comparing the cost of maintaining the human life for the time it takes the human to answer… presuming the human is immediately available. Presumably 33,000 U.S. households would be hard pressed to answer hundred of millions of queries given that working an 8 hour day, they’d have to answer each query in less than 5 seconds (for one adult per household) or 10 seconds (for two adults per second). Here I’m conservatively assuming “hundreds of millions” means 200 hundred million. It could be more.

      Another aspect this article neglects is that even if inference is expensive, what really costs a lot is training these enormous models. So yes, power consumption is a concern, but unfortunately I don’t think this article is quite capturing the concern.

      Personally, I hope Helion succeeds, in which case we might have enough energy to start pulling excess CO2/CH4 out of the air and have energy left over for new use cases like this one.

  6. GramSci

    I found myself wondering why we need many large AI installations? One would think a single, super-omniscient AI could inform everyone.

    Then I realized the beauty of multiple AIs: If an off-the- shelf AI fails to deliver the answers you want, you can train up your own!

  7. SocalJimObjects

    Perhaps, this will spur the companies to invest in more energy efficient systems, like hasn’t Microsoft talked about building mini nuclear reactors to train AI models? Perhaps these companies will be the ones pushing for Green energy sources. Fat hope, I am sure, but hei hopium is free.

  8. flora

    Setting goals and objectives. Who, what committee is setting the goals? What are their real goals and objectives in SMART applicances? What is a SMART car or SMART household thermostat or TV measuring and metering that is to your benefit and not their benefit)? / ;)

  9. EGrise

    If the ultimate use case is control, then energy doesn’t matter for AI any more than it does for prisons or NSA server farms. Got this from Arnaud Bertrand on the Twitters:

    According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”

    https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

  10. Trajan

    It seems that those who want to use AI as it is ‘more efficient’ and reduces human error are forgetting the real purpose of all of business activities. Although in law, business leaders need to be subservient to shareholder demands – basically to make money, this can only happen if, at the end of any chain of production, there are humans purchasing a product.

    If technology is used to take humans ‘out of the loop’ someone (or group of people) reap a benefit. But there is a cost elsewhere, as there is less need for other humans to be employed. The argument is put forward that those who loose those jobs can find gainful work elsewhere. Appeal is made to history to support this argument. But will this be true in the future?

    It is surely clear that the human race has reached the limits of safe use of the resources that are available in our finite world. It is also clear that the actions of one country can impact the lives of people living in other countries. But there is no accountability, no court with real power that can stop actions in one country from harming others.

    How this can be changed, I have no idea. But I suggest that there is a need for the issue to be acknowledged and discussed.

    1. Jams O'Donnell

      Don’t worry. The collapse of human civilisation* in the medium future will take care of all these problems.

      * Due to soil erosion, global warming, sea level rise, lack of adequate water supplies, chemical pollution, mass population movements, war breaking out in diverse areas, (possibly nuclear war, eventually), etc. etc.

  11. LY

    Is the intensive energy usage from training AI or running the resulting system? For some applications I’m aware of, it’s the training. The resulting output can run on a cellphone or other embedded system.

    For large language models, it seems to be both?

  12. Synoia

    So one needs to implement the AI where it can deliver copious amounts of hot water.

    Hospitals and Prisons come to mind.

  13. Dave

    Late stage neoliberalism when coupled to the unipolar moment is an era when economic and political elites in the west are in “let’s get rich baby” mode.

    The energy issues are widely known within the industry.

    The ways in which the inudstry tends to pursue AI projects are generally opaque and incompatible with the stuff that they dump into the media about the environment.

    In addition, the industry isn’t particularly transparent at publicising the carbon footprint of its AI projects, e.g.

    Artificial Intelligence Can’t Think Without Polluting
    https://slate.com/technology/2019/09/artificial-intelligence-climate-change-carbon-emissions-roy-schwartz.html

    The Environmental Impact of ChatGPT: A Call for Sustainable Practices In AI Development
    https://earth.org/environmental-impact-chatgpt/

    The rapid growth of the AI sector combined with limited transparency means that the total electricity use and carbon emissions attributed to AI are unknown, and major cloud providers are not providing the necessary information

    AI has a dirty secret: its carbon footprint
    https://www.thefuturelaboratory.com/blog/ai-has-a-dirty-secret-its-carbon-footprint

    For technology brands serious about sustainability, it’s time to consider the environmental impact of incessant digital development

    AI’s Dirty Secret: The Shocking Carbon Footprint Of The Technology That Powers Our Lives
    https://magazine.mindplex.ai/mp_news/ais-dirty-secret-the-shocking-carbon-footprint-of-the-technology-that-powers-our-lives/

    … experts are calling for greater transparency in AI model power consumption and emissions.

    “Let’s get rich baby” appears to require support from the mainstream mockingbirds such as “let’s tell lies”.

    AI built on ‘information’ delivered by gaslight?
    https://www.medicalnewstoday.com/articles/gaslighting

    Gaslighting is a form of psychological abuse where a person causes someone to question their sanity, memories, or perception of reality.

Comments are closed.