Consumers Are Footing the Bill for AI’s Insatiable Appetite for Energy

Yves here. Perhaps most readers were aware, but I have to admit that even with all of the considerable alarm about the energy staggering demands of AI, I had missed a key point. I had thought the ginormous energy-eating was due to needing to training set demands, that it was generally understood that the widespread LLM implementation like ChatGPT still had marked gaps, and so feeding more data into their maw was an ongoing activity, with accompanying large power needs. It turns out that using these AI models also has an extremely high energy price tag.

In all seriousness, all of you need to stop using AI. Immediately. We’ll provide some examples below of the simply grotesque energy costs of using AI even for seemingly small task. You are directly and greatly contributing to the acceleration of climate change and other resource degradation, like increased use of water, by using AI at all. If you drive an EV out of environmental concern but elect to use AI, you are either ignorant or a hypocrite.

This article fails to point out that, as far as the US is concerned, we’ve chosen to intensify the damage. Silicon Valley and its bought-and-paid-for political allies have lashed themselves to the mast of these so-called generative AI models, when DeepSeek has already proven to be more efficient in terms of power needs. From Energy Digital, summarizing a study by Greenly which compared ChatGPT-4 with DeepSeek:

Building and running large language models (LLMs) such as ChatGPT-4 and DeepSeek requires substantial computing power

This involves not only high electricity use but also dependence on water for cooling and energy-intensive chip manufacturing.

AI hardware production involves mining rare earth minerals, a process that can result in soil erosion, water contamination and wider pollution.

ChatGPT-4, for example, operates with 1.8 trillion parameters – 20 times more than earlier versions…

In a scenario where an organisation relies on ChatGPT-4 to answer one million emails per month, Greenly calculates the yearly emissions at 7,138 tCO₂e – equivalent to 4,300 round-trip flights from Paris to New York.

Even small tasks carry energy costs.

According to research from Carnegie Mellon University and Hugging Face, a single text-based prompt consumes as much energy as charging a smartphone to 16%.

Under routine conditions, the same email use would still produce 514 tCO₂e per year.

Greenly found that text-to-image models like DALL-E produce up to 60 times more CO₂e than standard text generation.

Amid concerns about AI’s energy demands, DeepSeek offers a potential way forward.

The Chinese-developed model employs a Mixture-of-Experts (MoE) architecture, meaning it only activates relevant sub-models for each task rather than the entire model.

This drastically reduces the power required per operation.

Whereas ChatGPT-4 was trained using 25,000 NVIDIA GPUs and Meta’s Llama 3.1 used 16,000, DeepSeek used just 2,000 NVIDIA H800 chips.

These chips also draw less power than previous models.

As a result, DeepSeek consumed a tenth of the GPU hours compared to Meta’s model.

This not only brings down its carbon footprint but also lessens the load on servers and reduces water usage needed for cooling.

See this Q&A from The Grainger College of Engineering for more technical detail on the merits of DeepSeek.

By Haley Zaremba, a writer and journalist based in Mexico City. Originally published at OilPrice

  • The rapid growth of data centers, particularly due to AI, is significantly increasing energy demand and jeopardizing clean energy initiatives by extending the life of fossil fuel plants and promoting new ones.
  • The issue is compounded by “phantom data centers,” which inflate projected energy demand and give utilities leverage to expand fossil fuel infrastructure.
  • This surge in energy demand and the resulting infrastructure projects are projected to lead to higher energy bills for consumers, especially in the Southeast United States.

As data centers place more and more demand on global power grids, policy and economic priorities are shifting from creating more clean energy to creating more energy, period. Projected clean energy additions are simply not enough to meet the runaway demand of the global tech sector, meaning that climate goals could be at risk.

The proliferation of artificial intelligence is causing massive increases in energy demand from data centers, and the areas that host them are struggling to keep up. A 2024 study from scientists at Cornell University found that generative AI systems like ChatGPT use up to 33 times more energy than computers running task-specific software. As a result, it is estimated that each AI-powered internet query consumes about ten times more energy than traditional internet searches. But these numbers are just our best guess – we don’t really know how much energy AI is sucking up, because the companies who are piloting AI platforms aren’t sharing those numbers.

But we know that the overall picture is pretty grim. Last year, Google stated that the company’s carbon emissions had skyrocketed by a whopping 48 percent over the last five years. “AI-powered services involve considerably more computer power – and so electricity – than standard online activity, prompting a series of warnings about the technology’s environmental impact,” the BBC reported last summer. While Google hasn’t publicly revised its goal of becoming carbon neutral by 2030, the tech firm has admitted that “as we further integrate AI into our products, reducing emissions may be challenging.”

Already, the uptick in energy demand from data centers is causing new plans for gas- and coal-powered plants as well as extending the life of existing fossil fuel operations across the United States. Utility Drive reports that “at least 17fossil fuel generators originally scheduled for closure [are] now delaying retirement” due to data center demand, and that “utilities in Virginia, Georgia, North Carolina and South Carolina have proposed building 20,000 MW of new gas power plants by 2040” for the same reasons.

The issue is particularly acute in the Southeast. Major utilities in Virginia, North Carolina, South Carolina and Georgia project that they will collectively add 32,600 MW of electrical load over the next 15 years. The Institute for Energy Economics and Financial Analysis reports that in Virginia, South Carolina and Georgia, “data centers are responsible for 65% to more than 85% of projected load growth.”

However, it could be the case that this projected demand growth is overblown, and that states will add extra gas power capacity – and therefore extra greenhouse gas emissions – unnecessarily. Because the competition for energy sources is so fierce between data centers, the project managers of new centers are likely to reach out to many different power providers at once with speculative connection requests, creating redundancies and a compounding issue of “phantom data centers.” This inflates demand and makes accurate projecting extremely difficult.

A study published last year by Lawerence Berkley National Lab calculated exactly how big the phantom data center issue might be, and they found that projected energy demand could be as much as 255 terawatt-hours of energy higher than real energy demand. That’s enough energy to provide power to more than 24 million households.

However, it’s not in utilities’ interest to simplify interconnection processes and ferret out phantom data centers. In fact, the panic over rising energy needs from data centers is giving them great leverage to expand their businesses and push through huge fossil-fuel powered energy projects. Plus, while building new plants and extending the lives of old plants is costly, those costs will be borne by the ratepayers.

Consumers across the U.S. – and especially in the data-center-laden Southeast – can expect their energy bills to rise in response. “We are witnessing a massive transfer of wealth from residential utility customers to large corporations—data centers and large utilities and their corporate parents, which profit from building additional energy infrastructure,” Maryland People’s Counsel David Lapp recently told Business Insider. “Utility regulation is failing to protect residential customers, contributing to an energy affordability crisis.”

Print Friendly, PDF & Email

47 comments

  1. Mikerw0

    Karn Evil 9, 3rd Impression

    No man yields who flies in my ship
    DANGER!
    Let the bridge computer speak
    STRANGER!
    LOAD YOUR PROGRAM. I AM YOURSELF
    No computer stands in my way
    Only blood can cancel my pain
    Guardians of a nuclear dawn
    Let the maps of war be drawn

    Rejoice! Glory is ours!
    Our young men have not died in vain
    Their graves need no flowers
    The tapes have recorded their names

    [Outro]
    I am all there is
    NEGATIVE! PRIMITIVE! LIMITED! I LET YOU LIVE
    But I gave you life!
    WHAT ELSE COULD YOU DO?
    To do what was right
    I’M PERFECT. ARE YOU?

    Reply
    1. mrsyk

      Step inside! Hello! We’ve a most amazing show
      You’ll enjoy it all we know
      Step inside! Step inside!

      Be the barker at the carnival show

      Reply
      1. Laughingsong

        I love all the ELP references!

        Welcome back my friends,
        To the show that never ends!
        We’re so glad you could attend,
        Come inside, come inside….

        Reply
  2. Nealser

    I disagree. AI datacenters are a relatively clean industry powering the next technology leap forward. The focus should be on building more low carbon electricity capacity to meet the needs of industry.

    Reply
    1. Laughingsong

      Point one: they are only as “clean” as their source of energy, and
      point two: the relative “cleanliness” is not the point, but rather their rapacious usage of both power and water.

      As for the technological “leap forward”, well maybe…. I do doubt it. The main thing I see when I deal with LLM implementations is the same thing I see with all software: it’s only as good as the management let it get before they forced it to be released due to pressure to make revenue. Don’t worry, the next point release will take care of those “hallucinations “.

      It’s not intelligence and it is still very far from it. It was trained on the cesspit called the internet and apparently has no capability of discerning between good and bad data; it’s completely agnostic, amoral.

      Finally I would say that yeah we’ve needed to address the power production and grid demands for a while now, even without AI. But for actual, you know, people? Crickets. For the tech bros? No problem! Hmmm.

      Reply
        1. david

          Well of course. It makes perfect sense to put something that needs a lot of cooling in the middle of a desert with water shortages. You’d think if AI was so good it would have told them that is a bad idea.

          Reply
    2. Reader Keith

      Clean datacenter power is still wasted when using an LLM versus a simple Google search. And LLM output is being forced down our throats and wasting energy that negatively impacts most regular folks.

      If Google searches weren’t so ensh!ttified (see Ed Zitron) maybe we wouldn’t turn to lousy LLM summaries?

      Maybe this grotesque waste of energy will spur a clean energy innovation which could be a positive thing but in the meantime its a means to inflate multiple bubbles.

      Reply
      1. Catchymango

        If you use Apple devices you should look up the app called Lucky by a developer named “And a Dinosaur”. It proxies your Google searches to remove trackers, and returns the old school “5 blue links” style results page, no junk. It’s like $10 bucks I think.

        IMO it’s worth trying it out if only to get rid of the awful Google junk like “trending searches”, and the constant prompts to log in. I’m currently using it on my phone, tablet and computer. The real benefit, however, is the ability to block domains. I’ve started blocking most retailer or event URLs and I feel like I’m rediscovering the internet.

        Whereas before searching Quinn Slobodian’s name might get you dozens of retailers, pages for RSVPing to talks that already came and went, and maybe even the websites of AI tools offering to summarize his books, now I often see PDFs of academic articles hosted on professors’ personal websites, blogs, news articles and forum posts, especially now that I am more inclined to scroll beyond the first 4-5 results. It’s not perfect, but as I continue to refine my filters I expect it will keep getting better.

        The dev also sells another add-on to this for 99 cents called Lucky Notes, if you want to further curate your search experience by searching your notes alongside the web. I’m still exploring how to effectively integrate that one. But just thought I’d share my experience as I’ve been pleasantly surprised by this low-tech alternative to bypassing google’s strangulation of search.

        Reply
    3. jefemt

      “Relatively clean”. That is a slippery slope.
      My experience and observations reinforce the adage that there is NO free lunch when it comes to energy, and that most energy involves a heck of a lot of carbon, and/or significant environmental degradation.
      With all the cost accountants and folks trained in hard science, we somehow still know very little about full life-cycles cradle to grave, including ‘externality’ knock on effects, for most products and industrial activities.
      We don’t want to know… plausible deniability matters!
      I am very skeptical of the assertions that AI energy sources are clean.
      I doubt A.I. will prompt a “leap forward”. Off the brink? That I might concede.

      I imagine a well-crafted closed probe to A.I. about its energy use, and the impact on the earth’s biomes and ecosystem would affirm that A I is net negative, and that, indeed, No Lives Matter.

      Reply
    4. Acacia

      Haha. Really?

      There are a number of firms with a global network of data centers, e.g., Amazon, but let’s just consider Google. They claim their data centers will be carbon neutral five years from now, but their 2024 Environmental Report stated the following:

      In 2023, our total GHG emissions increased 13% year-over-year, primarily driven by increased data centre energy consumption and supply chain emissions.

      Compared to 2019, that figure is 48% higher, and notice they are mentioning supply chain emissions, which, at the end of the day, may be beyond their control.

      Interestingly, they are trying to argue that “AI holds immense promise to drive climate action. In fact, AI has the potential to help mitigate 5–10% of GHG emissions by 2030.” (N.B. typical white paper weasel words “potential” and “promise”). Basically, the claim is that AI is going to make everything in the world so much more efficient, that we will consume less energy (no, really).

      Meanwhile, Google operates around 45 data centers worldwide — 12 in the US — and they are in the process of building 23 more data centers. They don’t say how many servers they operate, but estimates vary between 900,000 and 5 million:

      https://www.reddit.com/r/Netlist_/comments/qsc416/how_many_servers_does_google_have_add_here_is_my/

      The problem with the “everything is becoming more efficient” argument is that the “everything” just keeps growing. Even if Google meets their goal of a carbon-neutral fleet of data centers — and the reported increase of GHGs doesn’t inspire confidence they will make that happen — what about the rest of the supply chain? This is important because every server that Google operates has a limited lifetime. They are constantly upgrading, expanding, building new data centers, with newer, fancier hardware (btw, their in-house cluster manager app is named “Borg”, which some engineers no doubt thought was a witty name).

      Taking the whole supply chain into account, how much energy is consumed to manufacture 2.5 million servers? How much waste dumped into the environment?

      It’s really hard to imagine such a yuge computing infrastructure as “carbon neutral”.

      Reply
      1. Yves Smith Post author

        And focusing on “carbon neutral” is misleading. What about all the water for cooling and for producing the rare earths needed for tons of misleadingly labeled clean energy products? Potable water is the resource that is going to run into global shortages first, formerly projected to hit ~2040, but with all this super hot weather and energy profligacy, I’d bump the estimate up.

        Reply
    5. Mikel

      BS. There are problems in the world that people already have the technology and skills to fix.
      Some conmen come along and say some alleged “AI” is needed for these problems to be addressed.

      Reply
    6. Adam

      Sad that the apparent next technology leap forward is just computers doing things that humans can already do, but making up bs 20-30% of the time.

      Reply
    7. lyman alpha blob

      I’d say shoving bullshit generating “AI” down the throats of humanity is more of an industry want than a need. But some nerdboxes get to be squillionaires on the grift, so let ‘er rip!

      Reply
  3. raspberry jam

    Sadly the majority of the AI datacenter build out in the US is for corporate use, not public end users. And even sadder is that Deepseek (and all Chinese models) are being banned either by the corporations themselves or through insane fearmongering by the US government around things like this:

    Exclusive: DeepSeek aids China’s military and evaded export controls, US official says Reuters

    Before this and the associated and expected bill banning their use by governments came out the corporations were screeching about Chinese IP theft being the reason for refusing to use the Chinese models within their private networks. This is just US corporations though; the European national champions, surprisingly, are quietly using the Chinese open source models in addition to the US and French models in their private air-gapped on prem bare metal data center setups that serve the models over their internal network to their internal corporate users.

    Reply
  4. Candide

    A young couple in our wooded neighborhood community is building a home based on their parents’ experience living “off the grid.” Multiple responsible and economically efficient choices have created a bank of inverters and a moderate array of solar electric panels now benefit from continuously decreasing cost and photon-to-battery efficiency increases.

    This example helps us all, and gives a “no thank you” to the assumptions and rapacious capture by corporate power.

    Reply
  5. Rip Van Winkle

    Will Zeldin try to bring back the incandescent light bulb? I can’t see Zuch, Bezo, Theil and Musk going for that – they need all of the energy they can get for their data centers.

    Whatever happened to all the mercury from the fluorescents since they were pushed 30+ years ago?

    Reply
      1. steppenwolf fetchit

        I suspect the mercury from thrownaway fluorescent bulbs is in the landfills they got thrown away in.

        I think the mercury in the tuna comes from coal smoke, especially the coal smoke from China as it industrialized and as it maintains its industrial level. Nancy Pelosi should eat fancy white albacore tuna three times a day, as her reward for her part in NAFTA, MFN for China, etc.

        Reply
  6. The Rev Kev

    Notice that nobody talks anymore about reducing how much energy we use as a civilization and making appliances much more efficient. It really started with the advent of cryptocurrency which had old coal power plants being re-opened again but AI is sending this insatiable need for power through the roof. And because of those in power in government, the cost of all this crap is being unloaded onto ordinary people by those corporations who will keep all the profits for themselves. You could have whole regions being blacked out by these data centers from time to time but you can bet that the authorities will direct whatever electricity they can to those data centers to keep them going. It is only a matter of time until they get that final governmental protection by being declared national security infrastructure.

    Reply
  7. Carolinian

    Here in SC Duke Energy has announced a rate increase for next year. If it really is about AI then the public should rise up in revolt–not that they will. Meanwhile I still get a monthly mailing from Duke praising my low electricity use, which illustrates the schizophrenia of an industry which encourages the general public to control their behavior while embracing the new energy hogs.

    And if this entirely impractical scheme is really about goosing the stock market then the entire country should revolt–and maybe they will.

    It’s also a betrayal of the personal computer revolution which sought to empower the many with tools formerly only available to the few. It gradually came to do so with hardware that uses very little power indeed (so as to encourage the use of battery operated devices). This crowd sourcing of bio-intelligence has far greater potential than an oligarchic pipe dreams designed to keep the many irrelevant. The class war never ends, abroad or at home.

    If we could change our politics we could change everything. There’s the rub. Trump said he was doing it but he is instead the latest wrinkle in the big bamboozle. Many in our ruling class seem to like this.

    Reply
    1. Mikel

      “And if this entirely impractical scheme is really about goosing the stock market then the entire country should revolt–and maybe they will.”

      The suspicion is understandable. It always lingers in my mind that the hype around language models, etc. picked up speed (more like went into Star Trek Warp Factor 10 speed) around the time the rising interest rates were putting a hurting on the stock market.

      Reply
    2. Jason Boxman

      Ha. I get wasted mail from Duke every month encouraging me to contact them for my home energy efficiency kit, but when I called, they told me to pound sand because I rent; as if my electricity use is ephemeral or something. Lunacy. I still get that damn thing in the mail, every month, for years and years. More waste.

      Reply
    3. GF

      Our local electric utility, APS, has also announced rate increases that will take effect in 2026 if approved by the republican dominated Arizona Corporation Commission (a given). APS also supplies the electricity to most of the 90 data centers currently operating in the greater Phoenix metro area.

      Here is a quick-and-dirty look at the requested rate increases. Notice that data centers don’t have a separate rate class:

      DOCKET no. E-01345A-25-010

      Bill Impacts

      The following table shows APS’s proposed revenue increase percentages for customer
      classes… Requested Retail Revenue Increase for Residential Class as a Whole – 16.44%.
      General Service
      XS, S 9.32%
      M 9.42%
      L 15.73%
      XL (E-34, E-35) 23.52%
      XHLF 47.03%
      Schools 14.34%
      Houses of Worship 16.00%
      Irrigation/Municipal 15.92%
      Outdoor Lighting 15.95%

      Total Retail 15.99%

      Reply
  8. Joe Meck

    Artificial Intelligence’s growing demand for energy is reshaping global costs—expenses that often fall on everyday consumers. For South Africans relying on government support like the SASSA SRD grant, rising utility bills influenced by global tech trends could make a real impact. Understanding how tech industries affect public services and daily expenses is key, especially for those depending on social assistance to manage their household budgets.

    Reply
  9. LawnDart

    In all seriousness, all of you need to stop using AI. Immediately.

    There’s good reason Meta/Facebook is plunging into AI… not good for any of us, but it’ll be pure gold for the shareholders bottom-line– and for those who wish to shape the thoughts, to control the behaviors, of users, agents of second-hand influence on non-users as well. Maybe we can think of AI as a superpower for Moloch…

    Facebook– the rather quaint evil of old:

    Facebook’s ethical failures are not accidental; they are part of the business model

    “the largest social media companies are antithetical to the concept of reasoned discourse … Lies are more engaging online than truth, and salaciousness beats out wonky, fact-based reasoning in a world optimized for frictionless virality. As long as algorithms’ goals are to keep us engaged, they will continue to feed us the poison that plays to our worst instincts and human weaknesses.”

    Today, knowing that AI frequently lies, we see Meta Moloch 2.0 unleashed:

    The Psychology of AI Persuasion
    How machines learned to influence human minds.

    In the shadowy intersection of psychology and technology, a new form of influence is emerging—one that operates with surgical precision, learning from our digital breadcrumbs to craft messages that bypass our rational defenses. Recent research reveals that artificial intelligence (AI) has crossed a critical threshold: AI chatbots are now more persuasive than humans in online debates 64 percent of the time when provided with minimal demographic information.

    AI chatbots: anything you say can and will be used against you; the resulting psychological profile is up for sale, and is sold often. This may be something for one to consider before engaging with an AI therapist or, god forbid, an AI-based significant-other… hell, just about pretty much everything AI, inculding queries.

    While we can see that there certainly are applications where AI can be used for good, for the benefit of humanity, it’s obvious to many of us that is not what Meta is up to: in addition to helping to destroy our environment via rapidly metastasizing, power-hungry data centers, Meta is hell-bent on invading our privacy and manipulating our thoughts, robbing us of our autonomy and free will, taking from us our very sanity… Meta future toxic X? Never mind, I digress…

    It appears that under the best governments money can buy– in addition to our tax dollars used for direct subsidies to Meta and its ilk– that via the flip of a lightswitch, the charging of a phone, or now any time that we use an electric utility for service, we are paying towards our own enslavement and misery. Perhaps Mr. Hudson can weigh in here on the economics of “1984” and how this affects us proles.

    Reply
  10. tegnost

    While Google hasn’t publicly revised its goal of becoming carbon neutral by 2030, the tech firm has admitted that “as we further integrate AI into our products, reducing emissions may be challenging.”

    Jevons paradox, like dunning kruger, is a business school justification for efficiency boosting growth as a natural outcome and in my musings ran across the microsoft ceo claim deep seek proved jevons in that it uses less to compute. What the ceo failed to note is that western bloat is just hoovering data planning to figure out how to exploit it later, while the chinese restricted bloat, and got better performance. it’s a cultural thing.
    My own view of jevons may be heretical in a sense, but it is that efficiencies can’t lead to reductions but rather only to increased energy use, particularly but not isolated to the aforementoned ceo because bloat, equals groaf narrative, equals money. There is no reverse gear.

    Reply
  11. 4paul

    You can turn off the annoying AnswerWithAI in search results, but you have to menu dive for it.
    For example, in the Brave browser, you have to go to search.brave.com , click on the gear menu on the page in the upper right, and disable the radio button; and then you have to disable it again in a Private Window.

    The state of Florida is hurtling itself to un-livability, a huge rate increase from the Florida Power & Light electric utility, which is in part to fund infrastructure for moar data centers. … Also, how do I get a government guaranteed return on investment of 11% ???

    https://www.foodandwaterwatch.org/2025/02/28/florida-power-light-requests-largest-rate-increase-in-u-s-history/
    https://floridapolitics.com/archives/748720-what-the-fpl-rate-increase-means-to-floridians/

    Like tegnost said above, electricity for computers is exponential Jevons Paradox … after the colossal energy savings from virtualization (>75% reduction in electricity drawn by data centers) we have completely reversed those savings with crypto and then video chats and now AI; Mr Market somehow always reverses progress.

    Reply
  12. Quintian and Lucius

    I don’t usually like to just outright boast in an nc comment but I have never once queried a slop generator for any reason. The extent of my experience with them is suffering their output being inflicted on my cognition because other users do not sufficiently respect the sheer awful they are. It’s a delightful feeling of lightness ‘pon the soul.

    Reply
  13. Peter L.

    If I even think about using AI LLMs my teenager scolds me in the way she does if I leave the lights on in an unoccupied room.

    Reply
  14. Jason Boxman

    We are all pretty screwed, if my area of tech is any indication; at an employer I know, the farm is being bet on this, in every single way possible. Daily, use of LLMs for all possible tasks, all possible, is strongly encouraged. And that call is being answered with daily usage for all manner of tasks. It’s tied into company, department, and individual goals. And this is like no initiative I’ve ever heard of being; total immersion, complete focus, all in lockstep.

    It’s as if we’re at Web 1.0 and no one wants to miss the train. I can’t deeply enough describe the urgency with which I hear this is being pushed. I can’t formulate strongly enough the focus here. It’s like a hard pivot for a startup that’s found its proposed line of business isn’t going to work. This is seen as existential to future success, the foundation for all that comes tomorrow.

    Self immolation is at hand; this is seen as like the discovery of fire itself.

    Reply
    1. raspberry jam

      It’s Act II of The Disciplining of a Labor Aristocracy. The mass layoffs in tech starting mid-2023 were Act I.

      Everyone knows Act III is resolution but in this case I don’t think most of these use cases have legs so with the combination of corporate shrinkage and stock buy backs it’s just greater hollowing out of all the mass white collar computer job employment that was built up from the 90s on.

      Reply
      1. Jason Boxman

        Ha. To quote John Conor in Terminator: Genesis, “These people are inviting their own extinction in through the front door and don’t even realize it” in response to everyone being on their digital devices non-stop, hilariously, in a hospital scene. And oblivious is about right when it comes to EHR and computer systems in hospitals today, where doctors talk to devices, rather than you during exams.

        Definitely I noticed the beginning of the tech recession back in 2023, early 2023. And it’s been ongoing.

        Reply
    2. david

      I work in the oil industry and all of it seems to be saying AI is the solution to everything. It was Digital Twins but they haven’t even finished implementing that.

      Thing is I’ve not seen anyone actually spell out how AI can help. Just that it will be integral to moving forward in the future.

      Personally I want to withdraw from it all. I’m not interested in these games anymore.

      Reply
  15. restive

    meanwhile, in China –

    https://huabinoliver.substack.com/p/three-body-computer-constellation

    They will be “creating a AI cloud computing network in space”.

    With this move they avoid all the problems we are encountering trying to generate AI capacity from earth-bound facilities. They get all the solar power they need and no need for water for cooling.

    also this –

    https://gizmodo.com/the-biggest-signs-that-ai-wrote-a-paper-according-to-a-professor-2000634580

    which is a short primer on how to spot AI content

    Reply
  16. Boshko

    As my neuroscientist brother-in-law reminds me, true intelligence (i.e. the human brain) means processing a neural network model with the same energy to light a lightbulb–a few watts.

    We’ll all be burnt and swimmin by the time AI achieves that–if ever.

    Reply
  17. scott s.

    I dug down in the links to try and find the actual data. There’s a lot of confounding in discussing model training costs along side deployment inference costs. It seems like the actual source of the inference cost (in terms of CO2) data is from here: Alexandra Sasha Luccioni, Yacine Jernite, and Emma Strubell. 2024. Power Hungry Processing: Watts Driving the Cost of AI Deployment? that was published in ACM Conference on Fairness, Accountability, and Transparency. The paper is poorly cited: Authors are from Hugging Face and Carnegie Mellon, not Cornell.

    Reply
    1. ChrisPacific

      Yes, the quoted numbers don’t pass the smell test for me (and I had a similar difficulty finding raw numbers). Take this claim:

      ChatGPT-4, for example, operates with 1.8 trillion parameters – 20 times more than earlier versions…

      In a scenario where an organisation relies on ChatGPT-4 to answer one million emails per month, Greenly calculates the yearly emissions at 7,138 tCO₂e – equivalent to 4,300 round-trip flights from Paris to New York

      Let’s assume energy costs are roughly proportional to price and assume a retail price of around $500 for the round trip ticket (should be accurate to order of magnitude). That puts the cost of the plane tickets at $2 million. One million emails per month is 12 million, so to come to $2 million that would mean the cost of a single ChatGPT query (or at least the amount OpenAI would need to charge to match airline industry margins) would be in the neighborhood of $0.16.

      ChatGPT currently serves around 2.5 billion queries per day, according to the company. Priced at $0.16 per query, that adds up to a daily expense of around $400 million, or over $100 billion per year. I highly doubt ChatGPT has anything like that level of expenses – both revenue and expenses were less than a tenth of that, from what I could find.

      Here’s a contrarian source, which does supply numbers and working if you want to pick them apart:

      https://engineeringprompts.substack.com/p/does-chatgpt-use-10x-more-energy

      The estimate in this one is 0.2 Watt hours of energy per query, which would only charge your smartphone about 1% instead of the 16% quoted in the article. The author highlights the bad data underlying a lot of hot takes:

      Seriously… I went down the rabbit hole with these links, and yes, they all lead to the same original source of the data – an outdated Google blogpost, a guesstimate by the Google chairman, and a SemiAnalysis substack. What a world we’re living in!

      I saw the same thing with the Cambridge Analytica panic – what looked like a lot of sources, all of which turned out to be citing one another, and all of them tracing back to the same single paper with a list of unproven claims that were far more anodyne than the headlines.

      This is the difference between AI being the defining climate issue of our time or a rounding error compared to more pressing concerns like transport or food production. Kind of important to get it right, I’d say. Granted the energy buildout is real, but whether that’s supported by actual usage rates or is simply the boom part of a boom/bust cycle (like the overbuilding of fibre during the dot com era) needs scrutiny.

      Reply
  18. The Infamous Oregon Lawhobbit

    The local circuit court is now having conniptions over attorney use of AI. It’s an ongoing issue, made particularly fun by AI’s tendency to hallucinate cases with very authentic seeming cites.

    To the best of my knowledge I don’t use AI for anything, thus fulfilling Yves’ request. Heck, I have enough trouble with NATURAL intelligence, I can only imagine the trouble I’d get into with the artificial kind….

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *