Yours truly is overdue on an overview how private-debt-backed AI datacenter deals work and why companies like Meta who are perfectly capable of borrowing in their own name paying 200-300 basis points extra not to do so is not a good sign. But the Wall Street Journal just published a very detailed story on the general outlines of this debt binge, which has gotten more attention in the popular press as borrowing levels have skyrocketed this year:
Big Tech borrowing for AI data centers:
2015-2024 average: $32B/year
Sept-Oct 2025 alone: $75B> Meta borrowed $30B
> Oracle borrowed$18B
> $META also did $27B off-balance sheet with Blue OwlAI companies now 14% of IG index
The “money-printing” tech companies are…… pic.twitter.com/IuhBU0LS4M
— junkbondinvestor (@junkbondinvest) November 5, 2025
And from the new Wall Street Journal story:

What the Journal calls a “frenzy” in Wall Street Blows Past Bubble Worries to Supercharge AI Spending Frenzy is awfully reminiscent of the toxic phase of the subprime lending binge, when originators were so desperate for product, as the jibe then went, they’d fund borrower who could fog a mirror. It was years later that we were able, with the key elements including some insiders plus a remarkable pre-crisis analysis by (of all things) an equity market analyst, Henry Maxey of Ruffler Investment, to piece together how leverage-on-leverage created an even-then, well-reported “wall of liquidity”. It was also the leverage on leverage (and that systemically important yet fragile financial institutions were heavily exposed) that made the bubble unwind so catastrophic. We have warned that the unwind of an, erm, traditional credit bubble, even if very large (see Japan and for a lesser but still nasty version, the S&L crisis) typically produces at worst very deep and protracted recessions and zombification (the runup to the 1929 US crash also featured CDO-like leverage on leverage, see Frank Partnoy’s The Match King for details), as opposed to banking system near-or-actual failures.
So again, we have yet to see evidence of that meteor-hitting-the-financial-system event being in the offing.
But absence of evidence does not amount to evidence of absence.
And perhaps finance historians can correct me, but I do not recall a historical instance of a massive equity bubble (without 1929-style heavy borrowings directly against those equities) accompanied by so many red flags, particularly operating and financial leverage, in the underlying commercial activity. That included recursive deals among key companies and as we’ll discuss a bit in this and more in posts to come, overly-clever borrowing structures that make sense only to achieve higher levels of leverage than could be achieved by traditional means. The Journal points out in passing that one of the mega-deals will pay more than 2% that a plainer-vanilla offering would. In a fine overview of in June on how these financings work, Paul Kedrosky similarly said the premium was 200 to 300 basis points. That’s an awful lot to pay for opacity and supposed balance sheet remoteness.1
As a brief introduction to causes for concern about the datacenter boom, see the executive summary from Bubble or Nothing from the Center for Public Enterprise (hat tip Matt Stoller):
● Cash flow uncertainty persists as the cost of providing AI inference services continues to rise. Leading AI inference service providers are not particularly differentiated from one another; this competitive market structure suppresses market
participants’ pricing power and prevents them from recovering rising costs.● The collateral value of a graphical processing unit (GPU), the sector’s keystone asset, looks poised to fall in the near-term. The value of chips fluctuates depending on uncertain user demand as well as the supply dynamics and technical specifications of new GPUs, now released yearly. The cash flow that GPU collateral can demand is suppressed due to the sector’s competitive market structure and the uncertain depreciation schedule of existing GPUs.
● Data center tenants will undertake multiple cycles of intense and increasingly expensive capital expenditure within a single lease term, posing considerable tenant churn risks to data center developers. This asset-liability mismatch between data center developers and their tenants will strain developers’ creditworthiness without guarantees from market-leading tech companies.
● Circular financing, or “roundabouting,” among so-called hyperscaler tenants—the leading tech companies and AI service providers—create an interlocking liability structure across the sector. These tenants comprise an incredibly large share of the
market and are financing each others’ expansion, creating concentration risks for lenders and shareholders.● Debt is playing an increasingly large role in the financing of data centers. While debt is a quotidian aspect of project finance, and while it seems like hyperscaler tech companies can self-finance their growth through equity and cash, the lack of transparency in some recent debt-financed transactions and the interlocked liability structure of the sector are cause for concern.
The first two points alone, the fact that inference costs are not only not falling but still rising, and near-term downside in GPU prices ought to be fatal, or at least detrimental to debt funding.
Yet as you’ll see, even though the Journal raises concerns, and even adds to this list by describing how AI “hyperscalers” are putting out duplicate orders for the datacenter capacity, virtually assuring whipsaw, it’s not as sobering as the list above. But it includes signs that some dogs are turning their noses up at the dogfood:
Stock prices normally go up when a company reports record revenue but after Meta did just that on Oct. 29, its shares plummeted 11% instead. The reason: Zuckerberg disclosed he will “aggressively” increase capital spending on AI, drawing questions from analysts about how the company plans to actually make money off the new technology.
What the Journal describes parallels the late-stage subprime lending frenzy. In 2007, CEO Chuck Prince of later-big-bailout recipient Citigroup famously remarked:
When the music stops, in terms of liquidity, things will be complicated. “But as long as the music is playing, you’ve got to get up and dance. We’re still dancing….The depth of the pools of liquidity is so much larger than it used to be that a disruptive event now needs to be much more disruptive than it used to be.
Consider from the Journal the, erm, enthusiasm of lenders and fears of becoming wall flowers in this party:
Silicon Valley’s biggest players are flush with cash and were able to fund much of the initial AI build-out from their own coffers. As the dollar figures climb ever higher, they are turning to debt and private equity—spreading the risks and potential rewards more broadly across the economy.
Some of the financing is coming from plain-vanilla corporate bond sales, but financiers are making far bigger fees off giant private deals. Virtually every Wall Street player is angling to get a piece of the action, from banks such as JPMorgan Chase and Morgan Stanley to traditional asset managers such as BlackRock.
Investor appetite for data-center debt is so strong that some money managers have booked billion-dollar gains in a matter of days, even before construction of the facilities they are financing is complete.
Still, the longer-term performance is hardly assured. Big tech companies are expected to spend nearly $3 trillion on AI through 2028 but only generate enough cash to cover half that tab, according to analysts at Morgan Stanley.
Big names in the financial world, such as Goldman Sachs CEO David Solomon, are warning about AI-fueled froth in the markets and in capital spending.
At the same time, the fear of missing out is real. Days after Solomon voiced his concerns to analysts, Goldman formed a new team in its banking and markets group focused on AI infrastructure financing.
Later from the Journal:
Funds that invest in AI deals say they carry little risk, because tech companies with deep pockets have ironclad leases that will generate the money to pay investors back. Microsoft has a higher credit rating than the U.S. government, and it told investors on Oct. 29 that it would double its total data-center footprint in the next two years.
Perhaps I am too old, but I recall that IBM and GE were once AAA rated too, and that by 2000 (for IBM) and 2010 (for GE) their luster had taken quite a turn. And remember the pre-crisis pattern that housing prices had never fallen nationwide, only regionally? These supposedly blue-chip tech companies are placing monster bets on AI, so their historical solidity isn’t as germane as it might seem, unless they back off if the fundamental performance of large language models continues to fall well short of promises. The reaction of the Meta stockholders to Zuckerberg’s promise of even more big AI spending is confirmation.
Back to the Journal:
Tech executives see more risk in underbuilding than overbuilding…
But some tech companies are weaker financially than others. Oracle…needs to borrow billions more for its spending spree, prompting Moody’s Ratings and S&P Global Ratings to edge closer to reclassifying Oracle’s bonds as junk debt. In recent weeks, the company’s stock price has fallen 32% and its bonds have lost about 7%.
There’s also the risk that the chips tech firms are borrowing to buy could be obsolete in a few year…
The last time Wall Street went all-in on an industry was the fracking boom—then bust—over a decade ago. This time, financiers are marshaling even larger sums.
The article continues with a breathless account of the boomtown effect that these datacenter buildouts are generating. Matt Stoller flagged the concern of growing real-economy dependence on massive spend in what ought to be a niche:
A few months ago, I asked why our economy, despite steady growth in official numbers, feels so creepy and unstable. My conclusion was that the U.S. is in a “Chinese finger trap” economy. We are dependent for growth on monopolies and an AI bubble, which juices the all-important stock market. Trying to grow an economy in a more stable way could lower stock prices which would paradoxically lead to a downturn. So we’re stuck, until some outside event occurs….
Data center developers are now approaching multiple utilities with proposals for the same project, leading to “phantom” forecasts of demand that isn’t actually there. Essentially, it’s hoarding.
Hoarding is what happen in overheated markets….But this can lead to something called the “bullwhip effect.” Buyers overstate how much they want to buy….all of a sudden the demand evaporates because it was never real in the first place. This dynamic can throw an economy into an overheated state, and then a depression; that’s what happened globally after World War I, leading to, among other things, Mussolini’s takeover of Italy….
At this point, data centers are pretty much what’s growing in America. I recently had a conversation with an elected official who told me that data center construction is a huge construction jobs boost in rust belt areas….He posited a tension between political support for the new temporary jobs and political anger over higher electricity prices.
We have a one-legged economy, with the AI build-out serving as a driver of real estate values, the stock market, and GDP growth
This is a bet-the-economy scheme on models where China has much better mousetraps. There is no way this will end well.
____
1 The crisis demonstrated that theory and practice can be two different things. Banks had long offloaded credit card receivables to investors in supposedly off-balance sheet deals. When losses on them rose to previously unthinkable levels, the investors successfully revolted and made the banks eat some of those costs. The reason was the bank credit card businesses depended on being able to keep using other people’s credit. I am not sure whether or not these AI datacenter borrowers won’t wind up in an analogous position, of being so dependent on ongoing lending that lenders won’t let them walk away from outsized credit losses.




“companies like Meta who are perfectly capable of borrowing in their own name paying 200-300% basis points extra”
Yves, I noticed something that should be fixed: “% basis points”. Incompatible units, right?
Aargh, fixing, thanks.
“…have ironclad leases that will generate the money to pay investors back.”
Bwaahhh!!! That is until those leases hit the smelters of the bankruptcy courts!
Ed Dowd on the US economic, the AI bubble, the stock market, and the housing market.
utube, ~1 hr.
Ed Dowd: We’re Already in a Recession, Oil Going to $30 & The Deflation Scare Coming
https://www.youtube.com/watch?v=GHM-BH-SguE
Excellent discussion and economic assessments by Dowd. These are Capitalists to the core.
Meta and much of big tech still have corporate stock buyback plans open.
Incentives drive outcomes—-companies have move incentive to lever-up and chase the AI headlines than being seen as a 1989-era IBM (a company missing a secular pivot).
“…China has much better mousetraps.”
Listened recently to an interview with Andrew Ross Sorkin on this topic taking much the same line as the above article, during which he claimed as an example of China’s technical and manufacturing prowess that BYD electric vehicles are the gold standard that are better and cheaper than U.S. cars and that if the U.S. didn’t have tariffs on Chinese cars we would probably not have and an auto industry in the U.S. I’m giving serious consideration to encouraging my USian grandchildren to learn Mandarin,
https://www.newyorker.com/video/watch/the-radio-hour-andrew-ross-sorkin-in-conversation-with-andrew-ross-sorkin (Minute 12:33)
When you’ve lost Andrew Ross Sorkin…
I saw a video on the YouTube’s last night so it must be true…
BYD just test drove, (introduced?) an EV that will reach speeds over 200 miles an hour. I think they were at 300+ mph
Believe it o not, an electric motor can generate enormous rotational torque. Combustion engines are no match. Getting to 200-300 mph is more a matter of stability and aerodynamics.
Too many EV’s on a US freeway will create the same congestion as an ICE vehicle.
I’ve driven and owned a number of ICE muscle cars, mostly during my misspent youth, that I found impressively fast off the line. Now I drive a little Toyota hybrid, which frankly feels a bit like riding around in a sardine can, that upon acceleration can give you whiplash. I exaggerate but slightly.
Believe it or not, torque can not be anything but rotational. :) “Linear torque” is force.
Also, torque can be multiplied via gears, while power can’t.
Steam engines and electric motors tend to produce maximum torque close to zero rpm, which is the “believe it or not” part for ICE only drivers (with diesel engines being better at low rpm torque than petrol ones). Limited range of rotational speeds with useful torque is why trasmission is a necessity (and also why big trucks have more gears).
Getting to 200-300 mph is a matter of many things, but mostly not crashing. :)
Son in law just received his Tesla cyber truck. He loves it, so far. We talked about the almost instant high torque etc. Tech features are great! Run everything from the smart phone. My nephew has had the EV Rivian pick-up for a couple of years he still loves it.
I saw it Saturday!
I hope my daughter stops calling their cyber truck the “dumpster”.
At 75 I’ll stay with the ICE-SUV.
My one-owner 1977 Chevrolet Cabriolet, with second 305 V8, 270.000 miles, low rev – high torque, runs like new.
A random professional-American’s Mandarin will never be good as their random Chinese equivalent’s English. Particularly as English is the de-facto international working language in Asia, which is much more linguistically diverse than Europe’s Proto-Indo-European-descendant languages.
Don’t push Mandarin just because of China in a vacuum (though obviously the thought counts).
China isn’t just Mandarin, it’s a civilizational tradition that evolved independently of the west’s Greco-Roman-Enlightenment lineage—-an obvious statement, but something that’s lost on lots of credentialed, supposedly educated, people. And today’s “Communist China” is just one page in its lengthy history.
“And today’s “Communist China” is just one page in its lengthy history.”
Truly. The pages that have for the most part drawn my interest over the years are those having to do with the development of Buddhism in that country. One thing in particular that strikes me is that the Chinese took the more asocial natural mysticism of India and Southeast Asia and made it more fit for purpose in a materially practical, more socially engaged way of living.
Yes, I communicate with Chinese entrepreneurs and their English is as good or better than lifelong Americans. I’m told their linguistic skill is not unusual.
I just started listening to an audiobook “Empire of AI”.
One line of thinking in the book is that “size/scale/compute power” is not the only infrastructure model one can choose from to chase the elusive AGI unicorn (another topic the book explores).
This is the predominant line of thinking running thru SV though.
Open AI seized the narrative and strategically decided to achieve AGI thru sheer scale and size of its compute resources.
3 hours into the book so far but its very interesting.
Thanks for the book tip. I’m currently reading Adam Becker’s More Everything Forever describing beliefs and intentions prevalent among Silicon Valley elites. “Inmates running the asylum” is a phrase that comes to mind. And not in a good way as in King of Hearts.
$133B in questionable debt? Just file it under “other.” Here we have JP Morgan blazing the opacity trail.
Thanks for this background, Yves. To me, this finance perspective confirms some things I think about AI and data centers. An example from a portion of the Journal article you quote:
Why risk so much? Hasn’t stock price been sacred in the post-Greenspan era? Doesn’t this seem to be about more than even money?
I think I’ve found part of the answer in two interviews Ross Douthat, the Roman Catholic New York Times columnist, has done in the past year. The first, Douthat’s interview of Peter Thiel is second perhaps only to Tucker’s sit-down with Nick Fuentes among widely viewed videos. The second one, done in the past few days, is of Paul Kingsnorth, who’s on a book tour promoting his new book, Against the Machine: The Unmaking of Humanity. I was introduced to Kingsnorth through amfortas’s links to the Dark Mountain Project, which was begun by Kingsnorth and Dougald Hine.
I recommend listening to the Thiel interview before the Kingsnorth one because Douthat and Kingsnorth discuss that interview at some length. It’s also interesting contrast the two interviewees’ personalities and characters. (Kingsnorth calls the projects of Thiel and Musk “little boys’ toys,” an example of his barely contained contempt for them.)
I have watched about 15min of the first interview and paused at about 15 min of the second and, whatever one thinks about Kingsnorth he can be labelled as freethinker while Thiel not. The later is heavily influenced by his attachments (to his companies, money, whatever he is attached to) no matter how much he would like to appear as a freethinker. I couldn’t stomach Thiel for longer while I will probably end Kingsnorth interview even if i do not agree completely with him in some instances but he is much more interesting than Thiel. One shakes the brain and the other shakes the stomach.
Let me recommend the latest interview of Emmanuel Todd by Glenn Diesen which I found interesting. This are people who shake one beliefs and that is good.
Agree with Ignacio about Diesen’s interview of Todd. For once, Diesen pretty much shut up & let Todd talk. Todds latest book is about the loss of the West. He gets heavily into the loss of the philosophical “center” of the West- sort of there being no there there. He says that Western society has become a “Zombie” society like the Nazis. Well worth the listen!
Like this: from 2024, In view of this, ladies and gentlemen, what we can conclude is that the adoration of transvestites is an inevitable consequence of liberalism, and that the Inquisition burned too few people.
I will not venture into the twisted nest of vipers that is Thiel’s brain, but Kingsnorth will be worth a listen, thanks.
I’m meeting with my Minnesota AI vs. water people tomorrow. It’ll be interesting to hear their perspective. Data centers might try to propose one project to competing power companies but there’s only one water source (and regulator). If the volume of permit amendment applications from water utilitues/cities has fallen (data centers in MN do not supply their own water), the number of data center projects in MN will have fallen proportionally.
I’m trying to think about what’s similar and different with dotcom.
1) Similar to AI, dotcom had a huge infra build-out of core network fiber, switching, and data centers. Not all of that was scrap as some of it has relatively long useful life, especially fiber and switching, unlike AI GPUs.
2) The demand for dotcom infra was real, growing, and continues to grow. The dotcom bubble was supply. The Inernet as such was not vaporware. In AI it’s not clear how much money there really is in psychopathic search engines and generation of videos of dogs with 5 legs. A lot of the ultimate “demand” may be hype.
3) As Yves so well presents above, the financials of the AI bubble may involve a lot more leverage than dotcom.
So what are we facing? Something worse than dotcom but not worse than GFC?
The public Internet in the 90’s was not vaporware. It was the long developing son of the military/university ARPANET (network) from the 70’s. The early Internet was communication over phone lines. As public access grew cable/fiber optics were seen as necessary to get increased public/business participation. Unfortunately, laying cable was faster than public uptake. Until. of course, Netscape, and other graphical user interface that didn’t require knowledge of computer programming. Then, with exploding popularity (ease of use), the Internet demanded cable and fiber connectivity.
Something tells me AGI is not going to follow this breakout scenario.
Yea GUI, and they said no one would pay those hundreds of dollars for that heavy CRT.
On #2. you are GREATLY confusing dotcom infrastructure with companies selling “eyeballs’. Most of the bubble was in the latter.
Leaving aside for the moment the overall ridiculous nature of the AI technology boom and the fact that it is going to come crashing down of its own accord quite soon since LLVMs are not fit for their advertised purpose, any above quote seems like it highlights another risk-within-a-risk.
GPUs are a very high technology product that most people don’t know much about beyond a few acronyms. Using them as debt collateral seems very fraught in its own right since no one knows enough about them to decide what they are worth and in particular how quickly they’re going to depreciate.
Nvidia recently announced that they are going to a one-year design cycle, meaning all the chips that were purchased in 2025 will be superseded by newer, more powerful chips in 2026, and so on each year. I imagine this will create a big hit to the older models, and I imagine after this happens a few times the older chips will be essentially worthless. I don’t know what the term of the debt is that is being backed by these chips, but if it’s more than a couple of years this is obviously a shipwreck waiting to happen.
There are other changes in the technological landscape that can also render existing GPU hardware worthless in one way or another, including changes in software architecture and new manufacturers coming on the scene with cheaper and more powerful products. These kinds of things are both hard to predict and reasonably likely to happen.
Some physical objects, like real estate, vehicles, precious gems, and so on seem to make fairly reasonable collateral (even so, look at the record!), but new and evolving integrated circuits of very high density and complexity don’t fit into that category.
1. I am yet to hear of significant proposals stateside to expand electric generation capacity – plus transmission lines – to support all this projected data center growth. And remember, the US is a fragmented electricity space (10 separate market regions in just the lower 48), with differing regulatory rules (Texas does not equal New York does not equal California), so these need to be done at the local level.
The Russians have publically talked about using the new modular type nuclear reactors (including the ones they’d developed for the Burevestnik and the Poseidon weapons), though for now data centers do not seem at the top of their energy or nuke priorities (they’re replacing their aging nukes plus upgrading the infrastructure in historically underserved regions like Yakutia). The Chinese are drastically increasing their nuke capacity, paying the Russians a healthy chunk so to do. The Americans……….eh? Or maybe I’ve just missed it.
But all these plans for data center buildouts without the generation and transmission to go with are bound to run into quite the obstacle once they start drawing enough power to drop a given market region below its “reserve margin” level (typically 12% over peak demand is how much you want in generation), or, if god forbid these idiots build them in Texas, which is the only fully deregulated market, cause “ice storm” level price spikes for customers.
2. I am not sure what you would use as collateral if you’re doing any secured or ABS type financing for these data centers. The buildings themselves?
Because here’s the thing. If the servers are meant to do AI-LLM type stuff, that means every year, when the new chips/GPUs come out, you have to upgrade them all, at huge cost. But meanwhile, standard data center tasks do not need this extra capacity and do not need this extra bang for the buck. So 1-2-3 years into the deal you’re stuck with a bunch of formerly high-end chips that no-one needs, unless you pile even more money into the place?
Or think about it this way. Whenever I build my own PC, instead of buying the latest, most high-end CPU and video card, I buy the ones from ~2 years ago. Because at that point the price comes down 50% or more, while the decline in capability – for home user applications – is fairly negligible. The point is, the depreciation on these assets is insane relative to, say, a commercial building, or a power plant, or a refinery or whatever. Or even a commercial aircraft (assumed 25-year useful life, usually). To do an ABS deal secured by chips, if that is, in fact, what they are doing…that is pure insanity.
3. Does anyone remember IBM’s Watson?
Their whole thing in the 2000s and early-mid 2010s was – we have this “artificial intelligence” thing called Watson, and we’ll sell it to hospitals and such, where it will make diagnoses and help (read: replace) the medical staff in this and that capacity. And…the whole thing failed, because they could not fix what we now call AI hallucinations. Or, rather, IBM tinkered around with it for a while, with some sample projects in this or that hospital that never went anywhere much, and eventually sold it to some PE guys for ~$1 billion, which is rather less than they’d invested into it in the first place.
Of course, the total cost for Watson, at least its medical unit, including all the R&D and whatnot, had to have been in the several billion range at the most, which is something IBM can easily afford. I look at the charts for projected LLM capex spend, and there is just no way to get there from here, unless there is a technological revolution in LLMs no-one knows about.
Re #2, the security is the data center lease agreements.
I seldom comment on tech since I stopped working in that field 25 years ago, and I deliberately do not use hardware or software services provided by AI, Facebook, Microsoft, Apple, etc. Nevertheless, I am typing this on an ancient computer running Linux, which I deliberately configured to run minimal unbloated software, and I realize I need to build a new computer from parts because using my ancient parts is getting too risky. I remember 30 years ago when I eagerly anticipated the weekend edition of the newspaper with the weekend Fry’s Electronics ad. Holy moley! There’s a CPU and ATX motherboard combo for $120. I’m getting it. Without hesitation, I would drive there (or get my dad to drive) and immediately splurge. Then, I got right to work assembling my new computer. Wow, how times have changed for me.
I gave you this background to discuss how I have since fallen on hard times, how frugal I am, and how RAM prices have recently soared. Apparently, this happened because of the AI bubble encouraging RAM manufacturers to divert their resources away from desktop computer memory and components on RAM modules to continue feeding the AI bubble. Because I am not too knowledgeable about this, let me link to the Hardware Unboxed videos “DDR5 Pricing Skyrockets, What’s Going On?” and “VRAM Cost Is About To Make GPUs Way More Expensive“. I have been dreaming about buying new parts for a new computer for 6-12 months now. One year ago a 2×16 GB package of DDR5 RAM was approximately $100 USD. Today, that could be well over $200. That difference is enough to discourage me from buying. Three companies dominate the market for memory: Samsung, Micron, and Hynix. The AI bubble seems to have affected all three companies and respective prices for RAM.
Of course, I could try to skimp in other areas. I didn’t pay attention to computers for a decade or more. Thus, I got up to speed recently in anticipation of any supposed Black Friday promotions. Apparently, Intel makes garbage chips now. That leaves one other option, AMD. AMD’s latest and greatest platform is named AM5 (with AM6 forthcoming), but I could deliberately buy an AM4 CPU, AM4 motherboard, and DDR4 RAM. DDR4 RAM did not get excluded from whatever frothiness is going on with AI. It’s all kind of spendy. Argh!
Risk to the big 7 paying off these debts:
Trump fails to prevent China’s DeepSeek from dominating Sam Altman’s too much computer, too much training overhead. Big 7 intro/monopoly power already gone.
Data center build outs delayed by grid connection, if the grid can get debt to build out. Project and supply chain delays.
Some number of US/EU sources doing programming like DeepSeek.
If big 7 make any profit it will be short lived. HE/SW life cycle very short.
Any underwriter not following scientific papers is not doing due diligence.
But, the fed balance sheet has over $2.5 trillion in MBS, etc. Most with >10 years left. Might as well tranche the big 7 debts and move on.
Buckle up!
Reuters just reported CDS’ are being drawn up on big AI risk.
Weinstein, Saba….
Is this the report you mean? https://www.reuters.com/business/finance/weinsteins-saba-sells-credit-derivatives-big-tech-as-ai-risks-grow-source-says-2025-11-17/
Yes, thank you for linking it here….
“current prices indicate those risks are still low compared to other sectors.” is a stretch!
Jensen Huang says NVDA will sell $500 to $600 billion in GPU chips next couple of years…. where will the money come from?
i dont really believe in the ability of companies to supply data centers with enough electricity in the long run. thus in my mind, the fate of data centers many years down the line is likely some sort of looting for parts – if not electrical components, then the base metals used in the electrical components.
additionally, the oversupply of datacenters begs another question: what about the data? how are they getting enough data to compute in these centers? you need either more experimental data being gathered (phones, surveillance, maybe drones) or something from ab initio guys.
Jeff Bezos has entered the chat…
I think this is becoming as silly as the original Dot Com bubble.
According to the Guardian article you linked, so has Michael Burry.
Thanks for posting this article. I’m still trying to digest it all.
I think this is related – has anybody noticed what’s happening to RAM? Prices are going nuts:
RAM: WTF? https://www.youtube.com/watch?v=9hLiwNViMak
A memory upgrade that I was looking at earlier this year for around $120 is now approaching $500. I don’t know if this will bleed into prices for other devices that use RAM and other ICs used in server grade hardware, but if this keeps up, it will.
All so we can have things like AI targeted data scrapping and ads, lay off entry level workers of all sorts, and OpenAI’s big deal – AI p0rn. Really? Yeah, really:
OpenAI’s ChatGPT will soon allow ‘erotica’ for adults in major policy shift
https://www.cnbc.com/2025/10/15/erotica-coming-to-chatgpt-this-year-says-openai-ceo-sam-altman.html
China is using AI to do things like reduce the cost of healthcare and enhance manufacturing (I think this is a repeat link):
China’s AI hospitals will transform medicine across the world. But not in the United States. https://www.youtube.com/watch?v=XDmFB7AQSR0
I’ll have to keep looking around at how AI will be used in America hoping to find where it will be used to assist humans doing jobs – I’m sure it’s out there rather than all the crazy AI stuff that actually pops up unwanted. AI friends for kids? Yikes! NO!
I’ve come across a couple of instances where medical researchers in the U.S. are using AI. In one they’re attempting to discover possible cures to various diseases through the off-label use of existing medications.
The Medical Matchmaking Machine Radiolab (1 hour, 12 minute audio). This is quite a story as the principal researcher was a doctor who, without AI, discovered an off-label use for an existing medication to save his own life and that of others.
Other researchers are using AI to develop new drugs without much success as of yet.
AI Was Supposed To Discover New Drugs. Where Are They? Science Friday (18 minute audio with transcript)
Thanks!
Other researchers are using AI to develop new drugs without much success as of yet. AI Was Supposed To Discover New Drugs. Where Are They?
Years of human trials and millions or billions of dollars of investment will happen before the AI- discovered candidate molecules — and I know of plenty — which make it through the process will arrive.
That’s how the real world works. New drugs don’t just magically appear. Both these podcast transcripts eventually get to that little fact, so the headline/titles of them are essentially clickbait.