AI capex boom strains hyperscalers' cash flow as DRAM makers gain pricing power, says Jefferies

By ANI | Updated: May 2, 2026 08:25 IST2026-05-02T13:54:35+5:302026-05-02T08:25:07+5:30

New Delhi [India], May 2 : The AI infrastructure spending spree by US hyperscalers is showing signs of strain, ...

AI capex boom strains hyperscalers' cash flow as DRAM makers gain pricing power, says Jefferies | AI capex boom strains hyperscalers' cash flow as DRAM makers gain pricing power, says Jefferies

AI capex boom strains hyperscalers' cash flow as DRAM makers gain pricing power, says Jefferies

New Delhi [India], May 2 : The AI infrastructure spending spree by US hyperscalers is showing signs of strain, with capital expenditure now consuming nearly all of their operating cash flow, even as Dynamic Random Access Memory (DRAM) suppliers emerge as the key beneficiaries with unprecedented pricing power, global brokerage firm, Jefferies, said in a report.

According to Jefferies, the four major US hyperscalers are projected to spend 92 per cent of their operating cash flow on capex in 2026, up sharply from 41 per cent in 2023. This reflects the scale of the AI arms race, with total capex from major US tech players expected at USD 700 billion this year and USD 800 billion next year, equivalent to 2 per cent of US Gross Domestic Product (GDP) and 20 per cent of non-residential fixed investment. Remarkably, it also represents nearly 30 per cent of all US non-financial pre-tax profits.

A growing chunk of this spending is flowing into memory chips, with hyperscalers likely to allocate 28 per cent of operating cash flow to DRAM this year, assuming memory accounts for 30 per cent of total capex.

The structural shift is driven by the effective end of Moore's Law, which has constrained DRAM makers from boosting chip density on wafers by 50-100 per cent annually. With only three global DRAM suppliers left compared to 12 before 2012, supply is now fundamentally constrained.

This has forced hyperscalers like Nvidia to lock in 3-5 year supply agreements with DRAM makers, a dynamic that is making the memory industry resemble TSMC's model, where capacity expansion follows concrete demand rather than the traditional boom-bust cycle. However, Jefferies warns that the risk lies in a potential realisation by hyperscalers or investors that they have overinvested in AI infrastructure.

Adding to the uncertainty is the question of AI monetisation, a recent Jefferies report by China tech head Edison Lee notes that rising compute, memory and power costs mean "sustainable profitability is far away for pure model players." Still, in the long run, demand for compute is seen as structural, even if the short-term cycle faces a reset.

Disclaimer: This post has been auto-published from an agency feed without any modifications to the text and has not been reviewed by an editor

Open in app