In 1973, the Yom Kippur War erupted. Seventeen days later, Arab oil producers announced an embargo, and crude quadrupled from $3 to $12 a barrel. That crisis is usually filed under energy history. But it was also a turning point for semiconductors — American electricity prices surged, factory costs soared, and Japan seized the moment by substituting nuclear for thermal power, subsidising industrial electricity, and pouring national resources into chip manufacturing. A decade later, Japan's share of the global memory market had rocketed from under 10% to over 50%.
Electricity prices — perhaps the least glamorous variable in technology — determined where an entire industry's centre of gravity would shift.
Fifty-three years on, the same script is playing out in artificial intelligence.
The scissors
In March 2026, two things happened almost simultaneously. A new frontier AI model launched, pushing weekly active users on major AI platforms toward 900 million and sending inference demand into a tsunami. At the same time, the Iran war turned the Strait of Hormuz — conduit for a fifth of the world's oil and vast quantities of LNG — into a conflict zone. Brent crude hovered around $90; investment banks' worst-case scenarios pointed to $130.
A demand-side inference tsunami and a supply-side energy shock: two blades of a scissors closing on AI margins.
Electricity accounts for roughly 20–30% of cloud inference costs. At $90 Brent, data-centre power costs may rise 20–30%, translating into a 5–10% increase in total inference cost and perhaps 3–8 percentage points of margin compression for leading AI companies — painful, but survivable. At $130 under an effective Hormuz blockade, natural gas spot prices could double, pushing power costs up 70–120% and shaving 10–15 points off margins. For smaller AI companies already running thin, that may cross the viability threshold.
But electricity is not the worst of it. East Asian chip fabs depend on Gulf LNG for power; a supply disruption doesn't just make chips more expensive — it may halt production entirely. Qatar supplies roughly a third of global helium, an irreplaceable gas for semiconductor etching and cooling. A Hormuz blockade cutting Qatar's helium exports would impose a hard physical constraint on global wafer output. Even after energy prices normalise, chip supply recovery could lag by months. The scissors may stay closed longer than the energy shock itself.
Guizhou versus Virginia
On March 22, 2026, the chairman of a major Chinese technology company told a Beijing forum that China's AI edge lies not in algorithms but in its power grid. Over the past decade, China invested roughly $90 billion annually in electricity transmission. Last year's newly installed generation capacity was about ten times that of the United States.
In peacetime, those numbers sound like talking points. Against the backdrop of the Iran war, they become a quantifiable competitive direction.
China's western data-centre clusters — Inner Mongolia, Guizhou, Ningxia, built under the national "East Data, West Computing" initiative — run on electricity priced at roughly $0.05–0.07 per kilowatt-hour, sourced from coal, nuclear, and solar that never transits the Strait of Hormuz. America's main data-centre corridors in Virginia and Texas lean heavily on natural gas. Their peacetime rates of $0.07–0.09/kWh could climb to $0.09–0.12 in a wartime energy crunch, widening the gap from 10–20% to 25% or more.
Even if export controls successfully restrict access to leading-edge GPUs, lower operating costs can allow open-source models running mixture-of-experts architectures to stack competitive inference performance on commodity hardware. AI competition, of course, spans chips, talent, data, capital, regulation, and application ecosystems — electricity alone does not determine the outcome. But 1973 demonstrated that when a Middle Eastern war redrew the energy price map, it could reshape a technology industry's centre of gravity within a decade.
Who bears the pressure
If the scissors persists — and the trajectory of a sustained low-intensity conflict suggests it will — the AI industry faces an accelerating shakeout.
Directional winners: hyperscale platforms with captive inference infrastructure and long-term energy contracts; cybersecurity firms benefiting from both rising threats and AI-powered defence tools; and open-source AI ecosystems that gain from structural energy-cost differentials.
Directional losers: smaller companies wrapping foundation-model APIs with little infrastructure or pricing power of their own; pre-revenue AI startups squeezed by high interest rates and rising costs simultaneously; and gas-dependent data-centre operators sitting directly on the cost-transmission chain.
The most thought-provoking possibility: 900 million free or low-cost users generate enormous inference demand while contributing almost no commensurate revenue. The entities bearing the heaviest weight of the scissors may be precisely those that built the most powerful AI.
Coda
After 1973, it took the United States fifteen years to reclaim semiconductor leadership — not by reverting to the old cost structure, but by redefining the terms of competition entirely, shifting from manufacturing to design, from volume to architecture.
The scissors will not stay closed forever. Wars end, oil prices fall, gas supplies recover. But while the blades are pressing together, they will sort companies with genuine technical moats from those merely riding the tide. They may redefine AI competition's core variable — from "who has the most parameters" to "who delivers the most inference per watt."
Energy — the variable everyone assumed was solved — is quietly re-emerging as the decisive card in the most important industry of our time.
Data sources: Morgan Stanley energy scenario analysis, Société Générale global economic outlook, public remarks at the China Development Forum 2026. Cost and margin figures are scenario-based estimates derived from public information, not audited financials.
No comments:
Post a Comment