OpinionPREMIUM

TOBY SHAPSHAK: Blame AI for making your next phone or PC more expensive

Demand for memory has created an ‘unprecedented’ global shortage

AI taking over
The HBM needed by the AI server farms has profit margins of 50% versus 35% for normal DRAM, so guess where the only three manufacturers are focusing, writes the author (Steve Johnson / Unsplash)

It’s time to talk about the other AI bubble. Computer and smartphone prices are going to skyrocket this year because of the insatiable demand for key components in AI data centres. As the prices for these soar, consumer electronics makers are braced for lower sales.

The memory chips in our devices alone can be anywhere from 15% to 40% of the cost. In less than a year, since April 2025, the cost of the most commonly used dynamic random access memory (DRAM) variant (called DDR4) has shot up 1,360%, according to Bloomberg Intelligence.

Having just recovered from supply chain interference (from Covid to Donald Trump’s tariff war, to a container ship blocking the Suez Canal), the technology industry is facing a new threat. To power these huge new AI data centres, they need the same innards that power our smartphones and allow teenagers to take selfie videos.

Dell COO Jeffrey Clarke has been warning about this “unprecedented” global shortage of components as “demand is way ahead of supply”. “The cost basis is going up across all products. Everything uses a CPU, has DRAM, and has storage in it,” he warned investors last November. “We have not seen costs move at the rate that we’ve seen.”

The most sought-after component is DRAM, the memory module that powers your phone and laptop, but also all those new AI servers. The variant needed by data centres is high-bandwidth memory (HBM), which uses four times as many silicon wafers as DRAM.

Higher profit margins

The HBM needed by the AI server farms has profit margins of 50% versus 35% for normal DRAM, so guess where the only three manufacturers are focusing. By 2030 HBM will be half of global DRAM manufacturing, a huge leap from 8% in 2023, according to Bloomberg Intelligence.

Meanwhile, the cost of 12Gb of DRAM used in smartphones has shot up by $70 in the past 15 months, Counterpoint Research senior researcher Yang Wang told The Economist. In December he predicted the global smartphone market would shrink by 2% in 2026. Now he’s expecting it to go down by 6%.

All this means your next smartphone and computer will cost more as the laws of supply and demand do what they do. Everyone fishes where the fish are. Sorry, Joe Consumer.

Cynical commentators have warned that the only way those consumer electronics prices will come down is if the AI bubble bursts. Nobody stops to think what that will do to all our pensions. But if you avoid talking about the AI bubble, it might go away — at least this seems to be the prevalent approach to the hypercycle of hyperbolic spending on data centres.

All the big AI firms have a slice of each other, including Nvidia, OpenAI, Microsoft and even AMD, the other big chip maker. The “circle jerk” of investment in each other, as one commentator put it, means it’s a stack of dominoes waiting for something to tip over the whole thing.

Right now it means your next smartphone will not only cost you more, but you’ll get less for more by paying for downgraded features (like less RAM or storage) at the now increased price.

I realised last year I didn’t write often enough about AI to cover all the major innovations. I prefer to write about what people have done, not what they say they will do. That has always been my yardstick for coverage.

What we are seeing is another ChatGPT moment (when it burst onto the scene in November 2023) with a so-called agentic agent called Claude Code.

I know something significant is happening because all the other geeks are raving about it. Casey Newton, one of the smartest people I follow, is talking about a similar “Claude Code moment”.

“ChatGPT was the moment when people woke up to the power of large language models (LLMs) to generate text,” he wrote in his hugely popular Platformer newsletter. “The Claude Code moment, while orders of magnitude smaller, strikes me as potentially just as significant. It’s waking people up to LLMs’ power to generate tools.”

He highlighted the word “tools”, and indeed they are. His breakthrough was building his own website using text prompts — or, as AI researcher Andrej Karpathy said after ChatGPT launched three years ago, “the hottest new programming language is English”.

“In just about an hour of typing into a box — of programming in English, as Karpathy put it — I made a personal website that far surpassed anything I had been able to put together in Squarespace. Claude regularly made strange choices and outright errors that I would have to correct.

“Without too much effort, though, I was able to get the site that I asked for — as well as several features I never would have thought to build had Claude not suggested them.”

So, I tried myself to build my own new Maven.Africa website. It took me more than an hour. Was it easier than hiring a friend to do it, or just using Wix or Squarespace? No, but I did learn a bunch of things and saw the new wave of Claude Code. Similar functionality is available from OpenAI’s Codex or Google’s Antigravity.

Will I use these new skills or tools in my day-to-day work? Also no. But we’re in the “throwing spaghetti at the wall” stage of finding out what AI is good at. Some will stick; some will fall off. The final dish is by no means ready.

• Shapshak is editor-in-chief of Stuff.co.za.

Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Comment icon