ColumnistsPREMIUM

TOBY SHAPSHAK: Apple takes its eye off the ball in AI slump

Bizarre focus on its new design has done nothing to dispel commentators’ concerns that it is in serious danger

Apple's senior vice-president of software engineering Craig Federighi speaks at the World Wide Developers Conference in Cupertino, California, the US, June 9 2025.  Picture: LAURE ANDRILLON/REUTERS
Apple's senior vice-president of software engineering Craig Federighi speaks at the World Wide Developers Conference in Cupertino, California, the US, June 9 2025. Picture: LAURE ANDRILLON/REUTERS

That Apple didn’t make any artificial intelligence (AI) announcements at this week’s developer conference has had the internet in uproar. 

The world’s most valuable tech firm has been the de facto smartphone leader — alongside Samsung — for most of the past 20 years. In the beginning Apple tended to set the tone for the industry, before losing out to superior work being done by Android (on the software front) and hardware manufacturers such as Samsung, Huawei, Xiaomi, Honor and others.

Apple’s fundamental problem is that it overpromised and underdelivered at its big iPhone launch last year. It announced a range of AI features it called Apple Intelligence. None of those have shipped, and Apple’s earliest target to bring out these much-needed features is 2027, Bloomberg reports.

A bunch of iPhone users in the US are suing Apple — obviously — claiming they were duped into buying a new phone without the features that came with it. I also bought the iPhone 16 Pro and I’m pretty happy with it, especially the new camera button and the USB-C upgrade.

But this week Apple’s bizarre and overelaborate focus on its new design — called “liquid glass” — has done nothing to dispel commentators’ concerns that it is in serious danger. If nothing else, this week’s announcement is an obvious sign it doesn’t have its eye on the ball.

The problem with AI is that it is used as a catchphrase for everything. It is either a tsunami that’s going to wipe away everything that came before it, or a rising tide that will lift us all higher.

Profound advances are taking place in AI. But the hype is out of kilter with the actual possibilities or features of a new service or technology. In other cases the hype fails to match just how advanced AI is and the threats it poses to existing businesses, if you listen to any tech executive. But what is certain is that AI is arguably killing off the first rungs of careers for many juniors seeking work experience and on-the-job training.

The most immediate — and scary — threat is the death of entry-level jobs for professionals in the legal sphere, accounting, finance and software development. I’ve spoken to lawyers, accountants and bankers who are all watching with concern the first wave of these AI services, which are good enough to do what these ranks of juniors used to do. And cost less. 

But if in the next few years there will be no jobs for candidate attorneys or accountants, how will next year’s class of graduates get the training and work experience they need? “My generation is fine,” a senior lawyer told me. “But how will the next generation get trained?”

Similarly, the head of a trading desk for a major bank told me last week that “we just don’t need junior quants”. It’s cheaper and faster to use AI services, he says, and he’s “all in” on AI.

These are two smart people at the top of their respective games, watching the human capital of their specialised industries being decimated by a sudden technology advance. The problem is just as stark for software developers, whose junior ranks will be replaced by some form of AI coding, I’m told by friends in that industry. 

But what does AI mean to us little people? In the case of Apple’s iPhones, and AI in general, what are the real use cases? For instance, I am dictating much of this column to my phone. In the past I’ve used a variety of custom software packages, third-party apps and various hacks to get to this point. Microsoft’s SwiftKey keyboard app has been my go-to for years.

Now I just open a note on my iPhone 16 Pro and a few clicks later I can speak in a natural, easy way — including long pauses as I rethink the structure of a sentence. What I get is useful: a mostly accurate transcription of my thoughts (in seconds) that I can edit in Microsoft Word later. Am I using AI or am I just using a new feature on my phone?

The most significant offering from generative AI in the beginning was its ability to summarise large volumes of information. For lots of people that’s a killer feature. But not for journalists and specifically not for me.

I don’t need articles or white papers summarised for me. That is literally my job. Journalists are the original big data scientists, I have been arguing for the past few years, because we have summarised world events into 400-word news articles for decades.

Sifting through all the facts to prioritise them, order them and ask experts what to think about them is what we humans have been doing for, well, all of our existence. We’ve found mechanical and silicon means to enhance our productivity, for want of a better way to describe evolution. 

Meanwhile, I’m astounded that the device and interface used by the least number of Apple users — the clunky, expensive and heavy Vision Pro headset — has become the interface for all of its devices. People are already complaining that the see-through nature of this new interface just makes it harder to read. I can’t wait. 

It seems either absurd or astute that Apple has focused on this visual enhancement as an obvious fillip for its lack of innovation in AI, the most pressing of technology upgrades, and has unveiled “liquid glass” as its solution to an obvious innovation slump.  

Also, happy Friday the 13th. 

• Shapshak is editor-in-chief of Stuff.co.za.

Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Comment icon