NEWS FROM THE FUTURE: AI is the quantum killer app

Quantum computers can slash AI’s energy requirements by billions of dollars and still provide reliable results

Dateline: January 29 2030. 

Since the first error-corrected quantum computer chips began to emerge in 2025, the era of quantum computing has advanced in leaps and bounds. From interesting but esoteric experiments in the lab, quantum became the future of supercomputers. 

It stands to reason. While classical computer bits can only be in one of two states — either a zero or a one — quantum bits, or qubits, can be in either state simultaneously, or any value in between. So, it’s more like a probability than a discrete value, which means you can do blazingly fast matrix calculations and simultaneous processing, rather than sequential computation. 

But because they work in probabilities rather than exact numbers, real-world applications for quantum computing have been limited to things such as optimising routes and resources, or forecasting based on multiple connected parameters, where speed is more important than accuracy. 

Which is where AI comes in. To answer a query AI models perform inference computing, essentially predicting the most likely set of tokens from billions of active parameters, to provide the correct answer. And if you’re doing facial recognition or fraud detection, or just writing witty articles, speed is often more important than being 100% correct. 

AI models and agents are more powerful and busier than before, requiring enormous computing and energy resources. Quantum computers can slash those requirements by many orders of magnitude — billions of dollars — and still provide reliable results, within certain confidence limits. It’s a match made in computing heaven. 

And AI is the quantum killer app. 

• First published in Mindbullets January 30 2025.

Quantronics is the next big thing 

Nano materials spawn new tech industries 

Dateline: February 18 2025.

Many of us grew up in the exciting world of electronic components replacing electromechanical devices. Transistors, resistors and solid-state controllers; soon it was all about integrated circuits, microprocessors and computer chips. The semiconductor industry was born, and Moore’s Law (that the number of transistors in an integrated circuit doubles about every two years) took over. 

And for decades the exponential improvements — more bang for the buck — seemed to be never-ending. Computers, and the chips that powered them, got smaller, faster and cheaper, until they virtually disappeared into our personal items like phones, watches and glasses, and even became embedded in our bodies. 

But as the silicon circuits dropped in width to just a few nanometres, strange things started to happen. Called leakage, electrons began to misbehave and lose their integrity. With so little power required for processing, any environmental disturbance — such as Wi-Fi or static — could seriously disrupt the semiconductor logic. Undetected errors are the worst kind; you can’t trust the device. 

Now we’re building nano processors out of nano materials and meta materials — exotic devices such as quantum dots and electron wells, and “origami” graphene. There’s hardly any silicon in sight, except for the packaging and interconnection. We’re able to trap individual photons and electrons, create superconducting circuits that transmit electrons without any losses, and exploit the power of quantum physics. 

With this type of exponential acceleration of raw computing power, and equivalent reduction in electrical energy to power the systems, Moore’s Law will be discarded as irrelevant. A new breed of “chip” is emerging, and a whole new industry built around quantum microelectronics. 

It’s the next big thing and it needs a name; let’s call it “quantronics”. 

First published in Mindbullets February 18 2021.

Despite appearances to the contrary, Futureworld cannot and does not predict the future. The Mindbullets scenarios are fictitious and designed purely to explore possible futures, and challenge and stimulate strategic thinking. 

Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Comment icon