Skip to main content

Has quantum computing been cracked?

In recent days there has been a surge in interest in quantum computing - computers that use quantum particles as the equivalent of bits. Out of the blue, I've received several invitations to talk to people about quantum computing as a result of my my book, imaginatively named Quantum Computing, which provides an introduction to the field. I suspect this upsurge is because of the recent announcement that the BBC dramatically headlined Quantum breakthrough could revolutionise computing

This is a topic that has suffered from considerable hype in the past - so is this breakthrough (which there certainly has been) transformative or an incremental step towards what is still a fairly distant proposition?

The reason quantum computers are of huge interest is that for certain applications they can, in principle, carry out calculations that would take conventional computers the lifetime of the universe to churn through. The reason that they can do this is that instead of using bits that can store values of 0 and 1, the quantum computer uses qubits - each a quantum particle which can be in a superposition of states - partly 0 and partly 1 simultaneously, with the 'partly' effectively capable of representing an infinitely long real value. The way that qubits link together means that what would usually require sequential processes in a conventional computer can be undertaken simultaneously.

However, there also plenty of problems with making quantum computers work. You need to be able to isolate quantum particles from their environment, or the states of the qubits will be lost, while still being able to interact with them. This is not trivial and as yet it has limited quantum computers to orders of magnitude around 100 qubits. You also need to undertake error correction, because the process is inherently prone to errors, which means it takes considerably more qubits to undertake a calculation that might otherwise be thought. What's more, you need to have both a suitable algorithm, specifically devised for a quantum computer, and the ability to get information in and out of the computer, when the typical answer may well just be 0 or 1.

It's important to emphasise that quantum computers are not desktop devices - they may well always require a specially controlled environment, working as shared cloud devices - and they are not general purpose computers, with relatively limited numbers of potentially very powerful algorithms. The first two examples  produced were an algorithm that effectively makes it easier to crack the encryption used for internet payments (a trifle worrying), and (the reason Google, for example, is very interested) a search algorithm that makes it possible to find something with the square root of the number of searches required by a conventional computer. To emphasise how much the development of this hardware is a slow process, these algorithms were both developed in the mid-1990s, long before anything was available to run them on.

The breakthrough that is making the news involves one class of quantum computers - those where the qubits are based on ions (atoms that have gained or lost electrons to become electrically charged). Other quantum computers use photons, for example, but ions have the advantage of being relatively easy to keep in place due to their electrical charge. A chip to confine and interact with ions requires a lot more space that dealing with the equivalent number of conventional bits. A standard-sizes chip can only handle around 100 qubits, where an effective quantum computer might require a few millions (still vastly smaller than the billions of bits in a conventional computer processor). The breakthrough involves being able to transfer ions from one chip to another with a very low loss rate and without measurably impacting the 'phase coherence' of the qubit - in simple terms, the qubit keeps the values its holding.

This is an impressive piece of work. It makes it possible in principle to have a quantum computer with many chips that interact with each other, enabling it to support the kind of number of qubits that would make it a truly effective resource. However, it's worth emphasising that there are still plenty of other issues to be dealt with, and that while this is an effective demonstration, it's still a way from being applicable on any scale. Realistically it could be another 5 to 10 years before there is a real product where large scale, useful quantum algorithms can be deployed. An important step, then, but definitely incremental rather than a revolution.

If you'd like to read more about the technology, the paper is here and is freely downloadable. (Surely it's time the BBC started providing links to papers?)


Comments

Popular posts from this blog

Rakhat-Bi Abdyssagin Five Way Interview

Rakhat-Bi Abdyssagin (born in 1999) is a distinguished composer, concert pianist, music theorist and researcher. Three of his piano CDs have been released in Germany. He started his undergraduate degree at the age of 13 in Kazakhstan, and having completed three musical doctorates in prominent Italian music institutions at the age of 20, he has mastered advanced composition techniques. In 2024 he completed a PhD in music at the University of St Andrews / Royal Conservatoire of Scotland (researching timbre-texture co-ordinate in avant- garde music), and was awarded The Silver Medal of The Worshipful Company of Musicians, London. He has held visiting affiliations at the Universities of Oxford, Cambridge and UCL, and has been lecturing and giving talks internationally since the age of 13. His latest book is Quantum Mechanics and Avant Garde Music . What links quantum physics and avant-garde music? The entire book is devoted to this question. To put it briefly, there are many different link...

Should we question science?

I was surprised recently by something Simon Singh put on X about Sabine Hossenfelder. I have huge admiration for Simon, but I also have a lot of respect for Sabine. She has written two excellent books and has been helpful to me with a number of physics queries - she also had a really interesting blog, and has now become particularly successful with her science videos. This is where I'm afraid she lost me as audience, as I find video a very unsatisfactory medium to take in information - but I know it has mass appeal. This meant I was concerned by Simon's tweet (or whatever we are supposed to call posts on X) saying 'The Problem With Sabine Hossenfelder: if you are a fan of SH... then this is worth watching.' He was referencing a video from 'Professor Dave Explains' - I'm not familiar with Professor Dave (aka Dave Farina, who apparently isn't a professor, which is perhaps a bit unfortunate for someone calling out fakes), but his videos are popular and he...

Everything is Predictable - Tom Chivers *****

There's a stereotype of computer users: Mac users are creative and cool, while PC users are businesslike and unimaginative. Less well-known is that the world of statistics has an equivalent division. Bayesians are the Mac users of the stats world, where frequentists are the PC people. This book sets out to show why Bayesians are not just cool, but also mostly right. Tom Chivers does an excellent job of giving us some historical background, then dives into two key aspects of the use of statistics. These are in science, where the standard approach is frequentist and Bayes only creeps into a few specific applications, such as the accuracy of medical tests, and in decision theory where Bayes is dominant. If this all sounds very dry and unexciting, it's quite the reverse. I admit, I love probability and statistics, and I am something of a closet Bayesian*), but Chivers' light and entertaining style means that what could have been the mathematical equivalent of debating angels on...