Skip to main content

Has quantum computing been cracked?

In recent days there has been a surge in interest in quantum computing - computers that use quantum particles as the equivalent of bits. Out of the blue, I've received several invitations to talk to people about quantum computing as a result of my my book, imaginatively named Quantum Computing, which provides an introduction to the field. I suspect this upsurge is because of the recent announcement that the BBC dramatically headlined Quantum breakthrough could revolutionise computing

This is a topic that has suffered from considerable hype in the past - so is this breakthrough (which there certainly has been) transformative or an incremental step towards what is still a fairly distant proposition?

The reason quantum computers are of huge interest is that for certain applications they can, in principle, carry out calculations that would take conventional computers the lifetime of the universe to churn through. The reason that they can do this is that instead of using bits that can store values of 0 and 1, the quantum computer uses qubits - each a quantum particle which can be in a superposition of states - partly 0 and partly 1 simultaneously, with the 'partly' effectively capable of representing an infinitely long real value. The way that qubits link together means that what would usually require sequential processes in a conventional computer can be undertaken simultaneously.

However, there also plenty of problems with making quantum computers work. You need to be able to isolate quantum particles from their environment, or the states of the qubits will be lost, while still being able to interact with them. This is not trivial and as yet it has limited quantum computers to orders of magnitude around 100 qubits. You also need to undertake error correction, because the process is inherently prone to errors, which means it takes considerably more qubits to undertake a calculation that might otherwise be thought. What's more, you need to have both a suitable algorithm, specifically devised for a quantum computer, and the ability to get information in and out of the computer, when the typical answer may well just be 0 or 1.

It's important to emphasise that quantum computers are not desktop devices - they may well always require a specially controlled environment, working as shared cloud devices - and they are not general purpose computers, with relatively limited numbers of potentially very powerful algorithms. The first two examples  produced were an algorithm that effectively makes it easier to crack the encryption used for internet payments (a trifle worrying), and (the reason Google, for example, is very interested) a search algorithm that makes it possible to find something with the square root of the number of searches required by a conventional computer. To emphasise how much the development of this hardware is a slow process, these algorithms were both developed in the mid-1990s, long before anything was available to run them on.

The breakthrough that is making the news involves one class of quantum computers - those where the qubits are based on ions (atoms that have gained or lost electrons to become electrically charged). Other quantum computers use photons, for example, but ions have the advantage of being relatively easy to keep in place due to their electrical charge. A chip to confine and interact with ions requires a lot more space that dealing with the equivalent number of conventional bits. A standard-sizes chip can only handle around 100 qubits, where an effective quantum computer might require a few millions (still vastly smaller than the billions of bits in a conventional computer processor). The breakthrough involves being able to transfer ions from one chip to another with a very low loss rate and without measurably impacting the 'phase coherence' of the qubit - in simple terms, the qubit keeps the values its holding.

This is an impressive piece of work. It makes it possible in principle to have a quantum computer with many chips that interact with each other, enabling it to support the kind of number of qubits that would make it a truly effective resource. However, it's worth emphasising that there are still plenty of other issues to be dealt with, and that while this is an effective demonstration, it's still a way from being applicable on any scale. Realistically it could be another 5 to 10 years before there is a real product where large scale, useful quantum algorithms can be deployed. An important step, then, but definitely incremental rather than a revolution.

If you'd like to read more about the technology, the paper is here and is freely downloadable. (Surely it's time the BBC started providing links to papers?)


Comments

Popular posts from this blog

Battle of the Big Bang - Niayesh Afshordi and Phil Harper *****

It's popular science Jim, but not as we know it. There have been plenty of popular science books about the big bang and the origins of the universe (including my own Before the Big Bang ) but this is unique. In part this is because it's bang up to date (so to speak), but more so because rather than present the theories in an approachable fashion, the book dives into the (sometimes extremely heated) disputed debates between theoreticians. It's still popular science as there's no maths, but it gives a real insight into the alternative viewpoints and depth of feeling. We begin with a rapid dash through the history of cosmological ideas, passing rapidly through the steady state/big bang debate (though not covering Hoyle's modified steady state that dealt with the 'early universe' issues), then slow down as we get into the various possibilities that would emerge once inflation arrived on the scene (including, of course, the theories that do away with inflation). ...

Why Nobody Understands Quantum Physics - Frank Verstraete and Céline Broeckaert **

It's with a heavy heart that I have to say that I could not get on with this book. The structure is all over the place, while the content veers from childish remarks to unexplained jargon. Frank Versraete is a highly regarded physicist and knows what he’s talking about - but unfortunately, physics professors are not always the best people to explain physics to a general audience and, possibly contributed to by this being a translation, I thought this book simply doesn’t work. A small issue is that there are few historical inaccuracies, but that’s often the case when scientists write history of science, and that’s not the main part of the book so I would have overlooked it. As an example, we are told that Newton's apple story originated with Voltaire. Yet Newton himself mentioned the apple story to William Stukeley in 1726. He may have made it up - but he certainly originated it, not Voltaire. We are also told that â€˜Galileo discovered the counterintuitive law behind a swinging o...

Ctrl+Alt+Chaos - Joe Tidy ****

Anyone like me with a background in programming is likely to be fascinated (if horrified) by books that present stories of hacking and other destructive work mostly by young males, some of whom have remarkable abilities with code, but use it for unpleasant purposes. I remember reading Clifford Stoll's 1990 book The Cuckoo's Egg about the first ever network worm (the 1988 ARPANet worm, which accidentally did more damage than was intended) - the book is so engraved in my mind I could still remember who the author was decades later. This is very much in the same vein,  but brings the story into the true internet age. Joe Tidy gives us real insights into the often-teen hacking gangs, many with members from the US and UK, who have caused online chaos and real harm. These attacks seem to have mostly started as pranks, but have moved into financial extortion and attempts to destroy others' lives through doxing, swatting (sending false messages to the police resulting in a SWAT te...