
Quantum computing often grabs the headlines. The word "quantum" itself is intriguing enough, and combined with the promise of computational power that surpasses anything we have seen so far it becomes irresistible. But what exactly is quantum computing?
To get to grips with quantum computing, first remember that an ordinary computer works on 0s and 1s. Whatever task you want it to perform, whether it's calculating a sum or booking a holiday, the underlying process is always the same: an instance of the task is translated into a string of 0s and 1s (the input), which is then processed by an algorithm. A new string of 0s and 1s pops out at the end (the output), which encodes the result. However clever an algorithm might appear, all it ever does is manipulate strings of bits — where each bit is either a 0 or a 1. On the machine level, this either/or dichotomy is represented using electrical circuits which can either be closed, in which case a current flows, or open, in which case there isn't a current.

Quantum computing is based on the fact that, in the microscopic world, things do not behave in the same way we expect them to in our macroscopic world. A quantum computer works with particles – called qubits – that can make use of this strange quantum nature. Because of something called quantum superposition qubits can not only take on the values 0 or 1, they can also take on the values 0 and 1 simultaneously. Another strange quantum phenomenon called quantum entanglement means that a relatively small number of qubits can describe information that would require a huge number of ordinary bits. Superposition and entanglement taken together mean that quantum computers are potentially a lot more powerful than ordinary ones. (You can read more about this strange quantum behaviour in our ridiculously short introduction to some very basic quantum mechanics.)
This means that quantum computers operate in a very different way to classical computers. You can write very different algorithms for quantum computers that can do things classical computers can't. An example is Shor's algorithm, developed by American mathematician Peter Shor in 1994, before quantum computers existed. If a quantum computer was able to run Shor's algorithm, it could break any of the cryptography used today to secure the internet. (You can read more about the impact of quantum computers on cryptography in this article. And you can also read more details on how quantum computers work and what thesequantum algorithms look like.)
Even though there aren't quantum computers capable of running Shor's algorithm yet, protecting our networks against this potential quantum danger is already important. "Luckily there will not be a single 'Q-Day' where everything suddenly becomes vulnerable," said Zygmunt Lozinski from IBM Research, one of the speakers at an event, Quantum Computing: Applications and Challenges, organised by the Newton Gateway to Mathematics in November 2024.

Large-scale quantum computers may start coming online in the next decades, but only a few at a time. And these initial quantum computers will probably be focussed on very high-value research that can't be feasibly done in other ways, such as modelling the reactions inside electric vehicle batteries or folding proteins. But the consensus is that quantum computers capable of breaking classical cryptography will eventually become available in the future. "We need policy makers and industry not to kick the can down the road, and identify what interventions are needed today. There's a huge amount of work to be done."
This introduction to quantum computing is based on our collection What is quantum computing? and on the article Post-quantum cryptography: Making the world's networks quantum safe.
This article was produced as part of our collaborations with the Isaac Newton Institute for Mathematical Sciences (INI) and the Newton Gateway to Mathematics.
The INI is an international research centre and our neighbour here on the University of Cambridge's maths campus. The Newton Gateway is the impact initiative of the INI, which engages with users of mathematics. You can find all the content from the collaboration here.

