It is not an overstatement to say that quantum computing is all set to usher in the next Industrial Revolution. The chart above, from Qureca, shows the estimated global public investment in quantum initiatives in 2022. IBM has already declared the 2020s the Quantum Decade. By 2025, the quantum computing market will be worth $949 million.
So why are multibillion-dollar corporations, such as IBM, Google, and Microsoft investing in quantum computing? Are quantum computers nothing but more advanced classical computers? If so, are classical computers on the verge of extinction?
This blog post aims to answer those questions while clarifying the core concepts of quantum computing.
What Is a Quantum Computer?
A quantum computer is a machine that makes use of the quantum behavior of subatomic particles for computational purposes. Whereas the building blocks of a classical computer are known as bits, the building blocks of a quantum computer are called quantum bits or qubits.
A classical bit is binary: it can only have one of two possible values, either 0 or 1. A quantum bit, on the other hand, can be in a “superposition” state of both 0 and 1 at the same time, and this is what makes quantum computers special.
The ability to leverage such quantum phenomena as superposition, interference, and entanglement is what is expected to give quantum computers an advantage over their classical counterparts and will enable us to solve more challenging computational problems that a classical computer cannot handle.
Such is the theory as well as the hope for the forthcoming generation of quantum computational devices.
Superposition and Quantum Information
Here, we use the convention of representing the qubit abstractly as a three-dimensional sphere called the Bloch sphere or the Riemann sphere. Superposition is represented as an arrow within the Bloch sphere. The arrow pointing straight down represents 1; pointing up it corresponds to 0; and in any other position it represents a superposition of 0 and 1.
Because a qubit can be in a superposition state of 0 and 1, it can have any one of an infinite number of possible values. However, its measured value always resolves to 0 or 1. One way of putting it is to say that a measurement “collapses” the qubit’s quantum state: this is the famous “wave function collapse” of quantum physics.
Under superposition, it is impossible to accurately predict the specific outcome of such a measurement. What we can reliably do, though, is calculate the probability of each possible outcome. In Figure 3, since the arrow representing the qubit’s initial state is inclined downward, pointing below the equator, the probability of getting 1 on measurement is higher than that of getting 0.
In quantum mechanics, entanglement occurs when, minimally, two particles pair up in such a way that the quantum state of one particle cannot be determined independently of the state of the other particle, irrespective of the distance between the two.
Entanglement can be effectively harnessed by quantum computers. Two or more entangled qubits create a single combined quantum state. Changing the state of one qubit instantaneously changes the state of the other qubits that are entangled with it.
Quantum computing uses entangled qubits to gain a “quantum speedup” in processing capability. Doubling the number of bits in a classical computer doubles its processing power. But in the case of a quantum computer, increasing the number of entangled qubits triggers an exponential increase in processing power.
Two entangled qubits can only be measured on the basis of a probability distribution of all the possible states (corresponding to the values 00, 01, 10, 11).
So, for one qubit we have a probability distribution over two states. For two qubits, the probability distribution is over four states. For three qubits, it is eight, and so on. In general, for N qubits, the probability distribution is over 2N states.
Interference is the phenomenon of quantum waves interacting with each other. In quantum computing, interference has the potential to cause the wave functions of the qubits either to reinforce or to cancel each other.
A wave function is the basic mathematical description of everything in quantum physics. To measure the entangled qubits, we add the individual wave functions of each qubit, producing a single wave function of a single quantum state. The adding together of the individual wave functions gives us the interference pattern.
To increase the probability of the correct answer, we use constructive interference (where two wave crests add up, producing a larger wave). To decrease the probability of the incorrect answer, use we destructive interference (where two waves cancel each other out).
The primary motivation behind the invention of quantum algorithms is that there are certain intractable problems that cannot be solved on a classical computer. We need more sophisticated algorithms to deal with them.
There are many quantum algorithms, such as the Deutsch–Jozsa algorithm, the Bernstein–Vazirani algorithm, Simon’s algorithm, the quantum phase estimation algorithm, etc., which we hope to discuss in future articles. Here, we look at one of the most widely appreciated quantum algorithms, Shor’s algorithm (developed by the American mathematician Peter Shor in 1994).
Prime factorization of very large numbers is an intractable problem because the search base of all possible integers is very large. Which is why prime factorization is used as the basis of present-day cybersecurity encryption.
Geek note: To put this in the technical language of computational complexity theory, familiar to many classical computer scientists: prime factorization is in the NP class of problems, although we do not know for certain whether it is NP-hard. There is at present no proof that it cannot be solved in polynomial time on a classical computer, although this is widely assumed to be impossible.
Modern RSA encryption runs on the assumption that it is near-impossible to break cryptography by solving large prime factorization in practical time.
But Shor’s algorithm theoretically cracks large prime factorization in polynomial time, i.e. much faster than the classical algorithms that take exponential time.
Geek note: Shor’s algorithm depends on efficiency of the quantum Fourier transform, a linear transformation on qubits that is the quantum analogue of the discrete Fourier transform and will eventually be executable on quantum computers.
The only problem is that we do not yet have a quantum computer that can run Shor’s algorithm robustly, so standard cyber encryption is secure for now.
Other Potential Applications of Quantum Computing
A wide range of applications – from finance to biotech to logistics – is projected for quantum computing once it becomes commercially viable.
In their informative Harvard Business Review article on the potential near-future applications of quantum computing, Francesca Bova, Avi Goldfarb, and Roger Melko remind us that “a key motivation in Richard Feynman’s initial proposal to build a quantum computer” was the need for an efficient way to simulate quantum mechanics. A quantum computer, being itself a quantum system, is ideally suited for this purpose.
In chemical and biological engineering, which involves the observation and manipulation of subatomic particles, quantum simulation can be highly useful. As Bova et al. argue, “As molecules get more complex, the number of possible configurations grows exponentially.” Complex chemical reactions can be simulated on a quantum computer to understand the behavior of exotic materials at low temperatures, the nature of superconductivity, and so on.
Today, the most powerful supercomputers can simulate no more than 30 particles of a quantum system. It is hoped that future quantum computers will be able to simulate large quantum systems.
It is also hoped that quantum computing will help the finance industry develop better financial modelling. Corporate financial modelling currently relies on Monte Carlo simulations, which are highly time-consuming. Quantum technology is expected to outperform its classical counterpart by improving the quality of the models and significantly reducing the time needed to develop them.
Increasingly part of mainstream discourse, ‘quantum’ is synonymous with the future. Today, the need for better cybersecurity, advancement of science (predicting climate patterns accurately, better drug design and development), and global competition are all driving the “paradigm shift” towards quantum computing.
IBM, Google, Apple and other technology giants as well as national governments and armed forces are all major players in the field. Amazon will offer cloud-based quantum processing through Amazon Web Services. New quantum startups are emerging by the day and are well funded. “Quantum computing startups have increased from a handful in 2013 to nearly 200 in 2020,” reports ZD Net. “US-based startup ColdQuanta, which is building a quantum processor based on cold atoms, launched a 100-qubit processor this year, which it hopes to upgrade to 1,000 qubits in the next three years.”
A few companies are now promising to launch a million qubits in the very near future. Unfortunately, many such boasts, though impressive, remain at the level of hype, because what really counts in computing is not the “q-volume,” but logical qubits, i.e. qubits that behave in a well-ordered, controlled manner within quantum circuits and retain their coherence long enough to be useful. There are currently no logical qubits in industry, but there soon will be, if the many optimists are to be believed.
This is not to suggest that classical computers have exhausted their usefulness. Importantly, quantum computers are not in conflict with classical computers. Already providers like Amazon Web Services, Microsoft Azure, and IBM offer cloud access to quantum technology. For the foreseeable future at least, classical computers are expected to complement quantum computing. The first quantum devices will be hybrids and will probably remain so for a while.
1. Veena Pureswaran, “The Quantum Decade”, IBM Report, 11 June 2021.[⇑]
2. Francesco Bova, Avi Goldfarb, and Roger Melko, “Quantum Computing Is Coming. What Can It Do?” (Harvard Business Review, July 16, 2021).[⇑]
3. Daphne Leprince-Ringuet, “Quantum computing is at an early stage. But investors are already getting excited” (ZD Net, September 15, 2021).[⇑]
4. Watch this helpful video by Domain of Science, sponsored by IBM’s Qiskit. (Acknowledgement: our infographics in this article were partly guided by this video).[⇑]