With IBM boasting of having launched the first commercial quantum computer in 2019, and with Google’s declared demonstration of quantum supremacy in the same year, it seems that the quantum future is well under way. Or is it all hype and wishful thinking? It is important to understand precisely the extent to which it is and the extent to which it is not.
It has taken quantum scientists and engineers four decades to realize, if only partially, the vision of Richard Feynman, who proposed the idea of a quantum computer in the early 1980s. Quantum computing (QC) won’t be just another tech fad, nor the quantum computer merely another type of gadget. Quantum has the potential to address some of society’s biggest problems at the most fundamental level — provided hype does not ruin its prospects. That the industry “has a hype problem” has been widely suggested by experts such as Scott Aaronson and publications such as the MIT Technology Review.
This post is an attempt to get at the truth of the matter, without diminishing real quantum accomplishments. We look at examples of both positive (enthusiastic) and negative (skeptical) hype about quantum, trying to discover a healthy balance.
“Quantum Computers” Are Hype
First and foremost, we are still very far away from having a true quantum computer – or anything similar to those imagined full-stack quantum machines that are futuristically predicted to transform human civilization. The “quantum computers” that QC companies report having built are not, in fact, “computers”.
They are experimental, rudimentary quantum processing devices that are still in the early stages of striving to control the fundamental units of quantum information, qubits, named by analogy with the “bits” of classical computing. (See our earlier introduction to the fundamentals of quantum computing for beginners.)
Therefore, the term “quantum computer,” when applied to any device that exists today, is technically hype. Of course, quantum professionals know exactly what these devices are, but the general public can be expected to be misled by the term.
We can sympathize with the hope that the results of these promising experiments will evolve into what are recognizably computers. But even when that happens, they will still be far away from true quantum computers. We will first have “hybrid” computers instead: classical machines with quantum components. Even that will be a major breakthrough, and we are not there yet.
The “Quantum Decade” Is the “NISQ” Era
Despite being theoretically possible, quantum computers are appallingly hard to build due to quantum decoherence – the quick collapse of information-containing quantum states due to environmental noise. Unlike classical bits, qubits are highly fragile, being sensitive to environmental disturbances in the form of temperature changes, cosmic rays, electromagnetic radiation, and so forth. These quantum fluctuations are called “noise.” Despite some advances in the methods of quantum error-correction, decoherence persists.
IBM has optimistically declared the 2020s the Quantum Decade. More realistically, it ought to be declared the quantum transition decade. The industry understands this and already labels itself, a tad esoterically, “NISQ.”
NISQ – a term of art used in the industry – was coined by the US theoretical physicist John Preskill in 2018 to refer to the “noisy intermediate-scale quantum” computers of the near future. In the “NISQ” era, the leading quantum processors are not yet advanced enough to achieve quantum advantage i.e. the ability of a quantum machine to solve a complex computational problem that no classical supercomputer can solve within a realistic timeline. A NISQ computer is not fully error-corrected.
The first generations of quantum algorithms, including some of the best known, such as Shor’s integer factorization algorithm and Grover’s algorithm for unstructured database and unordered list search, were rather “theoretical,” hardware-agnostic. NISQ-era algorithms are more hardware-specific and are designed to be inherently noise-resilient and less error-prone. But algorithms alone do not guarantee success: hardware limitations may play spoilsport.
In his insightful piece, “Quantum computing has a hype problem”, in the MIT Technology Review, prominent theoretical physicist Sankar Das Sarma welcomes experiments with “noisy” qubits is an excellent research idea, argues that NISQ computers offer no computational advantage over classical computers. “I have asked researchers involved in various startups how NISQ would optimize any hard task involving real-world applications, and I interpret their convoluted answers as basically saying that since we do not quite understand how classical machine learning and AI really work, it is possible that NISQ could do this even faster. Maybe, but this is hoping for the best, not technology,” writes Sarma.
As Sarma observes, this has not reined in investments. While the commercial potential of quantum computing is not in dispute, the sheer pace at which corporations and startups have been plowing money into QC has come as a surprise to quantum researchers in the academic community.
All things considered, a NISQ device is unlikely to achieve error-corrected quantum computation. While numerous NISQ-oriented algorithms are being published, they are yet to outperform classical computation.
The best bet for quantum computing application in the NISQ era seems to be quantum simulation. To perform quantum simulation, a NISQ device will need a minimum of 100-200 qubits with much better coherence time than is currently achievable today. Even so, the results of such simulations will first be scientifically interesting before being commercially viable.
“NISQ devices,” says John Preskill, the originator of the term, “will be useful tools for exploring many-body quantum physics, and may have other useful applications, but the 100-qubit quantum computer will not
change the world right away — we should regard it as a significant step toward the more powerful quantum technologies of the future.”
Quantum Hardware Today
The science for quantum computing is not yet fully worked out, let alone the engineering. Much of the future depends on current and future experimentation. Meanwhile, we are witnessing a race in the industry among several developments based on different hardware approaches.
Contenders for the physical basis include superconductors (pursued by Google Quantum AI, as seen in Google’s 2019 “quantum supremacy” demo, and by AWS, IBM, Intel, IQM, and Rigetti); trapped ions (pioneered by IonQ and Quantinuum); photons (harnessed by ORCA, Photonic Inc, PsiQuantum, Quantropi); “cold” or neutral atoms (Atom Computing, ColdQuanta, Pasqal, QuEra); silicon dots (Equal1 Laboratories), and others. Microsoft’s “topological qubits” have likewise raised many hopes.
We are naming only a few companies in each category in the interests of space. All these trailblazers deserve our admiration for advancing quantum research. But their qubits remain affected by problems of decoherence, connectivity, and scaling.
In the words of Sankar Das Sarma, “quantum computing is indeed one of the most important developments not only in physics, but in all of science.” It is saddled with a multitude of interdisciplinary expectations. But this diversity of hardware developments shows that quantum technology has not yet properly matured.
Geek note: This moment in history recalls the situation in the early stages of classical computing, prior to the 1960s with their IC (integrated circuit) revolution. In the 1960s, transistors became the universal basis of digital computing, having proven themselves to be vastly superior to vacuum tubes and all other earlier computer architectures.
The universal winner in quantum computing – the quantum equivalent of the silicon microchip – is yet to be discovered. And since cutting-edge quantum research is capital-intensive, researchers have an incentive to paint an idyllic picture to investors in a bid for funding.
Meanwhile, as leading quantum computer scientist Scott Aaronson has suggested, it is far from clear if QC is capable of realizing within a reasonable timeframe all the grand hopes that are pinned upon it, such as revolutionizing artificial intelligence and machine learning, reversing climate change, accelerating drug discovery and even curing cancer. As Aaronson notes, so much hype is being generated by industry experts themselves that it is increasingly hard to distinguish between the claims of science and those made by vested interests.
Ultimately, the proof of the quantum pudding will be in the promised commercialization of quantum.
Combining Qubits: Quantum Metrics and Benchmarks
In 2020, IBM announced a “road map” for building quantum computers and promised a 1000-qubit quantum computer by 2023. Although IBM carries plenty of authority, talk of high qubit counts has now been recognized in the industry as a form of “quipe.”
The first thing to note here is that qubit counts refer to physical qubits that a quantum system can create. Physical qubits are necessary, but they do not mean very much in and of themselves. Theoretically, it is possible to combine many qubits into one fault-tolerant logical qubit that is stable and is subject to algorithmic manipulation. Currently, we do not have a single fault-tolerant logical qubit, although some hardware solutions promise to get us there.
Seeing that merely having a high qubit count no longer counts, IBM has come up with its own, improved metric, Quantum Volume (QV), which strives to become the industry standard. Explained simplistically, QV is a way of assessing both the performance ability and error rates of a “modest-size” NISQ computer (of about 50 qubits).
QV appeals because it is a single-number metric that can be measured using a concrete protocol, and because it is consistent with Moore’s law.
Geek note: Moore’s law is the observation made by cofounder and former chairman of Intel corporation Gordon Moore in 1965 that the number of transistors in a dense integrated circuit (IC) doubles about every two years, resulting in a comparable improvement in performance.
QV does not equal the qubit count, which may confuse outsiders. It does not have one standard definition. IBM itself amended its original 2017 formula in 2019, while Microsoft and Google have referred to “quantum volume” without having adopted IBM’s metric. Still, QV is an informative metric for visualizing IBM’s progress in quantum hardware (see the relevant figure).
How Does Quipe (Quantum Hype) Arise?
Quipe has a curious way of arising. Much of it seems to come from bloggers, journalists, and other enthusiasts, who are prone to misrepresent quantum news. The pioneers of quantum engineering need not be solely to blame for it, but their own marketing also contributes an element of hype.
On January 8, 2019, at the Consumer Electronics Show (CES) held in Yorktown Heights, NY, IBM unveiled its new Q System One™. The headline of IBM’s official press release read: “IBM Unveils World’s First Integrated Quantum Computing System for Commercial Use.” Note the somewhat guarded if vague scope of “for commercial use.” What IBM meant was that, using this device, businesses can finally get started exploring quantum computing for possible future commercial applications.
The TechCrunch headline of the same date read: “IBM unveils its first commercial quantum computer.” Thus ariseth quipe. Exploring prototypes and proofs of concepts is important, but it is not the mainstream definition of “commercial use,” and IBM’s Q System One does not fit the image suggested by the phrase “commercial quantum computer.”
One can see examples of journalistic hype surrounding Microsoft’s recent breakthrough in topological quantum computing. Many hope today that Microsoft’s topological qubits will allow for natural error correction, solving the problem of decoherence. The topological approach to quantum computing is about encoding information “globally” into a large structure, so that local perturbations do not matter as much, as they fail to degrade the topological stability of the whole. Although Microsoft has not yet produced any topological qubits, it has made important advances in proving some of the science that will eventually enable them.
Microsoft was very careful and specific in its announcement of March 14, 2022 regarding its recent breakthrough in advancing the topological approach. Microsoft has not claimed that it has created topological qubits, but merely that it has “proven some of the science” that may make such qubits possible in the long run. Nonetheless, the claim propagated in the press (e.g. in a TechRadar post of March 21) – that “Microsoft has developed a whole new kind of qubit” – is a hype interpretation of the same development.
Geek note: The topological approach to quantum computing was first proposed in the late 1990s by Russian-American physicist Alexei Kitaev. It uses special entities, “quasiparticles” called anyons, whose paths in spacetime (called “worldlines”) intertwine into “braids” in a three-dimensional spacetime (2d plus time). These braids can form the topological logic gates. Local decoherence does not affect the braid’s overall shape. In a remarkable breakthrough most recently, Microsoft has created a topological heterostructure – a “sandwich” of indium arsenide and aluminum – that has enabled it to produce evidence of the (previously theorized) Majorana zero modes or MZMs. These can be used to represent non-abelian braid groups that will enable decoherence-free topological quantum computation.
Companies have an incentive to overstate their promise, so as to attract capital. The corporate giants have the additional incentive to hype, so as to attract government support. There is the tendency in the tech press to overgeneralize a handful of quantum success stories. Add to that the overexuberant claims of some research organizations, cash-strapped startups, and governments driven by their ambitions of establishing technological sovereignty.
Sabine Hossenfelder, a theoretical physicist at the Frankfurt Institute of Advanced Studies who is also known for her popular quantum channel on YouTube, cautions us that investors will be disappointed when the alleged potential of QC will not have materialized within the expected timeline of five or so years, and when they eventually pull out, the quantum bubble may rapidly deflate.
The British encryption startup Arqit Quantum Inc. is a case in point. In 2021, the company made headlines with their promise of building an encryption system that would protect present-day cybercommunications from the disruptive potential of next-generation computers. However, according to recent investigations carried out by the Wall Street Journal, Arqit exaggerated its achievement in an attempt to secure large-scale funding. Given that the readiness of its inhouse encryption technology is largely suspect, and the company’s revenue and profit estimates are allegedly far-fetched. Regardless of its low technology readiness level and questionable customer base, Arqit had been planning to sell its product to the defense industry and the corporate sector.
It is important, especially where national and major vested interests are involved, to make a clear distinction between tentative proofs of concept and actual implementation and commercialization. According to cryptographer and entrepreneur Steve Weis, Quantum Key Distributions (QKD)—the technology purportedly used by Arqit for developing its encryption system—currently has limited real-world applications, notwithstanding their potential usefulness in the long term. At the moment, QKD cannot be integrated into modern networking technology, and would require a full-scale physical restructuring of the internet.
Overhyped claims about QC, coupled with the lack of matching results, have led to a public dread of a possible quantum winter. Given the current proliferation of q-based brands and domain names in the quantum space, it is timely we added “quinter” to our coinage of “quipe”!
However, we are far from pessimistic and would not be writing this blog if we did not have ample faith in quantum computing, whose promise is based on solid science.
The real issue is when this promise will start coming to fruition. Some of it – especially in the field of quantum cryptography and cybersecurity – is just around the corner. Other applications will take time to prove themselves, but it’s only a matter of which decade: this (as we may hope and as IBM and other optimists would have us believe) or the next. The jury is still out.
Why Quipe Is Dangerous
One may argue that a certain amount of hype is both inevitable in commerce and even healthy. Victor Galitski, quantum physics professor at the Joint Quantum Institute, University of Maryland, lucidly explains the negative consequences specific to quantum hype.
According to Galitski, hype incentivizes “people with no relevant background” to try and cash in on the “bubble.” Pseudo-quantum startups and entities peddle far-fetched claims to entice investors looking to “spice up” their portfolios with the Q-word. Sustained by unwitting and overoptimistic investors, fraudulent quantum ventures headed by charlatans brazenly propagate “junk science,” jeopardizing the legitimacy of a field that demands, in Galitski’s words, “years of education, work, and dedication.”
Another potential danger of quipe, Galitski observes, is the emergence of “quantum Ponzi schemes” for sustaining QC startups whose purported quantum capabilities resist straightforward verification (given the inalienable complexity of quantum science), thus leaving ample space for easy fraud.
Where issues like data security are involved (e.g. in banks), quipe may be used to bait customers into signing up for phony quantum services. “There are of course many more ‘opportunities’ to fool investors and the public in such a complicated space. But the Ponzi schemes cannot continue forever, and sooner or later they tend to crash, wiping millions of dollars of investors’ money,” adds Galitski.
In the final analysis, Galitski anticipates the negative impact of hype on the science at large. “The manifestly false promises of quantum computing,” Galitski contends, “routinely made by unqualified individuals, and the high potential for fraud in the less-than-transparent business schemes of the QC companies operating in a fake-it-until-you-make-it market all but guarantee an eventual collapse.” Such a collapse, Galitski fears, would impair the reputation of scientists and of science.
Galitski’s take on quantum computing is to be reckoned with. The locomotives of quantum progress are the scientists. This physicist’s position captures a true scientist’s indignation – from within the field – at pseudoscientific quantum razzmatazz.
Negative Hype and the Recent Attack on IonQ
Having seen the dangers of hype, we must also understand that pessimism and skepticism suffer from a false sensationalism of their own and can also be manipulated unethically for selfish ends. Negative hype can be damaging to honest businesses and even to industry leaders.
In May 2022, Scorpion Capital, a short-selling activist investor, published a scathing report on IonQ, one of the dominant players in the quantum computing field. In a stinging tactic as if prefigured by Scorpion Capital’s brand name and logo, the 183-page “report” unjustly belittles and mocks IonQ’s achievements in quantum computing, brazenly calling the company a “hoax” and its qubit systems “toys.”
In response, Russ Fein, venture investor with a deep interest in quantum computing, mounted a strong defense of IonQ in his corrective published on Medium, rightly calling the Scorpion Capital report “poorly constructed” and “a big cut-and-paste exercise.”
Fein makes it clear that the author of the report runs a short hedge fund, an entity that borrows and sells stocks, only to buy them back in the event of a drastic fall in price. “It is in the author’s immediate self-interest to cause a panic that leads to a steep and fast stock price decline,” points out Fein. Notably, the report passes over IonQ’s many impressive achievements.
Fein also reminds us that companies like Amazon, Google, NEA, and Breakthrough Ventures have all found IonQ good enough to invest in. Microsoft, Amazon, and Google make IonQ’s machines available via their quantum cloud services, and multibillion-dollar corporations such as Accenture, Goldman Sachs, and Hyundai are among IonQ’s customers.
Another group of prominent quantum influencers – David Shaw, Doug Finke, and André M. König – soon published an even more detailed rebuttal of the Scorpion Capital report, defending not only IonQ but quantum computing as a whole. Against the backdrop of Scorpion Capital’s trumped-up charge, the authors maintain that fresh progress in quantum computing, with multiple demonstrations of extra-classical simulations and tentative logical qubits, cannot be wishfully disregarded.
IonQ employs trapped ions as the physical basis for their quantum development. As the authors rightly conclude, IonQ has done a praiseworthy job on this front, having managed to successfully operate their device with up to about 21 qubits, albeit falling short so far of their stated target of 32 qubits. “The IonQ 11Q device deservedly debuted in Nature in 2019. It has continued to perform well in independent benchmarks,” the authors concur.
Negative hype notwithstanding, IonQ is not any kind of “hoax,” but a leader in quantum advancement.
Room for Quantum Optimism
While challenges persist, the majority expert view is that the quantum future is not a pipedream. Most serious analysts, while warning against hype, are believers that QC will be realized in the foreseeable future. Some of them are investors in QC, putting their capital where their mouths are. Scott Aaronson is more cautious, on record as having said he is “glad that others are investing” in quantum companies.
French quantum technology expert Olivier Ezratty reminds us that the invention of transistors and lasers in the mid-20th century was the fruit of quantum physics research. Quantum theory underlies much of what constitutes the present digital era. In Ezratty’s opinion, it is important to educate the public about the progress taking place in quantum. Critics must take pains to understand the technical challenges of QC implementation.
“Useful quantum computing is probably bound to be at best a long-lasting quest, spanning several additional decades. Patience will be a virtue for all stakeholders, particularly governments and investors,” Ezratty adds.
We agree with this assessment and are both optimistic and patient. Advances in quantum technology are accelerating. Quantum computing is constantly in the news, tirelessly boosting our adrenal level. They are all pointing in the hopeful direction.
Given the diversity of quantum, it would be useful to remedy the present scattered nature of international research by encouraging global collaboration. Equally, addressing moral concerns and making quantum ethical from the start is a must for this ultramodern industry.
Quantum computational advantage has been demonstrated by Google for their superconductor approach and most recently, on the first of this month of June 2022, by the Canadian quantum company Xanadu for their photonic approach on their Borealis quantum computer, which they have made available on the cloud. These demonstrations do not generalize yet: they are limited to ad hoc computational tasks with no relevance to commerce, but they show real progress.
Some quantum computing providers already promise a small but definite “commercial advantage.” This may indeed be true in areas where the development and refining of use cases for quantum applications has independent value. However, this is not the kind of “commercial advantage” that we ultimately expect from QC.
While companies periodically make claims of quantum advantage, we are certain that when the commercial advantage of quantum computing is finally demonstrated in a bulletproof manner, the news will be found on the front pages of the major newspapers, impossible to miss. Until such time, we must remain hopeful and optimistic, but rational and critical of quipe.
 Sankar Das Sarma, “Quantum computing has a hype problem” (MIT Technology Review, March 28, 2022). [⇑]
 Scott Aaronson, “Quantum computing for policymakers and philosopher-novelists” (Shtetl-Optimized, scottaaronson.com, June 6, 2018). [⇑]
 John Preskill, “Quantum computing in the NISQ era and beyond” (arXiv:1801.00862v3 [quant-ph], 2018).[⇑]
 Scott Aaronson, “QC ethics and hype: the call is coming from inside the house”, (Shtetl-Optimized, scottaaronson.com, March 20, 2021)[⇑]
 Adrian Cho, “IBM promises 1000 qubit quantum computer—a milestone—by 2023” (science.org, September 15, 2020).[⇑]
 Chetan Nayak, “Microsoft has demonstrated the underlying physics required to create a new kind of qubit” (Microsoft Research Blog, March 14, 2022).[⇑]
 Daphne Leprince-Ringuet, “Quantum computing is just getting going. But the hype could bring everything crashing down” (ZD Net, Nov 1, 2021).[⇑]
 Byron Tau, Dustin Volz, Eliot Brown, “British Encryption Startup Arqit Overstates Its Prospects, Former Staff and Others Say” (The Wall Street Journal, April 18, 2022).[⇑]
 Victor Galitski, “Quantum computing hype is bad for science” (LinkedIn, June 16, 2021).[⇑]
 Russ Fein, “(Quantum) winter is coming… or is it?” (Medium, May 4, 2022).[⇑]
 David Shaw, Doug Finke, and André M. König, “Weathering the first quantum short” (Quantum Computing Report, May 7, 2022).[⇑]
 Olivier Ezratty, “Mitigating the Quantum Hype” (arXiv:2202.01925v3 [physics.soc-ph], Jan 23, 2022).[⇑]
 Madsen, L.S., Laudenbach, F., Askarani, M.F. et al. “Quantum computational advantage with a programmable photonic processor” (Nature, 606, 75–81, 2022).[⇑]