The History of Quantum Computing: From Theory to Systems

Summarize this article with:
Insider BriefWhile everyone is scrolling through the AI hype, a deeper revolution is already underway.There is a line from Lewis Strauss that has always stuck: “Amateurs seek the sun. Get eaten. Power stays hidden in the shadows.” Quantum technology is exactly that – hidden in the shadows, largely ignored by the general public, slowly becoming one of the most consequential fields in modern science.For most of the past century, quantum sounded like science fiction. Pure theory, abstract mathematics, the domain of physicists arguing in lecture halls. But that’s not the case anymore.Understanding the history of quantum computing helps clarify where the field actually stands today. The path from theoretical physics to working hardware has taken decades and passed through several distinct phases, each one building on unresolved problems from the last. This article traces that path – from the early theory to the engineering work happening right now.The theoretical groundwork for quantum computing emerged from early 20th-century physics. Max Planck’s 1900 proposal that energy is quantized opened the field of quantum mechanics. Albert Einstein, Niels Bohr, Erwin Schrödinger and Werner Heisenberg each extended the theoretical framework, establishing the principles of superposition, wave functions, uncertainty principle and quantized energy states that quantum computing would later depend on.These developments did not immediately suggest computational applications. That connection came later, from physicists thinking about the limits of classical simulation.The idea of computation based on quantum systems took shape during this period.In May 1981, Richard Feynman argued at a conference that simulating quantum systems on classical computers becomes exponentially inefficient as system size grows. He proposed that a machine governed by quantum mechanics could simulate quantum systems without that overhead. The paper was published in 1982.In 1985, David Deutsch at Oxford formalized the concept. His paper in the Proceedings of the Royal Society described a universal quantum computer and introduced quantum parallelism – the ability of a quantum system to process multiple states simultaneously.The field gained urgency in 1994, when Peter Shor developed a quantum algorithm capable of factoring large integers efficiently. Because integer factorization underlies widely used encryption schemes including RSA. The result established quantum computing as a cryptographic concern as well as a computational one. In 1996, Lov Grover introduced a quantum search algorithm offering a quadratic speedup over classical search. Both algorithms demonstrated that quantum systems could outperform classical ones on specific, well-defined problems.Experimental work began in the late 1990s using nuclear magnetic resonance (NMR) systems, which allowed researchers to manipulate small numbers of qubits using molecular spin states.In 1998, researchers at Oxford – Jonathan Jones and Michele Mosca – ran the first experimental demonstration of a quantum algorithm, using a 2-qubit NMR system to solve Deutsch’s problem. That same year, Isaac Chuang, Neil Gershenfeld, and Mark Kubinec demonstrated Grover’s search algorithm on a 2-qubit NMR system using chloroform molecules at IBM Almaden and UC Berkeley.Multiple hardware approaches developed in parallel during this period involving trapped ions, superconducting qubits, neutral atoms, and photonic systems. Each carried different trade-offs in coherence time, gate fidelity, and scalability. None reached a scale suitable for practical computation.In February 2007, D-Wave publicly demonstrated a 16-qubit quantum annealing prototype at the Computer History Museum in California. Quantum annealing is a specialized approach targeting optimization problems rather than general-purpose computation. D-Wave’s systems have remained a subject of ongoing debate regarding their classification and computational advantages relative to classical hardware.IBM’s decision to open cloud access to quantum hardware in 2016 marked a significant change in how researchers and developers could interact with quantum systems. The IBM Quantum Experience allowed users to run programs on real quantum hardware via a browser interface, broadening participation beyond well-funded research labs.Qubit counts increased gradually through this period, but error rates remained high. John Preskill of California Institute of Technology coined the term NISQ (Noisy Intermediate-Scale Quantum) in 2018 to describe quantum systems that can perform certain tasks beyond classical reach but remain too noisy for reliable general computation.In October 2019, Google reported that its 53-qubit Sycamore processor completed a specific sampling task in 200 seconds, which it estimated would take the world’s fastest supercomputer 10,000 years. IBM disputed the classical estimate within weeks. In 2022, researchers at the Chinese Academy of Sciences showed the same task could be completed in hours using GPU-accelerated classical algorithms.Since then, the field has remained within the NISQ regime, but the focus has changed. Rather than increasing qubit counts alone, research has moved toward improving system quality, particularly error rates, logical qubits, and early forms of error correction. There have been several key developments during this period, though the list below is not exhaustive: IBM – Quantum Utility (June 2023) IBM and UC Berkeley published results in Nature demonstrating that a 127-qubit Eagle processor produced accurate results for a physical simulation at a scale beyond what classical supercomputers could reliably verify using brute-force methods. The experiment used zero-noise extrapolation for error mitigation rather than full error correction. IBM described this as entering the “quantum utility” era – NISQ devices producing results of genuine scientific value ahead of fault tolerance.Harvard, QuEra, MIT, NIST/UMD – A team led by Harvard University, in collaboration with QuEra Computing, MIT, and NIST/UMD, published results in Nature demonstrating the first programmable logical quantum processor at scale – 48 logical qubits executing complex algorithms on up to 280 physical qubits using reconfigurable neutral atom arrays. The system showed that increasing error correction code distance reduced logical error rates, the first experimental confirmation of this relationship at this scale.
As Mikhail Lukin, co-director of the Harvard Quantum Initiative, noted, “the fundamental ideas of quantum error correction and fault tolerance are starting to bear fruit.”Google Willow – Google’s Willow chip demonstrated below-threshold quantum error correction – the point where adding more physical qubits reduces rather than increases the logical error rate. This had been a theoretical requirement for scalable error correction since the 1990s but had not previously been demonstrated in a physically relevant system at this scale.IBM Quantum Loon – IBM’s Quantum Loon processor demonstrated all hardware components required for fault-tolerant quantum computing on a single chip – including qLDPC-compatible architecture, c-couplers for long-range qubit connectivity, and real-time error decoding under 480 nanoseconds. IBM also reported a 10x speedup in error correction decoding, one year ahead of its internal schedule.Quantinuum – 94 Logical Qubits Beyond Break-Even (March 2026) Quantinuum demonstrated quantum computations using up to 94 error-protected logical qubits on a trapped-ion processor, achieving beyond-break-even performance where encoded qubits outperformed unprotected hardware. The work remains partially fault-tolerant and relies on postselection, but represents the largest logical qubit computation demonstrated to date on a trapped-ion system.In parallel, NIST published its first post-quantum cryptography standards in August 2024 with transition guidance calling for RSA-2048 and ECC-256 to be deprecated by 2030 and disallowed after 2035.Quantum computing is moving from proof-of-concept demonstrations to sustained engineering work on the systems required for practical use.The central challenge is fault-tolerant quantum computing, where logical qubits encoded across many physical qubits can run long computations reliably. Riverlane’s 2025 survey of over 300 quantum professionals found 2028 emerging as an informal industry deadline for meaningful fault-tolerant integration. IBM’s published roadmap targets Starling in 2029. The US Department of Energy announced a Grand Challenge in April 2026 targeting a first fault-tolerant quantum computer by 2028 – a timeline Yale physicist Steven Girvin described to Science as “a very optimistic but worthy goal.”Progress depends on continued advances across error correction codes, physical qubit fidelity, decoder speed, and manufacturing yield. None of these are fundamental barriers, but all require sustained engineering work that does not yet have a confirmed delivery date.Quantum operations are fragile and prone to errors from environmental interference (decoherence). High error rates mean quantum computations lose accuracy. For practical quantum computing, error rates must be low enough that error correction can actually reduce errors faster than they accumulate—this threshold was achieved by Google’s Willow in 2024, representing a watershed moment for the field.Physical qubits are the actual quantum devices (superconducting circuits, trapped ions, etc.). Due to errors, we need multiple physical qubits to reliably encode one logical qubit through error correction. Current systems have mostly physical qubits; fault-tolerant computers will have many logical qubits. The transition from physical to logical qubits is one of the primary engineering challenges ahead.It’s unclear. Superconducting qubits (IBM, Google), trapped ions (Quantinuum, IonQ), topological qubits (Microsoft), photonic approaches (Xanadu), and neutral atoms each have strengths. The winner may depend on specific applications—different platforms may dominate different domains. 2025 shows increasing evidence that multiple platforms will coexist rather than a single technology dominating.For more on quantum computing fundamentals, explore our guide to types of quantum computers, and learn about the cutting-edge quantum computing challenges researchers are tackling today. Discover quantum computing applications transforming industries.Interested in the commercial landscape? Check out our coverage of Google’s Willow breakthrough, explore quantum computing startups driving innovation, and learn about leading quantum computing countries competing globally.Share this article:Keep track of everything going on in the Quantum Technology Market.In one place.
