Back to News
quantum-computing

The Future of Quantum Computing (No Crystal Balls Necessary) - The Foundation for American Innovation

Google News – Quantum Computing
Loading...
13 min read
0 likes
The Future of Quantum Computing (No Crystal Balls Necessary) - The Foundation for American Innovation

Summarize this article with:

Discover more from The Quantum StackQuantum computing, clarified. A Substack for those who need to separate hype from reality, and stay on top of trends and opportunities.Over 1,000 subscribersSubscribeBy subscribing, you agree Substack's Terms of Use, and acknowledge its Information Collection Notice and Privacy Policy.Already have an account? Sign inThe Future of Quantum Computing (No Crystal Balls Necessary) What Industry Roadmaps Tell Us About the Coming Years Travis L ScholtenFeb 02, 2026331ShareWhat Gemini thinks of this article, visuallyWelcome! This is The Quantum Stack. Created by a Travis L. Scholten, it brings clarity to the topic of quantum computing, with an aim to separate hype from reality and help readers stay on top of trends and opportunities. To subscribe, click the button below.Subscribe"The best way to predict the future is to create it." — Peter DruckerIntroductionTen years ago, the first quantum devices were put on the cloud. A milestone moment in the history of the then-nascent quantum industry, it signaled that a new era was being born. Quantum was becoming accessible, adoptable, and actionable. Fast-forwarding to today, and coming off the heels of the 2025 International Year of Quantum Science and Technology, the question of “What comes next?” naturally arises. After a decade’s worth of effort, progress, and maturation of quantum, we’re approaching an inflection point.The arrival of an imminent inflection point is becoming increasingly clear. That fledgling quantum computing industry? It’s matured to the point where credible and realistic projections on advancements and milestones can be given. Ten years ago, speculating on quantum’s future was more akin to gazing into a crystal ball than laying out a Gantt chart. Today, making projections is much more akin to project management than prophecy. That this is the case is clear when considering how companies in the quantum computing industry have, over the past several years, laid out their forward-looking roadmaps. A practice started by IBM back in 2020, roadmaps have proliferated in recent years, across all qubit modalities.To see that another inflection point is on the horizon, we can look to the industry’s own projections. I have collated the latest roadmaps from across the field into a single reference document. This is intended not as a leaderboard of who is ‘winning,’ but as a mosaic of industry-wide conviction. While hardware approaches differ, the convergence of their timelines tells a remarkable story.My goal isn’t to nit-pick specific milestones, but to highlight the overlapping horizons. We are witnessing a clear shift from ‘demonstrating physics’ to ‘scaling systems.’ These roadmaps represent “industrial intent”, confirming that we are rapidly approaching a massive inflection point in system capability.A word of caution: these documents use varying ‘units of measure’ and focus on different layers of the quantum stack. Most importantly, the definition of a ‘logical qubit’ varies significantly between providers. For our purposes, the key thing to realize is that if someone says they have a logical qubit, but they can’t do any operations on it, then what they have is a logical memory, not a qubit. So when talking with potential quantum platform providers, it’s good to probe them on what they think a logical qubit is, and how they envision getting you access to them is going to help you. I tapped out a few additional thoughts on this matter back in 2024, when my predictions for 2025 included “More companies claim to have built logical qubits (and increased numbers thereof).”; see here.S-Curves of Innovation in Quantum ComputingWhen considering these roadmaps as a whole, what emerges is that by the end of this decade, we will likely be accelerating up an S-curve of computational capability and capacity. This inflection point – the arrival of “fault-tolerant quantum computing” – will unlock a new era. (In brief, fault-tolerant quantum computers will be able to generate extremely reliable results at large scale and complexity even though the act of running the computation which creates those results will still have noise.) And while the kinds of quantum computers that are envisioned to be built by the early 2030s are not nearly as powerful as those expected to come in a decade’s time, their arrival will be no less transformative.The quantum computing industry has seen these S-curves before. As alluded to above, the first one was the arrival of quantum devices on the cloud back in 2016. This S-curve showed how quantum was taking its first baby steps out of the lab, and into the world. A second S-curve was climbed around 2023 when the scale and complexity of the circuits that could be run on quantum computers began to exceed the abilities of classical computers to directly brute-force simulate them. This S-curve showed that quantum was growing into a unique computational tool in its own right. As the industry and community accelerated each of those S-curves, new insights into how to adopt quantum computers, develop workflows using them, and deploy them for computational benefit emerged. If the future is anything like the past, as we approach yet another such S-curve, we should expect to see more insights, clearer goalposts for computational benefit, and broader adoption of quantum computers as tools for scientific and commercial impact.I’ve tried to represent this pictorially in the image below. Note that the y-axis is kind of arbitrary; I was going for something which captured the idea of “becoming more complex”, not a rigorous estimate of any relevant computational quantity. Each S-curve unlocks a new era, the names of which I made up. For a slightly more technical take on these different eras, I encourage you to look at Figure 2.2 of the 2024 US Department of Energy Quantum Information Sciences Roadmap (link here).Conceptually representing the idea of S-curves for quantum computing. The y-axis is deliberately kept vague, and is meant to capture the idea “things are becoming more powerful”. Era names are non-standard, and are roughly associated with particular years.The Coming Eras of Quantum ComputingAs you navigate these roadmaps, I suggest using an art-inspired mental model:Think of the logical qubit count as the scale of the canvas: it governs the complexity or scale of the problem you are trying to solve. Think of the gate count, however, as the richness of the scene: it governs the depth and complexity of what you draw, and the resolution with which you can create detail.Let’s take a look at how both logical qubit count and gate count will evolve over the coming years.Today. While current systems produce impressive results for specific problems, they resemble a masterful sketch on a small canvas. The essential elements are present, but the scale and complexity of the scenes remain limited.Early-fault-tolerant era (c. 2030). As we approach the first S-curve of fault tolerance, and enter the early-fault-tolerant era, the anticipated gate count will increase dramatically, but the number of logical qubits will be comparable to the hundreds of physical qubits already available today. In our art analogy, we’re still sketching on pieces of paper, but now we can really refine on the details. Since the canvas size will remain comparable to today, the scope of addressable problems will be similar. On the flip side, this means that investments now in getting quantum ready – and the use case prototyping undertaken today – can be easily transitioned into the early-fault-tolerant era.Future-fault-tolerant era (c. 2035). Looking a bit further out, a few roadmaps suggest another S-curve inflection point in the mid-2030s. At this inflection point, both the number of logical qubits and gate counts are expected to increase dramatically (by factors of roughly 10 to 100). In this era, the size of the canvas expands dramatically, meaning we can create new types of scenes. Consequently, the range of addressable problems will expand significantly beyond the capabilities of current or late-decade systems. And because gate counts increase as well, the complexity of the scenes which can be created increases dramatically as well. The difference between “state-of-the-art” scenes in 2035 and “state-of-the-art” in 2030 will likely be greater than the differences between 2030 and today.Visionary-fault-tolerant era (c. 2040+). Of course, quantum computing’s future doesn’t end in the mid-2030s. And while roadmaps don’t yet extend beyond then, there are ideas being worked on today regarding quantum networking which would enable further scaling quantum computers to tens of thousands of logical qubits and beyond. In this “visionary-fault-tolerant” era (starting around, say 2040, for argument’s sake), we can envision very large-scale quantum computing data centers with quantum computers networked together. In this era, the convergence of quantum computing and networking will realize systems that, just twenty years prior, seemed like flights of fancy.Key Take-AwaysAs we approach the early-fault-tolerant era, I would encourage you to keep 3 things in mind.First, the realization of fault-tolerant quantum computers has moved from theoretical speculation to a credible industrial milestone. Across diverse qubit modalities, industry roadmaps clearly point to this inflection point within the next 5 years. Through sustained effort, patient capital, and a “commitment to compete”, the industry has transitioned scientific risks into engineering challenges, maturing the latter through sound project management. (Again, we’re now in a time where anticipating the future of quantum computing is more akin to reading a Gantt chart than consulting a crystal ball!) Of course, there are a lot of scientific challenges for further scaling (e.g., quantum networking across quantum data centers & beyond), and those themselves require a lot of hard work. And there are the scientific challenges of how to make current and near-future computers increasingly useful for discovery and innovation. So quantum computing is by no means a “solved problem”.Second, when developing use cases for these future quantum computers, remember that the map is not the territory. In particular, sometimes people like to make plots of formal resource estimates which are, as the name implies, estimates on the required capabilities a quantum computer would need to have in order to solve a specific problem of a specific problem type. (An example of such a plot is given below, taken from Figure 2 of Assessing the Benefits and Risks of Quantum Computers.) And while these can be useful, there’s sometimes the temptation to then try and overlay industry roadmaps on top of those plots, to get a sense of what applications can be unlocked by when. This approach overlooks two key facts. First, advancements in quantum algorithms and fault-tolerant implementation typically reduce resource requirements, shifting them ‘down and to the left’. Second, emerging algorithms for quantum-centric supercomputers lack formal resource estimates, leaving potential ‘green space’ as yet undefined.Resource estimates for various problems. Top: separating resource estimates based on whether the problem involves cryptography. Bottom: separating resource estimates based on broad application family. Note 1: the x-axis of these plots involves the number of a very particular and special gate. Note 2: on the y-axis, “circuit width” should be thought of as being comparable to “logical qubits”. Taken from Figure 2 of Assessing the Benefits and Risks of Quantum Computers.Third, inflection points trigger profound and lasting change; consequently, the urgency to become both ‘quantum-ready’ and ‘quantum-safe’ is accelerating. As noted above, because the “canvas size” won’t change much between now and 2030, the work organizations do today in scoping problems relevant to today’s quantum computers can be easily repurposed at the end of the decade. And assuming a modest amount of time is required to develop organizational proficiency, getting going sooner rather than later becomes an increasingly strategic position to take. With respect to getting quantum safe, we haven’t touched the question of when cryptographically relevant quantum computing might arrive, But given what we currently know as far as the hardware roadmaps and resource estimates are concerned, it’s clear that the window for getting quantum safe is shrinking. Given the NSA’s mandate for national security systems to be quantum-safe by 2035, other organizations should aim for a similar deadline to ensure long-term security.Wrap Up: Beyond the Roadmap — From Observation to OperationTen years ago, a common refrain I heard in my work was: “What is a qubit, and why should I care?” Today, the conversation has centered on a much more practical question: “How do I put them to work for my specific computational problems?” With the rise of AI accelerating alongside quantum hardware, there’s an even broader strategic question to consider: “What is the role of advanced computation in ensuring my organization’s long-term competitiveness?” To capitalize on this inflection point, organizations must move from passive observers to active collaborators. This requires two key strategic shifts:Activating Domain Experts: Meaningful pilot projects shouldn’t just be “innovation theater.” They must be designed to inform future strategic considerations, giving domain experts the hands-on experience needed to identify where quantum will provide a true competitive edge.Normalizing Advanced Computing: We must move quantum from a “curiosity for the innovation team” to an essential component of regular business operations. This shift ensures the technology is integrated into the durable, long-term fabric of the organization.The Power of “Problem-Inspired” InnovationA core activity in these pilot projects is what Matt Langione (BCG) calls “demand-side innovation.” Instead of just waiting for the hardware to arrive, organizations can help create the future of quantum algorithms themselves.Developing quantum algorithms is a distinct skill set from simply applying known ones. By leaning into “problem-inspired” algorithm development, the value of the research is clear: it gets you closer to an organizationally-relevant advantage on current hardware, while preparing the ground for those “richer scenes” of tomorrow.The Three Signals of the Inflection PointIf you are still wondering, “How will I know when the inflection is happening?”, monitor these three “signal events”:Logical Qubit Realization: Multiple providers demonstrating the ability to create and sustain logical qubits (acknowledging the current lack of a universal standard definition).Beyond-Classical Milestones: The achievement of state-of-the-art, beyond-classical results specifically using those logical qubits.Production-Ready Tooling: The maturation of software stacks and developer tools to a point where they can be reliably deployed in production environments rather than just research labs.From Finger Paints to MasterpiecesThink of your first childhood drawings: they weren’t masterpieces, but they were essential for learning and growth. You practiced on scraps of paper so that one day, you’d know what to do with a real canvas.The quantum industry is currently in a what might be dubbed a "sketchbook" phase. Our current algorithms and use cases might sometimes feel like those early drawings: limited in scope, and low-resolution representations of the richness of our imagination. But the industry roadmaps make one thing clear: by the 2030s, large, high-quality canvases are going to be available. Will you be ready?Start sketching now, because when those canvases are finally unveiled, the world will be looking for the those who already know how to compose.Was this article useful? Please share it.ShareP.S. In addition to The Quantum Stack, you can find me online here.NOTE: All opinions and views expressed are my own, are not representative or indicative of those of my employer, and in no way are intended to indicate prospective business approaches, strategies, and/or opportunities.Copyright 2026 Travis L.

Read Original

Tags

quantum-computing

Source Information

Source: Google News – Quantum Computing