Back to News
quantum-computing

Beyond the Qubit: Navigating Quantum Computing Roadmaps and Challenge Pathways

Quantum Computing Report
Loading...
8 min read
0 likes
Beyond the Qubit: Navigating Quantum Computing Roadmaps and Challenge Pathways

Summarize this article with:

By Dr. Joe Spencer I’ve had this conversation with a number of people recently and wanted to share a simplified ‘thought of the day’ from how we think about quantum computing roadmaps at Global Quantum Intelligence, LLC. When developing hardware and seeking to raise, it’s clearly important to inform your investors (or as an investor, be informed) by the company roadmap and vision and how they will reach this panacea of a teraquop scale Fault Tolerant Quantum Computing (FTQC). (Note: The acronym Quop refers to reliable quantum operations. It is one measure of the performance level of a quantum computer. A Teraquop would indicate a quantum computer that can successfully process algorithms requiring one trillion reliable quantum operations.) With a lack of standards and announcements of quantum advantage in loose, disparate and inconsistent definitions that often enable one company to announce quantum advantage over others in a limited way, it can be confusing for end-users and adopters / investors in quantum to understand this roadmap zoo. A snapshot of company roadmaps from various large scale quantum providers, start ups and scale ups We see from the figure above, different roadmaps, different pictures and different unique selling propositions (USPs) as to why ‘our company is the best.’ And that’s certainly not wrong, highlighting the roadmap is key to success and highlighting your unique differentiator sets you apart from the crowd. But there’s a challenge in communicating this, lack of consistency and awareness. This is why at Global Quantum Intelligence, LLC we have developed a framework and ontology for understanding this ecosystem, allowing you to compare one Quantum Computer to another. It seems a simple thought, but one that’s quite key. We compare apples to apples. In doing so, you need to understand your apple. Can you compare a superconducting quantum computer to a photonic quantum computer? That’s comparing apples to oranges. What about comparing one type of Neutral Atom company to another Neutral Atom company? That’s perhaps Golden Delicious to Granny Smith. That’s why we created a framework to allow for this and understand when comparing inter-modality compute and intra-modality compute what aspects you should care about. This framework is based on our Quantum Tech Stack, which has evolved over a number of years and in the compute space is akin to how a supercomputing cluster is developed The GQI Quantum Computing Stack From the figure above, you can see that our tech stack doesn’t just look at the qubit (the quantum plane), but examines the whole quantum computing system. This spans the fundamental physics and engineering of the physics package, the control plane as well as ancillary / auxiliary systems that enable a quantum computer to function. We then go to the midstack which considers the error correction layer and readout. Finally the top of the stack which considers the algorithm being run, applications and user community for use-cases. Using this approach, we can apply a common lens in which to examine modalities and players. Snapshot of the key modality platforms through the GQI lens Now this chart above looks quite noisy and data heavy, but is extremely insightful when you understand the layers of this and gives you a technique to compare one qubit modality / player to another on certain areas of the stack that you care about. Maybe you want to understand the differentiators of the error correction layer between colour centres or super conducting circuits. And indeed, this is a relevant example as Photonic Inc. SHYP codes as a more speculative error correction technique (though demonstrated) compared to standard surface code readout has potential for variance in scaling between physical architectures. Put simply, your choice of error correction code matters based on your quantum plane, and some QEC techniques are more compatible with certain qubit modalities. And we expect perhaps more announcements like The Pinnacle Architecture utilising quantum low-density parity check (QLDPC) codes to allow for universal, fault-tolerant quantum computation with a spacetime overhead significantly smaller than that of any competing architecture. Developed by Iceberg Quantum, this technique may also be useful in trapped ion architectures such as Oxford Ionics (IonQ). But that is all background, the key point I want to share is how investors and adopters should think of quantum roadmaps and what we call challenge pathways in development of FTQC. Example of challenge pathways from one QC platform to another Above I have drawn up a strawman example of how we view roadmaps, and I will go through the thought and give some definitions along the way. First we need to think about the types of challenges quantum companies face on the road to fault tolerance. We broadly put these into 3 categories: Incremental Challenges: these are things to watch out for, but perhaps things that you can buy in or develop quite easily. They are relatively low risk. Engineering Challenges:- these get more interesting, and we’ve all heard the scientists in quantum companies say ‘oh that’s just engineering’ dismissing it as purely engineering. And that’s reasonable, they are still challenges, with some cost to solving, but are more medium risk. An example of this may be device packaging and development of PIC. It is not insurmountable if you have the skills and coin to do it. Scientific Challenges: These are hard problems that require some research, academia and serious insight to solve. Perhaps it’s a challenge that industry doesn’t have a solution for and research needs to be done to solve the problem. Perhaps it’s a physics limit of scaling or management problems that technology cannot solve currently. An example of this may be heat load dissipation and wiring challenges with superconducting quantum companies. Now that we have defined these, let’s think about the above example, where I have drawn up these challenge pathways in two company types that may be developing a roadmap and seeking investment. Q1 is a player who has a large number of qubits now (this could be a superconducting company) for example, and they don’t have too many challenges in what they can do currently (I use that lightly, I do appreciate that relative challenge is a matter of perspective) But this company can take the early wins and perhaps profess some success in the industry. Q2 is a player with fewer qubits today, they might be a solid state company, and are struggling to get more than 8 qubits. They had lots of hard engineering and scientific challenges in the early days. However, as these companies evolve, attract investment, users and grow, the scaling starts to kick in. Q1 may hit a real scientific ‘gotcha’ moment where they have to expend significant resources in solving a critical challenge, slowing down their roadmap development, or halting completely. Whereas Q2 might have an easier time of it in the long term with scaling their system. Now the interesting point of the graph is where they end. Both reach the holy grail of FTQC with the same challenges (or perhaps at the same time). However, the journey for both is very different. And too often roadmaps are presented and end-users / investors are not aware or appreciative of these scientific challenges that a company may face down the road. The question they need to ask themselves is which do I invest in now. Do I invest in Q2, who is having a tough time now with imminent technical risk, but who will scale easier in the medium to long term? Or do I invest in Q1, with some easy wins now, but who may hit a technical bottleneck down the line? Now I’ll clarify here: this isn’t advice, nor am I picking one platform over the other. I am highlighting, as an example, things that people need to consider as they present roadmaps and choose which company they are going to back.

The National Quantum Computing Centre (NQCC)Defense Advanced Research Projects Agency (DARPA) QBI is testing platforms right now, assessing performance and use-cases. But it’s also important for these test and evaluations (T&E) facilities to also know which layer of the stack to probe, and when to probe it. Having an understanding of roadmaps as a function of time and technology layer within a general stack is key for this and will allow users, governments, and investors to focus on the probe questions and points that matter. Dr. Joe Spencer is a GQI UK Director and GQI’s Vice-President of Client Success March 4, 2026

Read Original

Tags

superconducting-qubits
photonic-quantum
quantum-computing
quantum-hardware
quantum-advantage

Source Information

Source: Quantum Computing Report