Back to News
quantum-computing

Quantum’s next bottleneck is the system, not just the qubit - SDxCentral

Google News – Quantum Computing
Loading...
4 min read
0 likes
⚡ Quantum Brief
A March 2026 roundtable with Nobel laureate John Martinis and quantum leaders warned that scaling quantum systems now depends on broader infrastructure—not just qubits. Fault-tolerant machines with millions of qubits require advances in manufacturing, wiring, and cost reduction. Yonatan Cohen of Quantum Machines emphasized tighter integration between quantum and classical systems. Slow calibration loops block progress, demanding real-time error correction and hybrid architectures to handle growing complexity. MIT’s Fabrizio Berritta revealed qubit energy decay fluctuates rapidly, exposing hidden calibration challenges. Faster measurements uncover new failure modes, pushing for improved fabrication to reduce material defects. Alice & Bob’s Nicolas Didier noted cat qubits could lower error-correction overhead but stressed system-wide integration. Early logical qubit deployment would accelerate testing of HPC and hybrid workflow compatibility. The panel agreed modularity and open frameworks like Quantum Machines’ Open Acceleration Stack are critical. Classical processor choices remain unresolved, but flexibility will outpace rigid architectures in scaling efforts.
Quantum’s next bottleneck is the system, not just the qubit - SDxCentral

Summarize this article with:

Quantum’s next bottleneck is the system, not just the qubit Roundtable says scaling quantum systems will hinge on manufacturing, calibration, and quantum-classical integration March 27, 2026 By Berenice Baker, SDxCentral Have your say Facebook Twitter LinkedIn Reddit Email Share – Getty Images The next phase of quantum computing will be defined less by headline-grabbing processor demos and more by whether the industry can solve a broader systems challenge across hardware, control, and architecture.That was the message from a recent roundtable featuring 2025 Nobel laureate John Martinis, Quantum Machines co-founder and co-CEO Yonatan Cohen, Alice & Bob head of chip design Nicolas Didier, and MIT researcher Fabrizio Berritta. The panel argued that the path from promising processors to practical fault-tolerant machines will require advances well beyond the qubit itself.Martinis said the industry cannot afford to treat scaling as a single bottleneck. Moving from hundreds of qubits to fault-tolerant systems with millions will require changes across manufacturing, wiring, control, and system cost. He pointed to the need to reduce the cost of control infrastructure and improve how qubits are built and connected.Cohen made a similar argument from the control side, saying quantum systems will only scale if calibration and error correction are brought closer to the hardware and more tightly linked to classical compute. Calibration loops that take too long become a practical blocker as systems grow, he said, adding that quantum machines must be designed as hybrid quantum-classical architectures rather than isolated processors.

The Quantum Supplement The cryptographic race against quantum decryption 21 Jan 2026 That systems view also underpins Quantum Machines’ newly launched Open Acceleration Stack, which is designed to connect quantum processing units with classical accelerators for calibration, decoding, and eventually applications. Cohen said the industry still faces open questions about which classical processors are best suited to different tasks, but argued that modularity will matter more than locking into a single architecture too early.Berritta’s recent MIT work added another calibration dimension. He said the team used faster control hardware to track qubit energy decay, or T1, at much finer timescales and found it could change by an order of magnitude within tens-of-milliseconds. That was scientifically interesting, he said, but also “a nightmare from a calibration point of view,” because it revealed a fast-moving parameter that had previously been averaged out.For Martinis, the results highlight how better measurement can expose new failure modes, not just improved performance. Faster calibration may help researchers understand qubits in real time, but it also reinforces the case for improving fabrication and reducing materials defects.Didier argued that cat qubits could ease some of the scaling burden by lowering error-correction overhead. But even then, he said, the wider stack matters. Getting useful logical qubits sooner would allow the industry to evaluate the surrounding ecosystem earlier, including how quantum processors integrate with high-performance computing (HPC) environments and hybrid workflows.The discussion suggested that scaling challenges are shifting from individual qubits to the systems that support them. Calibration speed, control infrastructure, decoding, and software are now emerging as key constraints as the industry pushes toward fault-tolerant machines. Subscribe to The Silicon & Supercomputing Channel for regular news round-ups, market reports, and more. Create an Account to Subscribe Now More in Quantum The Quantum Supplement 18 Dec 2025 Scientists demonstrate room-temperature quantum signaling using twisted light 19 Jan 2026 Europe steps up sovereignty with AI ramping and rumored vendor axing More in Quantum Networks The Quantum Supplement 28 Nov 2025 New erbium molecular qubits could unlock telecom-ready quantum networking 14 Nov 2025 Single-photon switches offer a path toward scalable quantum networks Facebook Twitter LinkedIn Reddit Email Share Tags Alice & Bob Fabrizio Berritta Hardware John Martinis MIT Nicolas Didier Open Acceleration Stack Quantum Quantum Computing Quantum Machines Quantum computing Yonatan Cohen quantum computer qubit qubits

Read Original

Tags

quantum-hardware

Source Information

Source: Google News – Quantum Computing