QuEra Achieves 1 Error Per Trillion Steps in Memory Result

Summarize this article with:
QuEra has achieved an error rate of one error per trillion steps in a quantum memory result, a step toward the reliability needed for complex computations. The company demonstrated the creation of one reliable logical qubit using just over two physical qubits, a substantial reduction from the hundreds to thousands typically required by other quantum computing approaches. This advance builds on a theoretical breakthrough by Kasai (2026), which revealed the potential of quantum error-correcting codes with encoding rates above 1/2. “Our results show that neutral atoms have a clear path to the Teraquop regime, roughly one error per trillion logical operations,” the researchers state, bringing the prospect of useful quantum computing significantly closer by minimizing the scale of machines needed for advanced algorithms.
Neutral Atom Platforms Enable High-Density Qubit Encoding A system of tens of thousands of physical qubits could deliver the logical qubit counts and low error rates that many proposed algorithms require, according to results from QuEra Computing, Harvard University, and MIT. This demonstrates a significant leap in quantum computer design. This advance centers on a novel approach to quantum error correction, a critical process for building stable and scalable quantum computers.
The team’s work addresses a fundamental challenge in quantum computing: maintaining the integrity of quantum information long enough to perform meaningful calculations. Quantum bits, or qubits, are susceptible to noise, leading to errors that corrupt computations. Error correction combats this by encoding information across multiple physical qubits to create a logical qubit, which is more resilient to disturbances. However, the efficiency of this encoding, the ratio of physical to logical qubits, is paramount. “Fewer physical qubits per logical qubit means a smaller, nearer-term system can do the same work,” explains the research team. Their recent publication details a method achieving an encoding rate of approximately 0.503, meaning nearly one logical qubit is created for every two physical qubits utilized, a substantial improvement over existing methods. This was demonstrated with a code encoding 580 physical qubits into 1152 physical qubits. The success is attributed to a co-design approach, tailoring the error-correcting code to the specific capabilities of neutral atom platforms, which utilize arrays of individual atoms as qubits and manipulate them with atom movement. Specifically, the team adapted the Kasai (2026) code family to leverage the reconfigurable nature of neutral atom systems and their use of Acousto-Optic Deflectors (AODs) for atom movement. “By choosing a reference APM with large orbits and designing the transition APMs to commute with it, the required motion collapses to simple cyclic shifts within orbits and permutations between them — executable in low AOD depth.” The team emphasizes this result is a quantum memory achievement, demonstrating the ability to store information reliably, but further development is needed for full fault-tolerant computation. Nevertheless, this progress significantly reduces the hardware requirements for building powerful quantum computers, bringing the promise of practical quantum computing closer to reality. Kasai Construction & Affine Permutation Matrices for qLDPC Codes The pursuit of practical quantum computation increasingly focuses on error mitigation, moving beyond simply building more qubits to extracting reliable results from imperfect hardware. While various quantum error correction schemes exist, a significant bottleneck has been the sheer number of physical qubits required to create even a single dependable logical qubit; most approaches demanded hundreds, if not thousands, of physical qubits for this task. Recent work from a collaboration between QuEra, Harvard, and MIT demonstrates a substantial reduction in this overhead, achieving reliable encoding with “just over two” physical qubits per logical qubit, a figure that dramatically alters the near-term feasibility of scalable quantum systems. This advancement hinges on a novel application of quantum Low-Density Parity-Check (qLDPC) codes, building upon a theoretical foundation established by Kasai (2026). Traditional qLDPC codes have been hampered by low encoding rates, typically requiring ten to one hundred physical qubits for each logical qubit. Kasai’s construction introduced “affine permutation matrices”, a mathematical innovation that allowed for the creation of qLDPC codes with encoding rates exceeding 1/2, theoretically opening the door to more efficient error correction. However, translating this theoretical potential into a practical reality demanded careful consideration of the underlying hardware architecture.
The team specifically targeted neutral-atom platforms, leveraging their unique capabilities to optimize code performance. “Kasai’s construction… introduced affine permutation matrices… in place of the circulant permutation matrices that earlier quantum LDPC constructions relied on,” explains the research. The researchers co-designed the code around the constraints and advantages of reconfigurable neutral-atom systems, which utilize Acousto-Optic Deflectors (AODs) for atom movement. By strategically structuring the code, they minimized the complexity of atom movement during error correction cycles. Specifically, they focused on the “orbit structure of the transition APMs,” ensuring that syndrome extraction could be achieved with minimal movements. Validation involved rigorous circuit-level noise simulations, employing a physical error rate of 0.1% and a decoder to accurately estimate logical fidelity. Building on a recent theory breakthrough (Kasai, 2026), we show that quantum error-correcting codes with encoding rates above 1/2 could be used in practical settings, and directly verify in simulation that they can achieve error rates as low as one error per trillion steps.Research team spanning QuEra (Chen Zhao, Casey Duckering), Harvard (Andi Gu), and MIT (Nishad Maskara, Hengyun Zhou) Quantum Error Correction Performance Results QuEra Computing, in collaboration with researchers at Harvard and MIT, is demonstrating a significant reduction in the resources needed to build practical quantum computers, moving beyond the limitations of current error correction methods. Most existing approaches necessitate hundreds to thousands of physical qubits to create a single, reliable logical qubit, a substantial barrier to scaling. However, QuEra’s new codes achieve this with just over two physical qubits, a result detailed in their latest paper. This level of performance, termed the Teraquop regime, is where quantum computers become capable of running the deep algorithms proposed for applications like molecular simulation, optimization, and cryptanalysis. The specific codes developed, designated [[n = 1152, k = 580]] and [[n = 2304, k = 1156]], demonstrate encoding rates of 0.503 and 0.502 respectively, with the [[n = 1152, k = 580]] code protecting against up to 5 errors and the [[n = 2304, k = 1156]] code protecting against up to 6 errors. The researchers adapted the Kasai code family, focusing on structural conditions that enable efficient atom movement, resulting in a syndrome cycle fast enough for practical implementation. “Structural conditions that compile to hardware,” is a key aspect of their approach, allowing for constant-depth execution of syndrome extraction.
The team validated the performance using a full circuit-level noise model, simulating realistic error rates and utilizing a decoder to ensure accuracy. “Our simulations use a physical error rate of 0.1%, under a realistic circuit-level noise model,” they state, reflecting the rapidly improving fidelity of neutral atom qubits. This work establishes a clear path towards Teraquop-scale quantum computing, where the foundation of reliable memory enables truly powerful computation. Teraquop Regime: Achieving 10⁻¹³ Logical Error Rates The pursuit of practical quantum computation has long been hampered by the fragility of quantum information; however, recent advances are bringing the promise of reliable quantum processing significantly closer to reality. QuEra’s method, leveraging a novel application of quantum error-correcting codes, achieves the same result with just over two physical qubits.
The team’s success is deeply intertwined with the specific architecture of their neutral-atom quantum computer. Unlike superconducting qubits fixed on a chip, neutral atoms offer a reconfigurable platform where atoms can be moved and manipulated. This flexibility allowed the researchers to co-design the error-correcting code with the hardware, optimizing for efficient atom movement. The simulations, conducted with a physical error rate of 0.1% under a realistic circuit-level noise model, demonstrate a per-logical-per-round error rate approaching 1.3⁺³·⁰₋₀.₉ × 10⁻¹³, firmly establishing the system’s entry into the Teraquop regime and paving the way for quantum computers capable of running the deep algorithms proposed for molecular simulation, optimization, and cryptanalysis. This is not a coincidence. Neutral-atom platforms offer a combination of properties that align with what QEC algorithms demand: identical qubits by nature, flexible and reconfigurable connectivity, highly parallel operations, and long coherence times relative to gate and transport speeds. Classical LDPC Codes Inform Quantum Error Correction Design The pursuit of reliable quantum computation has unexpectedly drawn inspiration from the well-established field of classical data transmission. While quantum bits demand entirely new approaches to error mitigation, the underlying principles of efficient code design are proving remarkably transferable, particularly through the lens of Low-Density Parity-Check (LDPC) codes. These codes, integral to technologies like 5G and deep-space communication, operate by adding redundancy to data, allowing for error detection and correction; modern LDPC codes can achieve rates as high as 90%, representing a near-optimal use of transmission capacity. Quantum error correction similarly relies on redundancy, but translating the efficiency of classical LDPC codes to the quantum realm has been a significant hurdle, with previous quantum LDPC (qLDPC) codes typically requiring ten to one hundred physical qubits to encode a single logical qubit. Recent advances, however, are challenging this paradigm. “Kasai (2026)’s construction showed that rate 1/2 qLDPC codes are mathematically reachable at large block sizes,” explains the research team, paving the way for more practical encoding schemes. “We adapted the Kasai code family to work well with these design constraints,” the team notes, highlighting the importance of aligning code structure with hardware limitations. This brings useful quantum computing closer. For example, the [[n = 1152, k = 580]] code is laid out on a 3 × 32 grid, requiring at most one horizontal and one vertical shift during syndrome extraction. “Constant depth is what makes the syndrome cycle fast enough to be practical, and achieving it requires designing the code and the hardware execution together,” they emphasize, underscoring the importance of this holistic approach. The results suggest that neutral atom systems are on a clear path toward realizing the benefits of this technology, bringing useful quantum computing demonstrably closer to reality. If a quantum computer can’t hold information reliably, it can’t compute reliably either. The encoding efficiency demonstrated here directly determines how large that computer needs to be. Source: https://arxiv.org/abs/2604.16209 Tags:
