Back to News
quantum-computing

Quantum Processors Now Correct Errors Using Lost Atoms As a Resource

Quantum Zeitgeist
Loading...
6 min read
0 likes
⚡ Quantum Brief
Researchers at the University of Strasbourg and CNRS transformed atom loss—a major error source in neutral-atom quantum processors—into a resource for error correction by exploiting correlated loss patterns. Their new decoder converts delayed erasure channels into standard erasure channels, simplifying correction and achieving a tenfold reduction in logical error probability while raising the loss threshold from 3.2% to 4%. The team used a dynamic "loss graph" to map atom loss correlations in real time, enabling parallelizable, state-aware error correction that adapts to partially correlated losses common in experiments. State-dependent loss (e.g., excited atoms escaping traps faster) was identified as a key factor, suggesting future decoders could weigh loss probabilities by initial atom states for even greater precision. This breakthrough extends neutral-atom processors’ operational limits, bringing fault-tolerant quantum computing closer by making error correction more efficient and scalable.
Quantum Processors Now Correct Errors Using Lost Atoms As a Resource

Summarize this article with:

Hugo Perrin and colleagues at University of Strasbourg and CNRS have shown how atom loss, a common source of error in neutral-atom quantum processors, can be used for quantum error correction. They demonstrate a new decoding strategy that uses the correlated structure of atom loss, effectively changing delayed erasure channels into standard erasure channels. Their surface code, equipped with teleportation-based loss-detection units, dynamically updates loss probabilities using a highly parallelizable loss graph. This approach yields a sharp improvement in performance, achieving up to an order-of-magnitude reduction in logical error probability and increasing the loss threshold from 3.2% to 4%, even when dealing with partially correlated loss, a key step towards building more strong and reliable quantum computers. Correlated atom loss mitigation unlocks improved neutral-atom quantum processor fidelity Error rates dropped to 3.2% to 4%, a substantial improvement previously unattainable due to limitations in accounting for atom loss correlations. Atom loss is a dominant source of error in neutral-atom quantum processors, yet existing decoders typically treat each loss event as independent, ignoring the relationships between them. This simplification neglects the fact that in many physical implementations, the loss of one atom can increase the probability of losing neighbouring atoms due to factors such as imperfect trapping potentials or correlated excitation pathways. By exploiting these correlations, the new decoder effectively converts complex delayed erasure channels, errors appearing later in computation, into simpler erasure channels, streamlining the error correction process. Delayed erasure channels are particularly problematic as the error manifestation is not immediate, complicating the decoding process; transforming them into standard erasure channels allows for more efficient correction. The surface code, a leading candidate for fault-tolerant quantum computation, benefits significantly from this improved decoding. The decoder constructs a ‘loss graph’, mapping connections between lost atoms and dynamically updating loss probabilities, enabling highly parallelizable and real-time operation. This graph represents the probabilistic relationships between atom losses, allowing the decoder to infer information about unobserved losses based on observed ones. The algorithm leverages the inherent parallelism of graph-based structures, making it suitable for implementation on classical hardware alongside the quantum processor. Even partially correlated loss events, where the link between the first and second atom loss isn’t absolute, do not impede this approach. This robustness is crucial as perfect correlation is rarely achieved in experimental settings. Simulations reveal a loss threshold of 4%, a substantial gain over previous methods, and expands the viable operating parameters for these processors. The loss threshold represents the maximum tolerable atom loss rate before the error correction fails to maintain the integrity of the quantum information. Increasing this threshold allows for longer and more complex quantum computations. In some cases, a tenfold reduction in logical error probability represents a substantial step forward for neutral-atom quantum computing. Logical qubits, encoded using multiple physical qubits, are protected from errors, and reducing the logical error rate is paramount for achieving fault tolerance. This improvement in the loss threshold persists even with partially correlated atom loss, allowing for a deeper exploration of the decoder’s capabilities and potential limitations in more complex quantum circuits. Further investigation will focus on the decoder’s performance with increasing numbers of qubits and more intricate error models, including the impact of other noise sources such as control errors and imperfect qubit interactions. Scaling the decoder to larger systems will require optimising the computational resources needed to maintain the loss graph and perform the decoding operations. Accounting for state-dependent atom loss enhances neutral-atom quantum error correction Neutral-atom quantum processors promise scalability, but maintaining qubit stability through error correction remains a formidable challenge. Trapped neutral atoms offer a promising platform for quantum computation due to their long coherence times and the potential for creating highly connected qubit arrays. However, these systems are susceptible to atom loss, which can disrupt the quantum state and introduce errors. The new approach offers a solution by acknowledging that atom loss isn’t random; losing one atom subtly alters the probability of losing its neighbours. Specifically, atoms in an excited state are far more likely to be lost, a state-dependency not yet fully incorporated into the decoding strategy. This increased loss probability in excited states arises from the increased scattering rate of photons, leading to a higher probability of the atom escaping the trapping potential. Relationships in how atoms fail offer a pathway to more stable quantum processors. Understanding and mitigating these correlations is crucial for improving the fidelity of quantum computations. This work establishes a foundation for understanding how the initial state of an atom influences its susceptibility to loss, building upon the ‘loss graph’ which dynamically maps connections between atom failures and improves error correction. The loss graph can be extended to incorporate state-dependent loss probabilities, assigning different weights to connections based on the initial state of the atoms. This refined model allows the decoder to more accurately predict and correct for atom loss errors. Future research should investigate state-dependent loss probabilities and their impact on decoding strategies, potentially unlocking further gains in quantum computation by refining the model to account for excited state decay rates and their influence on overall processor fidelity. Detailed characterisation of the excited state lifetimes and branching ratios is essential for accurately modelling state-dependent loss. Furthermore, exploring the interplay between state-dependent loss and other noise sources will provide a more comprehensive understanding of the error landscape in neutral-atom quantum processors. Investigating the impact of different trapping geometries and laser cooling techniques on atom loss rates could also lead to further improvements in processor stability and performance. The ultimate goal is to develop a robust and scalable quantum error correction scheme that can protect quantum information from all sources of noise, paving the way for the realisation of fault-tolerant quantum computers. The research demonstrated that accounting for correlated atom loss, where the failure of one atom influences others, improved the performance of quantum error correction. By implementing a decoder that maps these connections via a ‘loss graph’ and dynamically updates probabilities, researchers achieved up to a ten-fold reduction in logical error probability and increased the loss threshold to 4%. This matters because atom loss is a major obstacle to building stable quantum computers using neutral atoms, and mitigating it enhances the reliability of quantum calculations. Future work will focus on incorporating the influence of an atom’s initial state, particularly excited state decay rates, into the loss graph to further refine error correction strategies. 👉 More information🗞 Correlated Atom Loss as a Resource for Quantum Error Correction🧠 ArXiv: https://arxiv.org/abs/2603.24237 Tags:

Read Original

Tags

government-funding
quantum-computing
quantum-hardware
quantum-error-correction

Source Information

Source: Quantum Zeitgeist