Back to News
quantum-computing

Quantum Machine Learning Achieves Improvement in One-Third of Models with Noise

Quantum Zeitgeist
Loading...
6 min read
0 likes
⚡ Quantum Brief
Researchers from the University of Washington and University of Michigan found that moderate quantum noise improves performance in roughly one-third of randomly initialized quantum machine learning models, challenging the assumption that noise is universally harmful. The study analyzed 55 quantum graph neural networks predicting molecular energy gaps using the QM9 dataset, revealing a strong negative correlation (r = −0.62) between initial model performance and noise-induced benefits—under-optimized models gained from noise as an implicit regularizer. Optimal noise levels were lower than theoretical predictions, suggesting error cancellation in structured quantum circuits, with 99.5% gate fidelity (ε = 0.005) yielding improvements comparable to current hardware capabilities. Noise effects varied widely: one-third of models improved, a smaller fraction deteriorated, and the rest remained unaffected, indicating initialization quality dictates noise sensitivity rather than uniform mitigation being necessary. The findings advocate for noise-aware optimization strategies tailored to model structure, potentially enhancing near-term quantum machine learning efficiency by leveraging noise as a computational resource.
Quantum Machine Learning Achieves Improvement in One-Third of Models with Noise

Summarize this article with:

Scientists have long considered quantum noise a significant hurdle in the development of reliable quantum computers, prompting considerable effort into its reduction. However, new research led by Linghua Zhu and Yulong Dong, from the University of Washington and University of Michigan respectively, alongside Ziyu Zhang, Xiaosong Li et al, challenges this established view. Their work, utilising graph neural networks for molecular property prediction, reveals that moderate noise can surprisingly improve learning performance in approximately one-third of randomly initialised models. This discovery is significant because it suggests noise isn’t always detrimental, and that a more nuanced, structure-aware approach to optimisation , rather than blanket mitigation , could unlock greater potential in near-term quantum machine learning applications. Their experiments reveal that, amongst randomly initialized models sharing the same architecture, roughly one-third exhibit improved performance under moderate noise, while a smaller proportion deteriorate and the rest remain largely unaffected. This groundbreaking work suggests a nuanced relationship between noise and model optimization, prompting a re-evaluation of current noise mitigation strategies. The study meticulously examined 55 independently initialized quantum graph neural networks trained on a subset of the QM9 dataset, a collection of quantum chemical properties for over 134,000 organic molecules. Researchers focused on predicting the energy gap between the highest occupied and lowest unoccupied molecular orbitals, a crucial property governing molecular reactivity and electronic behaviour. By analysing individual model responses rather than population averages, the team uncovered a strong negative correlation (r = −0.62) between a model’s initial performance and its benefit from noise; under-optimized models benefited from noise acting as an implicit regularizer, while well-converged models experienced disruption. This discovery challenges the assumption that noise is universally harmful and highlights the importance of considering initialization quality. Furthermore, the observed optimal noise level fell below theoretical predictions, indicating the presence of error cancellation within the structured quantum circuits employed. This suggests that specific circuit designs can inherently mitigate the negative effects of noise, leading to unexpectedly robust performance. The research establishes that noise effects are critically dependent on initialization quality and are not uniformly detrimental, advocating for a shift from universal noise mitigation towards structure- and noise-aware optimization strategies. This innovative approach could unlock new possibilities for leveraging noise as a computational resource in the NISQ era, potentially improving the efficiency and accuracy of quantum machine learning algorithms.

The team’s quantum graph neural network architecture builds upon the equivariantly diagonalizable unitary framework, guaranteeing permutation equivariance, ensuring predictions remain consistent regardless of atom labelling. This architecture utilizes 12 qubits, with specialized assignments for encoding atomic information, processing local interactions, and performing graph-level readout via parameterized controlled rotations. Initializing the master qubit in the |+⟩ state enables quantum graph pooling, effectively learning relationships between atoms within the molecular structure. By meticulously controlling and analysing the impact of noise on these networks, the researchers have provided compelling evidence that noise is not simply an obstacle to overcome, but a potentially valuable tool for enhancing quantum machine learning. QM9 Molecular Energy Gap Prediction Setup Scientists investigated an unexpected phenomenon: the potential benefits of noise in quantum computing systems. They randomly sampled 2,000 molecules from the QM9 dataset, a collection of approximately 134,000 organic molecules, each containing up to nine heavy atoms , carbon, nitrogen, oxygen, and fluorine , represented as undirected graphs to predict the energy gap between the highest occupied and lowest unoccupied molecular orbitals. The study meticulously split the dataset into training (80%), validation (10%), and test (10%) sets, ensuring consistent divisions across all experimental runs and evaluating performance using the R2 score. To quantify the impact of noise, researchers calculated the relative performance change, defined as ∆R2 = R2 noisy −R2 noiseless / R2 noiseless ×100%, classifying responses as positive (∆R2 2%), negative (∆R2A single-layer architecture was adopted to maintain consistent circuit depth and noise accumulation, enabling the team to attribute variations in noise response directly to initialization quality. The EDU operations processed molecular bonds based on their type , single, double, triple, or aromatic , with parameters learned separately for each, implementing quantum message passing between connected atoms via parameterized RY(θij), RZ(φij), and RZZ(ψij) gates. Circuit measurements utilized Pauli-Z expectation values on all 12 qubits, generating a 12-dimensional quantum feature vector subsequently processed by a classical feedforward neural network with four layers to predict the HOMO-LUMO gap, as defined by the equation: f(G,θ) = NNclassical(z(G,θ)). This innovative approach revealed that noise effects are not uniform, but systematically correlate with baseline model performance, suggesting that noise can act as an implicit regularizer for under-optimized models while disrupting well-converged ones. Noise surprisingly boosts graph neural network performance Scientists have discovered that noise, conventionally considered detrimental to computation, can surprisingly enhance the performance of graph neural networks. Experiments conducted on graph neural networks designed for molecular property prediction revealed a heterogeneous response to noise, challenging the long-held belief that noise is always a hindrance. Among 55 independently initialized models with identical architecture, approximately one-third exhibited performance improvements under moderate noise conditions, while a smaller fraction deteriorated and the remainder were marginally affected.

The team measured noise levels of ε ∈ {0.000, 0.005, 0.010, 0.015}, corresponding to per-gate fidelities of 100%, 99.5%, 99%, and 98.5% respectively. The lowest nonzero value, ε = 0.005, representing 99.5% fidelity, is comparable to current hardware capabilities, with state-of-the-art two-qubit gate fidelities reaching 99.5% on neutral-atom arrays and up to 99.99% on trapped-ion systems.

Results demonstrate a wide range of responses, spanning a 20.5 percentage point spread, exceeding five standard deviations of typical training variance and statistically significant (permutation test. Noise benefits under-optimised molecular prediction models surprisingly well Scientists have demonstrated that noise, typically considered detrimental to computation, can surprisingly improve the performance of certain machine learning models. Through experiments employing graph neural networks designed for molecular property prediction, researchers discovered heterogeneous responses to noise across independently initialized models, roughly one-third exhibited enhanced performance, while others suffered a decline or remained largely unaffected. This suggests a nuanced relationship between noise and model optimisation, challenging the conventional view of noise as a purely disruptive force. The findings establish a strong negative correlation between a model’s initial performance and its benefit from noise; under-optimised models appear to be regularised by noise, allowing them to escape suboptimal local minima, whereas well-converged models are destabilised. Notably, the optimal noise level observed was lower than predicted by existing theory, hinting at noise cancellation within the structured circuits of the networks. The authors acknowledge that their study focused on single-layer quantum graph neural networks and a specific molecular property prediction task, limiting the generalisability of these results to more complex architectures and learning problems. Future research should investigate how noise impacts deeper networks and explore the interplay between error accumulation, interlayer correlations, and optimal depth-noise trade-offs, potentially leading to more effective noise-aware optimisation strategies for quantum machine learning. 👉 More information 🗞 Rethinking Quantum Noise in Quantum Machine Learning: When Noise Improves Learning 🧠 ArXiv: https://arxiv.org/abs/2601.13275 Tags:

Read Original

Tags

energy-climate
quantum-computing
quantum-machine-learning

Source Information

Source: Quantum Zeitgeist