Back to News
quantum-computing

Quantum Extreme Learning on Digital Quantum Processors

Quantum Zeitgeist
Loading...
5 min read
0 likes
⚡ Quantum Brief
IBM researchers, alongside ETH Zurich and the University of Basel, developed a Quantum Extreme Learning Machine (QELM) that overcomes noise and concentration effects in superconducting quantum processors, achieving scalable operation with 124 qubits and 5,000+ two-qubit gates. The QELM framework combines hyperparameter tuning and local eigentask analysis to stabilize performance across tasks like time-series forecasting and satellite image classification, matching classical methods while operating in pre-fault-tolerant regimes. A key breakthrough is the system’s ability to balance stability and expressivity by optimizing eigentasks—principal components of quantum states—that filter noise while preserving critical information, enabling consistent scalability. Multi-objective calibration of observable variability, system capacity, and task performance mitigates hardware imperfections, achieving a five-fold improvement in gate fidelity for large-scale circuits. This advancement demonstrates transferable operating regimes from smaller to larger systems, paving the way for practical quantum machine learning applications in real-world data processing.
Quantum Extreme Learning on Digital Quantum Processors

Summarize this article with:

Timothée Dao and colleagues at IBM Research Europe, Zurich, in collaboration with ETH Zurich and the University of Basel, have created a Quantum Extreme Learning Machine (QELM) for existing superconducting platforms. The QELM tackles the problems of hardware noise and concentration effects which typically reduce performance as quantum systems increase in size, successfully operating with up to 124 qubits and circuits containing over 5,000 two-qubit gates. Combining a new hyperparameter tuning strategy with a local eigentask analysis, the team identified operating regimes that maintain performance and transferability across different tasks and system sizes, achieving results comparable to established classical methods in time-series forecasting and satellite image classification. This provides a key framework for large-scale, pre-fault-tolerant quantum machine learning and enables more advanced reservoir-based techniques. Scalable quantum machine learning via a superconducting quantum extreme learning machine A Quantum Extreme Learning Machine (QELM) now surpasses previous limitations of approximately 100 qubits for scalable quantum machine learning, utilising circuits containing over 5,084 two-qubit gates. This advancement enables the exploration of complex quantum dynamics previously inaccessible due to the rapid accumulation of errors and loss of signal distinguishability in larger systems. Reservoir computing, a bio-inspired approach to machine learning, leverages the rich, non-linear dynamics of these systems to process temporal data. However, scaling these systems presents significant challenges, as the inherent noise in quantum hardware and the phenomenon of ‘concentration effects’, where the quantum state collapses into a limited number of possibilities, can erase the ability to distinguish between different inputs and outputs. The QELM framework, implemented on superconducting platforms, employs a new hyperparameter tuning strategy and local eigentask analysis to identify stable operating points, ensuring strong performance across diverse tasks like time-series forecasting and satellite image classification, achieving results competitive with classical methods. Superconducting qubits, based on the principles of superconductivity and quantum mechanics, offer a promising avenue for building scalable quantum computers, though they are particularly susceptible to environmental noise. This represents a major step towards practical, pre-fault-tolerant quantum computation, establishing a viable path for extending reservoir-based techniques to real-world applications.

The Statlog Landsat Satellite dataset, a benchmark used for classifying land cover types from remotely sensed imagery, further validated performance, demonstrating its potential beyond simple time-series analysis. This dataset comprises multi-spectral data representing different land cover classes, providing a challenging test for the QELM’s classification capabilities. Maintaining stability as the number of qubits increased to 124 was achieved through the system’s hyperparameter tuning strategy, which simultaneously optimises observable variability, capacity, and task performance. A computationally efficient technique, local eigentask analysis, was also developed to select key features, improving information retrieval and reducing the impact of noise. The ‘eigentasks’ represent the principal components of the quantum state, capturing the most significant information for a given task. By focusing on these key features, the system can filter out irrelevant noise and improve its overall performance. Multi-objective calibration stabilises performance in large-scale superconducting quantum circuits Hyperparameter tuning proved central to the success of this quantum machine learning approach, functioning similarly to fine-tuning the dials on a radio to achieve the clearest signal. Quantum systems are inherently susceptible to noise, arising from various sources such as electromagnetic interference and imperfections in the fabrication of qubits. Without careful calibration, these errors accumulate and obscure meaningful results. The process involves adjusting parameters that control the quantum circuit, such as the duration and amplitude of microwave pulses used to manipulate the qubits. Actively seeking regimes where observable variability, system capacity, and task performance were balanced mitigated the impact of hardware imperfections and concentration effects, phenomena that typically degrade performance as the number of qubits increases. Observable variability refers to the range of possible quantum states the system can access, while system capacity represents its ability to store and process information. This multi-objective strategy allowed for the identification of stable operating points, vital for maintaining accuracy in larger systems. Identifying these points requires exploring a vast parameter space, often employing sophisticated optimisation algorithms. Balancing stability and expressivity in a 124-qubit quantum extreme learning machine Reservoir computing holds promise for processing time-dependent data, but struggles to maintain accuracy as quantum systems scale up.

The Perimeter Institute’s Naren Manjunath and collaborators have now demonstrated a QELM utilising 124 qubits, a significant step towards overcoming these limitations. Analysis reveals a key trade-off between stability and expressivity. Too many ‘eigentasks’, key features extracted from the quantum data, introduce noise, while reducing them limits the system’s ability to discern patterns. The number of eigentasks directly relates to the dimensionality of the Hilbert space, the mathematical space that describes all possible quantum states. A higher dimensionality allows for more complex patterns to be represented, but also increases the susceptibility to noise.

The team identified a stable operating ‘regime’ observable at smaller scales that consistently transferred to larger systems, suggesting a path towards building more complex quantum machine learning tools. This transferability is crucial for scalability, as it allows insights gained from smaller systems to be applied to larger, more powerful ones. By carefully balancing system stability with the ability to discern patterns in data, gate fidelity increased five-fold on forecasting and image classification tasks using up to 124 qubits and over 5,000 quantum operations. Gate fidelity is a measure of the accuracy of quantum operations, indicating how well the qubits maintain their quantum state during manipulation. This demonstrates that, while retaining an optimal number of ‘eigentasks’ presents a delicate balance between system stability and pattern recognition ability, this trade-off doesn’t invalidate the progress. Rather, it highlights the complexity of using quantum systems for practical computation. Further research will focus on refining the hyperparameter tuning strategy and exploring alternative methods for mitigating noise and concentration effects, paving the way for even larger and more powerful quantum machine learning systems. 👉 More information 🗞 Breaking concentration barriers for quantum extreme learning on digital quantum processors 🧠 ArXiv: https://arxiv.org/abs/2603.13005 Tags:

Read Original

Tags

superconducting-qubits
quantum-machine-learning
government-funding
quantum-hardware
partnership

Source Information

Source: Quantum Zeitgeist