Back to News
quantum-computing

Quantum Kernel Methods Show Competitive Radar Classification with 133-Qubit IBM Processor

Quantum Zeitgeist
Loading...
5 min read
0 likes
⚡ Quantum Brief
Indian researchers demonstrated a 133-qubit Quantum Support Vector Machine (QSVM) for radar micro-Doppler classification, achieving competitive performance despite NISQ hardware limitations. The team combined classical feature extraction with quantum kernel encoding to classify aerial targets. The study used PCA to reduce feature dimensionality before quantum encoding via a ZZFeatureMap, enabling efficient processing on IBM’s Torino (133-qubit) and Fez (156-qubit) processors. This approach outperformed classical baselines with fewer features. Hardware tests on IBM’s Heron r2 architecture showed improved stability and fidelity, mitigating noise and decoherence effects. Measurement shot counts were optimized to enhance quantum distribution accuracy on physical devices. QSVMs matched classical SVM accuracy but required significantly fewer features, highlighting quantum efficiency. Classical methods still hold a slight edge, though quantum approaches excel in dimensionality reduction. Future work targets higher qubit counts, real-time quantum ML pipelines, and real-world radar dataset validation. Error mitigation and hardware advancements remain critical for practical deployment.
Quantum Kernel Methods Show Competitive Radar Classification with 133-Qubit IBM Processor

Summarize this article with:

Researchers are increasingly exploring quantum machine learning for complex signal processing tasks, and this study investigates the practical application of quantum kernel methods to radar micro-Doppler classification. Vikas Agnihotri, Jasleen Kaur from the National Institute of Technology, Rourkela, and Sarvagya Kaushik from the Indian Institute of Technology, Dhanbad, et al., demonstrate a Quantum Support Vector Machine (QSVM) capable of classifying aerial targets from radar signatures, even with the limitations of current noisy intermediate-scale quantum (NISQ) hardware. By combining classical feature extraction with quantum kernel encoding and evaluating performance on both simulators and IBM quantum processors, this work offers a crucial assessment of the feasibility and challenges of deploying quantum algorithms for real-world radar applications, potentially paving the way for more efficient and accurate target recognition systems. The research team extracted classical features and reduced their dimensionality using Principal Component Analysis (PCA) to facilitate efficient quantum encoding. Reduced feature vectors were then embedded into a quantum kernel-induced feature space via a fully entangled ZZFeatureMap before classification using a kernel-based QSVM. This reduction in dimensionality is crucial for efficient quantum processing and encoding of complex radar signals. The study systematically investigated the impact of noise, decoherence, and measurement shot count on quantum kernel estimation, identifying improved stability and fidelity on the newer Heron r2 architecture. By mapping micro-Doppler patterns into an expanded quantum state space, the classifier can more easily separate subtle differences in target dynamics. This work provides a comprehensive comparison between simulator-based and hardware-based QSVM implementations, highlighting both the feasibility and current limitations of deploying quantum kernel methods for practical radar signal classification. The researchers implemented a classical baseline SVM using a Radial Basis Function (RBF) kernel with a 15-dimensional feature set for comparison. The use of the ZZFeatureMap, an entangling feature map, allows the QSVM to construct an optimal separating hyperplane within a high-dimensional Hilbert space, resolving nonlinearities that challenge classical kernel methods. The research team first extracted classical features from radar data and then employed Principal Component Analysis (PCA) to reduce their dimensionality, facilitating efficient quantum encoding. Reduced feature vectors were subsequently embedded into a quantum kernel-induced feature space using a fully entangled ZZFeatureMap, preparing the data for quantum classification. Experiments began with performance evaluation on a quantum simulator, establishing ideal benchmarks before transitioning to validation on Noisy Intermediate-Scale Quantum (NISQ) hardware. Specifically, the team utilised the IBM Torino processor, a 133-qubit system, and the IBM Fez processor, boasting 156 qubits, to assess performance in realistic conditions. Researchers observed that the QSVM achieved competitive classification performance with substantially reduced feature dimensionality compared to classical baselines. Further hardware experiments on the newer Heron r2 architecture revealed improved stability and fidelity, demonstrating the potential of evolving quantum hardware. Classical features were extracted and then reduced using Principal Component Analysis (PCA) to facilitate efficient quantum encoding. The reduced feature vectors were embedded into a quantum kernel-induced feature space via a fully entangled ZZFeatureMap before classification using a kernel-based QSVM. Experiments initially evaluated performance using a quantum simulator and subsequently validated the results on NISQ-era quantum hardware, specifically the IBM Torino (133-qubit) and IBM Fez (156-qubit) processors.

Results demonstrate the QSVM achieved competitive classification performance compared to classical SVM baselines, while operating with substantially reduced feature dimensionality.

The team measured performance across varying feature set sizes, demonstrating the efficiency gains from dimensionality reduction prior to quantum encoding. Hardware experiments revealed the impact of noise, decoherence, and measurement shot count on quantum kernel estimation. Measurements confirm improved stability and fidelity were observed on the newer Heron r2 architecture. Specifically, the study provides a systematic comparison between simulator-based and hardware-based QSVM implementations, highlighting both the feasibility and current limitations of deploying quantum kernel methods for practical radar signal classification. This work details the implementation of a ZZFeatureMap, an entangling feature map, to create a complex vector space where dimensionality increases exponentially with the number of qubits (2n). By mapping data into this expanded quantum state space, micro-Doppler patterns became more separable, allowing for the construction of an optimal separating hyperplane within the high-dimensional Hilbert space. Classical features were extracted and reduced using Principal Component Analysis (PCA) to facilitate efficient quantum encoding, subsequently embedding these reduced vectors into a quantum kernel-induced feature space and classifying them with a kernel-based QSVM. Performance was initially assessed through quantum simulation and then validated on noisy intermediate-scale quantum (NISQ) hardware, specifically the IBM Torino and IBM Fez processors. Experimental results indicate that the QSVM achieves classification performance comparable to classical SVM baselines, but with significantly reduced feature dimensionality. Hardware experiments highlighted the impact of noise, decoherence, and measurement shot count on quantum kernel estimation, revealing improved stability and fidelity on the newer Heron r2 architecture. This study offers a systematic comparison of simulator-based and hardware-based QSVM implementations, demonstrating both the feasibility and current limitations of deploying quantum kernel methods for practical radar signal classification. The findings establish that QSVMs can effectively manage the nonlinearities inherent in radar signals, even within the constraints of NISQ-era hardware. Optimizing measurement shot counts proved crucial for stabilizing quantum distributions on physical hardware, and the IBM Fez processor demonstrated greater suitability for QSVM circuits due to its improved accuracy and robustness. While classical SVMs currently maintain a slight edge in prediction accuracy, QSVMs offer enhanced feature efficiency, achieving comparable results with substantial dimensionality reduction through PCA. The authors acknowledge limitations related to quantum hardware reliability and the need for advanced error mitigation techniques. Future research directions include improving hardware, exploring higher qubit counts for direct feature encoding, investigating variational quantum classifiers, developing real-time quantum machine learning pipelines for edge devices, and validating performance with real-world radar datasets to assess robustness and generalisation. 👉 More information 🗞 Practical Evaluation of Quantum Kernel Methods for Radar Micro-Doppler Classification on Noisy Intermediate-Scale Quantum (NISQ) Hardware 🧠 ArXiv: https://arxiv.org/abs/2601.22194 Tags:

Read Original

Tags

quantum-machine-learning
telecommunications
quantum-investment
quantum-algorithms
quantum-hardware
ibm
india-quantum-computing
india-nqm

Source Information

Source: Quantum Zeitgeist