Back to News
quantum-computing

Quantum Computers Now Guide Classical Solvers to Find Better Solutions

Quantum Zeitgeist
Loading...
6 min read
0 likes
⚡ Quantum Brief
Researchers from Naval Research Laboratory, Fermi National Accelerator Lab, and Old Dominion University developed D-QEO, a hybrid quantum-classical framework that uses a 50-qubit processor to guide classical solvers in high-dimensional optimization problems. The system acts as a "topographical preconditioner," generating high-quality seed points for GPU-accelerated classical solvers, avoiding barren plateaus and near-term quantum hardware limitations while reducing computational effort by orders of magnitude. Benchmark tests on 10-dimensional Rastrigin and Ackley functions showed D-QEO eliminated exponential failure rates of classical methods, achieving near-perfect convergence with just 8,000 quantum evaluations. The framework divides the 50-qubit space into 5-qubit subcircuits, creating diverse candidate solutions that reshape the search landscape, drastically reducing local minima traps for classical refinement. While currently limited to separable functions, the team aims to extend D-QEO to non-separable problems, potentially unlocking applications in finance, materials science, and machine learning.
Quantum Computers Now Guide Classical Solvers to Find Better Solutions

Summarize this article with:

A new framework, Distributed Quantum-Enhanced Optimisation (D-QEO), addresses optimisation problems in high-dimensional search spaces where classical algorithms often fail to identify global minima due to exponentially growing search volumes. Dominik Soós at Naval Research Laboratory, and colleagues in collaboration with Fermi National Accelerator Laboratory and Old Dominion University, present a system that uses quantum processing as a topographical preconditioner, not a direct solver. The approach utilises a 50-qubit space, divided into manageable sub-spaces by 5-qubit subcircuits, to generate high-quality seed points for a classical GPU-accelerated solver, avoiding limitations of near-term quantum hardware and the issues of barren plateaus. Benchmarking on established functions shows that D-QEO mitigates the exponential failure rates of classical methods and sharply reduces the computational effort needed for convergence, offering a pragmatic pathway for integrating quantum resources into complex global search tasks. Quantum topographical preconditioning overcomes exponential scaling in function optimisation The D-QEO framework achieved a striking result: it prevented the exponential failure rates seen in purely classical algorithms when optimising high-dimensional functions, a feat previously unattainable with standard methods. Observed on benchmark functions, specifically the 10-dimensional Rastrigin and Ackley functions, classical algorithms typically falter due to the exponentially growing search space. As the number of variables increases, the volume of the search space expands exponentially, rendering exhaustive search impractical and stochastic methods increasingly unreliable. This phenomenon, known as the ‘curse of dimensionality’, severely limits the applicability of classical optimisation techniques to complex, real-world problems. Utilising a 50-qubit quantum processing unit (QPU) as a topographical preconditioner, the system effectively narrows the search area and generates high-quality seed points for a classical GPU-accelerated solver to refine, circumventing limitations of near-term quantum hardware. The quantum component doesn’t attempt to solve the optimisation problem directly, but rather to intelligently reshape the search landscape, making it more amenable to classical algorithms. An 8000-evaluation budget for the quantum preconditioning phase yielded near-perfect convergence across all tested dimensions, demonstrating a strong stabilisation of the optimisation process, also observed with the Ackley function. This level of performance represents a significant improvement over classical methods, which typically exhibit rapidly decreasing success rates as dimensionality increases. The methodology involves dividing the 50-qubit quantum computer into multiple 5-qubit subcircuits, each responsible for exploring a portion of the search space. These subcircuits generate a diverse set of candidate solutions, which are then evaluated and used to create a probability distribution favouring regions with lower function values. This distribution is then sampled to produce the seed points for the classical solver. Analysis revealed that the quantum preconditioner sharply reduced the number of trapped local minima within the preconditioned search space, effectively narrowing the area the classical solver needed to explore. Local minima represent points where the function value is lower than at nearby points, but not the absolute lowest value across the entire search space; classical algorithms can easily become stuck in these local minima, preventing them from finding the global optimum. Even with a limited quantum evaluation budget of just 200 function evaluations, the success rate remained high, contrasting sharply with the exponential failure rate of purely classical methods at this dimensionality. This demonstrates the efficiency of the quantum preconditioning step, requiring relatively few quantum resources to achieve a substantial improvement in performance. Further analysis confirmed the effectiveness of the D-QEO framework in shaping the optimisation landscape. The technique demonstrably sidesteps the exponential scaling issues plaguing conventional optimisation methods, offering a practical pathway for near-term quantum devices. The use of a GPU-accelerated classical solver further enhances the performance, allowing for rapid evaluation of the seed points generated by the quantum computer. Faster convergence and improved solution quality resulted from the reduction in local minima, highlighting the benefit of using quantum resources before classical refinement. The combination of quantum preconditioning and classical refinement represents a promising hybrid approach to tackling complex optimisation problems, leveraging the strengths of both quantum and classical computing paradigms. This approach is particularly relevant in the current era of noisy intermediate-scale quantum (NISQ) technology, where quantum computers are still limited in size and prone to errors. Limitations of separable function optimisation and potential for broader applicability While this new Distributed Quantum-Enhanced Optimisation framework offers a compelling route to tackling notoriously difficult optimisation problems, its current reliance on separable functions presents a significant constraint. Separable functions are those where the overall function value can be calculated as a sum of individual terms, each depending on only one variable. This allows the problem to be decomposed into independent subproblems, which can be solved in parallel. The authors acknowledge this limitation, and the extent to which its benefits will translate to the messy reality of non-separable problems, those where variables are interconnected, remains an open question. Non-separable functions introduce dependencies between variables, making the problem significantly more complex and challenging to optimise. A wealth of prior work exists exploring quantum-behaved particle swarm optimisation and hybrid particle swarm-BFGS strategies, suggesting alternative approaches to navigating complex fields, though these often struggle with scalability. These methods attempt to directly apply quantum-inspired concepts to classical optimisation algorithms, but they often suffer from the same limitations as traditional methods when dealing with high-dimensional, non-separable problems. Despite functioning best with easily divided problems, the Distributed Quantum-Enhanced Optimisation framework does not diminish its value. Quantum processors, acting as ‘topographical preconditioners’ to identify promising starting points, allow classical computers to find solutions with fewer steps. This hybrid approach represents a significant step towards using quantum computing for real-world challenges, even with current hardware limitations. The framework avoids the limitations of both wholly quantum and classical methods by utilising a quantum processing unit (QPU) to pre-condition the search space, identifying promising areas before a classical computer refines the solution. The success of the D-QEO framework hinges on its ability to efficiently explore the search space using a relatively small number of qubits. Exploiting the structure of separable functions allowed the 50-qubit quantum system to scale without entanglement overhead, successfully processing 20 million data points; this approach offers a pathway to address complex optimisation challenges beyond the reach of current classical algorithms. Future research will focus on extending the framework to handle non-separable functions, potentially through the development of novel quantum algorithms or the integration of classical decomposition techniques. The potential applications of this technology are vast, ranging from financial modelling and materials discovery to machine learning and logistics optimisation. The research demonstrated that a hybrid quantum-classical approach, termed Distributed Quantum-Enhanced Optimisation, successfully prevents the exponential failure rates seen in purely classical algorithms when solving optimisation problems. By using a 50-qubit quantum processor to identify promising starting points, the number of subsequent classical iterations required to find a solution was significantly reduced on 10-dimensional functions. This framework exploits the structure of separable functions to divide a large 50-qubit search space into manageable sub-spaces, avoiding entanglement overhead. The authors intend to extend this work to address non-separable functions, potentially broadening the scope of solvable problems. 👉 More information🗞 Distributed Quantum-Enhanced Optimization: A Topographical Preconditioning Approach for High-Dimensional Search🧠 ArXiv: https://arxiv.org/abs/2604.20639 Tags:

Read Original

Tags

quantum-investment
quantum-computing
quantum-hardware
partnership

Source Information

Source: Quantum Zeitgeist