Home/Quantum Technology/Quantum Optimization & Logistics: Supply Chain & Routing Applications

Quantum Optimization & Logistics: Supply Chain & Routing Applications

Quantum optimization news: logistics, supply chain quantum, routing optimization, QAOA. Combinatorial optimization & enterprise deployments.

2,510 Articles
Updated Daily

Optimization problems—finding the best solution among millions or billions of possibilities—represent the most immediate commercial application for quantum computing. Logistics, supply chain management, manufacturing, and transportation face combinatorial explosion where classical algorithms struggle.

Quantum approaches include quantum annealing solving optimization natively using quantum tunneling; QAOA (Quantum Approximate Optimization Algorithm) as a gate-based alternative; and quantum-inspired algorithms providing immediate business value on classical hardware.

India's Quantum Optimization Landscape

India's National Quantum Mission prioritizes optimization applications given the country's complex logistics challenges. The Indian Railways, the world's largest employer and passenger carrier, represents a prime use case for quantum scheduling optimization. The NQM Thematic Hub at IIT Bombay focuses on quantum algorithms for optimization problems.

Tata Consultancy Services (TCS) develops quantum optimization solutions for Indian enterprises including supply chain, logistics, and manufacturing applications. The Quantum Valley Tech Park in Andhra Pradesh, anchored by an IBM Quantum System Two with 156-qubit Heron processor, targets optimization applications among its use cases including supply chain resilience and energy optimization.

The NQM specifically targets quantum computing applications in optimization, with intermediate-scale quantum computers expected to demonstrate utility in logistics and scheduling problems within the mission timeline.

Quantum Models Predict Vanishing Stability Times in Large Systemsquantum-computing

Quantum Models Predict Vanishing Stability Times in Large Systems

Scientists at University of Malta, led by Emanuel Schwarzhans, have presented a novel analytical expression for approximating the average equilibration time within the Gaussian Unitary Ensemble (GUE). The GUE serves as a foundational framework within random matrix theory (RMT), a branch of theoretical physics concerned with the statistical properties of large, random matrices. Understanding equilibration times, the period required for a closed quantum system to reach a stable, equilibrium state, is crucial for characterising the behaviour of these systems. The derived expression confirms that the average equilibration time is independent of both the initial quantum state and the observable being measured, a consequence of the inherent rotational invariance within the GUE. However, the research reveals a counterintuitive trend: the calculated equilibration time diminishes as the system size increases, ultimately approaching zero in sufficiently large systems. This finding suggests that realistic chaotic many-body systems possess key physical characteristics beyond those captured by the GUE model, which fundamentally govern their true equilibration timescales. GUE equilibration timescales shorten with increasing quantum state numbers An analytical expression for the average equilibration time within the Gaussian Unitary Ensemble has been rigorously derived, demonstrating a decrease in this time as the system size expands, a result contrasting with some prior quantitative predictions. The GUE is particularly relevant to the study of quantum chaos, where the behaviour of systems is highly sensitive to initial conditions and exhibits seemingly random fluctuations. The analytical derivation is based on examining the decay of correlations between the initial state and the time-evolved state, utilising techniques from perturbative quantum mechanics and statistical physics. Numerical simulations, employing established algorithms for generating and diagonalising random matric

Quantum ZeitgeistLoading...0
Quantum Algorithm Speeds up Complex Problem Solving at Later Stagesquantum-computing

Quantum Algorithm Speeds up Complex Problem Solving at Later Stages

Scientists at the Centre for Exploratory Research have developed a new quantum algorithm, Quantum Riemannian Hamiltonian Descent (QRHD), to enhance continuous optimisation on Riemannian manifolds. Yoshihiko Abe and Ryo Nagai present a method that builds upon existing Quantum Hamiltonian Descent (QHD) techniques by explicitly integrating the geometric structure of the parameter space via a position-dependent metric within the kinetic term of the Hamiltonian. Their formulation, explored through both operator and path integral approaches, reveals the presence of quantum corrections to the action integral, which are demonstrably suppressed as the optimisation process progresses. This suggests that while quantum effects dominate the initial dynamics, the classical potential ultimately governs convergence near optimal solutions. The team estimate a lower bound for convergence time and demonstrate the algorithm’s functionality through numerical analysis, alongside a discussion of its potential implementation via time-dependent Hamiltonian simulation and an estimation of its query complexity. Geometric optimisation accelerates quantum parameter estimation A convergence time of 0.11 is reported for Quantum Riemannian Hamiltonian Descent, or QRHD, representing a measurable improvement over standard Quantum Hamiltonian Descent techniques. Traditional continuous optimisation algorithms often struggle with the complexities of high-dimensional and non-convex parameter spaces, frequently becoming trapped in local minima. QRHD addresses this limitation by explicitly incorporating the geometry of the search space, allowing for more efficient exploration of potential solutions. Enabling parameter searches on specified Riemannian manifolds, curved, multidimensional surfaces characterised by a locally defined metric tensor, allows the algorithm to navigate these complex landscapes more effectively. This is particularly relevant in scenarios where the cost function exhibits significant

Quantum ZeitgeistLoading...0
Quantum Annealing Gets a Boost with New, Efficient Encoding Methodquantum-computing

Quantum Annealing Gets a Boost with New, Efficient Encoding Method

Ryoji Miyazaki and colleagues at the National Institute of Advanced Industrial Science and Technology have developed a more efficient method to translate complex computational problems into a format suitable for implementation on existing quantum hardware. The new embedding scheme facilitates the implementation of the parity-encoded Sourlas-Lechner-Hauke-Zoller (SLHZ) approach on current quantum annealers. It addresses a key limitation of the SLHZ scheme, which previously proved difficult to scale up for practical application due to its resource demands. Their qubit-efficient embedding, requiring three physical qubits per spin, demonstrably improves upon existing methods such as those designed for the Pegasus graph and offers flexible potential reduction to single physical qubits. Reduced qubit requirements enable practical quantum annealing on the Pegasus graph A novel embedding scheme requires three qubits per spin in the parity Hamiltonian, a substantial reduction compared to previous methods for the Pegasus graph which demanded four or two qubits per spin, depending on the variant. This reduction in qubit requirements is vital as it unlocks the potential for implementing the Sourlas-Lechner-Hauke-Zoller (SLHZ) scheme, a powerful quantum annealing approach, on devices limited by qubit count. Quantum annealing is a metaheuristic for finding the global minimum of a given objective function over a given set of candidate solutions, and its effectiveness is often constrained by the size and connectivity of the quantum hardware. Previously, the high qubit demand hindered practical application of the SLHZ scheme, limiting its ability to tackle larger, more complex optimisation problems. Constructing an interaction graph using two-qubit chains and utilising the Zephyr connectivity, researchers at the University of Edinburgh and Humboldt University of Berlin have created a more efficient pathway to translate complex optimisation problems onto existing quantum annealers. S

Quantum ZeitgeistLoading...0
Faster Quantum Computing Now Possible Despite Increased Error Ratesquantum-computing

Faster Quantum Computing Now Possible Despite Increased Error Rates

Scientists from Johannes Kepler University Linz and collaborators demonstrated an approach to improve the reliability of quantum computations on trapped-ion systems. Their work implements local robust shadows, an error mitigation protocol designed to counteract measurement errors. The method recovers computational accuracy when measurement procedures are accelerated, improving the fidelity of results across different quantum states. However, this comes at the cost of increased sampling requirements, resulting in no overall efficiency gain. High fidelity quantum computation enabled by strong shadow protocol error mitigation A fidelity of around 0.96 was observed in the experiments, demonstrating that the robust shadows protocol can effectively reduce errors caused by shortened measurement times. Rather than representing a universal improvement over all previous methods, this result highlights the protocol’s ability to recover accuracy in scenarios where faster measurements introduce additional noise. Measurement errors are a major limitation in near-term quantum devices, and reducing the measurement pulse duration—such as to 150 μs—can significantly increase these errors. The robust shadows approach mitigates this effect, helping restore reliable estimates despite the noisier conditions. The protocol was tested across three different quantum states, including a local Haar-random state and two states used in quantum approximate optimisation algorithms (QAOA). In each case, it consistently reduced the impact of measurement errors. The method alternates between a calibration phase and a shadow estimation phase. A key component is Pauli-X twirling, where random bit-flip operations are applied before measurement to symmetrize errors. This enables the estimation of single-qubit expansion coefficients, which are then used to construct corrected “shadow” representations of the quantum state for classical post-processing. The results show that this approach can reduce bias in

Quantum ZeitgeistLoading...0
Trapped Molecules Unlock a New Route to Efficient Quantum Computationquantum-computing

Trapped Molecules Unlock a New Route to Efficient Quantum Computation

Sakthikumaran Ravichandran and colleagues at University of Warsaw show that the structure of optical tweezers, used to control individual molecules, can be used to create efficient quantum gates. Their numerical modelling of dipolar molecules in these traps reveals trap-induced resonances, offering a pathway to state-dependent dynamics and potential applications in both quantum computation and sensing. The work reframes motional dephasing, traditionally viewed as a limitation, as a potentially valuable resource for advancing quantum technologies. Coupled-channel simulations demonstrate enhanced dipolar interactions via trap-induced resonances Numerical simulations now reveal that trap-induced resonances enhance dipolar interactions by a factor of up to 50 when compared to previous calculations utilising simplified models. This enhancement is achieved through a rigorous coupled-channel treatment of the molecular dynamics, incorporating experimentally realistic parameters such as trap frequency and molecular polarisability. The substantial increase in interaction strength unlocks the potential for strong quantum gate implementation, previously hampered by the inherently weak, long-range nature of dipolar interactions. Dipolar molecules, possessing a permanent electric dipole moment, interact via electrostatic forces that fall off rapidly with distance, making strong coupling difficult to achieve. These trap-induced resonances emerge as avoided crossings between vibrational states of the trapped molecules and short-range molecular bound states formed by the trapping potential. The precise control afforded by these resonances allows for tailored manipulation of the molecular interactions. The coupled-channel approach employed accounts for the simultaneous excitation of multiple vibrational modes, providing a more accurate description of the system than single-channel approximations. This is crucial for understanding the complex interplay between the trapping potential a

Quantum ZeitgeistLoading...0
Quantum Computation Using Limited Interactions Rivals Standard Models’ Powerquantum-computing

Quantum Computation Using Limited Interactions Rivals Standard Models’ Power

A new model termed Exchange Quantum Polynomial Time (XQP) utilises the exchange interaction for quantum computation. Jędrzej Burkat and colleagues at University of Oxford show that XQP circuits, constructed from restricted operations, occupy a computational complexity between classically tractable and fully quantum algorithms. Simulating these circuits efficiently would have key implications for computational complexity. It could potentially collapse the polynomial hierarchy. The team also prove that even a limited subset of these circuits remains challenging to simulate, and reveal connections to established mathematical structures like the Gelfand-Tsetlin basis and statistical physics models, suggesting suitability for implementation on near-term quantum hardware. XQP simulation unlocks efficient modelling of broader quantum computational complexity Additive-error simulation of the Exchange Quantum Polynomial Time (XQP) model now enables efficient additive-error simulation of arbitrary BQP computations, a feat previously unattainable. It establishes XQP as a new computational model utilising only computational basis states and the isotropic Heisenberg exchange interaction, effectively bridging the gap between classical and quantum computation. Structurally, XQP captures decoherence-free subspace computation without requiring access to singlet states, simplifying potential hardware implementations. This is particularly significant as singlet states are notoriously difficult to create and maintain with high fidelity in physical quantum systems, often being highly susceptible to decoherence. The isotropic Heisenberg exchange interaction, a fundamental interaction in quantum mechanics, provides a natural and potentially robust mechanism for implementing the required qubit interactions. Circuits comprised solely of √SWAP gates, defining a pulse angle of π/4, remain remarkably difficult to simulate even with these restrictions, challenging existing classical simulation

Quantum ZeitgeistLoading...0
Alice & Bob and Partners Awarded $3.9M ARPA-E Grant for Quantum Magnet Designquantum-computing

Alice & Bob and Partners Awarded $3.9M ARPA-E Grant for Quantum Magnet Design

Alice & Bob and Partners Awarded $3.9M ARPA-E Grant for Quantum Magnet Design Alice & Bob, in collaboration with Los Alamos National Laboratory and GE Vernova, has been awarded $3.9 million from the U.S. Department of Energy’s ARPA-E Quantum Computing for Computational Chemistry (QC3) program. The three-year project aims to develop fault-tolerant quantum algorithms to identify rare-earth-free permanent magnets, which are essential components for electric motors and turbines. The effort is directed toward providing a technical alternative to neodymium-iron-boron (NdFeB) magnets, whose supply chains are currently geographically concentrated and subject to political constraints. The technical objective of the project is to achieve a 10,000-fold speed-up in computing time compared to current state-of-the-art classical simulations. The team will implement a hybrid approach where classical algorithms, developed by a group led by Professor Emanuel Gull, calculate environmental parameters while Alice & Bob’s quantum algorithms simulate highly correlated electronic systems. Los Alamos National Laboratory will contribute tensor network tools for quantum circuit optimization, and the performance targets will be validated experimentally on Alice & Bob’s cat-qubit hardware as well as through theoretical resource estimates. Classical computers struggle to accurately model the complex quantum interactions between electrons that define the magnetic behavior of candidate materials. By using quantum processors to model these systems directly, the consortium intends to enable realistic material calculations within a 24-hour window. GE Vernova’s Advanced Research accelerator will support the project by performing a technoeconomic analysis to evaluate the commercial viability of materials discovered through the hybrid algorithm. If successful, the researchers indicate that the framework could be adapted to broader applications in computational chemistry and materials sci

Quantum Computing ReportLoading...0
Monarch Quantum Secures $55M Growth Round to Scale Integrated Photonics Productionquantum-computing

Monarch Quantum Secures $55M Growth Round to Scale Integrated Photonics Production

Monarch Quantum Secures $55M Growth Round to Scale Integrated Photonics Production Monarch Quantum has closed a $55 million growth round led by Serendipity Capital, with participation from 55 North and Global Innovation Labs. This financing brings the company’s total capital and customer contracts to over $115 million within six months of its 2025 founding. Of this total, approximately $60 million is represented by existing hardware delivery contracts. The San Diego-based company is led by Dr. Timothy Day, a photonics veteran who previously scaled laser systems at Daylight Solutions prior to its 2017 acquisition by Leonardo DRS. The company’s primary product line, Quantum Light Engines™, consists of integrated photonic control systems for quantum computing, sensing, and networking. This hardware is designed to replace lab-scale optical assemblies with integrated, manufacturable subsystems that utilize advanced packaging and systems engineering. The engines are intended to serve as a standardized infrastructure layer, providing the precise photonic control required for various quantum modalities, including trapped-ion and neutral-atom architectures. The technical focus is on transitioning from bespoke laboratory setups to chip-scale hardware configurations with lower size, weight, and power (SWaP) requirements. Monarch currently holds commercial contracts with Quantinuum, Infleqtion (NYSE: INFQ), and NASA to support the integration of photonic control into their hardware roadmaps. The $55 million in new capital will be used to increase production capacity, expand the company’s supply chain, and develop global partnerships with quantum original equipment manufacturers (OEMs) and defense integrators. By focusing on the hardware infrastructure layer, Monarch aims to address the bottlenecks associated with the scaling and reliability of the optical systems necessary for deploying quantum technologies in non-laboratory environments. The lead investor, Serendipity Capital,

Quantum Computing ReportLoading...0
Quantum Factoring Needs Just 10,000 Qubitsquantum-computing

Quantum Factoring Needs Just 10,000 Qubits

A new analysis from Madelyn Cain and colleagues at University of California and Oratomic reveals that Shor’s algorithm, a key tool with implications for modern cryptography, may be achievable with fewer qubits than previously thought. By optimising quantum error correction and circuit design, cryptographically relevant calculations using Shor’s algorithm could potentially be performed with as few as 10,000 reconfigurable atomic qubits. The research sharply lowers the resource requirements for building a quantum computer capable of breaking widely used encryption algorithms, suggesting such a machine may be within reach given continued advances in neutral-atom technology and recent experimental demonstrations of fault-tolerant operations on increasingly large qubit arrays. The team’s analysis indicates that a system with 26,000 physical qubits could solve discrete logarithms on the P-256 elliptic curve in a matter of days. Shor’s algorithm viability enhanced via reduced qubit counts and optimised atomic architectures A cryptographically relevant Shor’s algorithm now requires as few as 10,000 reconfigurable atomic qubits, a reduction from previous estimates of millions. This breakthrough crosses a critical threshold, as the immense qubit requirements previously rendered practical implementation of the algorithm, vital for breaking modern encryption, impossible. The reduction was achieved through advances in quantum error-correcting codes, efficient logical instruction sets, and optimised circuit design, densely packing logical qubits to minimise overhead. Under plausible assumptions, a system comprising 26,000 physical qubits could solve discrete logarithms on the P-256 elliptic curve in a matter of days, demonstrating a sharp leap in computational speed. These findings highlight the potential of neutral atoms for fault-tolerant quantum computing and wider scientific applications. Utilising high-rate quantum error-correcting codes, the team achieved encoding rates of

Quantum ZeitgeistLoading...0
Quantum Code Boosts Data Capacity while Shielding Against Errors
Featured
quantum-computing

Quantum Code Boosts Data Capacity while Shielding Against Errors

A new quantum error correction code, the Majorana-XYZ code, offers a pathway towards scalable and strong quantum computation. Tobias Busse and Lauri Toikka at Aalto University demonstrate a subsystem code where logical quantum information exhibits macroscopic scaling and benefits from topologically non-trivial protection. The code, defined by parameters including $n=L^$2 physical qubits and $k= \lfloor L/2 \rfloor$ logical qubits, detects single- and two-qubit errors, alongside higher-weight errors constrained by its distance of $L$. Key undetected errors remain confined to the gauge group, preserving the integrity of logical information, and the code’s structure combines aspects of both topological and local gauge codes to achieve many topological logical qubits. The authors derive this code from a system of Majorana fermions arranged on a honeycomb lattice, using only nearest-neighbour interactions, suggesting potential feasibility for experimental realisation. Macroscopic scaling of logical qubits via a novel Majorana fermion code The Majorana-XYZ code encodes approximately $\lfloor L/2 \rfloor$ logical qubits, a substantial improvement over previous topological codes. These earlier designs typically required a number of logical qubits scaling linearly with system size, limiting their scalability for complex quantum algorithms. This breakthrough crosses a critical threshold, enabling macroscopic scaling of logical qubits, the fundamental units of quantum information, within a single system, previously unattainable without sacrificing error protection. The ability to encode a significant number of logical qubits is paramount because the complexity of quantum algorithms often necessitates many of these units to represent and manipulate quantum data effectively. Without sufficient logical qubits, even theoretically powerful algorithms become impractical due to the limitations imposed by the physical hardware. Unlike earlier designs reliant on strictly local connecti

Quantum ZeitgeistLoading...0
Alice & Bob Secures ARPA-E Award to Design Rare-Earth-Free Magnetsquantum-computing

Alice & Bob Secures ARPA-E Award to Design Rare-Earth-Free Magnets

Alice & Bob has secured a $3.9 million award from the U.S. Department of Energy’s ARPA-E program to pursue the development of fault-tolerant quantum algorithms with the goal of discovering rare-earth-free permanent magnets. These magnets are essential components in electric motors and turbines, and current reliance on materials like neodymium-iron-boron presents geopolitical and supply chain challenges. The company aims to achieve a 10,000-fold speed-up in computing time compared to existing classical simulations, potentially enabling realistic material calculations within a single day. “Designing high-performance magnets without rare earth elements is one of the hardest problems in material science,” said Juliette Peyronnet, U.S General Manager at Alice & Bob, “as these materials are extremely difficult to simulate with classical computers.” This three-year project, conducted in collaboration with Los Alamos National Laboratory and GE Vernova, could accelerate the creation of more sustainable magnets and offer a broadly applicable solution for complex materials science problems. ARPA-E Funds Alice & Bob for Rare-Earth-Free Magnet Discovery The project will leverage a collaborative effort with Los Alamos National Laboratory, GE Vernova’s Advanced Research accelerator, and researchers led by Professor Emanuel Gull, combining classical and quantum computing techniques. Los Alamos will focus on optimizing quantum circuits using tensor network tools, while GE Vernova will assess the economic viability of materials discovered through this hybrid algorithm. Marco Cerezo, Los Alamos scientist and Laboratory lead on the project, emphasized the synergistic nature of the research, stating, “Finding ways to prepare high quality states via tensor network optimization is a critical tool that will help develop fault-tolerant quantum algorithms applied to challenges like rare-earth-free permanent magnets.” Jonathan Owens of GE Vernova added, “Our team is excited to col

Quantum ZeitgeistLoading...0
QuSecure Collaborating with NIST’s National Cybersecurity Center of Excellence to Address Post-Quantum Algorithm Migrationquantum-computing

QuSecure Collaborating with NIST’s National Cybersecurity Center of Excellence to Address Post-Quantum Algorithm Migration

Insider Brief QuSecure is collaborating with NIST’s NCCoE to support industry efforts in migrating from current cryptographic systems to post-quantum cryptography. The project focuses on identifying quantum-vulnerable public key algorithms and developing tools and strategies to enable smoother enterprise-wide migration. QuSecure will test and evaluate PQC solutions in NCCoE lab environments to improve interoperability, performance, and implementation practices. PRESS RELEASE — QuSecure™, Inc., the market leader in post-quantum cybersecurity and cryptographic agility, today announced it is collaborating with the National Cybersecurity Center of Excellence (NCCoE) in the Migration to Post-Quantum Cryptography Project Consortium to bring awareness to the issues involved in migrating to post-quantum algorithms and to develop practices to ease migration from current public-key algorithms to replacement algorithms. Quantum computers capable of breaking public key cryptography threaten current information systems. NIST’s post-quantum cryptography program developed standardized quantum-resistant algorithms to protect digital information. Organizations must identify where vulnerable public key algorithms exist across hardware, software, and services and prioritize migration to NIST post-quantum cryptographic algorithms to protect data and processes from future threats. “Public-key cryptography is widely used to protect today’s digital information,” said William Newhouse, Security Engineer, NIST National Cybersecurity Center of Excellence. “With the advent of quantum computing, and its potential to compromise many of the current cryptographic algorithms, it is critical that organizations begin to plan for many of the technological and operational challenges that a migration to post-quantum cryptography will present. This project aims to help organizations in that effort.” As a contributing member of the consortium, QuSecure will collaborate with Automated

Quantum DailyLoading...0
Alice & Bob Secures $3.9M ARPA-E Award to Use Quantum Computing to Design Rare-Earth-Free Magnetsquantum-computing

Alice & Bob Secures $3.9M ARPA-E Award to Use Quantum Computing to Design Rare-Earth-Free Magnets

Insider Brief Alice & Bob received $3.9 million from ARPA-E to develop fault-tolerant quantum algorithms for designing rare-earth-free permanent magnets. The project aims to achieve a 10,000× speed-up over classical simulations using hybrid quantum-classical methods for complex material modeling. The three-year effort involves partners including Los Alamos National Laboratory and GE Vernova to advance algorithms, optimization tools, and technoeconomic analysis. PRESS RELEASE — Alice & Bob, a leader in fault-tolerant quantum computing, has been awarded $3.9 million from the U.S. Department of Energy’s Advanced Research Projects Agency-Energy (ARPA-E) Quantum Computing for Computational Chemistry (QC 3) program to develop fault-tolerant quantum algorithms aimed at discovering rare-earth-free permanent magnets – a critical component in electric motors and turbines.  To meet their goal, Alice & Bob will strive to achieve a 10,000-fold speed-up in computing time compared to state-of-the-art classical simulations, enabling realistic material calculations within approximately one day. They will show this speed-up experimentally on Alice & Bob’s fault-tolerant quantum computers, and theoretically with resource estimates.  “Designing high-performance magnets without rare earth elements is one of the hardest problems in material science, as these materials are extremely difficult to simulate with classical computers. A hybrid approach – where classical methods compute environmental parameters and quantum computers simulate highly correlated electronic systems more accurately – could significantly accelerate the discovery of new magnetic materials.”— said Juliette Peyronnet, U.S General Manager at Alice & Bob. Alice & Bob will lead the three-year project in collaboration with Los Alamos National Laboratory, GE&

Quantum DailyLoading...0
Digital Catapult and NQCC Launch New Phase of Quantum Technology Access Programmequantum-computing

Digital Catapult and NQCC Launch New Phase of Quantum Technology Access Programme

Digital Catapult and the National Quantum Computing Centre (NQCC) are expanding their collaboration with a new phase of the Quantum Technology Access Programme (QTAP), designed to accelerate the adoption of quantum computing across UK industry. The initiative will provide up to twelve organizations spanning sectors like energy, healthcare, and finance with training, hands-on experimentation, and expert support to explore potential quantum applications. Participants will focus on two key areas: combinatorial optimization for logistical challenges and quantum machine learning for advanced data analysis. “Our partnership with the NQCC represents an exciting milestone as we continue to prepare UK industry for the next wave of technological transformation,” said Paul Ceely, Director of Technology Strategy, Digital Catapult, emphasizing the program’s aim to equip businesses with the skills to harness quantum computing’s potential and experiment on NQCC systems. Quantum Technology Access Programme (QTAP) Launches with NQCC Building on Digital Catapult’s prior work with over 20 companies, this latest phase of QTAP, sponsored by the NQCC’s SparQ program, aims to move businesses beyond theoretical interest and into practical experimentation with quantum technologies. The initiative recognizes a critical juncture in the development of quantum computing, seeking to equip industry with the skills and understanding necessary to capitalize on emerging opportunities. The program will support up to twelve organizations spanning sectors vital to the UK economy, including energy, healthcare, transport, finance, and aerospace; participants will be guided through immersive learning experiences and hands-on work with NQCC’s quantum computing systems. QTAP will concentrate on two specific application areas designed to yield high impact: combinatorial optimization, addressing complex challenges in logistics and resource management, and quantum machine learning, enabling advanced data analy

Quantum ZeitgeistLoading...0
Google Urges Cryptocurrency Community to Transition to Post-Quantum Cryptographyquantum-computing

Google Urges Cryptocurrency Community to Transition to Post-Quantum Cryptography

Google is urging the cryptocurrency community to proactively adopt post-quantum cryptography, revealing new research that suggests existing security protocols could be vulnerable to future quantum computers with fewer resources than previously anticipated. A new whitepaper from Google Quantum AI details updated estimates for the quantum computing “resources” needed to break the elliptic curve cryptography underpinning many blockchains and other security systems; the research indicates this could be achieved with fewer than 500,000 physical qubits. Researchers have compiled quantum circuits implementing Shor’s algorithm for ECDLP-256 using less than 1,450 logical qubits and 70 million Toffoli gates, representing a roughly 20-fold reduction in required physical qubits. “We want to raise awareness on this issue and are providing the cryptocurrency community with recommendations to improve security and stability before this is possible,” said Ryan Babbush, Director of Research, Quantum Algorithms, and Hartmut Neven, VP of Engineering, emphasizing the need for a transition to post-quantum cryptography to ensure the long-term viability of digital currencies. Google’s 2029 Post-Quantum Cryptography Migration Timeline Google is actively addressing a looming threat to digital security: the advent of quantum computers capable of breaking current encryption standards. Researchers at Google Quantum AI have demonstrated that future quantum computers may compromise the elliptic curve cryptography safeguarding cryptocurrency and other systems with fewer computational resources than previously anticipated, prompting a proactive stance on mitigation. The company’s work, detailed in a new whitepaper, focuses on quantifying the quantum resources, qubits and gates, needed to crack the 256-bit elliptic curve discrete logarithm problem, the foundation of much current cryptographic security. This represents an approximately 20-fold reduction in the physical qubit count needed for such an

Quantum ZeitgeistLoading...0
Paraqaoa Solves Large Problems with Improved Efficiency and 2% Errorquantum-computing

Paraqaoa Solves Large Problems with Improved Efficiency and 2% Error

A new parallel framework, ParaQAOA, addresses a key limitation of the Quantum Approximate Optimisation Algorithm (QAOA) when solving large-scale Max-Cut problems. Po-Hsuan Huang and colleagues at National Taiwan University, in a collaboration between National Taiwan University and National Cheng Kung University, have developed ParaQAOA to improve execution efficiency without compromising solution quality. The framework overcomes the tendency of existing QAOA-based Max-Cut solvers to prioritise accuracy over speed, which restricts their application to realistically sized problems. ParaQAOA uses parallel computing to partition and solve problems more rapidly, demonstrating a speedup of up to 1,600x on 400-vertex problems and successfully solving a 16,000-vertex instance in just 19 minutes, a task that previously required over 13.6 days using established methods. Rapid Max-Cut optimisation via parallel quantum approximate optimisation algorithms ParaQAOA delivers a 1,600x speedup on Max-Cut problems with 400 vertices compared to state-of-the-art methods. A key threshold has now been crossed, enabling the resolution of a 16,000-vertex instance in 19 minutes; previously, this required over 13.6 days using existing approaches. The new framework, ParaQAOA, utilises a parallel divide-and-conquer strategy to partition complex Maximum Cut problems into smaller, simultaneously solvable subproblems, preserving solution quality throughout. This scalable approach allows tunable control over the accuracy-efficiency trade-off, adapting to diverse performance needs and opening possibilities for tackling previously intractable optimisation challenges. Despite the sharp speed improvements, ParaQAOA’s accuracy remained within 2% of the best-known solutions for Max-Cut problems, indicating a strong balance between speed and quality. A striking feature is the linear-time graph partitioning algorithm, reducing the complexity of decomposition and enabling efficient processing of extensive

Quantum ZeitgeistLoading...0
QuTech Chairs Conference Focused on Scaling Spin Qubit Systemsquantum-computing

QuTech Chairs Conference Focused on Scaling Spin Qubit Systems

QuTech will chair Spin Qubit 7, the 7th International Conference and Workshop on Spin-Based Quantum Information Processing, bringing together leading researchers in the field this July. The five-day event, to be held at TU Delft from July 13 to 17, focuses on spin qubits, a promising approach to quantum computing that encodes information in the spin of electrons within semiconductor structures. This conference occurs as QuTech demonstrates advances in high-performance semiconductor spin qubits, including methods for shuttling and scaling these systems. “Spin Qubit 7 offers an important meeting point for the international community,” with a program covering diverse implementations from silicon donors to germanium nanowires, and crucial enabling technologies like cryogenic control electronics. With 45 speakers confirmed and over 12 sponsors, the conference aims to accelerate progress towards robust and scalable quantum technologies. Spin Qubit 7 Conference Details: TU Delft, July 2026 Recent progress from QuTech, including demonstrations of high-performance semiconductor spin qubits and advancements in device architecture, highlights the growing maturity of this technology. The five-day conference will address a diverse range of spin-based qubit implementations, encompassing donors in silicon, quantum dots in various semiconductor materials, and hole-based systems like germanium nanowires; beyond qubit design, the program will also emphasize crucial supporting technologies such as fabrication methods, quantum algorithms, and cryogenic control electronics. Chaired by QuTech principal investigators Maximilian Rimbach-Russ and Stefano Bosco, the conference has already secured 45 speakers and over 12 sponsors, and early-bird registration is available until April 10 via SpinQubit7.com. Researchers interested in attending can find more information, including a complete list of confirmed speakers, on the conference website. Semiconductor Spin Qubits: Silicon, GaAs, and Ge Im

Quantum ZeitgeistLoading...0
Quantum Search Speeds up with a New Mathematical Shortcutquantum-computing

Quantum Search Speeds up with a New Mathematical Shortcut

Scientists at the Beijing International Center for Mathematical Research, Peking University, in collaboration with colleagues at Tsinghua University have developed a refined quantum search technique that substantially improves precision and efficiency. Zhijian Lai and colleagues present a methodology leveraging the Riemannian modified Newton method to optimise Grover’s algorithm, a fundamental procedure in quantum computation. The resultant method achieves a quadratic convergence rate, manifesting as a double-logarithmic dependence on the desired precision, mathematically expressed as O(√N log (1/ε)). This represents a considerable advancement over existing optimisation techniques and crucially, maintains compatibility with established quantum operations, facilitating practical implementation and efficient classical precomputation of parameters. Riemannian optimisation yields faster Grover search complexity A novel approach to Grover’s algorithm, a cornerstone of quantum computing for unstructured search problems of size $N$, achieves a computational complexity of $O(\sqrt{N}\log\log (1/\varepsilon))$. This signifies a marked improvement over previous methods exhibiting a complexity of $O(\sqrt{N}\log (1/\varepsilon)$; the new method’s speedup is double-logarithmic in precision. Grover’s algorithm, originally proposed in 1996, provides a quadratic speedup over classical algorithms for unstructured search, meaning it can find a specific item within a database of $N$ items in approximately √N steps, compared to the classical requirement of N steps. However, optimising the implementation of Grover’s algorithm to achieve this theoretical speedup has remained a significant challenge. The advance presented by Lai et al. arises from applying the Riemannian modified Newton (RMN) method to the problem, which is reformulated as a maximisation problem on a ‘unitary manifold’. This manifold represents the space of all possible quantum states, and formulating the search problem

Quantum ZeitgeistLoading...0
Viewbix’s Nuclear Quantum Progresses to Industry Engagement for Quantum Algorithmsquantum-computing

Viewbix’s Nuclear Quantum Progresses to Industry Engagement for Quantum Algorithms

Viewbix Inc. has announced that Nuclear Quantum, a portfolio company of its subsidiary Quantum X Labs, is moving beyond algorithm development to actively engage with industry partners for its quantum-based simulation technology. This strategic shift aims to modernize nuclear and engineering simulations by integrating Nuclear Quantum’s algorithms into existing modeling platforms, accelerating adoption and minimizing disruption for companies seeking to upgrade capabilities. The technology addresses a critical limitation in the field, the trade-off between simulation precision and computational time, with potential to enhance accuracy in areas like safety assessments and material property prediction. “We are initiating a focused strategy to collaborate with established simulation companies, offering integration pathways for our quantum algorithmic engine into existing modeling platforms,” stated the company, outlining a plan to unlock new performance thresholds through quantum computation. Valuates Report projects the nuclear power simulation software market will grow from 226 million in 2024 to 321 million by 2031, indicating a growing demand for advanced simulation tools. Nuclear Quantum Advances Simulation Algorithm Development Traditional high-fidelity simulations demand extensive processing power, limiting their practical application in real-world engineering scenarios. Nuclear Quantum aims to overcome this limitation by harnessing the principles of quantum computation. The company is now shifting its focus toward collaborative integration with established simulation firms, offering a pathway to upgrade existing modeling platforms without requiring complete overhauls. This strategy is designed to accelerate adoption and minimize disruption for industry partners, allowing them to enhance their capabilities without abandoning existing infrastructure. The company stated that the goal is to enable industry players to upgrade their simulation capabilities without repla

Quantum ZeitgeistLoading...0