Quantum Computing Market Analysis: Industry Trends & Investment
Quantum computing market news: market size, industry analysis, quantum investment, market forecast. Quantum computing stocks & funding.
The quantum computing market is transitioning from research to commercial reality, with projections ranging from $1 billion (2024) to $125 billion by 2032 depending on fault-tolerant system development.
Market segmentation by offering type includes quantum hardware (30%), quantum software (25%), and quantum services (45%). By application: optimization (35%), simulation (30%), machine learning (20%), and cryptography (15%).
India's Quantum Market Landscape
India's National Quantum Mission represents a ₹6,003.65 crore ($720 million) government investment through 2030-31, making it one of the top 5 government quantum programs globally. The mission aims to capture a significant share of the growing quantum market by developing indigenous capabilities across computing, communication, sensing, and materials.
India's quantum startup ecosystem received government support through NQM and NM-ICPS (National Mission on Interdisciplinary Cyber-Physical Systems). Eight startups selected in November 2024 include: QNu Labs (Bengaluru): Quantum-safe networks and QKD systems; QpiAI India (Bengaluru): Superconducting quantum computer development; Dimira Technologies (IIT Mumbai): Cryogenic cables for quantum computing; Prenishq (IIT Delhi): Precision diode-laser systems; QuPrayog (Pune): Optical atomic clocks; Quanastra (Delhi): Advanced cryogenics and superconducting detectors; Pristine Diamonds (Ahmedabad): Diamond materials for quantum sensing; Quan2D Technologies (Bengaluru): Superconducting nanowire single-photon detectors.
Tata Consultancy Services (TCS) partners with IBM on quantum computing with significant investment in quantum algorithm development. The Quantum Valley Tech Park in Andhra Pradesh represents a major public-private quantum computing investment.
quantum-computingQuanscient and Haiqu Demonstrate 15-Step Nonlinear Fluid Simulation on IBM Quantum Hardware
Insider Brief Researchers from Quanscient and Haiqu have developed and tested a new algorithm that reduces the number of qubits required to run computational fluid dynamics simulations on quantum computers, conducting a 15-step nonlinear fluid benchmark with an obstacle on IBM’s Heron R3 quantum processor. The One-Step Simplified Lattice Boltzmann Method algorithm is based on a quantum Lattice Boltzmann Method that generalizes classical CFD techniques, with Haiqu’s algorithmic and runtime layer reducing circuit depth and applying error-reduction techniques to enable multi-step complex workflows on current quantum hardware. The demonstration represents what researchers describe as one of the most physically complex, publicly documented Quantum Lattice Boltzmann Method hardware demonstrations to date, offering a potential path toward industrial-scale quantum CFD solutions for aerospace, automotive, and energy sectors. PRESS RELEASE — Researchers from Quanscient, a leader in cloud-based multiphysics simulation technology and quantum algorithms, and Haiqu, a leading developer of quantum middleware, today announced a new algorithm that can significantly advance the use of quantum computing in real-world engineering applications. The teams conducted a 15-step nonlinear fluid benchmark with an obstacle, making this the most physically complex, publicly documented variant of a Quantum Lattice Boltzmann Method (QLBM) hardware demonstration to date. Developed and tested on IBMs largest-available quantum computer, the IBM Heron R3, the algorithm reduces the number of qubits required to run complex simulations in computational fluid dynamics (CFD) on quantum computers, demonstrating a viable path toward future industrial-scale solutions. CFD is widely used to model how air, water, and other fluids behave around objects, such as airflow over an aircraft wing. It plays a critical role in product development and testing across industries, including aerospace, automotiv
Quantum InsiderLoading...0
quantum-computingCavilinQ Secures $8.8M Seed Round to Architect the Interconnect Layer for Scalable Quantum Systems
Insider Brief CavilinQ raised $8.8 million in seed funding to develop interconnect hardware aimed at scaling quantum computers beyond single-processor systems. The company is building photonic links to connect multiple quantum processors into modular, distributed computing architectures. Funding will support lab development, team expansion, and early demonstrations of its quantum networking technology. PRESS RELEASE — CavilinQ, a quantum hardware startup, today announced it has raised $8.8 million in seed funding to develop the interconnect hardware necessary to scale quantum computers beyond today’s single-processor limits. The round was led by QVT, with participation from Safar Partners, MFV Partners, Serendipity Capital, and Harper Court Ventures. The quantum industry has reached exciting milestones by performing verifiable calculations that challenge classical supercomputers. However, achieving broad, reliable real-world impact remains limited by the scaling challenge. To address this, CavilinQ is developing cavity-enhanced photonic links that enable individual quantum processors to operate together as modular, high-performance clusters. “While we’ve seen impressive demonstrations of quantum utility on specialized tasks, solving real-world problems has been limited by the physical limits of current isolated processors,” said Shankar G. Menon, CEO of CavilinQ. “We are building the interconnects that unify isolated processors into one distributed processor, providing the infrastructure to make large-scale, fault-tolerant computing a reality.” The company’s approach leverages high-fidelity light-matter interfaces, a field pioneered by its scientific co-founders Mikhail Lukin (Harvard University) and Hannes Bernien (University of Chicago / University of Innsbruck). While the technology is platform agnostic, CavilinQ will initially demonstrate integration with neutral atom quantum processors, a leading modality for large-scale quantum processing. “With recent advance
Quantum DailyLoading...0
quantum-computingHybrid Classical--Quantum Optimization of Wireless Routing Using QAOA and Quantum Walks
--> Quantum Physics arXiv:2604.01250 (quant-ph) [Submitted on 1 Apr 2026] Title:Hybrid Classical--Quantum Optimization of Wireless Routing Using QAOA and Quantum Walks Authors:Eric Howard, Hardique Dasore, Hom Nath Dhungana, Radhika Kuttala, Samuel Murphy, Emma Soo, Shah Haque View a PDF of the paper titled Hybrid Classical--Quantum Optimization of Wireless Routing Using QAOA and Quantum Walks, by Eric Howard and 6 other authors View PDF HTML (experimental) Abstract:Routing in wireless communication networks is shaped by mobility, interference, congestion, and competing service requirements, making route selection a high-dimensional constrained optimization problem rather than a simple shortest-path task. This paper investigates the use of hybrid classical--quantum methods for wireless routing, focusing on the Quantum Approximate Optimization Algorithm (QAOA) and quantum walks as candidate mechanisms for exploring complex routing spaces. The paper examines how wireless routing can be expressed as a constrained graph optimization problem in which routing objectives, flow constraints, connectivity requirements, and interference effects are mapped into quantum-compatible Hamiltonian representations. It then discusses how these approaches can be integrated into a hybrid architecture in which classical systems perform network monitoring, graph construction, pre-processing, and deployment, while quantum subroutines are used for selected optimization components. The analysis shows that the potential value of quantum routing lies primarily in the treatment of difficult combinatorial subproblems rather than end-to-end replacement of classical routing frameworks. The paper also highlights practical limitations arising from state preparation, constraint encoding, oracle construction, hardware noise, limited qubit resources, and hybrid execution overhead. It is argued that any meaningful near-term advantage will depend on careful problem decomposition, compact encoding, and tig
arXiv Quantum PhysicsLoading...0
quantum-computingExhaustive Optimisation of Automorphism Groups for Stabiliser Codes
--> Quantum Physics arXiv:2604.01282 (quant-ph) [Submitted on 1 Apr 2026] Title:Exhaustive Optimisation of Automorphism Groups for Stabiliser Codes Authors:Aisling Mac Aree, Mark Howard View a PDF of the paper titled Exhaustive Optimisation of Automorphism Groups for Stabiliser Codes, by Aisling Mac Aree and 1 other authors View PDF Abstract:An important measure of utility for a quantum code is the identification of which logical operations can be implemented fault-tolerantly on its codespace. We introduce a framework which leverages the automorphism groups of associated classical codes, the choice of logical basis and exploitation of code equivalence to construct all distinct implementable realisations of each valid logical operation for a given $[[n,k,d]]$ code. We establish conjugacy classes and group transversals (unrelated to transversality) as key explanatory concepts. We subsequently motivate and calculate two figures-of-merit that can be optimised with this framework. Our results yield a table of optimal logical operations and their corresponding physical circuits for all small stabiliser codes with $n \leq 7$ and $k \leq 2$, drawn from quantum databases. This exhaustive table of results provides the optimal physical implementations of logical operations which may be advantageous for both magic state cultivation and experimental purposes. Comments: Subjects: Quantum Physics (quant-ph) Cite as: arXiv:2604.01282 [quant-ph] (or arXiv:2604.01282v1 [quant-ph] for this version) https://doi.org/10.48550/arXiv.2604.01282 Focus to learn more arXiv-issued DOI via DataCite Submission history From: Aisling Mac Aree Ms [view email] [v1] Wed, 1 Apr 2026 18:00:26 UTC (682 KB) Full-text links: Access Paper: View a PDF of the paper titled Exhaustive Optimisation of Automorphism Groups for Stabiliser Codes, by Aisling Mac Aree and 1 other authorsView PDFTeX Source view license Current browse context: quant-ph < prev | next > new | r
arXiv Quantum PhysicsLoading...0
quantum-computingResource Estimation via Efficient Compilation of Key Quantum Primitives
--> Quantum Physics arXiv:2604.01376 (quant-ph) [Submitted on 1 Apr 2026] Title:Resource Estimation via Efficient Compilation of Key Quantum Primitives Authors:Colin Campbell, Rich Rines, Victory Omole, Tina Oberoi, Palash Goiporia, Rayat Roy, R. Peyton Cline, Eric B. Jones, Teague Tomesh View a PDF of the paper titled Resource Estimation via Efficient Compilation of Key Quantum Primitives, by Colin Campbell and 8 other authors View PDF Abstract:Resource estimation is a significant challenge in evaluating fault tolerant quantum computers. Existing approaches often rely on either fixed architectural assumptions or coarse analytical models that fail to capture the interaction between hardware constraints and circuit compilation. This challenge is particularly acute for neutral atom quantum computers, where architectural features such as atom movement, measurement zones, and multi-species arrays introduce a broad design space for implementing fault tolerant computation. Addressing the need for a tighter feedback loop between hardware design and practical application development, we present a compilation-driven framework for quantum resource estimation that translates arbitrary quantum circuits into logical primitive operations with known physical resource costs. This framework allows for easily configurable hardware assumptions that enable rapid comparison of different architectural design choices. We apply our approach to two early fault tolerant quantum simulation and optimization workloads, assuming the use of the surface code, revealing several architectural trends. While the production of magic states continues to be the dominant source of overhead for these benchmarks, access to movement can save time on cultivation and important transversal gates. As problem size grows, routing and qubit movement become dominant bottlenecks, highlighting the need for movement-aware compiler optimizations and frugal routing strategies. Finally, our results suggest that neutral at
arXiv Quantum PhysicsLoading...0
quantum-computingQuantum polymorphism characterisation of commutativity gadgets in all quantum models
--> Quantum Physics arXiv:2604.01408 (quant-ph) [Submitted on 1 Apr 2026] Title:Quantum polymorphism characterisation of commutativity gadgets in all quantum models Authors:Eric Culf, Josse van Dobben de Bruyn, Peter Zeman View a PDF of the paper titled Quantum polymorphism characterisation of commutativity gadgets in all quantum models, by Eric Culf and 2 other authors View PDF HTML (experimental) Abstract:Commutativity gadgets provide a technique for lifting classical reductions between constraint satisfaction problems to quantum-sound reductions between the corresponding nonlocal games. We develop a general framework for commutativity gadgets in the setting of quantum homomorphisms between finite relational structures. Building on the notion of quantum homomorphism spaces, we introduce a uniform notion of commutativity gadget capturing the finite-dimensional quantum, quantum approximate, and commuting-operator models. In the robust setting, we use the weighted-algebra formalism for approximate quantum homomorphisms to capture corresponding notions of robust commutativity gadgets. Our main results characterize both non-robust and robust commutativity gadgets purely in terms of quantum polymorphism spaces: in any model, existence of a commutativity gadget is equivalent to the collapse of the corresponding quantum polymorphisms to classical ones at arity $|A|^2$, and robust gadgets are characterized by stable commutativity of the appropriate weighted polymorphism algebra. We use this characterisation to show relations between the classes of commutativity gadget, notably that existence of a robust commutativity gadget is equivalent to the existence of a corresponding non-robust one. Finally, we prove that quantum polymorphisms of complete graphs $K_n$ have a very special structure, wherein the noncommutative behaviour only comes from the quantum permutation group $S_n^+$. Combining this with techniques from combinatorial group theory, we construct separations between
arXiv Quantum PhysicsLoading...0
quantum-computingCodimension-controlled universality of quantum Fisher information singularities at topological band-touching defects
--> Quantum Physics arXiv:2604.01515 (quant-ph) [Submitted on 2 Apr 2026] Title:Codimension-controlled universality of quantum Fisher information singularities at topological band-touching defects Authors:C. A. S. Almeida View a PDF of the paper titled Codimension-controlled universality of quantum Fisher information singularities at topological band-touching defects, by C. A. S. Almeida View PDF HTML (experimental) Abstract:Topological phase transitions in generic multiband systems are mediated by band-touching defects whose codimension -- the number of momentum directions along which the gap closes linearly -- varies across universality classes. Although singular behavior of fidelity susceptibilities and quantum Fisher information (QFI) has been computed for specific models, no unifying principle connecting these results has been identified: it has remained unclear whether the controlling variable is spatial dimensionality, band structure, or an intrinsic geometric property of the defect. We resolve this question by showing that the singular contribution to the QFI with respect to the tuning parameter $m$ obeys a universal power-law scaling $\sim |m|^{p-2}$ for $p \neq 2$, with a logarithmic divergence $\sim \ln(1/|m|)$ at the marginal codimension $p = 2$, where $p$ denotes the codimension of the band-touching defect. This exponent is independent of spatial dimensionality, anisotropies, ultraviolet regularization, and additional gapped bands, and is protected by renormalization-group arguments at the linearized fixed point. The result unifies previously isolated observations for SSH chains ($p=1$), Chern insulators ($p=2$), and Weyl semimetals ($p=3$) as instances of a single codimension-dependent universality class, and reveals that only defects with $p \leq 2$ generate divergent information-geometric responses. This establishes a direct and previously missing link between topological classification in momentum space and quantum distinguishability in parameter sp
arXiv Quantum PhysicsLoading...0
quantum-computingA Differentiable Physical Framework for Goal-Driven Spin-State Engineering in Magnetic Resonance Spectroscopy
--> Quantum Physics arXiv:2604.01722 (quant-ph) [Submitted on 2 Apr 2026] Title:A Differentiable Physical Framework for Goal-Driven Spin-State Engineering in Magnetic Resonance Spectroscopy Authors:Gaocheng Fu, Shiji Zhang, Kai Huang, Xue Yang, Huilin Zhang, Daxiu Wei, Ye-Feng Yao View a PDF of the paper titled A Differentiable Physical Framework for Goal-Driven Spin-State Engineering in Magnetic Resonance Spectroscopy, by Gaocheng Fu and 6 other authors View PDF Abstract:Magnetic Resonance Spectroscopy (MRS) offers a unique non-invasive window into metabolic processes, yet its potential remains strictly constrained by severe spectral congestion and intrinsic insensitivity. Traditional pulse sequence design, tethered to human intuition, predominantly targets simple quantum states, thereby overlooking the vast majority of the exponentially scaling operator space which consists of complex spin superpositions. Here, we introduce a spectrum-driven, end-to-end differentiable physical framework that transcends these heuristic limitations. By integrating physical laws with automatic differentiation algorithm, our approach directly navigates the high-dimensional spin dynamics space, bypassing the intractable inverse problem of state preparation. This enables the discovery of non-intuitive, complex mixed states that simultaneously satisfy the dual objectives of selective excitation and interferometric signal enhancement. We validate this paradigm by achieving the robust separation of Glutamate and Glutamine, which is a longstanding neuroimaging challenge, in the human brain at 3T, demonstrating spectral fidelity superior to conventional methods. By unlocking the "dark" informational content of nuclear spin ensembles, our work establishes a generalizable paradigm for goal-driven quantum state engineering in magnetic resonance and beyond. Subjects: Quantum Physics (quant-ph); Applied Physics (physics.app-ph); Medical Physics (physics.med-ph) Cite as: arXiv:2604.01722 [quant-ph
arXiv Quantum PhysicsLoading...0
quantum-computingQuantum Encryption’s Hidden Weakness Exposed by New Eavesdropping Attack
A new attack, termed ‘Manipulate-and-Observe’, exploits information leaked during key reconciliation in quantum key distribution systems. William Tighe and colleagues at School of Physics and Astronomy demonstrate the attack intercepts a portion of qubits and injects errors to remain undetected. Simulations using the BB84 protocol and Cascade reconciliation show this can sharply diminish security. This can potentially allow full recovery of the secret key material. The findings highlight the key need to reassess the combined security of both quantum key exchange and the classical post-processing steps that follow, particularly for protocols relying on parity-based error correction. Parity leakage enables complete key recovery in quantum key distribution systems A novel attack reduces the search space for an n-bit reconciled key from 2n down to a single candidate, a feat previously impossible with existing security protocols. The ‘Manipulate-and-Observe’ attack targets quantum key distribution systems by exploiting parity leakage during reconciliation, subtly altering quantum data and analysing resulting errors. Reconciliation protocols are error-correction routines that, while fixing errors, inadvertently reveal information to potential eavesdroppers. Simulations utilising the BB84 protocol and Cascade reconciliation revealed this attack can diminish security below theoretical limits and, in the worst-case scenario, fully recover the secret key material, highlighting vulnerabilities in the classical post-processing stage of quantum communication. Interception of between zero and 11 percent of qubits occurs during key exchange, injecting errors while remaining undetected, thus allowing probing of parity leakage during reconciliation. Quantum key distribution (QKD) aims to provide unconditionally secure communication by leveraging the principles of quantum mechanics, such as the Heisenberg uncertainty principle and the no-cloning theorem, to guarantee the security of
Quantum ZeitgeistLoading...0
quantum-computingEnhanced Quantum Control Beats Previous Squeezing Limits
Optimising a single collective transverse field now enhances spin squeezing in a two-dimensional system with dipolar interactions. Achieving substantial squeezing in finite-range interacting systems proved challenging until now, but this strategy surpasses the two-axis-twisting benchmark. Ang Li of the Tsinghua University and colleagues used rotor-spin-wave theory with a power-law interaction value of α = 3 to achieve this breakthrough in squeezing enhancement. A new method enhances ‘spin squeezing’, a key resource for building more sensitive quantum technologies. The technique overcomes existing limitations when controlling interactions within quantum systems, enabling stronger and more dependable squeezing than previously attainable. The advance uses rotor-spin-wave theory to efficiently manage complex calculations for larger systems, demonstrating that optimising a single control field is sufficient to sharply improve squeezing. Spin squeezing gently compresses a cloud of quantum particles to reduce uncertainty in measurements, much like focusing a blurry image, and enhancing it allows for more precise readings. This advance addresses a longstanding challenge in controlling interactions within quantum systems, particularly those with finite-range interactions where particles do not influence each other equally. Ang Li of the University of Science and Technology of China and colleagues employed rotor-spin-wave theory, a simplified map for complex quantum interactions, to efficiently calculate optimal control strategies for larger systems. Remarkably, optimising just one control field surpasses the performance of the two-axis-twisting benchmark, a standard measure of squeezing quality. Optimised transverse fields enable enhanced spin squeezing beyond two-axis twisting Spin squeezing, a key resource for quantum technologies, now surpasses the two-axis-twisting (TAT) benchmark, achieving a squeezing parameter exceeding that of traditional methods by a substantial mar
Quantum ZeitgeistLoading...0
quantum-computingCERN and ADVACAM Deploy Radiation Monitoring System for NASA’s Artemis II
NASA’s Artemis II mission, launched earlier this morning, carries a radiation monitoring system developed through a collaboration between CERN and ADVACAM, marking the first human journey to the Moon since 1972. The system utilizes six Timepix chips to measure the radiation environment within the Orion spacecraft, a critical step toward ensuring the safety of future lunar and deep space exploration, as astronauts are expected to receive tens of millisieverts of radiation during the ten-day journey, more than ten times the annual dose experienced on Earth. Unlike missions in low Earth orbit, Artemis II will venture beyond the protective geomagnetic field, exposing the crew and sensitive electronics to increased levels of galactic cosmic rays and trapped particles. Understanding and managing this exposure is essential for continued safe space exploration, highlighting the importance of real-time data collection for assessing risk and responding to sudden events like coronal mass ejections. CERN Timepix Chips Monitor Artemis II Radiation Environment Understanding this exposure is paramount for ensuring the safety of future deep-space exploration. These Timepix chips are integral to NASA’s Hybrid Electronic Radiation Assessor (HERA) system, designed to characterize the radiation in real time. The system analyzes the composition, intensity, and energy of incoming particles, providing crucial data for assessing risks to both the astronauts and the spacecraft’s sensitive electronics. In such environments, real-time radiation monitoring and characterization, along with a rapid response capability, are essential, particularly during sudden radiation events such as coronal mass ejections. Developed by the CERN-hosted Medipix2 Collaboration, the chips utilize a matrix of pixels to detect individual particles and measure their energy deposition, allowing for identification of different radiation types. This is not the first time Timepix technology has left Earth; it was initial
Quantum ZeitgeistLoading...0
quantum-computingQubits Ventures Hosts Quantum Day Pasadena to Connect Southern California Tech Community
Qubits Ventures will host Quantum Day Pasadena on April 14, 2026, at the California Institute of Technology, connecting the growing Southern California quantum technology ecosystem. The event coincides with World Quantum Day and Innovate Pasadena’s Connect Week, and is designed to bridge the gap between academic research and commercial application for quantum computing, a field that leverages the multiple states of quantumly entangled particles known as qubits. A panel discussion, “From Lab to Market, Accelerating Commercialization and Securing Funding,” and a fireside chat featuring a physicist from the Jet Propulsion Laboratory are planned, along with exhibits from local startups and universities. “Quantum Day Pasadena will bring together students, academics, founders, scientists, researchers, and investors, and foster growth for the Southern California tech community,” said Nardo Manaloto, Qubits Ventures Managing Partner. Qubits Ventures Hosts Pasadena Quantum Tech Networking Event Qubits Ventures, which invests in early-stage quantum and future of computing technologies globally, intends for the afternoon event to be an accessible entry point into deep tech for the wider public, facilitating connections between funders and emerging companies. This will be followed by a fireside chat featuring JPL physicist Lin Yi and Qubits Ventures Managing Partner Nardo Manaloto, exploring the development of a robust quantum ecosystem; Manaloto explained, “With so many leading academic programs in the area, including Caltech, USC, UCSB, UCLA, and UCI, as well as the Jet Propulsion Laboratory and a growing quantum technology startup community, an event like this is long overdue.” Beyond the scheduled talks, the event will showcase quantum devices, projects, and applications from local startups and academic institutions, alongside dedicated networking opportunities. Speakers will include Alan Ho, CEO of Qolab, Farzaneh Afshinmanesh, CEO of PINC Technologies, a Caltech spin-off,
Quantum ZeitgeistLoading...0
quantum-computingHiggs Boson Decays Could Rule Out Faster-Than-Light Entanglement Signals
Lawrence Lee and colleagues at University of Tennessee present a method to probe entanglement through the analysis of $H\rightarrowτ^+τ^-$ decays at a future electron-positron Higgs factory. Simulations of events at an energy of 240 GeV demonstrate the possibility of reconstructing tau lepton decay vertices and measuring spin correlations based on the time between the decays. The work offers the first spacetime-resolved measurement of electroweak quantum entanglement at a particle collider, and with 0.75 ab$^{-1}$ of data, it could exclude entanglement signal propagation speeds below approximately two times the speed of light at 95% confidence level, providing a unique test of fundamental physics. Higgs boson decays constrain superluminal entanglement propagation with high confidence Entanglement signal propagation speeds below approximately 2c can now be excluded with 95% confidence, representing a substantial improvement over previous limitations that could not definitively rule out superluminal signalling. Analysing simulated data equivalent to 0.75 ab$^{-1}$ of integrated luminosity, a measure of the total data collected, at a future electron-positron Higgs factory enabled this exclusion. This now makes the first proposed spacetime-resolved measurement of electroweak quantum entanglement at a particle collider possible, opening a new avenue for testing fundamental physics beyond the Standard Model A future electron-positron Higgs factory offers the potential to rigorously test quantum entanglement in Higgs boson decays. Simulations of collisions at an energy of $\sqrt{s}=$240 GeV, generating data equivalent to 0.75 ab$^{-1}$ of integrated luminosity, allowed reconstruction of tau lepton decay vertices and measurement of spin correlations. Propagation speeds of entanglement signals below approximately nine times the speed of light were excluded, a sharp improvement over previous limitations. The analysis focused on events where the Z boson decayed into muons and
Quantum ZeitgeistLoading...0
quantum-computingResearchers Simulate Thermal Effects to Track Quantum Evolution with High Precision
Researchers led by G. X. A. Petronilo from the Universidade Federal do Pará, together with collaborators from SENAI CIMATEC and the Universidade Federal do Oeste da Bahia have developed a gate-based quantum algorithm that prepares and evolves the finite-temperature vacuum of Thermofield Dynamics. The protocol, utilising only single-qubit rotations and nearest-neighbour CNOT gates, exhibits a circuit depth that scales linearly with system size and is therefore suitable for near-term quantum computers. Benchmarking on the PennyLane simulator confirms the algorithm accurately reproduces known results for a spin-$1/$2 particle, including temperature-dependent damping, and provides a foundation for exploring thermal quantum simulations, dissipative phase transitions, and thermal machine-learning models on existing hardware. Thermodynamic properties of spin-1/2 particles verified with machine precision using a novel approach A fundamental quantum property, the magnetization of a spin-1/2 particle, now aligns with analytical predictions to machine precision, a level of accuracy previously unattainable in thermal quantum simulations. Earlier methods lacked the necessary precision to verify core theoretical results, with discrepancies arising from the limitations of simulating thermal effects on near-term quantum devices. Researchers at the Universidade Federal do Para have developed a new gate-based quantum algorithm, utilising Thermofield Dynamics, a technique for encoding temperature into quantum systems, to achieve this unprecedented level of agreement. Thermofield Dynamics, originally formulated in the 1970s, provides a framework for describing quantum systems in thermal equilibrium by introducing a ‘duality’ between particles and anti-particles, effectively doubling the Hilbert space to incorporate thermal degrees of freedom. This allows for the treatment of temperature as a parameter within the quantum mechanical formalism itself, rather than as a classical statistica
Quantum ZeitgeistLoading...0
quantum-computingThe Forgotten Blue Chip Stock That's Been Quietly Compounding at 15%+ a Year
By Anders Bylund – Apr 2, 2026 at 1:13PM ESTKey PointseBay has delivered a 15.8% annualized total return over the past decade, beating the S&P 500 by a small margin.The company's share count has dropped 62% since 2015 thanks to aggressive buybacksIts pending $12 billion acquisition of Etsy's Depop brings 56 million younger users to the platform.When investors think of e-commerce giants, eBay (EBAY +0.67%) rarely tops the list. The lack of fanfare hasn't stopped the online marketplace from beating the market in the long term, though. As of April 1, the 30-year-old company has been compounding at 14.3% annually over the last decade. Throw in a dividend policy yielding 1.5% annually and you get a total return of 15.8% per year. That's enough to outpace the S&P 500's (^GSPC 0.13%) 14.2% total return without ever trending on social media. EBAY Total Return Level data by YCharts The numbers behind the quiet compounder The company isn't trying to be everything to everyone. Unlike Amazon or Alibaba, it's not building enterprise-class artificial intelligence (AI) tools or same-day shipping networks. It's just connecting buyers and sellers of used sneakers, vintage watches, and rare Pokémon cards while steadily growing revenue. Meanwhile, the company's capital allocation has been shareholder-friendly. It returned approximately $3 billion to shareholders in 2025 through buybacks and dividends. And those buybacks make a big difference. eBay's share count is down by a staggering 62% since the end of 2015. eBay is deliberately prioritizing steady, predictable margins over boundless, expensive expansion. That might be exactly what you're looking for in a robust, long-term investment. ExpandNASDAQ: EBAYeBayToday's Change(0.67%) $0.62Current Price$93.75Key Data PointsMarket Cap$42BDay's Range$91.77 - $94.5352wk Range$58.71 - $101.15Volume170KAvg Vol5.3MGross Margin71.51%Dividend Yield1.27% Can the compounding continue? I can't promise that eBay's market-beating compound
The Motley FoolLoading...0
quantum-computingSuperconducting System Achieves 98 Per Cent Accuracy in Quantum Calculations
Scientists have designed and simulated a new architecture for continuous variable (CV) quantum computation utilising superconducting circuits. Bruno A. Veloso of the Universidade Federal de São Carlos (Brazil) and colleagues created a two-layer system capable of performing all five interactions, rotation, displacement, squeezing, Kerr, and beam splitter, necessary for universal CV computation, within practical experimental parameters. The system addresses a key challenge in the field, as superconducting platforms previously lacked a scalable architecture for universal CV computation, and achieves high gate fidelities exceeding 98 percent. The modular design establishes a clear route towards building high-fidelity, universal CV quantum computers based on superconducting technology. High fidelity universal continuous-variable gate implementation in a superconducting circuit Gate fidelities now reach 99% for Gaussian operations, representing a substantial improvement over previous superconducting continuous-variable (CV) devices. Those earlier devices struggled to surpass 98% fidelity and lacked the capacity for universal computation. This breakthrough stems from a newly designed two-layer superconducting architecture that successfully implements all five interactions, rotation, displacement, squeezing, Kerr, and beam splitter, essential for universal CV quantum computation within experimentally viable parameters. Continuous variable quantum computation differs fundamentally from qubit-based approaches by leveraging the infinite-dimensional Hilbert space of bosonic modes, offering potential advantages in certain computational tasks and encoding strategies. The inherent complexity of manipulating these continuous degrees of freedom has, however, presented significant engineering challenges. The system employs a DC-SQUID to encode continuous variables, a fluxonium qubit to mediate nonlinear interactions, and two ancillary qubits to broaden computational possibilitie
Quantum ZeitgeistLoading...0
quantum-computingI found that parameterized quantum circuits always produce outputs inside the Mandelbrot set — even across 130 inputs and 100 random seeds
I’ve been studying the geometry of parameterized quantum circuit (PQC) outputs, and I stumbled onto something unexpected. If you take adjacent qubit probabilities from a PQC, center them around 0.5, and treat them as a complex number c, then iterate the Mandelbrot map z\rightarrow z^2+c, the orbit never escapes. Across: 130 inputs 35 mathematical families 6‑qubit, 6‑layer PQC 4096–8192 shots per input …every single output landed inside the Mandelbrot set. I stress‑tested this across 2–20 qubits, 1–50 layers, and 100 random parameter seeds. The result held 97.5% of the time. The only failures were circuits with just one entanglement layer — with 2+ layers, boundedness was universal. The mechanism seems to be entanglement: CNOT cascades prevent adjacent qubits from simultaneously reaching extreme probabilities, which keeps |c|\leq 0.606, well inside the Mandelbrot cardioid. This doesn’t encode input identity (p = 0.83), so it’s not a classifier. But it is a geometric constraint on PQC output space that I haven’t seen reported before. Full paper (Zenodo): https://zenodo.org/records/19367794 submitted by /u/Clean-Swordfish-5977 [link] [comments]
Reddit r/QuantumComputing (RSS)Loading...0
quantum-computingQuanscient and Haiqu Demonstrate Algorithm for Scalable Computational Nonlinear Fluid Simulations
Quanscient and Haiqu Demonstrate Algorithm for Scalable Computational Nonlinear Fluid Simulations Quanscient and Haiqu have announced a new quantum algorithm designed to accelerate Computational Fluid Dynamics (CFD) simulations. The researchers successfully executed a 15-step nonlinear fluid benchmark involving an obstacle, which currently stands as the most physically complex hardware demonstration of a Quantum Lattice Boltzmann Method (QLBM). The benchmark was conducted on the IBM Heron R3 quantum processor, demonstrating that complex fluid behaviors can be simulated with significantly fewer qubits and computational operations than previously required. The technical core of the breakthrough is a novel One-Step Simplified LBM (OSSLBM). Traditionally, QLBM implementations are resource-heavy, often exceeding the qubit counts or circuit depths available on near-term hardware. By utilizing Haiqu’s middleware and runtime layer, the team reduced the circuit depth and applied targeted error-reduction techniques. This allowed the system to maintain convergence toward a steady state even in the presence of hardware noise. The OSSLBM framework is notably more flexible than earlier models, allowing for a wider range of physics—from linear acoustics to nonlinear Navier-Stokes problems—to be modeled within a hybrid quantum-classical loop. Industrial CFD is a cornerstone of the aerospace, automotive, and energy sectors, where simulating airflow or liquid turbulence can take weeks on classical supercomputers. This collaboration outlines a practical path for moving beyond simple linear demonstrations toward realistic engineering applications. While current hardware still faces challenges with non-unitarity and amplitude dissipation, the Quanscient and Haiqu results suggest that with the right algorithmic “middleware,” industrially relevant fluid simulations may reach commercial viability sooner than expected as quantum systems continue to scale. For the full technical results and
Quantum Computing ReportLoading...0
quantum-computingCommonwealth Fusion Systems leans on magnets for near-term revenue
Commonwealth Fusion Systems said on Thursday it would sell high-temperature superconducting magnets to Realta Fusion, the second in a string of deals that suggests the company will lean heavily on its magnet technology in the coming years to bring in much-needed revenue. “It’s the largest deal of this kind to date for CFS,” Rick Needham, the company’s COO, told reporters on a call. Commonwealth Fusion Systems, or CFS, previously sold magnets to the WHAM experiment at the University of Wisconsin, with which fusion startup Realta collaborates closely. The physics behind WHAM underpins Realta’s approach to fusion power, which is known as a magnetic mirror reactor. In a magnetic mirror, plasma is confined into a shape that resembles two 2-liter soda bottles connected at the base. On each end, powerful magnets punch the plasma and force it back toward the center. Weaker magnets encircle the middle of the bottle shape. To make a more powerful reactor, Khosla-backed Realta would only need to expand the middle section, and because those magnets are less powerful, they’re cheaper. Per kilowatt-hour costs should fall as Realta’s reactors increase in size. CFS is pursuing another form of magnetic confinement fusion called a tokamak. In a tokamak, D-shaped magnets cast powerful fields to keep plasma circulating in a doughnut-like shape inside. Over the years, the company has refined its magnets in pursuit of putting electrons on the grid from Arc, its future commercial-scale reactor that’s slated to be built in Virginia. Both CFS’ and Realta’s existence stem from the magnets themselves. CFS was founded in 2018 after scientists at MIT realized that a new class of commercially available high-temperature superconductors could underpin a viable tokamak design. Realta was founded a few years later when physicists at the University of Wisconsin “saw that there was a new technology, a game changer that would enable us to go back to the [magnetic] mirror and avai
TechCrunchLoading...0
quantum-computingHumans head to the moon for the first time in years
For more than half a century, the moon has been something humans could only look at. Wednesday, April 1, changed that. At 6:35 p.m. EDT, a 322-foot rocket carrying four astronauts lifted off from Kennedy Space Center in Florida and pointed toward the lunar sky, per NASA. "We have a beautiful ...
TheStreetLoading...0