Back to News
quantum-computing

Is RSA Safe? New Study Argues Quantum Computers Face a Hard Ceiling

Quantum Daily
Loading...
6 min read
0 likes
⚡ Quantum Brief
A new study in Proceedings of the National Academy of Sciences challenges quantum computing’s potential by proposing a fundamental limit on qubit scalability, suggesting RSA encryption may remain secure against quantum attacks. The theory, Rational Quantum Mechanics (RaQM), argues quantum systems are inherently discrete—not continuous—imposing a hard cap of ~1,000 qubits, far below what’s needed to break modern cryptography like 2,048-bit RSA keys. Standard quantum mechanics assumes exponential scaling, but RaQM claims information capacity grows linearly, causing quantum advantage to vanish beyond ~400 qubits, undermining algorithms like Shor’s. The framework links qubit limits to gravity, proposing quantum states are finite bit strings, not continuous wavefunctions, with experiments on large-scale entanglement potentially validating or disproving the theory soon. While speculative, RaQM could reshape quantum computing’s trajectory, shifting focus from fault-tolerant systems to niche applications within the proposed constraints, with near-term devices unaffected.
Is RSA Safe? New Study Argues Quantum Computers Face a Hard Ceiling

Summarize this article with:

Insider BriefA paper published in Proceedings of the National Academy of Sciences proposes a revision to quantum theory that, if validated, would place a strict upper bound on the power of quantum computers. That limit could mean RSA, a key cryptographic system protecting much of today’s data, may not be threatened after all.”The study introduces a framework called Rational Quantum Mechanics, or RaQM, which suggests that the mathematical space underlying quantum systems is not continuous — as long assumed — but fundamentally discrete.This new framework suggests that quantum computers may never achieve the large-scale performance needed to break modern encryption or deliver the full exponential speedups long promised by the field.The study, written by Tim Palmer of the University of Oxford, reframes quantum mechanics as an approximation of a deeper, information-limited system. While standard quantum theory allows systems to scale indefinitely in complexity, RaQM imposes a finite capacity on how much information a quantum system can encode. According to the paper, that capacity translates into a maximum number of qubits, which are units of quantum information, that can be meaningfully entangled and used in computation.Estimates in the study suggest that limit lies between roughly 200 and 400 qubits for current technologies, and may never exceed about 1,000 qubits under any physical implementation. Beyond that threshold, the theory predicts that quantum computers would lose their computational edge, even if engineers succeed in building larger and more stable machines.The central claim of the paper is that the continuous mathematical structure used in quantum mechanics — known as Hilbert space — is an idealization. In practice, it may be composed of discrete, rational elements that impose strict constraints on quantum systems, according to the study.In conventional quantum mechanics, a system of N qubits can occupy a state described by an exponentially large number of parameters. This exponential scaling is critical for the fulfillment of the promise of quantum computing, enabling algorithms such as Shor’s method for factoring large numbers far faster than classical machines.RaQM challenges that premise by introducing a concept called “qubit information capacity,” which grows only linearly with the number of qubits. When the exponential number of quantum states outpaces the available information capacity, the system can no longer represent all possible configurations.At that point, the researcher suggests quantum mechanics itself ceases to apply in its standard form.This leads to a practical consequence that quantum algorithms that rely on fully exploiting the exponential state space — particularly those involving large-scale entanglement — would stop delivering advantages over classical computation once systems exceed a certain size.The findings directly relate to the breaking of public-key cryptography, one of the most widely cited long-term application of quantum computing.Shor’s algorithm, a quantum method for factoring large integers, is often cited as a future threat to RSA encryption. The study directly addresses this scenario, arguing that a quantum computer capable of factoring a 2,048-bit RSA key would require more qubits than the proposed limit allows.As a result, the paper concludes that such encryption schemes may remain secure, not due to technological barriers, but because of fundamental physical constraints.The implications extend beyond cryptography because, if RaQM is correct, the study suggests that the trajectory of the quantum computing industry would need to shift from pursuing large-scale, fault-tolerant systems toward more targeted applications that operate within the proposed limits.Near-term quantum computers, often referred to as noisy intermediate-scale quantum (NISQ) devices, would remain useful. These systems operate with relatively small numbers of qubits and are already being explored for applications in chemistry, materials science and optimization.In those domains, the study suggests that RaQM and standard quantum mechanics would produce indistinguishable predictions. The divergence only appears at larger scales, where the limits of information capacity become significant.The study builds its argument by changing a basic assumption of quantum theory — that the numbers used to describe quantum states can take any value, no matter how precise or continuous.Instead, RaQM restricts these parameters to rational numbers, meaning fractions that can be described using a finite amount of information. This effectively replaces the smooth, continuous structure of Hilbert space with a granular one.To represent quantum states under this framework, the paper describes them as finite-length bit strings rather than continuous wavefunctions. In this formulation, the amount of information available to describe a system is explicitly limited.The theory further links this discretization to gravity. Palmer proposes that gravitational effects — often considered negligible in quantum systems — play a fundamental role in determining the structure of quantum state space.Using models of gravitationally induced state reduction, the study estimates the scale at which discretization becomes relevant. These estimates yield the proposed limits on qubit capacity.The paper also outlines a potential experimental test. Quantum algorithms that require maximal entanglement across many qubits — such as the quantum Fourier transform used in Shor’s algorithm — would serve as a proving ground. If performance plateaus or degrades beyond a certain number of qubits, it could indicate the presence of the proposed limit.The theory remains speculative and departs from widely accepted principles of quantum mechanics. Standard quantum theory has been extensively validated across a broad range of experiments, and no clear evidence has yet emerged for the type of discretization proposed in RaQM.The paper acknowledges that RaQM and conventional quantum mechanics are indistinguishable for small systems, which complicates efforts to test the theory in current experimental setups.Another open question concerns the role of error correction. Modern quantum computing roadmaps rely on encoding logical qubits across many physical qubits to suppress noise. RaQM suggests that increasing the number of qubits would not circumvent the fundamental limit, but this claim remains untested.The study also raises conceptual questions about the nature of quantum theory itself. By framing quantum mechanics as a limiting case of a deeper, discrete system, RaQM challenges long-standing assumptions about continuity, randomness and the role of measurement.Those wondering whether RaQM holds or not may not have to wait long. The paper points out that near-term quantum computing experiments may be a potential path for validation. As hardware improves and systems approach hundreds of qubits, researchers may be able to test whether performance continues to scale as predicted by standard theory.If quantum computers demonstrate sustained exponential speedups beyond the proposed limits, RaQM would be falsified. If not, the theory could gain traction as a candidate for reconciling quantum mechanics with gravity, one of the central challenges in modern physics.Beyond computing, the framework offers a new lens for interpreting quantum phenomena. The study suggests that features such as entanglement and measurement may arise from underlying information constraints rather than intrinsic randomness.Because the PNAS paper remains behind a paywall, the arXiv version was used for this article. Note that there may be difference between the pre-print and the final accepted version.Share this article:Keep track of everything going on in the Quantum Technology Market.In one place.

Read Original

Tags

quantum-computing

Source Information

Source: Quantum Daily