Back to News
quantum-computing

You’re not the weakest link, your network is - SDxCentral

Google News – Quantum Computing
Loading...
9 min read
1 views
0 likes
You’re not the weakest link, your network is - SDxCentral

Summarize this article with:

Sponsored You’re not the weakest link, your network is With quantum computing closing in and every network only as strong as its weakest link, securing a path to quantum-safe resilience has never been more imperative December 16, 2025 By Farah Johnson-May, DatacenterDynamics Have your say Facebook Twitter LinkedIn Reddit Email Share – Getty Images With the era of quantum computing edging ever closer, the long-term security of today’s data is at risk. Operators across the industry are beginning to rethink the foundations of cryptography, key management, and network architecture in anticipation of a world where quantum machines can overturn long-standing security assumptions.In a recent SDx>Broadcast episode, ‘Understanding quantum-safe networks’, Sameh Yamany, Viavi’s CTO and chief AI officer, talks understanding your cryptographic assets, testing mitigative techniques in real environments, and building agility into your roadmap.A quantum-safe network is built to remain secure amid the emergence of large-scale quantum computers – capable of breaking much of today’s cryptographic landscape – which are rapidly becoming a reality. Current encryption schemes rely on the known limits of classical computing, but quantum computing ultimately dismantles that comfort. What we consider secure today may be obsolete tomorrow.Preparing for this shift requires a multilayered approach, which includes deploying post-quantum cryptography (PQC), exploring quantum key distribution (QKD), and implementing agile, comprehensive key management systems that can evolve alongside emerging threats. In this context, resilience is not just about stronger algorithms, but securing every link in the chain. After all, as Yamany puts it:“A quantum-safe network is only as strong as its weakest link in the chain of end-to-end quantum-safe technology.”Assembling the piecesPQC relies on advanced mathematical algorithms that are designed to withstand quantum attacks, protecting data as it travels across networks. QKD, on the other hand, safeguards the exchange of encryption keys themselves, using the laws of quantum physics to ensure that they cannot be intercepted or tampered with – a form of protection rooted in physical hardware rather than computation.In practical deployments, PQC will serve as the foundational layer that secures most data movement, while QKD will reinforce the underlying infrastructure for ultra-high-value, latency-tolerant links. The result is a hybrid security model that blends mathematical defense with physical assurance. “The hybrid model will dominate quantum protection for decades, particularly across national networks, financial institutions, data centers, and other mission-critical infrastructure.”What Yamany underscores is that building quantum-safe networks is not simply a matter of ‘fixing cryptography’ – quantum capability challenges the very assumptions of classical encryption. The real shift lies in transforming operations so cryptography becomes a living system – continuously monitored, assured, and adaptable as the dynamics of quantum attacks evolve.The data farming threatAs cyberthreats grow more sophisticated, addressing quantum-safe security is no longer a future concern, but an immediate priority. Standards bodies such as the US National Institute of Standards and Technology (NIST) are already driving global efforts to prepare for the quantum era. Yamany explains:“People know the concept of ‘harvest now, decrypt later.’ With quantum computing expected within the next five to ten years, many attackers are saying, ‘I’ll be able to decrypt later, so I’ll harvest now.’ For long-lived data, the risk is enormous. That’s why NIST and governments worldwide are pushing to embed PQC into systems today – you have to start building secure networks now.”These ‘harvest now, decrypt later’ attacks are already underway, collecting sensitive information with the expectation that it can be unlocked once quantum capabilities mature. This is especially dangerous for national and critical infrastructure, financial and healthcare systems, and intellectual property – all domains where data must remain secure for decades.Telecom operators sit in the mix of this risk landscape as they own and run the infrastructure that carries the data of every other sector, making them responsible not just for protecting their own networks, but for enabling quantum-safe cryptography across entire ecosystems.This raises the critical question of how quantum-safe operations change the way operators test, validate, and assure their networks, as so many new variables come into play.Quantum-safe technologies introduce entirely new performance considerations, from algorithmic complexity and latency to key exchange scalability, hardware acceleration, and interoperability. One example is ‘handshake delay,’ the added latency introduced during the initial connection phase when using new PQC algorithms. It is a real, measurable challenge during the transition to quantum-secure systems.“Operators have to think about continuous automated testing for continuous assurance – understanding how these dynamics behave in a real network, not just in the lab,” says Yamany.Managing error and validation in real timeBecause introducing quantum-safe technologies is new, it brings with it a range of new complexities. When PQC is deployed, its advanced mathematical computations increase processing demands, which can affect service-level agreements (SLAs), requiring operators to redefine performance expectations.For example, if a network operator were to deploy PQC without preparation and suddenly find that a core security device, like a firewall, suffers a 60 percent performance drop due to the added computational load, the impact would be immediate and unacceptable. This is why understanding how a network will realistically behave is essential, especially as hybrid environments emerge. This is where network test and monitoring leaders like Viavi come in. They go beyond traditional lab testing with synthetic or emulated protocols to incorporate continuous field measurements, providing far more accurate insight into how an entire system will perform end-to-end under real-world conditions.“It’s difficult for smaller players to build and validate an entire ecosystem, so we’re creating a multi-environment, multi-vendor framework to integrate PQC and deliver it as a service. Financial institutions and operators can bring their systems to us and test them in a realistic environment – so they can be confident it will work in the field,” Yamany illustrates, adding:“Our quantum self-test lab is developing optimization techniques using alternative computing hardware, like GPUs or ASICs, to run these algorithms, enabling continuous performance improvement.”Digital twins to ensure interoperabilityIt is becoming increasingly clear that technologies like digital twins will be essential, particularly in quantum-safe networks, because one of the biggest real-world challenges they help address is interoperability.No single vendor is likely to deliver a fully end-to-end quantum-safe network. Some links may use PQC, others may rely on QKD, and still others may incorporate different PQC or even satellite-based connections. You cannot deploy QKD everywhere, and once these mixed technologies are combined, it becomes extremely difficult to understand how true interoperability will look across the entire system.“That’s why interoperability test frameworks and scenarios are so important in bringing ecosystems together and figuring out how to make them work. We are proposing a new alliance, similar to the Open Radio Access Network (RAN) – we call it the Open Quantum Safe Alliance.“This is how you bring all the ecosystems together to build use cases for different industries and verticals. Financial services may need something very different from critical infrastructure or automotive. So we have to develop blueprints that can be certified and trusted, ensuring that when they are deployed in real networks, interoperability is achieved across a multi-vendor environment.”Ecosystems readinessAs quantum-safe security advances, new optimization strategies are emerging – particularly in areas where AI and specialized hardware can help accelerate performance. Yet, even with this momentum, the broader ecosystem is still evolving.Standards bodies such as the European Telecommunications Standards Institute (ETSI) and NIST are laying essential groundwork for standardizing new PQC encryption algorithms, but significant gaps remain in assuring performance and interoperability across the broader systems-level architectures that operators and industries rely on.Where PQC has been making meaningful progress thanks to NIST’s early standardization efforts and its inherent software-based nature, QKD is just gaining traction as it contends with the constraints of hardware- and physics-dependent implementation.Fiber infrastructure is another important factor. Two or three decades ago, deploying fiber was expensive and complex; today, it is far simpler and more cost-effective, which is where Yamany expects QKD’s adoption curve to follow a similar trajectory as its technologies and deployment models mature.To this, Yamany points out that even if PQC and QKD technologies themselves continue to improve, one of the biggest gaps lies in the management of cryptographic keys across different networks, vendors, and technologies – i.e., key management interoperability:“Key management sits above everything – it’s where cross-domain orchestration happens, hybrid PQC-QKD workflows are defined, and realistic, high-scale, low-latency emulation takes place. That layer doesn’t exist yet in a mature form, which is why we’re putting key management interoperability at the forefront of our testing today.”If a cryptographic transition is on the cardsIn short, Yamany’s advice is to treat the incorporation of quantum-safe cryptography as a lifecycle transformation, not a one-and-done upgrade.“You have to build the culture and teams that understand this is a lifecycle. Start with inventory – you need to know exactly what you have today, because if you overlook even one part of quantum safety, the whole effort becomes null.”His advice is as follows:Begin with testing and architecture redesignBuild agility into every layerAssume nothing is permanent and that algorithms will continue to evolveWith each step forward in quantum computing, certain algorithms will inevitably become obsolete, so design your processes and organization with that reality in mindAnd leverage AI.While many worry that AI will accelerate quantum threats, fewer recognize that AI can also strengthen quantum-safe networks. As the saying goes, ‘don’t bring a knife to a gunfight,’ so match the sophistication of the threat. As Yamany explains, AI can help pinpoint vulnerabilities, accelerate hardware optimization, and improve cryptographic performance:“The smarter operators today are using AI to identify and optimize their weakest points. As a provider of the data those AI systems rely on, we can partner with operators to build evolutionary scenarios and highlight the capabilities they need to bring in.”So, with significant innovation and algorithmic evolution still ahead, Yamany’s strategic message is to begin testing now in real, multi-vendor environments with realistic data, adopt technologies and digital twin approaches for the quantum shift, and work toward building quantum-safe testing hubs to unite the ecosystem. Ultimately, don’t wait for quantum computers to mature; prepare your roadmap now.Watch the full broadcast episode with Viavi here. Subscribe to The Networking Channel for regular news round-ups, market reports, and more.

Read Original

Tags

quantum-computing
quantum-cryptography

Source Information

Source: Google News – Quantum Computing