Back to Categories
🏭

Industry

Commercial developments, partnerships, acquisitions, and business news from quantum computing companies.

201 Articles
Updated Daily

Showing 12 of 201 articles

The hidden cost of MFT vulnerabilities
industry

The hidden cost of MFT vulnerabilities

Share this article Copy Link Share on X Share on Linkedin Share on Facebook Beware MFT vulnerabilities, says Kiteworks’ John Lynch. (Image: summer studio / Shutterstock) When Fortra disclosed CVE-2025-10035 in GoAnywhere MFT last month, many security teams would have experienced a familiar sinking feeling. Another critical vulnerability. Another emergency patch cycle. Another race against ransomware operators. But this one revealed something more troubling. It exposes the fundamental fragility of how organisations handle their most sensitive data transfers. This is an industry-wide crisis hiding in plain sight. Legacy managed file transfer (MFT) systems have suffered similar critical vulnerabilities in recent years. Each follows an eerily similar pattern: authentication bypass or code execution flaws that grant attackers keys to the kingdom. The reason is structural, not coincidental. These solutions exist at the intersection of maximum value and maximum exposure. MFT systems handle everything from financial transactions to healthcare records, intellectual property to government secrets. Yet, they must also connect disparate networks, bridge security domains, and accommodate external partners with varying security postures. This inherent tension creates attack surfaces that grow exponentially with each integration point. The uncomfortable truth for businesses is that if a firm’s strategy relies primarily on patching vulnerabilities quickly, it has already lost. The problem isn’t the patches; the architecture itself turns every vulnerability into an existential threat. Luckily, modern architectural patterns can offer a different path. It is best to think of security as layers of Swiss cheese. Any single layer has holes, but stacking them creates a defence in depth. Sandboxing isolates risky components, preventing deserialisation flaws from achieving system compromise. Zero-trust networking assumes breach and limits blast radius. Embedded security controls create spee

OQC Advances Integration of Quantum Computing and AI Supercomputing with NVIDIA NVQLink
industry

OQC Advances Integration of Quantum Computing and AI Supercomputing with NVIDIA NVQLink

PRESS RELEASE OQC Advances Integration of Quantum Computing and AI Supercomputing with NVIDIA NVQLink Oxford Quantum Circuits (OQC), a global leader in quantum computing, is working with NVIDIA on NVQLink, a groundbreaking open system architecture that provides real-time, low-latency connectivity between quantum and AI supercomputing systems. SHARE ARTICLE London, UK, 28 October 2025  — Oxford Quantum Circuits (OQC), a global leader in quantum computing, is working with NVIDIA on NVQLink, a groundbreaking open system architecture that provides real-time, low-latency connectivity between quantum and AI supercomputing systems. This initiative builds on OQC’s earlier announcement of its collaboration with NVIDIA on  the industry’s first quantum-AI data centre in New York City with Digital Realty – a landmark step towards scalable, hybrid quantum-classical computing in the heart of financial and national infrastructure. OQC’s vision for the future of hybrid compute is one where quantum processors and classical supercomputers work seamlessly together to unlock new levels of performance and scalability. NVQLink represents a major step towards this future, offering the open, interoperable foundation needed to connect the world’s most advanced computing technologies. “NVIDIA NVQLink is a milestone for the industry, providing the platform that will help accelerate the transition towards truly hybrid computing,” said Simon Phillips, Chief Technology Officer at OQC. “At OQC, we believe the future of computing will be built on collaboration between the world’s most advanced quantum and classical systems – and NVQLink helps make that vision possible.” Bridging Quantum and AI Supercomputing At the heart of NVQLink lies a real-time, low-latency and high-throughput connection layer, optimised for hybrid quantum-classical orchestration. Built on NVIDIA CUDA-Q, NVQLink allows developers and hardware partners to interlace compute resources (GPUs, CPUs and QPUs) with sub-4 mi

Open source is under threat, says a prominent think tank. Left unprotected, Europe’s “entire digital agenda” could be left perilously exposed.
industry

Open source is under threat, says a prominent think tank. Left unprotected, Europe’s “entire digital agenda” could be left perilously exposed.

Share this article Copy Link Share on X Share on Linkedin Share on Facebook Much more needs to be done to protect and fund open source software projects, argue activists in the EU. (Photo: Zakharchuk / Shutterstock) Like all the best Jenga towers, the internet is simultaneously solid and dangerously reliant on one or two crucial building blocks of open-source software. Such is the theme of ‘Dependency,’ a widely shared web comic portraying this shared global communications network as a crenellated superstructure whose continued existence is dependent on a minuscule ‘project some random person in Nebraska has been thanklessly maintaining since 2003’ not caving under its immense weight. In reality, there are several such thin, overburdened building blocks holding up the modern internet – but the principle remains the same. It’s a problem championed by the growing digital sovereignty movement, which advocates for greater control of technology and data within national boundaries, and depends on open source to function. In response, a new open letter published by the OpenForum Europe think tank urges the European Union to create a €350mn Sovereign Tech Fund – and warns that inaction puts Europe at risk of “stagnation” that threatens the success of its “entire digital agenda”. The letter arrives as digital sovereignty fast climbs Brussels’ policy agenda. European politicians are increasingly wary of US-made tech, as Silicon Valley appears to inch closer to the policy goals of the Trump administration. In February, US Vice President JD Vance’s ‘Munich speech‘ caused a rift between Washington and Brussels. European politicians broadly understood the event as an attack on their values. Since then, US tech firms have courted the White House and occasionally appeared to act as its enforcers, such as when Microsoft shut down the email address of the International Criminal Court’s chief prosecutor, recently sanctioned by the Trump administration. Spooked by these events, the dom

Why Europe’s data privacy framework needs a common blueprint
industry

Why Europe’s data privacy framework needs a common blueprint

Share this article Copy Link Share on X Share on Linkedin Share on Facebook Is GDPR really enough to guarantee user privacy and innovation in the 2020s, asks Wire’s Ben Schilz. (Photo: Rough-Media / Shutterstock) It has been six years since the General Data Protection Regulation (GDPR) came into force across the EU, and privacy is no longer up for debate. Europe has already made its ethical position on the matter clear – personal data must be protected, and trust must be engineered into every system that handles it. What is less clear is how that principle holds up under the weight of Europe’s expanding web of regulation.  The AI Act and the Data Governance Act both build on GDPR’s foundations, but together they form a patchwork of compliance that even seasoned experts might struggle to navigate. For businesses developing AI, scaling cloud infrastructure, or moving data across borders, concerns are beginning to mount about how workable this regulatory landscape really is. As data becomes the currency of economic and geopolitical power, the coherence of Europe’s privacy framework will determine whether the continent leads the next era of digital innovation or is slowed by the very protections that once set it apart. Is GDPR’s legacy a foundation of trust? When GDPR came into force in 2018, it reset the moral and operational standards of the whole digital economy. It compelled organisations to treat data less like a limitless resource, and more like something that must be earned, justified, and protected. In some ways, that shift in mindset became Europe’s greatest export. Nations from Brazil to Japan modelled their own frameworks on GDPR, and consumers worldwide came to associate European regulation with ethical stewardship. The regulation also gave businesses a common language for data accountability, forcing transparency into decision-making and creating a baseline of trust essential for digital services to function at scale.  Yet, for all its success, GD

The cloud will fail again, so make sure your leadership doesn’t
industry

The cloud will fail again, so make sure your leadership doesn’t

Share this article Copy Link Share on X Share on Linkedin Share on Facebook The AWS outage was a warning to all CIOs that, while their reliance on the cloud may be non-negotiable, their ability to react to outages is not. (Photo: xxposure / Shutterstock) Earlier this week, a failure at Amazon Web Services (AWS) – the cloud computing infrastructure that underpins huge swathes of the internet – triggered a cascading outage. From communication and collaboration tools like Slack and Zoom to payment services, critical websites and banking services from Lloyds, Bank of Scotland and Halifax, the knock-on effects impacted users worldwide.  These events are no longer shocks; they are the new operational reality. The systems and suppliers that now provide the operating model for the world are heavily interconnected and rely on each other. When something breaks, the impact can cascade, affecting countless other services. In this environment, the question is no longer ‘if’ the next major shock will hit, but ‘when.’ The greatest barrier to surviving this reality is a leadership mindset stuck on prevention. Leaders are understandably focused on stopping incidents, but today’s interconnectedness means we can no longer fully control our own operational reality. When our services depend on a complex supply chain of other providers, we cannot guarantee ‘perfect safety’. Something will always break. The real work, therefore, is shifting from prevention alone to a deliberate balance of preparation, response, and recovery. The most important question for a leader is not “Have we stopped all bad things from happening?” – but “When a bad thing happens, how quickly can we recover?” This recovery speed is the true measure of organisational resilience. Resilience is built, not bought. It’s a muscle strengthened through consistent practice. Building resilience Building this muscle cannot be delegated solely to the IT department; it is a “whole-organisation challenge.” It requires empower