IBM Quantum System Two Arrives in Chicago This September

Summarize this article with:
IBM will deploy its Quantum System Two to Chicago this September, a move directly linked to the foundational work of Hanhee Paik, the IBM researcher whose early improvements to the transmon qubit proved superconducting quantum computers were possible. Now overseeing quantum initiatives across Illinois, Paik is central to a new collaboration with the University of Illinois Urbana-Champaign and the University of Chicago, establishing a National Quantum Algorithm Center (NQAC) designed to bridge the gap between theoretical potential and practical application. This partnership will grant UIUC researchers simultaneous access to both IBM Quantum computers and the university’s National Center for Supercomputing Applications (NCSA) Delta and DeltaAI supercomputers, fostering an architecture focused on quantum computing. “Without algorithms, we can’t really use quantum computing to solve problems and develop the economy,” Paik explains, highlighting the center’s focus on translating quantum power into tangible results. Hanhee Paik’s Transmon Qubit Pioneered Superconducting Computing The viability of superconducting quantum computing rests fundamentally on work completed years ago; Hanhee Paik’s early research directly addressed a critical hurdle, proving the concept was achievable. By concentrating on improving the “coherence time” of the transmon qubit, Paik established a foundation upon which IBM has built its current fleet of quantum computers, a legacy she now continues to expand in her current role overseeing projects in Chicago. This collaborative effort extends beyond simple research partnerships; IBM has formalized a new agreement with UIUC to expand the Discovery Accelerator Institute, aiming to advance computing focused on quantum principles and develop algorithms that combine high-performance computing (HPC) and quantum processing power. This architecture isn’t about replacing traditional supercomputing, but augmenting it; Paik explains that when performing quantum computations, “you can’t leave the HPC idle because thousands of people are using those systems together. They’re shared assets.” The practical implications of this combined power are already being demonstrated. Researchers at IBM and RIKEN recently solved a complex problem involving the iron-sulfur molecule, Fe4S4, a key component in cellular energy production, using this approach, a calculation impossible with brute-force HPC methods. Paik’s journey reflects this evolution from hardware development to algorithmic application; initially focused on qubit design, she transitioned to exploring algorithms and data workflows. “I wanted to explore quantum algorithms and use one of the quantum computers I’d spent my entire career trying to build,” she explains, acknowledging the shift in focus after years dedicated to hardware. This evolution has also broadened her understanding of the challenges facing wider quantum adoption, leading her to advocate for abstraction layers that make quantum computing more accessible, such as IBM’s open-source language, Qiskit, designed to make quantum computing easy for everyone. Paik’s career trajectory, from improving qubit coherence to overseeing complex algorithmic development, underscores the interconnectedness of these fields and the long-term vision driving IBM’s quantum strategy. UIUC Collaboration Advances Quantum-Centric Supercomputing The pursuit of practical quantum computing has moved beyond simply building increasingly stable qubits; the focus is now firmly on integrating quantum processing with existing high-performance computing (HPC) infrastructure. This synergistic access is not merely about combining resources, but about addressing a fundamental bottleneck in quantum computing. Paik’s involvement is particularly noteworthy; her early work improving the coherence time of transmon qubits proved the feasibility of superconducting quantum computers, establishing a direct lineage from foundational hardware development to the current algorithmic focus. IBM is further solidifying its commitment with the establishment of a National Quantum Algorithm Center (NQAC) in Chicago, a move signaling significant national investment in the field. The architecture being developed prioritizes efficient workflow integration, recognizing that quantum computations don’t operate in isolation. “Sometimes you bring QPUs, sometimes GPUs and CPUs,” Paik notes, highlighting the complexity of managing these diverse systems. IBM’s experience with RIKEN in Japan, where they pioneered a workflow to manage quantum resources alongside traditional HPC, is informing this approach. Looking ahead, the collaboration intends to explore algorithms applicable to materials science, condensed matter physics, and the design of more efficient qubits themselves. “We’d like to discover more algorithms that can use both resources in an efficient way,” Paik states, emphasizing the need for quantum-plus-AI algorithms. Accessibility is also a priority, with IBM continuing to develop abstraction layers and open-source tools like Qiskit to lower the barrier to entry for quantum programming. I’ve always felt like Alice, in Alice in Wonderland, chasing the bunny to the next place. I focused on one goal, and then the next, and the next, until I ended up where I am right now. Fe4S4 Molecule Solved via Integrated Quantum & HPC Workflows This expertise now informs her oversight of projects designed to bridge the gap between quantum hardware and real-world problem solving. A recent demonstration of this combined power involved the successful computation of the ground state of the iron-sulfur molecule, Fe4S4. Researchers at IBM and RIKEN achieved this feat using a “quantum-centric supercomputer,” a system integrating quantum processing units (QPUs) with high-performance computing (HPC) resources. This molecule, crucial for energy production within cellular mitochondria, presented a challenge insurmountable for traditional HPC methods alone. “You can’t compute its ground state with brute force HPC,” explained Paik, highlighting the limitations of classical computing for certain complex simulations. The success with Fe4S4 serves as a compelling example of the effectiveness of this integrated approach, showcasing how quantum algorithms can leverage the strengths of both quantum and classical systems. Central to this advancement is a new agreement between IBM and the University of Illinois Urbana-Champaign (UIUC), expanding the Discovery Accelerator Institute. This architecture is designed to facilitate the testing of novel algorithms and exploration of quantum use cases. “Together with RIKEN,” Paik noted, “IBM pioneered the workflow to manage the quantum computational resources and execute the algorithms more efficiently.” The goal is not simply to combine resources, but to create a seamless system where quantum and classical computations complement each other, maximizing efficiency and accessibility. Qiskit and Abstraction Layers Expand Quantum Accessibility The promise of quantum computing extends beyond theoretical potential; practical application hinges on broadening access to the technology, and recent developments spearheaded by IBM are directly addressing this challenge. Hanhee Paik’s early work establishing the viability of superconducting qubits through improvements to “coherence time” now underpins a larger strategy to democratize quantum resources, evidenced by her current oversight of projects in Chicago. This isn’t simply a continuation of hardware development, but a deliberate shift toward usability, recognizing that powerful quantum processors are ineffective without the software and expertise to harness them. This center isn’t merely a research hub; it represents a significant national investment in quantum algorithm development, explicitly aiming to solve real-world problems and stimulate economic growth. “Our vision for quantum-centric supercomputing is to integrate CPUs, GPUs, and QPUs,” explains Paik, highlighting the necessity of combining quantum and classical computing power. This approach allows researchers to test algorithms and explore use cases in a practical setting, moving beyond isolated quantum simulations. IBM’s commitment to accessibility extends to software as well, with the open-source quantum programming language Qiskit playing a crucial role. “We will continue to build abstraction layers that make quantum computers even more accessible,” Paik asserts, acknowledging that a significant barrier to entry is the perceived need for deep quantum mechanical knowledge. Qiskit is designed to lower this barrier, enabling a wider range of researchers and developers to contribute to the field. Paik emphasizes the importance of data workflows, noting that “computing languages are essential for making these systems more accessible to users without too much extra learning effort,” and drawing on lessons learned during a recent assignment in Japan regarding the management of shared computing resources. The ultimate goal, she explains, is to move beyond simply building quantum computers to utilizing them effectively, chasing “the bunny to the next place” and pursuing the next interesting challenge. Without algorithms, we can’t really use quantum computing to solve problems and develop the economy. Source: https://research.ibm.com/blog/hanhee-paik-interview-on-qcsc-algorithms Tags:
