April 4, 2023 — NVIDIA recently announced its new system for taking classical computing to the next level utilizing quantum computing. This month, NVIDIA debuted DGX Quantum, the first system to couple GPUs and quantum computing. NVIDIA’s new Grace Hopper system has proven to have 10x better performance for applications running terabytes of data. Speed increases like that are extremely valuable to researchers with immense data sets and simulations.
Imagine if a year-long project could be finished in just over a month. That’s the type of increase quantum computing can bring to the table today. As advances in quantum computing continue, and as more supercomputing centers embrace the technology, these times will only get better.
NCSA is one of the supercomputing centers partnering with NVIDIA to utilize these supercharged quantum processing units (QPU). A new special GPU resource will be installed in the National Petascale Computing Facility at the University of Illinois Urbana-Champaign campus. This new resource will be connected to QPU which the Illinois Quantum Information Science and Technology Center (IQUIST) will house in their lab in the Engineering Sciences Building on campus.
Santiago Nuñez-Corrales, NCSA research scientist, will be leading NCSA’s quantum computing efforts. “NCSA has taken its first strides toward a long-term quantum computing strategy, designed to complement ongoing efforts at IQUIST,” Nuñez-Corrales said when speaking about NVIDIA’s announcement. “Our target comprises three core activities: understanding and harnessing the potential of existing real and simulated quantum devices as a new form of advanced computing, making quantum technologies accessible to a wide spectrum of users, and identifying application areas where quantum may become a game changer. All three of them draw upon our robust history and expertise with new cyberinfrastructure development, accelerating science-making and meeting the needs of future users. The recent announcement by NVIDIA, hence, arrives serendipitously.”
To many unfamiliar with the technology, quantum computing is a tricky topic to define. Contrary to “classical computers,” you can’t even use traditional physics to explain how it works. A quantum computer is a device that harnesses aspects of quantum mechanics, the laws that govern phenomena at the scale of atoms. To put that very simply, what scientists and engineers are attempting to crack is the ability to solve hard problems much faster using quantum mechanics.
Classical computers, the computers most people use every day, represent information by encoding it as 1s and 0s. The collection of all 1s and 0s in memory at any given time corresponds to the state of the computer, which can be changed by programs operating on it. Think of it as a large sequence of on and off switches; despite the sophistication of contemporary microprocessors, classical computers have operated using similar mathematical rules since their inception.
Quantum mechanics turns this on its head by expanding our vocabulary of what the state of a computer and a program can be. Instead of a bit being on or off such as in a classical computer, a qubit, quantum computing’s version of a bit, can be in both states simultaneously, a superposition of these states. Much like Shrödinger’s cat, the bits are theoretically always an uncertain combination of a 1 and a 0. While creating a fault-tolerant quantum machine is still a ways off, scientists and engineers have devised algorithms that benefit from quantum computing architectures to potentially speed up the solution of problems that are hard to solve with classical ones. With these new quantum resources, certain classes of calculations may happen much faster thanks to a broader palette of operations.
In regards to NVIDIA’s recent announcement, Nuñez-Corrales explains, “DGX Quantum has the potential to decrease the complexity of HPC-QPU integration projects at the hardware level thus lowering the risk of implementation of quantum-classical hybrid cyberinfrastructure. CUDA Quantum extends a mature programming model for GPUs into the QPU world, which will facilitate developing and integrating new quantum kernels across scientific applications. Finally, the ability to access GPU-powered simulators such as those in cuQuantum will help identify new software and scientific pipeline development practices for users to transition from classical to quantum problem-solving.”
Greg Bauer, senior technical program manager at NCSA, commented: “NCSA is preparing itself to support the adoption of QPUs by research computing projects similar to how NCSA led, in part, the transition to GPUs for research computing with the early evaluation of a PlayStation cluster and deployment of GPU-centric HPC resources.”
Increasing adoption of a wide variety of quantum computing technologies at NCSA will have direct benefits to researchers utilizing our resources. “At NCSA,” Nuñez-Corrales says “we have identified an initial set of users that may benefit from this collaboration with NVIDIA in terms of access to simulated QPUs and programming models, and later real QPUs.” Nuñez-Corrales’ team will use what they learn from this initial project to refine future applications of quantum computing. “From this experience,” Nuñez-Corrales continues, “we will gradually become proficient at establishing user support models and resources on campus that remain accessible to our academic community and business partners. More immediately, we are working to integrate these tools into existing GPU-intensive resources such as Delta and provide early access to resources and training for the UIUC research community.”
NCSA’s Santiago Nuñez-Corrales, research scientist, contributed to this story.
Source: Megan Meave Johnson, NCSA