As quantum computing comes closer to mainstream, it’s universally agreed that these systems won’t replace classical computing. That raises the question: where exactly do quantum computers fit in computing infrastructures?
Governments, supercomputing labs and companies are starting off by examining how quantum processing units, also called QPUs, could fit into high-performance computing environments.
The Leibniz Supercomputing Centre is creating a hub in its datacenters near Munich where it is testing out new types of chips and systems. The center is deploying Cerebras’ AI system in its labs, and also has a quantum computing center.
The center wants to match the new types of AI chips and quantum systems to best address the broad base of workloads it handles, said Laura Schulz, head of strategy at LRZ.
The LRZ initiative, called Future Compute, is evaluating exascale-class supercomputers with accelerated computing provided by quantum and AI systems.
“We are tending to think less about HPC systems and AI systems. There are some arguments to be made for the characterizations of quantum systems coming up. We’re kind of thinking about this in terms of the characterization of the workflows and the work to be done and what makes sense for the architectures,” Schulz said.
Argonne National Laboratory is also loading up on quantum chips from different companies as it investigates qubit systems. Google and IBM are developing superconducting qubit systems, while PsiQuantum is focused on building a quantum computing factory with a 1-million qubit system. Intel is chasing a system based on quantum dots with chips that can be made in its existing factories.
IBM’s roadmap includes mainly quantum systems until 2025, and starting 2026, it has quantum systems scaling beyond 10,000 qubits with “quantum and classical communication”
Cloud providers are also jumping in the quantum race. Amazon is providing different types of quantum systems including an annealer from D-Wave, ion-trap quantum processors from IonQ and superconducting qubit systems from Rigetti and IonQ.
Quantum computing is becoming more accessible with cloud providers creating layers for customers to simply explore the viability to systems for applications, said Carl Dukatz, quantum program lead at Accenture.
There are ways for developers to choose libraries that can calculate classically or on quantum devices, and choose their resource of choice. Also, as more libraries become popular and easier to use, data scientists are now accessing quantum computing through the cloud instead of going directly to quantum providers.
“It became natural for people who knew how to use those cloud service providers already and were familiar with the algorithms to easily start to test and compare classical and quantum systems together,” Dukatz said.
Quantum computers will emerge as an accelerator and still require a conventional CPU to run operations, said Philippe Notton, CEO of SiPearl, a France-based chipmaker that is developing an Arm-based CPU called Rhea for European exascale systems scheduled to go online next year.
“Even if you have a quantum processor or quantum accelerator in the middle, it’s another form of acceleration, and you still need plenty of CPUs surrounding it,” Notton said, adding that Rhea could function to manage quantum acceleration.
GPU company Nvidia isn’t developing a quantum processor, but is offering a CuQuantum software kit for quantum circuit simulation. The company’s CuQuantum DGX hardware appliance integrates a software container a full-stack quantum circuit simulator. The system has Nvidia’s A100 GPUs – which is also in many supercomputers — to accelerate quantum simulation and workloads.
“You can take this container, install it, and for all intents and purposes, you have a very powerful, perhaps the most powerful quantum processor in the world, because you’ve got a full simulation stack that drops on top of that hardware,” said Tim Costa, group manager for HPC and quantum computing at Nvidia.
There’s a general consensus that GPUs have an immediate part to play in hybrid quantum-classical computing. Costa said.
“Applications with advantage will not be pure quantum applications. It won’t be that you’ll do your first full weather simulation on a quantum processor. That’s not where things are going to go. Rather it’s going to be a combined effort between classical infrastructure, probably large-scale classical infrastructure like we use today,” Costa said.
Some portions of code could be treated as acceleration offloads to GPUs and quantum processors, or to quantum simulation on GPUs, depending on the workload.
“That’s an area that we’re very interested in as we march forward,” Costa said.