Aug. 9, 2024 — Moore’s Law is dead.
Under Moore’s Law, five decades were spent scaling up supercomputers – doubling the number of transistors on an integrated circuit roughly correlated to doubling overall performance. But now, the limits of the laws of physics have been reached. Transistors just can’t get much smaller.
Just as AI workloads demand more from hardware, systems engineers are building scale-out systems that keep hitting other bottlenecks: network bandwidth, heat density, etc. Innovations push the needle a bit further every day, but it’s going to take a real paradigm shift – a change in information processing itself – to build the next generation of high-performance computing (HPC) systems.
Enter the Quantum Computer
Quantum computers scale differently than traditional hardware. Performance is a linear function of transistor/bit count. Triple the number of transistors, and the computer runs calculations three times as fast.
Quantum computers scale exponentially as qubit count increases. So that same tripling is many orders of magnitude more powerful. When features are added like entanglement and superposition (concepts beyond the scope of this article), the potential for performance gains becomes even larger.
Most experts do not think that quantum computers are going to make traditional computers obsolete. Instead, those at the R&D forefront envision fully integrated systems with a hybrid architecture, a system where information passes back and forth between quantum computers and HPC systems, depending on the task at hand.
“We’re not QC purists,” said Yuval Boger, CMO at QuEra. “Algorithms don’t have to run entirely on quantum computers. There are many opportunities to integrate HPCs and quantum computers together, just as HPC managers have previously done with CPUs and GPUs.”
And the implications are massive. In the next five to 20 years, we may see a transformation as significant as the 1950s transition from vacuum tubes to integrated electronics. Currently intractable problems – ones too complex, that involve too many variables – are theoretically solvable by quantum computers. And, at least in some cases, quantum supremacy is already proven.
Hybrid Computing
What does it actually look like for an HPC system and a quantum computer to work together? Fundamentally, getting these computing modalities to function as a seamless whole, to be able to run a single job using them simultaneously, is a problem of systems engineering and distributed computing.
“There are certain ‘mechanical’ aspects of a hybrid system that must be figured out, such as job queuing and data stream management between the different modalities and dealing with the different speeds at which they operate,” said Dr. Remy Notermans, Director of Strategic Planning at Atom Computing. “The HPC industry is exceedingly good at this, and there is no doubt they will be able to solve this problem.”
Notermans continued, “The bigger problem is to define an algorithmic framework that decides at what point during a computational problem a handoff should happen between the two modalities: what part of the computational problem should be split off and sent to a quantum computer for processing? What is the acceptable level of complexity that the quantum computer can provide a reasonable answer in a reasonable amount of time? The first applications will probably be executed using a heuristic approach, but a long-term algorithmic solution will require significant interdisciplinary collaboration between HPC and QC.”
Machine learning is a great example of the division of labor between quantum and classical systems. It’s likely that the quantum computer will be ideal for the training portion, but everything before and after, from preparing and storing data to putting the model to work through inference, will continue to rely on traditional HPC systems.
When combined, QC and HPC create a force multiplier, a whole that’s greater than the sum of its parts. By leaning into the strengths of each, hybrid computing architectures will create a new world of possibilities.
What Will We Create?
What comes next? What’s the ultimate effect on humans and society at large? While QC still isn’t ready for prime time, some applications are starting to come to life.
“In partnership with IQM Quantum Computers, we’re developing quantum algorithms that categorize molecules for drug discovery,” said Dr. Konstantina Alexopoulou, business development at HQS Quantum. “The QC solution lets us calculate complex, nontrivial properties.”
“With our technical expertise in quantum computing systems and HQS Quantum Simulations’ experience in quantum software development, we believe our partnership has the potential to deliver an application useful for quantum advantage,” said Dr. Peter Eder, Head of Strategic Partnership at IQM Quantum Computers.
Largely the result of quantum computers being so well-suited to combinatorial optimization problems, these same effects will also continue to play out in realms like materials science, logistics, financial market analysis, and manufacturing and process design. Anything from optimizing power grid performance to streamlining triage in an emergency room can benefit from QC + HPC.
Significant work is going into proving these early applications, but, at least for right now, the most important thing that quantum computers are creating is knowledge. The truth is today’s quantum computers aren’t that useful. Qubit count, while rising, remains relatively low. Mitigating noise and reducing error rates remain a concern.
But with global market intelligence provider IDC predicting that customers (not investors) will spend $2.7 billion on quantum tech by 2027, organizations are betting on gaining a quantum advantage. Learning how to operate and program quantum computers requires climbing a notoriously steep learning curve, and businesses recognize their people need the education and practice now to make full use of tomorrow’s QC landscape.
The era of quantum utility is coming. Nobody wants to be left behind as their competitors figure out how to streamline their operations, make scientific progress, and create better AI systems.
Quantum’s Debt to HPC
Engineering quantum technology is a massive feat. The tools of the trade are too numerous to count, and they vary by the type of quantum computer being built: superconductive, neutral atom, ion trap, etc. What most quantum tech developers have in common, however, is they have to crunch a lot of big data sets, and they have to simulate complex quantum phenomena.
“Unsurprisingly, such emulation tasks are computationally demanding and memory intensive, so the researchers must use HPC strategies like data and algorithm distribution when modeling even modestly-sized present-day quantum experiments,” wrote a group of researchers for the AWS HPC blog.
And even before building quantum tech became an engineering challenge, discovering and understanding quantum physics was (and still is) a research challenge. Those researchers also rely on HPC every single day.
“From system engineering and workflow management to use cases, several challenges need to be tackled, and they need the HPC and QC communities to work closely together and share ideas,” said Dr. Bruno Taketani, Product Manager of HPC-QC at IQM Quantum Computers. “Today we are laying the foundations for a future where we take for granted that HPC includes quantum computers.”
Dr. Niccolo Somaschi, CEO at Quandela, echoed this sentiment, explaining that “HPC creates tools to benchmark quantum computers and redevelop better systems.”
In the end, quantum technology owes a great debt of gratitude to HPC. Yes, QC + HPC will be the future of supercomputing. But without classical HPC, there probably wouldn’t be any quantum computers to begin with.
Register and discover the latest and greatest for yourself at SC24.
Source: John Himes, SC24