Nov. 9, 2021 — In the emerging world of quantum computing, NVIDIA just broke a record with big impact, and it’s making its software available so anyone can do this work.
Quantum computing will propel a new wave of advances in climate research, drug discovery, finance and more. By simulating tomorrow’s quantum computers on today’s classical systems, researchers can develop and test quantum algorithms more quickly and at scales not otherwise possible.
Driving toward that future, NVIDIA created the largest ever simulation of a quantum algorithm for solving the MaxCut problem using cuQuantum, our SDK for accelerating quantum circuit simulations on a GPU.
In the math world, MaxCut is often cited as an example of an optimization problem no known computer can solve efficiently. MaxCut algorithms are used to design large computer networks, find the optimal layout of chips with billions of silicon pathways and explore the field of statistical physics.
MaxCut is a key problem in the quantum community because it’s one of the leading candidates for demonstrating an advantage from using a quantum algorithm.
We used the cuTensorNet library in cuQuantum running on NVIDIA’s in-house supercomputer, Selene, to simulate a quantum algorithm to solve the MaxCut problem. Using GPUs to simulate 1,688 qubits, we were able to solve a graph with a whopping 3,375 vertices. That’s 8x more qubits than the previous largest quantum simulation. Our solution was also highly accurate, reaching 96% of the best known answer.
Our breakthrough opens the door for using cuQuantum on NVIDIA DGX systems to research quantum algorithms at a previously impossible scale, accelerating the path to tomorrow’s quantum computers.
Keys to the Quantum World
You can test drive the same software that set this world record.
Starting today, the first library from cuQuantum,, cuStateVec, is in public beta, available to download. It uses state vectors to accelerate simulations with tens of qubits.
The cuTensorNet library that helped us set the world record uses tensor networks to simulate up to hundreds or even thousands of qubits on some promising near-term algorithms. It will be available in December.
Get the Latest News at GTC
We invite you to try cuQuantum, get dramatically accelerated performance on your simulations and go break some big records.
Learn more about cuQuantum’s partner ecosystem here.
Source: Samuel Stanwyck, NVIDIA