Quantinuum, a pioneer in ion trap quantum computing, today reported setting two performance records in quick succession, with its H1-1 system achieving a quantum volume (QV) of 16,384 (214), and then 32,768 (215). “The achievements represent a high-water mark for the quantum computing industry, based on the widely-recognized QV benchmark, which was originally developed by IBM to reflect a quantum computer’s general capability,” said the company in the official announcement.
The latest reported QV is a big jump. One prominent potential quantum computer user, Marco Pistoia, head of global technology applied research, JPMorgan Chase, noted, “This is a remarkable milestone for quantum computing and in line with the technology we have seen from Quantinuum. [In] our research, we have produced groundbreaking algorithms on their quantum computers for the past several years, which has allowed us at JPMorgan Chase to be on the leading edge of quantum computing.”
Quantinuum has posted a blog (Quantum Volume reaches 5 digits for the first time: 5 perspectives on what it means for quantum computing) detailing many of the improvements leading to the latest QV score. By comparison, IBM, the originator of the QV measure, reported achieving a QV of 512 on its Falcon r10 last spring and noted it as a highlight at its year-end IBM Quantum Summit. IBM is using superconducting qubits.
Gauging progress in quantum hardware development can be tricky. There are few agreed-upon benchmarks. Quantum Volume – a single number metric that has several ingredients (gate fidelity, connectivity, qubit count, etc.) baked into it – is seen as a good starting point for measuring base system quality. Other benchmarks measure attributes such as speed (circuits/s) or comparative application/algorithm performance. The Quantum Economic Development Consortium (QED-C) has ongoing benchmark development and at least one company – Super.Tech – offers a suite (SupermarQ) of different benchmarks for use.
QV was one of the first quantum benchmarks. Here’s some insight into what QV tells us, taken from a 2022 paper (Quantum Volume in Practice: What Users Can Expect from NISQ Devices) by Los Alamos National Laboratory researchers:
“Quantum volume (QV) has been designed as a benchmark measure for Noise Intermediate-Scale Quantum (NISQ) devices. Informally speaking, a NISQ backend that has passed a QV protocol test of 2m will largely correctly execute any quantum circuit on m qubits with up to m random 2-qubit gates on each of those qubits, thus giving a good guideline to users of the device as to what circuit sizes and depths appear reasonable to run on the device.” A table from this paper showing QV results for several vendors is at the end of the article. Quantinuum was by far the top performer at the time of the testing.”
Analyst Paul Smith-Goodson, vice president and principal analyst, quantum, AI & space, Moor Insights & Strategy, told HPCwire, “Quantinuum’s continuous improvement of its quantum volume over the past eight years is a good indication that quality improvements are being made in the operations and components of its trapped-ion quantum computer. Among other things, the recent QV could be reflecting higher fidelities of such things as single and two-qubit gates, fewer errors for preparing and measuring qubits, and lower errors caused by qubit cross talk. To make such a large numerical jump, I suspect the researchers recently improved a few things they have been working.”
In the blog, Brian Neyenhuis, director of commercial operations, credits reductions in the phase noise of the computer’s lasers as a key factor in the QV increase.
“We’ve had enough qubits for a while, but we’ve been continually pushing on reducing the error in our quantum operations, specifically the two-qubit gate error, to allow us to do these Quantum Volume measurements,” he said. The Quantinuum team improved memory error and elements of the calibration process as well. “It was a lot of little things that got us to the point where our two-qubit gate error and our memory error are both low enough that we can pass these Quantum Volume circuit tests,” he said.
The work of increasing Quantum Volume means improving all the subsystems and subcomponents of the machine individually and simultaneously, while ensuring all the systems continue to work well together. Such a complex task takes a high degree of orchestration across the Quantinuum team, with the benefits of the work passed on to H-Series users.
Also in the blog, Quantinuum argues improved signal-to-noise is important for applications. “As application developers, the signal-to-noise ratio is what we’re interested in. If the signal is small, I might run the circuits 10 times and only get one good shot. To recover the signal, I have to do a lot more shots and throw most of them away. Every shot takes time. The signal-to-noise ratio is sensitive to the gate fidelity. If you increase the gate fidelity by a little bit, the runtime of a given algorithm may go down drastically,” said Henrik Dreyer, managing director and scientific lead at Quantinuum’s office in Munich. “For a typical circuit, as the plot shows, even a relatively modest 0.16 percentage point improvement in fidelity, could mean that it runs in less than half the time.” (best to read the blog directly)
Achieving the high QV is impressive; putting it into the broader context of advancing quantum computing is more difficult. The natural question is what does it mean for the time-line for reaching quantum advantage which, of course, depends on more than what QV measures.
“Quantum Volume is a collective look at how the entirety of the system is performing and progressing. Despite the large quantum volume number, it is not an indicator that quantum advantage has been achieved, only that we are doing the right things and going in the right direction to achieve it. Quantum advantage is likely still 3-5 years away, and there are pros and cons for each of the various quantum technologies,” said Smith-Goodson.
Like others, Quantinuum has been aggressively building collaborations, software tools, and developing use cases. In 2022, the company launched InQuanto, a development platform to “enable both computational chemists and quantum algorithm developers to easily mix and match the latest quantum algorithms with advanced subroutines and error mitigation techniques to get the best out of today’s quantum platforms.” In cryptography, it has partnered with Fujitsu, for example, to build quantum resilience on software-based networks and demonstrate QRNG integration on the Fujitsu SD Wan.
Just last week, Quantinuum brought on a new CEO – long-time semiconductor industry executive Raj Hazra. At the time of the announcement, Darius Adamczyk, Quantinuum’s Chairman of the Board, said, “The time is perfect to bring Raj into the company, as we build momentum to drive the next chapter of quantum industries. He will help the company drive high-speed innovation and entrepreneurship in the quantum industry.” Outgoing CEO Ilyas Khan will remain with the company.
Opinions vary on how close the young quantum computing industry is to broader commercialization. Stay tuned.
Table of QV Showings from LANL Paper
Link to press release: https://www.hpcwire.com/off-the-wire/quantinuum-sets-industry-record-for-hardware-performance-with-new-quantum-volume-milestone/
Link to Quantinuum blog: https://www.quantinuum.com/news/quantum-volume-reaches-5-digits-for-the-first-time-5-perspectives-on-what-it-means-for-quantum-computing
Link to Las Alamos National Lab paper: https://arxiv.org/pdf/2203.03816.pdf