IBM today outlined its ambitious quantum computing technology roadmap at its virtual Quantum Summit. The eye-popping million qubit number is still far out, agrees IBM, but perhaps not that far out. Just as eye-popping is IBM’s nearer-term plan for a 1,000-plus qubit system named Condor and expected around the end of 2023.
IBM is perhaps the dominant commercial player pursuing quantum computing and it has bet big on superconducting semiconductor-based qubit technology (transmons). One huge challenge has been quantum state fragility. IBM and most everyone else have been forced to use dilution refrigerators to keep quantum processors ice cold, just a few degrees Kelvin, to escape the disruptive ‘noise’ that confounds quantum circuits. It’s one of the issues that has kept qubit counts low.
In announcing IBM’s quantum technology plans, Jay Gambetta, IBM Fellow and VP of IBM Quantum, noted his blog, “As we explore realms beyond the thousand qubit mark, today’s commercial dilution refrigerators will no longer be capable of effectively cooling and isolating such potentially large, complex devices. We’re also introducing a 10-foot-tall and 6-foot-wide “super-fridge,” internally codenamed “Goldeneye,” a dilution refrigerator larger than any commercially available today. Our team has designed this behemoth with a million-qubit system in mind—and has already begun fundamental feasibility tests.
“Ultimately, we envision a future where quantum interconnects link dilution refrigerators each holding a million qubits like the intranet links supercomputing processors, creating a massively parallel quantum computer capable of changing the world.”
That is quite a vision and today’s detailed plan seems like an important moment in quantum computing if IBM can deliver. Many issues remain – scalability, error correction/mitigation, rival qubit technologies, to name just a few. Indeed, quantum computing technology is a domain in which progress has proven devilishly difficult to predict progress. Yet IBM was surprisingly specific today as shown by the bulleted timeline here:
2020 – Hummingbird (65 qubits). Released to IBM Q Network members earlier this month, Hummingbird “features 8:1 readout multiplexing, meaning we combine readout signals from eight qubits into one, reducing the total amount of wiring and components required for readout and improving our ability to scale, while preserving all of the high performance features from the Falcon generation of processors,” said Gambetta.
- 2021 – Eagle (127 qubits). Eagle features several upgrades in order to surpass the 100-qubit milestone: crucially, through-silicon vias (TSVs) and multi-level wiring provide the ability to effectively fan-out a large density of classical control signals while protecting the qubits in a separated layer in order to maintain high coherence times.
- 2022 – Osprey (433 qubits). Design principles established for IBM’s smaller set processors “set us on a course to release a 433-qubit IBM Quantum Osprey system in 2022. “More efficient and denser controls and cryogenic infrastructure will ensure that scaling up our processors doesn’t sacrifice the performance of our individual qubits, introduce further sources of noise, or take up too large a footprint,” said Gambetta.
- 2023 – Condor (1,000-plus). “We think of Condor as an inflection point, a milestone that marks our ability to implement error correction and scale up our devices, while simultaneously complex enough to explore potential Quantum Advantages—problems that we can solve more efficiently on a quantum computer than on the world’s best supercomputers,” said Gambetta.
Putting all of this technology to good use is, of course, the goal. The quantum community has been chasing quantum advantage – the point at which for some application, a quantum computer performs sufficiently better than a classical computer to warrant switching. While that point is still distant, progress is being demonstrated. One example is recent work by IBM rival Google on quantum chemistry (see HPCwire article, Google’s Quantum Chemistry Simulation Suggests Promising Path Forward).
Measuring the quality of quantum computer performance so as to be able to gauge progress and to be able to make comparisons between various quantum computers is another challenge. IBM has proposed the QV – Quantum Volume – benchmark which has several attributes baked in (gate error rates, decoherence times, qubit connectivity, operating software efficiency, and more). Thus far a few other quantum system makers are also using QV. Lots of qubits isn’t helpful if the quality of the system performance is poor.
IBM achieved a QV of 64 earlier this summer and has said it will be able to double the QV of its systems yearly.
Said Gambetta, “The biggest challenge facing our team today is figuring out how to control large systems of these qubits for long enough, and with few enough errors, to run the complex quantum circuits required by future quantum applications.
“We maintain more than two dozen stable systems on the IBM Cloud for our clients and the general public to experiment on, including our 5-qubit IBM Quantum Canary processors and our 27-qubit IBM Quantum Falcon processors—on one of which we recently ran a long enough quantum circuit to declare a Quantum Volume of 64. This achievement wasn’t a matter of building more qubits; instead, we incorporated improvements to the compiler, refined the calibration of the two-qubit gates, and issued upgrades to the noise handling and readout based on tweaks to the microwave pulses. Underlying all of that is hardware with world-leading device metrics fabricated with unique processes to allow for reliable yield.”
Given the heightened attention from government (i.e. spending) on quantum computing and the rapid fleshing out of a larger quantum computing ecosystem (s/w tools, consultants, DOE testbeds, et al), it will be interesting to track how IBM performs against its own goals.
Link to IBM blog: https://www.ibm.com/blogs/research/2020/09/ibm-quantum-roadmap/