The post Quantum Processor Hits 99.9 Percent Reliability Target appeared first on HPCwire.

]]>*The five cross-shaped elements are the Xmon variant, so named by the team, of the transmon qubit placed in a linear array.*

A team from the esteemed lab of John Martinis, UCSB professor of physics, have demonstrated a new level of reliability for superconducting qubits, paving the way for large-scale, fault-tolerant quantum circuits. The details of the research appear in this week’s issue of the journal *Nature*.

Quantum computers promise unimaginable speedups compared with today’s fastest number-crunchers, but at this stage, the technology suffers from reliability issues due to the fragile nature of quantum states.

Thanks to the strange laws of quantum mechanics and a phenomenon called superposition, the qubit (“quantum bit”) can exist in multiple states at once. Instead of being relegated to a one or a zero, like the classical bit, the qubit can represent a one and a zero and all points in between. A computer that is comprised of qubits is thus inherently parallel and theoretically capable of conducting multiple computations simultaneously. The trouble with qubits, though, is their instability – they tend to “forget” their state very quickly. Quantum error correction, which distributes a logical state among many qubits by means of quantum entanglement, goes a long way to protecting the state, but until now fidelity targets were still shy of the 99 percent goal. This week in the journal *Nature*, the UCSB physicists report that they’ve created a small quantum computing array that performs with enough accuracy to make error correction viable.

“Quantum hardware is very, very unreliable compared to classical hardware,” notes Austin Fowler, a staff scientist in the physics department, whose theoretical work prompted the experiments. “Even the best state-of-the-art hardware is unreliable. Our paper shows that for the first time reliability has been reached.”

The experimental system, comprised of five superconducting qubits arranged in a linear array, is the first of its kind to cross the 99 percent accuracy threshold, setting the stage for even larger quantum arrays. The team achieved an average fidelity of 99.92 percent for a single-qubit logic gate and 99.4 percent for a two-qubit logic gate. Error correction was implemented with a surface code approach, which is based on nearest-neighbour coupling and rapidly cycled entangling gates.

“Motivated by theoretical work, we started really thinking seriously about what we had to do to move forward,” says John Martinis, a professor in UCSB’s Department of Physics. “It took us a while to figure out how simple it was, and simple, in the end, was really the best.”

The UCSB team’s superconducting multi-qubit processor is a representative architecture for a “universal quantum computer,” one that can handle any algorithm given to it. This stands in contrast to the quantum annealing machines made by the Canadian company D-Wave, which are only good at solving a specific set of tasks, called optimization problems.

Having passed this crucial threshold, the team will continue to work on reducing errors while scaling the system. Will a practical quantum computer be far off?

“If you want to build a quantum computer, you need a two-dimensional array of such qubits, and the error rate should be below 1 percent,” Fowler explains. “If we can get one order of magnitude lower – in the area of 10-3 or 1 in 1,000 for all our gates – our qubits could become commercially viable. But there are more issues that need to be solved. There are more frequencies to worry about and it’s certainly true that it’s more complex. However, the physics is no different.”

The post Quantum Processor Hits 99.9 Percent Reliability Target appeared first on HPCwire.

]]>