Whether or not the D-Wave quantum computer is in actuality a quantum computer is a debate that HPCwire has been following since the project hatched in 2007. Early critics of the system claimed that it wasn’t a “real” quantum computer. Since that time, D-Wave has been winning over supporters, including Google, NASA and Lockheed Martin. But others (MIT’s Scott Aaronson for one) remain skeptical. In the course of the continuing controversy, the Washington Post‘s Timothy B. Lee recently took up the matter with D-Wave’s vice president of processor development Jeremy Hilton.
There is some disagreement over what exactly constitutes a quantum computer. The classic model for building a quantum computer is called Gate Model Quantum Computing, but an alternative model was introduced by MIT researchers in the early 2000s. It’s called adiabatic quantum computing, and it’s what D-Wave’s computer is based on. Some critics of D-Wave’s technology contend that it’s not the real deal.
“What D-Wave built is not universal quantum computing,” Hilton readily admits, but he maintains that it’s been proven in the literature – not by D-Wave – that the adiabatic model is an equivalent model of quantum computation.
D-Wave founders went with the adiabatic approach because they thought it had the best chance of enabling real work in a reasonable time frame without getting into the really difficult NP-Hard class of problem-solving.
Asked to be “concrete” in describing their hardware, Hilton responds:
“D-Wave has focused on the superconducting side of things to benefit from the infrastructural advancement the semiconductor has made. The fabrication of superconductors is all [mature] semiconductor technology. We fabricate [our chips] at Cyprus Semiconductor. We don’t have exotic tools to make those devices. That was an important aspect for D-Wave, we want to scale up to a high level. If all those problems have already been solved, we’ll be able to take advantage more quickly. [If we had used] ion trap technology, new technologies would have needed to scale up.”
As for why this system should be considered impressive even though it’s only “comparable or slightly better” than classical computing technology, Hilton affirms that even being “in the ballpark of the conventional algorithms in the field was very exciting.” The company and its backers are focused on their future roadmap and on the large improvements they are seeing between generations. For example, transitioning from a 128 qubit to 512 qubit processor returned a 300,000x improvement in performance. A 1000 qubit is planned for release some time this year and a 2000 qubit processor is on the horizon as well.
“We’re at a point where we see that our current product is matching the performance of state-of-the-art classical computers,” Hilton adds. “Over the next few years, we should surpass them. The ideal is to get into a space that is fundamentally intractable with classical machines. In the short term all we focus on is showing some scaling advantage and being able to pull away from that classical state of the art.”
In the remainder of the Q&A, Hilton uses a hills and valleys metaphor to describe how the D-Wave machine compares with its conventional computing cousins (“entanglement allows those valleys to interact and interfere in a way that allows the system to find its lowest-energy optimization”). He also explains why D-Wave hasn’t focused on Shor’s algorithms (“it’s not an interesting market segment for a business”), and counters claims of secrecy as a historic effect (their early years were focused on building a scalable technology, not publication).
In the final analysis, Mr. Lee asks all the right questions, but the responses, while frank, can come off as frustratingly vague – a paradox befitting the subject matter, perhaps, or something more calculated if you’re a critic.
Richard Feynman has been quoted as saying: “If you think you understand quantum mechanics, you don’t understand quantum mechanics.” But he also said: “If you can’t explain it to a six year old, you don’t really understand it.”