Should you start exploring quantum computing? Yes, said a panel of analysts convened at Tabor Communications HPC and AI on Wall Street conference earlier this year.
Without doubt, the quantum computing landscape remains murky. Yet in the past ~5 years virtually every aspect of quantum computing has raced forward. At least one 1000-plus-qubit system is edging towards user access now and another is expected by year-end. There’s been a proliferation of software offerings up and down the “quantum stack” though it’s hardly complete. Most promising, what were a few POC use-case explorations has mushroomed into very many efforts across many sectors.
What are we waiting for? Against the backdrop of astonishing progress are also very hard technical problems. Error correction/mitigation tops the list. Effective quantum networking is another. Polished applications. Too many qubit types to choose from (at least for now.) Scale matters – it’s expected that millions of qubits may be needed for practical quantum computing These aren’t trivial challenges. Why bother?
The best reason to proceed, perhaps, is there’s little choice. The roaring geopolitical rivalry around getting to practical quantum computing fast – which includes robust spending from the U.S., the U.K., the EU, and China, as examples – is concrete evidence.
Panelist Bob Sorensen, Hyperion Research’s chief quantum watcher, zeroed in on quantum’s rush to integrate into what is an otherwise stalling HPC (hardware) picture.
“It’s no secret in the HPC world that the trajectory of performance gains is flattening out, we’re reaching a bunch of ends, if you will, the ends of Moore’s law, the ability to pack more transistors on a chip, Dennard scaling, you can only put so much power into a chip, the idea of lithographic capabilities running out. We’re at sub-nanometer line width lithography, [and] there’s only one company in the world that makes advanced lithography components, ASML out of out of the Netherlands, that can only supply two really competent silicon foundries to produce the advanced chips that that the HPC world needs – TSMC and Samsung,” said Sorensen.
“So, the trajectory of HPC performance is falling off, and the timing for quantum is perfect. It’s the next turn of the crank and accelerating performance. What that means is if you want to continue on your journey of advanced computing, you have to look for the next big thing. What’s interesting about quantum is its potential is the most attractive and it is on a different trajectory than where classical HPC is going right now. That’s really the promise. So, if you want to get started, you have to do a few things.
“We look at quantum as not a separate island unto itself of a new compute capability. It’s really more about accelerating the most complex, advanced workloads that the HPC world is always tackling. So, we view this as another turn of the crank in terms of Accelerating Opportunities in advanced computing. How you get started is you look at your most complicated vexing computational problems.”
It was a fascinating discussion and Tabor has archived the full video (link to video). The focus was not on exotic quantum technologies – important but not easily accessible to most of us – but on how and why to get started exploring them.
Panelists included Heather West, IDC’s lead quantum analyst and a research manager in IDC Infrastructure Systems, Platforms and Technologies Group; Sorensen, Hyperion’s senior vice president of research and chief quantum analyst; Jay Boisseau, CEO of Vizias and a former prominent Dell Technologies executive and a founder of the Texas Advanced Computing Center. HPCwire editor John Russell moderated.
West presented a handful of slides, nicely mapping the emerging quantum information sciences market, and then the panel tackled why now is the right time and offered tips on how to do it. Central to taking their argument is that quantum is coming on fast, that getting access to tools and QPUs is fairly easy and inexpensive via web-platforms such AWS Braket and Strangeworks, and that failure to get involved now is likely to slow your progress later.
Presented here are just a few comments from the panelists. Let’s start with a few slides depicting quantum’s development, presented by West. Her full slide deck presented in the video.
West noted the quantum forecasts are dynamic in that conditions can change quickly and that IDC incorporates changes as their impact becomes clearer. For example, IDC scaled back is total spending from ~$8.6B in 2027 to $7.6B based on shifts. Despite these shifts, quantum spending plans are growing significantly as a portion of the IT budget.
“Over the course of the last 20 years, we’ve seen [quantum computing] move from an academic achievement to now small-scale systems [that can be used] for small scale experimentation. Hopefully, within the next few years, we’ll be able to see systems that leverage error correction and mitigation techniques as well while little bit scaling to get to deliver some sort of near term advantage,” said West.
IDC does a nice job slicing up quantum segments. Looking at the proliferation of quantum hardware developers, she said, “We divide them into two different categories, hardware developers versus hardware vendors. The difference between the two is that the vendors have graduated to the point where they’re able to offer access to their systems and services for a premium fee so that organizations such as yours are able to use them, leverage them for some experimentation, use-case identification, etc.” (see slide below)
Taking a lesson from the past, Sorensen and Boisseau recalled the historically high-cost of adopting the next-gen HPC systems.
Sorensen said, “What’s so magical about quantum right now is, is the beauty of the low barrier to entry. In the old days if you wanted to get an HPC, and Jay knows this, you had to drop $25 million to bring a Cray in. You had to hire 25 guys and they lived downstairs in the basement. They never came out and they wrote code all the time and they spoke a language that you didn’t understand, and you had to pay them an awful lot of money to do that. The barriers to entry to get into HPC was high.
“The barrier to entry in quantum is you sit down, you go to AWS or Strangeworks. You pick your cloud access model of choice, you sign up for a couple of bucks, you grab a couple of new hires that just came out of with a degree in quantum chemistry or something, and you go and you play, and you figure out how that’s going to work. So, the barriers to entry of quantum are amazing. I’ve said it before, and I’ll say it again, if it wasn’t for cloud access, none of us would be sitting here vaguely interested in quantum; it’s what really is driving interest.”
Boisseau had a similar take. “You don’t have to choose a partner. You don’t have to make that decision. In fact, I think it’d be a bad play to make that decision now. You can go to any of the CSP-based infrastructure providers (with quantum gateways) and say I want to run this job on a D-Wave system, I want to run this on IonQ, and I want to run on Rigetti Systems, and can do that rather seamlessly,” he said.
“The interesting thing, and I’m an electrical engineer so I tend to look at things very pragmatically, is that right now, a lot of the software that’s running out there is quote-unquote, hardware-agnostic, which means you can run it on any (quantum) hardware you want. So again, you don’t have to make these choices yet, because it’s really too early to tell who’s going to win, who’s going to lose. Hardware-agnostic is really great in the early days, but eventually we’re going to turn the crank and people are going to start to say we need to optimize our code to run on certain things. But right now, the freedom to explore is what really matters most,” said Boisseau.
There was, of course, more to the broad panel discussion, including advice on choosing the right problems, for example, and a measure caution including brief discussion of a paper published last spring (Disentangling Hype from Practicality: On Realistically Achieving Quantum Advantage) by Matthias Troyer, of Microsoft, and colleagues. And Microsoft is firmly in the quantum hunt! (See HPCwire coverage, Microsoft and ETH Take Aim at Quantum Computing’s Hype (and Promise)
West noted, “Not everybody’s as optimistic, and some people are still deterred about adopting quantum because of costs, because of the maturity or lack of maturity of the systems, and whether or not it actually will be relevant to the problems that they’re willing to solve. However, for those organizations, they really should start to take note because quantum era is quickly approaching faster than what some might want to say. We still need to put that in a little bit of context: quickly approaching and being able to deliver a near-term advantage, that’s probably five to seven years out. So, quick is not going to be in the next six months. It’s not going to be next year, but quicker than the decades and decades which were thought earlier.”
Best to watch the full video, https://www.hpcaiwallstreet.com/conference/quantum-computing-analyst-panel-one-year-later/
Top image is a photo of the first deployment of an onsite private sector IBM-managed quantum computer in the United States installed at Cleveland Clinic.