As it’s become clearer that hybrid quantum-classical computing solutions will likely be necessary to achieve practical quantum computing, there’s been an increasing emphasis on developing software platforms to build hybrid applications and workflows. Nvidia’s recently-announced QODA (quantum optimized device architecture) is among the notable new arrivals. Zapata Computing is one of the early QC software pioneers that has bet on leveraging quantum-enabled hybrid applications using its Orquestra platform to deliver quantum advantage.
Timothy Hirzel, chief evangelist for Orquestra, recently provided HPCwire with a progress update on the heels of Nvidia’s QODA announcement, which named Zapata as an earlier collaborator. (See HPCwire coverage, Nvidia Dives Deeper into Quantum, Announces QODA Programming Platform.)
Nvidia’s intent, of course, is to integrate its GPU accelerator technology with quantum systems for use leveraging hybrid classical-quantum resources. Zapata has a broader aim. Its Orquestra product integrates with tools, such as QODA (when it becomes available), for building and deploying applications that run on a wider variety of hybrid quantum-classical resources. This isn’t a new idea. Agnostiq’s Covalent platform, for example, is similar but aimed at the R&D environment for prototyping and testing. Orquestra, says Hirzel, is aimed squarely at enterprise deployment and production environments.
Don’t get the wrong idea. No one is using Orquestra as a production deployment tool yet, but one collaborator – Andretti Autosport – is in the process of an infrastructure buildout that includes plans for using Orquestra in production. The auto racing powerhouse intends to use Orquestra and blended quantum-classical resources for a variety of applications such as tire degradation models and race simulations to understand the probability of a yellow flag or a caution coming up.
“Some of these models have to be run at the track side and in real-time,” said Hirzel. “Andretti has been a fantastic forcing function on the software, because things that weren’t as critical for an R&D user are now very important.”
Most of Zapata’s early customers are still working in R&D, building out quantum roadmaps and figuring out what high value problems can benefit from quantum resources. That said, the appearance of platforms, such as Orquestra and Covalent and QODA, for developing hybrid-applications and managing hybrid quantum-classical resources is an important trend. The market is still unfolding and will be worth watching. Increasingly it looks like quantum computing will require a blended classical-quantum environment.
Hirzel said, “We see quantum and classical growing together. In no way do we see quantum actually replacing HPC, or classical. As quantum devices grow, as qubit counts grow, and the gate-depth grows, the challenge of compiling circuits is going to grow along with it. In a simple experiment, you could be running millions of shots on a quantum device. You’re going to have to be wielding a substantial amount of classical processing alongside quantum, not just for circuit compilation, but any of the data pre-processing and post-processing.”
Besides these necessary processing tasks, there’s a lot of work underway to understand which portions of applications (optimization, simulations, etc.) are best handled by classical resources and which by quantum. Zapata has long been a champion of this line of thinking. For example, Zapata has worked heavily with quantum-enhanced solvers using machine learning and quantum-generated random numbers to improve performance.
CEO Christopher Savoie told HPCwire a year ago:
- “Any machine learning basically starts with a generative model, at least in modeling things, a probability distribution that’s made up of a random bit string. You start with a tunable, random bit string to get closer and closer – like a Born machine (quantum circuit born machine, QCBM) – to model the distribution that will give you a certain result. That’s your prior. [Next], you feed a distribution in and hopefully get a better handwriting sample, a better portfolio, a better optimization of something, chemistry, whatever, at the end. That’s the outcome. [To do that] you need a random bit string that has a tunable parameterized connection to it. The distribution of those bit strings is the source of richness in your model. That’s the premise.
- “What we showed in our papers is if you have a quantum source of bit strings you get a better distribution and a better handwriting sample and a better portfolio optimization than you can with classical machine work. We can do that today. What we’ve done is we’ve taken a quantum-enhanced neural network approach to basically put our quantum spy on that solver. We’re feeding into a classical neural network workflow that already exists and is already in production in a lot of places. The great thing is their cost function, their model, already exists too; we just put a little quantum spy on it. We think it’s pretty clever, and it’s patented, and very disruptive. We’re able to basically model the distribution of the good answers for any software.”
Using quantum-generated random numbers as input to improve classical solvers is just one example and probably represents low-hanging fruit in developing hybrid classical-quantum solutions. But the point is there are going to be more complicated problems, a portion of which is better suited to classical or quantum computation. Nvidia already argues the matrix multiplies, for example, are better handled on GPUs than on QPUs.
Platforms to develop these quantum-enhanced applications and workflows will be required. Enter Orquestra, QODA, Covalent. There will be more such offerings, covering various portions of the required hybrid stack, if you will. Zapata believes it has an advantage because its focus has always been on hybrid approaches.
Looking at the figure above, the areas in solid green represent Zapata proprietary technology, the light green combine Zapata and user-supplied technology, and the gray areas are non-Zapata resources, hardware and software, that Orquestra interacts with. The Orquestra stack works with a variety of popular quantum languages and libraries as well as classical simulators and quantum hardware. The idea is to provide a broadly usable platform leveraging existing and new tools.
Hirzel notes the number of available qubit types (e.g. ion trap, superconducting, etc.) is a moving target and each requires its own lower level protocols and tools. While the number of qubit technologies is currently limited to a handful, that may change. Zapata supports the major qubit technologies now in use and expects to expand support as required.
A core question that platforms for hybrid-dev should help answer, said Hirzel, is which compute engine (classical or particular qubit technology) is better, and why, for a given application including which portion of the application. That’s in addition to developing and executing the workflow.
“The decision of where you want to send your compute job becomes its own challenge. In a smaller way, we see this even today with how organizations are learning to leverage GPUs. This is really where we want Orquestra to live,” said Hirzel. “Stepping back a bit, it’s not just where things are computed. You also need to think about where data is stored; what type of orchestration layer you have sitting on top of that? Are you sending jobs via Slurm to HPC? Is this running on cloud? And the ability to visualize the data that’s coming out of it.”
These are still fairly early days for the platform and users tend to bring a fair amount of expertise. The Quantum-Ready Application functionality, as described by Hirzel, is currently more about taking an existing application and using Orquestra tools to prepare it to run on various resources, including quantum or classical (e.g. GPU) and then running comparison simulations.
Hirzel said, “It’s about building useful tools and this is something that we potentially see our own internal service teams doing. But ultimately, partners can take on writing applications themselves and using Orquestra to deliver applications. It’s not a toolkit for building GUIs and the like, but instead it’s for building the services that support existing end user applications. An end user may have an existing tool – a tool for computational fluid dynamics used for airplane wing design – and is really looking to accelerate a piece of that tool.”
He says the user can use Orquestra tools to break up the application and simulate various approaches to determine effectiveness among diverse device types. The quantum-enabled version(s) could then be deployed from Orquestra.
Zapata is also working on assembling algorithm suites for various functions and devices. For example, “Our hardware team is working on algorithms to help users select the device that’s most appropriate for that circuit. Ultimately, we’d look to put that into something like a quantum hardware broker.” Users could then provide selection criteria to the broker – availability, cost, fidelity, speed, etc. – and it would provide options.
Don’t underestimate the value of being able to estimating the costs of running a quantum workflow, said Hirzel, “We’ve noticed in our time running these quantum simulations or running on real quantum devices, that it is very easy to run up a big bill quickly. It’s a pain point that a lot of folks either in cloud or HPC experience.” Providing cost comparison for various compute methods during simulation or just before “we see as actually quite integral and can be an exciting feature in Orquestra.”
Orquestra is an early entry to the market for hybrid classical-quantum software development platforms. It is still evolving as is the market segment. Zapata, spun out of Harvard in 2017, has grown rapidly. Headcount is already 100-plus. It’s hoping early mover status and deep technical expertise translates into success. Then again, that sounds like many young companies on the quantum computing landscape. Stay tuned.