IBM-led Webinar Tackles Quantum Developer Community Needs

By John Russell

July 21, 2020

Quantum computing has many needs before it can become the transformative tool many expect. High on the list is a robust software developer community, not least because developers rarely follow ‘intended use’ rules; instead, they bend and break rules in unforeseen ways that transforms new technology into applications that catalyze market growth, sometimes explosively.

In fact, efforts to fill out the quantum computing ecosystem (hardware and software) have continued to expand rapidly. Just today, for example, the Trump Administration announced the establishment of three new Quantum Leap Challenge Institutes around quantum sensing, hybrid classical-quantum systems, and large-scale quantum system development (See HPCwire coverage). Making full use of the new class of machines – whenever they arrive – will require a robust and sufficiently large quantum developer community.

Roughly a week ago, IBM held the first of a planned series of webinars on quantum computing – this one on The Future of Quantum Software Development and it was a treat. Moderated by long-time Forrester analyst Jeffrey Hammond, the panel included three prominent quantum community voices with diverse quantum expertise – Blake Johnson (control systems delivery lead, IBM, and formerly Rigetti), Prineha Narang (Harvard professor and CEO/founder, Aliro Quantum), and tech entrepreneur William Hurley (CEO/founder, Strangeworks).

It was a “glass-half-full” crowd, so consider their enthusiasm when evaluating comments, but it was also a well-informed group not dismissive of challenges.

“You have to realize where we’re actually at,” said Hurley, “We’re not in the days of AMD versus Intel. We’re in the days of like little mechanical gates versus vacuum tubes and other solutions. These things aren’t computers from my pure developer standpoint. At this moment of time they’re really great equipment for exploring the quantum landscape and are the foundation for building machines.”

Indeed, today’s quantum systems are fragile and complicated and even the notion of gates can be confusing. “At what point is a gate so complicated that a developer is never going to touch it or understand it anyway. Is that 1000 (qubits)? Is it 100,000? Is it a million because you hear people with all these stories about, you know, millions-of-qubits machines. If you had it my question is who would program it? Because that sounds really difficult based on where we’re at, and where we’re trying to go,” said Hurley.

Here are a few themes from the discussion.

  • Fast Followers Will Lose! Quantum computing’s inflection point will be such that that if you’re not already in the game, you won’t catch up. “You can’t be a fast follower. You have to either be placing a bet now or deciding to take the risk of not being involved at the point where something that is clearly a tremendous change in computing happens,” said Hurley with agreement from Johnson.
  • It Will be a Hybrid World. Yes, there will be ‘general purpose’ quantum computers although for a limited set of quantum-appropriate problems. There will also be specialization with various qubit technologies (ion trap, cold atom, superconducting, etc.) excelling on different applications. Lastly, all quantum computing will be done in a hybrid classical-quantum computing environment.
  • QA isn’t Far Away (Maybe). There was adamant agreement by panelists that a decade is too pessimistic…but there was waffling on just how soon quantum advantage (QA) would be achieved. Two-to-five years was the consensus guess-of-choice although Narang declined to make any guess. One had the sense their belief in quantum computing’s inevitable breakthrough trumped worry over when. Meanwhile the QA watch continues.
  • Expanded (Developer) Conversation Needed. The quantum conversation now is mostly between system developers who tend to be physicists and algorithm developers who tend to be physicists. That has to change. It will require better quantum computers, wider access to various qubit technologies, better tools, and a level of software abstraction that lets developers do what they do without worrying about quantum physics.

The panel discussion was casual and substantive if not technically deep, and IBM has posted a link to it. Next up is a webinar on Building a Quantum Workforce (July 28) with there are plans for another in late August (no date yet) on Commercial Use of Quantum Computers.

Clearly, each of the participating companies has its own agenda but nevertheless the give-and-take had an insider feel.

IBM, of course, is the biggest player in quantum computing today with deep expertise in hardware and software and its IBM Q network which offers various levels of access to quantum resources. IBM quantum systems use semiconductor-based superconducting qubits. Panelist Johnson is a relatively recent IBM import from Rigetti. His work is fairly deep in the weeds and focuses on control systems which convert conventional instructions (electrical signals) into quantum processor control signals.

Aliro and Strangeworks are start-ups focused on software.

Aliro describes its offering as a “hardware-independent toolkit for developers of quantum algorithms and applications. The development platform is implemented as a scalable cloud-based service. Features include: access to multiple QC hardware vendors and devices via an intuitive GUI as well as REST API; quantum circuit and hybrid workflow visualization and debugging tools; cross-compilation between high and low-level languages; and hardware-specific optimizations enabling best execution on every supported hardware platform.” CEO Narang is also an assistant professor of computational materials science at Harvard.

Strangeworks says, its “platform is a hardware-agnostic, software inclusive, collaborative development environment that brings the latest advancements, frameworks, open source, and tools into a single user interface.” CEO Hurley is a veteran tech entrepreneur who also chairs the IEEE Quantum Computing Work Group.

These are early days for both of these young companies and they are broadly representative of growing number of start-ups seeking to fill out the quantum computing ecosystem. It is probably best to watch/listen to the conversation directly to get a sense of issues facing software development in quantum computing, but also to gain a glimpse into the mindset of the young companies entering the quantum computing fray.

 

Presented here are a few soundbites (lightly edited) from the panel.

WHAT’S THE STATE OF QUANTUM PROGRAMMING AND WHAT ARE SOME OF THE CHALLENGES?  

Johnson: Recognizing that there was something maybe intimidating about quantum, IBM chose to develop first a graphical interface, a graphical drag and drop way to build quantum circuits where they show the kind of the fundamental unit of quantum compute. So that’s what’s available today in IBM quantum experience. You can drag and drop gates, which are the logical operations and manipulate qubits, which are quantum bits, to build up a program. For more real tasks, you need a real programming interface and we have Qiskit, an open source computing framework developed by IBM which is a Python interface for building quantum circuits and for building algorithms that take advantage of quantum processors.

Narang: What I like about Qiskit is that it’s very accessible. The challenge is, with that abstraction, you lose a lot of the control over the actual hardware, you don’t necessarily have all of the tools to directly program the system. So the pulse level control that IBM has made available is a good way to bridge that. I wonder, as we go towards other types of hardware, how some of the programs that are written for superconducting circuits will be translated to those (other types of hardware) and if everything is not based off of the same pulse level control scheme, what would be a good way of translating? And I don’t have an answer to this.

Hurley: Our approaches is to let all of the languages battle it out. We’re big Qiskit fans. I say that not because we’re on IBM, but we first started [there because] there were already tons people working with it. We’re big supporters, soon to be making our first contributions to it. But you can’t take a developer and make them a quantum developer overnight. Some things that are fundamentally different. For example, if I’m programming on any other platform that’s possible in the world, and I run into that error, I can find it or I know how it works. Whereas [with quantum] what we see happen with developers is they get in and they can instantiate a teleportation thing through Microsoft Quantum Katas or IBM Qiskit, or whatever. Then the moment it breaks, if they don’t understand the fundamental physics behind it, they’re at a dead end.

Narang: A lot of things are not yet possible with quantum hardware but we take for granted in a classical computer. [I’m] thinking about conditional statements and intermediate measurements, things that are not trivial to do in a quantum circuit at the moment, but that’s going to be very important to write more complex quantum programs in future. As those advances come from the hardware side, we think about how to translate those into something that you can use on the on the software side. 

GIVEN THE NASCENT STAGE OF QUBIT TECHNOLOGIES – SUPERCONDUCTING, TRAPPED ION, COLD ATOMS, ETC. – WILL THERE BE SPECIALIZED MACHINES?

Narang:  Tricky question. I’m trying to see how to answer it without making all of my colleagues angry at me. My personal view is there will be certain problems that will run just fine on a variety of hardware. And some that might be more specialized to particular types of hardware and that’s just associated with a physics that is underlying that type of hardware. This will be especially important as we try and map problems from condensed matter and chemistry onto some of these devices. We’ll see that not every technology is ideal or even possible for all kinds of problems. But we don’t have that kind of experience yet.

Photo of IonQ’s ion trap chip with image of ions superimposed over it. Source: IonQ

[Currently] there are only a few different trapped ion systems out there. It’s very hard to get access to those and not many of them have gone the route that IBM did with making at least at the smallest devices available very broadly. So there’s a whole lot more that needs to be done on a simulator before you can actually try it on real hardware. And of course, systems like cold atoms or photonic circuits are really niche. Getting time on those is very expensive and almost unaffordable for the average developer. The best you can hope to offer to them is to say, “Hey, if you write something that works on this genre of systems, we can get you to a point where it runs on other systems” and that’s something that my group is trying to accomplish at the moment.

Hurley: On my desk [I have] iPhone and iPad and a 16-inch laptop, okay? And those things can all do email and they can all surf the web and they can do them really well. But I can’t open Xcode and compile a big program on my iPhone. It’s going to be like that in quantum and it’s going to take it in directions that none of us can imagine it would be foolish to try. There’s already 16 startups I’m following who are making quantum processors for specific applications exactly like Pri described. Look at supercomputing today and high performance computing. There are computers built in on Wall Street just for doing trades or [even] specific types of trades. We’ve seen this throughout computing history, right? I don’t think quantum will be any different. I hope that there’s as many hardware and software solutions available as possible

WHEN WILL WE ACHIEVE QUANTUM ADVANTAGE?

Johnson: The problem is we don’t know exactly how powerful a quantum machine needs to be in order to get the quantum advantage. We’ve committed to and have been on a track of doubling quantum volume[i] (broad performance metric) every year. We put up and made publicly available our first machine with a quantum volume of 32 back in April. We now have eight machines with quantum volume of 32 that are available in IBM quantum experience, and we continue to march along that path of doubling QV [yearly]. At what point is it a powerful enough machine for quantum advantage? I’m not sure. I would say personally, I’d be surprised if we have to get all the way to fault tolerance to find a single application where you can do something with quantum advantage, whether that’s time-to-solution, cost or whatever, against classical resource.

Hurley: If you’re in this industry, if you want to be a developer in this industry, it needs to be a long-term play, a long-term vision that you have. Pessimists think [quantum payoff is] 20 years out, the optimistic [thinks] it’s three years out, and the reality is if you want to be involved in it than you should be preparing now, because all I can tell you is at some point between tomorrow and some future tomorrow it will happen and the inflection point will be steep.

A rendering of IBM Q System One, the world’s first fully integrated universal quantum computing system, currently installed at the Thomas J Watson Research Center. Source: IBM

Johnson: Those of us that build hardware understand that the most critical thing that’s preventing us from reaching quantum advantages is the hardware, and so our first tools are really focused at those domain experts to give them the tools they need to build better hardware. That’s an important audience that we like and we will continue to make better and better tools to serve that audience. But as Pri mentioned, people are doing applications, research with the devices that exist today, and are finding they maybe can’t yet solve a system problem better than they could with a classical resource, but they can solve problems. They’re starting to figure out what the limitations are, how they can squeeze out the most utility out of the devices today, and then getting ready for the devices that exist tomorrow.

So we’re starting to build tools that try to lower the barriers to entry for those people, not the domain experts, but a new audience. We don’t want them to have to learn everything about quantum computing in order to be able to get started. The idea here was to try to reach out to this new audience of developers [such that] they can they can write their programs by describing the problem that they want to solve, in this case, an optimization problem. So they can write down a quadratic formula, they write down some description of the constraints of their optimization problem. They can choose different solvers in both quantum solvers and classical solvers, because a lot of these developers are trying to [understand the value today and [how] quantum works versus the classical.

Hurley: Looking at it from a pure 30 years of seeing new tech coming down and doing the development, I think 10 years sounds very pessimistic now. What most people imagine as a quantum computer in 10 years? Is it a full general purpose machine, whatever? Who knows? But I don’t think you’re going to wait 10 years, I think it’s more in the two-to-five-year range to where there are things that start to become economically advantageous to enterprises and it will probably be in chemistry and material science.

HOW BIG MUST THE DEVELOPER COMMUNITY BECOME AND WHAT’S THE KEY TO DOING THAT?

Hurley: Most of the people building machines aren’t actually talking to developers. They’re talking to physicists who can download development tools and they’re playing the role of developers. Software developers are not necessarily the greatest physicists and physicists are not all the greatest software developers, right? [We need] to drive it to a point where it’s not 200 people who can program the machines; it’s 2 million people. I believe this is the first real leap, then in the next 10 years computing will change more than it has in the last hundred.

Johnson: Open source, I think, is a critical piece to accelerating these kinds of developments because we want these tools to be available to the broadest audience possible. Open source is a great accelerator for making that happen. [But even] on the fastest timescale, we’re a long way off from the App Store experience where on my phone, I can get an app that takes advantage of current resources. But I think we’re not that far away from the developer equivalent of that, which is, you know, package management systems where I can just say, pip install or brew install some package, which is a quantum library for some application domain. The goal is to have like the equivalent of iPhone experiences today.

If I’m gonna an iPhone developer and I want to develop a new app that uses like the augmented reality app to check the position of a basketball, that itself is a non-trivial machine vision task, right? But we don’t ask every iPhone developer to be a machine vision expert. They just they plug in into Expo that they want to use with AR kit, which is Apple’s augmented reality solution. And off they go. We need to get to that point and I don’t think it’s that far away, where a developer can use quantum resources without having to be an expert in quantum computing. 

Google’s Sycamore quantum chip

Hurley: If you think back to 2007, there are 400 of us a week after [the iPhone introduction] with iPhones and people hacking on them. It went to 10s of thousands almost instantly, right? Within six months to a year, and then over the course of 10 or 11 years, you get to 23 million people who are doing that. That mass of developers being involved drives apps exactly as Blake just said. You have to have a mass of developers to do that. That’s where quantum computing faces its biggest challenge, when it gets what I call out of the lab into the real world and all of a sudden there’s a million developers. Because developers rarely use things in the exact way they were intended to be used. They will find more uses, they will find the bugs, they will find the weaknesses. So the faster we can get to that point the better. I mean, Pri what are your thoughts?

Narang: We’re taking some of the circuits run on silicon superconducting to trapped ion and realizing that some so don’t work the same way. And yeah, you forget about developers breaking things in new ways, even experienced people break things in ways they didn’t anticipate, and have to call an engineer and say, hey, how do I fix this? If we expect, you know, a million developers entering the community to have to get answers from an expert engineer, that’s probably not a very scalable model. Something that could be useful is having better simulators that allow you to replicate some of the noise associated with current hardware, to see how things are performing? Also, simple stuff like getting runtime estimates. Getting a yea or nay on if your circuit going to actually fit on the device you’re trying to run it on? That’s a problem I’ve seen a lot of people have. They have a beautiful idea. And they assume that it can run on this really tiny device. I think there’s different levels to how do we make it easier for developers who are entering the field.

Link to panel video, https://www.youtube.com/watch?v=fBP6qTc_fGU&feature=youtu.be

[i] Quantum Volume (QV) is a hardware-agnostic metric that we defined to measure the performance of a real quantum computer. Each system we develop brings us along a path where complex problems will be more efficiently addressed by quantum computing; therefore, the need for system benchmarks is crucial, and simply counting qubits is not enough. As we have discussed in the past, Quantum Volume takes into account the number of qubits, connectivity, and gate and measurement errors. Material improvements to underlying physical hardware, such as increases in coherence times, reduction of device crosstalk, and software circuit compiler efficiency, can point to measurable progress in Quantum Volume, as long as all improvements happen at a similar pace. https://www.ibm.com/blogs/research/2020/01/quantum-volume-32/

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Empowering High-Performance Computing for Artificial Intelligence

April 19, 2024

Artificial intelligence (AI) presents some of the most challenging demands in information technology, especially concerning computing power and data movement. As a result of these challenges, high-performance computing Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire