IBM Touts Quantum Network Growth, Improving QC Quality, and Battery Research

By John Russell

January 8, 2020

IBM today announced its Q (quantum) Network community had grown to 100-plus – Delta Airlines and Los Alamos National Laboratory are among most recent additions – and that an IBM quantum computer had achieved a quantum volume (QV) benchmark of 32 in keeping with plans to double QV yearly. IBM also showcased POC work with Daimler using a quantum computer to tackle materials research in battery development.

Perhaps surprisingly the news was released at the 2020 Consumer Electronics Show taking place in Las Vegas this week – “Very few ‘consumers’ will ever buy a quantum computer,” agreed IBM’s Jeff Welser in a pre-briefing with HPCwire.

That said, CES has broadened its technology compass in recent years and Delta CEO Ed Bastian delivered the opening keynote touching upon technology’s role in transforming the travel and the travel experience. Quantum computing, for example, holds promise for a wide range of relevant optimization problems such as traffic control and logistics. “We’re excited to explore how quantum computing can be applied to address challenges across the day of travel,” said Rahul Samant, Delta’s CIO, in the official IBM announcement.

Jeff Welser, IBM

IBM’s CES quantum splash was mostly about demonstrating the diverse and growing interest in quantum computing (QC) by companies. “Many of our clients are consumer companies themselves who are utilizing these systems within the Q network,” said Welser, who wears a number of hats for IBM Research, including VP of exploratory science and lab director of the Almaden Lab. “Think about the many companies who are trying to use quantum technology to come up with new materials that will make big changes in future consumer electronics,” said Welser.

Since its launch in 2016, IBM has aggressively sought to grow the IBM Q network and its available resources. IBM now has a portfolio of 15 quantum computers, ranging in size from 53-qubits down to a single-qubit ‘system’ as well as extensive quantum simulator capabilities. Last year IBM introduced Quantum Volume, a new metric for benchmarking QC progress, and suggested others should adopt it. QV is composite measure encompassing many attributes – gate fidelity, noise, coherence times, and more – not just qubit count; so far QV’s industry-wide traction has seemed limited.

Welser emphasized IBM Q Network membership has steadily grown and now spans multiple industries including airline, automotive, banking and finance, energy, insurance, materials and electronics. Newest large commercial members include Anthem, Delta, Goldman Sachs, Wells Fargo and Woodside Energy. New academia/government members include Georgia Institute of Technology and LANL. (A list and brief description of new members is at the end of the article.)

“IBM’s focus, since we put the very first quantum computer on the cloud in 2016, has been to move quantum computing beyond isolated lab experiments conducted by a handful of organizations, into the hands of tens of thousands of users,” said Dario Gil, director of IBM Research in the official announcement. “We believe a clear advantage will be awarded to early adopters in the era of quantum computing and with partners like Delta, we’re already making significant progress on that mission.”

IBM’s achievement of a QV score of 32 and the recent Daimler work are also significant. When IBM introduced QV concept broadly at the American Physical Society meeting last March, it had achieved a QV score of 16 on its fourth generation 20-qubit system. At that time IBM likened QV to the Linpack benchmark used in HPC and calling it suitable for comparing diverse quantum computing systems. Translating QV into a specific target score that will be indicative of being able to solve real-world problems is still mostly guesswork; indeed different QV-ratings may be adequate for different applications.

IBM issued a blog today discussing the latest QV showing, which was achieved on a new 28-qubit system named Raleigh. IBM also elaborated somewhat on internal practices and timetable expectations.

Writing in the blog, IBM quantum researchers Jerry Chow and Jay Gambetta note, “Since we deployed our first system with five qubits in 2016, we have progressed to a family of 16-qubit systems, 20-qubit systems, and (most recently) the first 53-qubit system. Within these families of systems, roughly demarcated by the number of qubits (internally we code-name the individual systems by city names, and the development threads as different birds), we have chosen a few to drive generations of learning cycles (Canary, Albatross, Penguin, and Hummingbird).”

It gets a bit confusing and best to consult the blog directly for a discussion of error mitigation efforts among the different IBM systems. Each system undergoes revision to improve and experiment with topology and error mitigation strategies.

Chow and Gambetta write, “We can look at the specific case for our 20-qubit systems (internally referred to as Penguin), shown in this figure:

“Shown in the plots are the distributions of CNOT errors across all of the 20-qubit systems that have been deployed, to date. We can point to four distinct revisions of changes that we have integrated into these systems, from varying underlying physical device elements, to altering the connectivity and coupling configuration of the underlying qubits. Overall, the results are striking and visually beautiful, taking what was a wide distribution of errors down to a narrow set, all centered around ~1-2% for the Boeblingen system. Looking back at the original 5-qubit systems (called Canary), we are also able to see significant learning driven into the devices.”

Looking at the evolution of quantum computing by decade IBM says:

  • 1990s: fundamental theoretical concepts showed the potential of quantum computing
  • 2000s: experiments with qubits and multi-qubit gates demonstrated quantum computing could be possible
  • And the decade we just completed, the 2010s: evolution from gates to architectures and cloud access, revealing a path to a real demand for quantum computing systems

“So where does that put us with the 2020s? The next ten years will be the decade of quantum systems, and the emergence of a real hardware ecosystem that will provide the foundation for improving coherence, gates, stability, cryogenics components, integration, and packaging,” write Chow and Gambetta. “Only with a systems development mindset will we as a community see quantum advantage in the 2020s.”

On the application development front, the IBM-Daimler work is interesting. A blog describing the work was posted today by Jeannette Garcia (global lead for quantum applications in quantum chemistry, IBM). She is also an author on the paper (Quantum Chemistry Simulations of Dominant Products in Lithium-Sulfur Batteries). She framed the challenge nicely in the blogpost:

“Today’s supercomputers can simulate fairly simple molecules, but when researchers try to develop novel, complex compounds for better batteries and life-saving drugs, traditional computers can no longer maintain the accuracy they have at smaller scales. The solution has typically been to model experimental observations from the lab and then test the theory.

“The largest chemical problems researchers have been so far able to simulate classically, meaning on a standard computer, by exact diagonalization (or FCI, full configuration interaction) comprise around 22 electrons and 22 orbitals, the size of an active space in the pentacene molecule. For reference, a single FCI iteration for pentacene takes ~1.17 hours on ~4096 processors and a full calculation would be expected to take around nine days.

“For any larger chemical problem, exact calculations become prohibitively slow and memory-consuming, so that approximation schemes need to be introduced in classical simulations, which are not guaranteed to be accurate and affordable for all chemical problems. It’s important to note that reasonably accurate approximations to classical FCI approaches also continue to evolve and is an active area of research, so we can expect that accurate approximations to classical FCI calculations will also continue to improve over time.”

IBM and Daimler researchers, building on earlier algorithm development work, were able to simulate dipole moment of three lithium-containing molecules, “which brings us one step closer the next-generation lithium sulfur (Li-S) batteries that would be more powerful, longer lasting and cheaper than today’s widely used lithium ion batteries.”

Garcia writes, “We have simulated the ground state energies and the dipole moments of the molecules that could form in lithium-sulfur batteries during operation: lithium hydride (LiH), hydrogen sulfide (H2S), lithium hydrogen sulfide (LiSH), and the desired product, lithium sulfide (Li2S). In addition, and for the first time ever on quantum hardware, we demonstrated that we can calculate the dipole moment for LiH using 4 qubits on IBM Q Valencia, a premium-access 5-qubit quantum computer.”

She notes Daimler hope that quantum computers will eventually help them design next-generation lithium-sulfur batteries, because they have the potential to compute and precisely simulate their fundamental behavior. Current QCs are too noisy and limited in size but the POC work is promising. It also represents a specific, real-world opportunity.

Link to IBM blog: https://www.ibm.com/blogs/research/2020/01/quantum-volume-32/

Link to Daimler paper: https://arxiv.org/abs/2001.01120

Feature image: Photo of the IBM System One quantum computer being shown at CES. Source: IBM

 

List of IBM Q New Members Excerpted from the Release (unedited)

Commercial organizations:

  • Anthem: Anthem is a leading health benefits company and will be expanding its research and development efforts to explore how quantum computing may further enhance the consumer healthcare experience. Anthem brings its expertise in working with healthcare data to the Q Network. This technology also has the potential to help individuals lead healthier lives in a number of ways, such as helping in the development of more accurate and personalized treatment options and improving the prediction of health conditions.
  • Delta Air Lines: The global airline has agreed to join the IBM Q Hub at North Carolina State University. They are the first airline to embark on a multi-year collaborative effort with IBM to explore the potential capabilities of quantum computing to transform experiences for customers and employees and address challenges across the day of travel.

Academic institutions and government research labs:

  • Georgia Tech: The university has agreed to join the IBM Q Hub at the Oak Ridge National Laboratory to advance the fundamental research and use of quantum computing in building software infrastructure to make it easier to operate quantum machines, and developing specialized error mitigation techniques. Access to IBM Q commercial systems will also allow Georgia Tech researchers to better understand the error patterns in existing quantum computers, which can help with developing the architecture for future machines.
  • Los Alamos National Laboratory: Joining as an IBM Q Hub will greatly help the Los Alamos National Laboratory research efforts in several directions, including developing and testing near-term quantum algorithms and formulating strategies for mitigating errors on quantum computers. The 53-qubit system will also allow Los Alamos to benchmark the abilities to perform quantum simulations on real quantum hardware and perhaps to push beyond the limits of classical computing. Finally, the IBM Q Network will be a tremendous educational tool, giving students a rare opportunity to develop innovative research projects in the Los Alamos Quantum Computing Summer School.

Startups:

  • AIQTECH: Based in Toronto, AiQ is an artificial intelligence software enterprise set to unleash the power of AI to “learn” complex systems. In particular, it provides a platform to characterize and optimize quantum hardware, algorithms, and simulations in real time. This collaboration with the IBM Q Network provides a unique opportunity to expand AiQ’s software backends from quantum simulation to quantum control and contribute to the advancement of the field.
  • BEIT: The Kraków, Poland-based startup is hardware-agnostic, specializing in solving hard problems with quantum-inspired hardware while preparing the solutions for the proper quantum hardware, when it becomes available. Their goal is to attain super-polynomial speedups over classical counterparts with quantum algorithms via exploitation of problem structure.
  • Quantum Machines: QM is a provider of control and operating systems for quantum computers, with customers among the leading players in the field, including multinational corporations, academic institutions, start-ups and national research labs. As part of the IBM and QM collaboration, a compiler between IBM’s quantum computing programming languages, and those of QM is being developed and offered to QM’s customers. Such development will lead to the increased adoption of IBM’s open-sourced programming languages across the industry.
  • TradeTeq: TradeTeq is the first electronic trading platform for the institutional trade finance market. With teams in London, Singapore, and Vietnam, TradeTeq is using AI for private credit risk assessment and portfolio optimization. TradeTeq is collaborating with leading universities around the globe to build the next generation of machine learning and optimization models, and is advancing the use of quantum machine learning to build models for better credit, investment and portfolio decisions.
  • Zurich Instruments: Zurich Instruments is a test and measurement company based in Zurich, Switzerland, with the mission to progress science and help build the quantum computer. It is developing state-of-the-art control electronics for quantum computers, and now offers the first commercial Quantum Computing Control System linking high-level quantum algorithms with the physical qubit implementation. It brings together the instrumentation required for quantum computers from a few qubits to 100 qubits. They will work on the integration of IBM Q technology with the companies’ own electronics to ensure reliable control and measurement of a quantum device while providing a clean software interface to the next higher level in the stack.”
Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Harvard/Google Use AI to Help Produce Astonishing 3D Map of Brain Tissue

May 10, 2024

Although LLMs are getting all the notice lately, AI techniques of many varieties are being infused throughout science. For example, Harvard researchers, Google, and colleagues published a 3D map in Science this week that Read more…

ISC Preview: Focus Will Be on Top500 and HPC Diversity 

May 9, 2024

Last year's Supercomputing 2023 in November had record attendance, but the direction of high-performance computing was a hot topic on the floor. Expect more of that at the upcoming ISC High Performance 2024, which is hap Read more…

Processor Security: Taking the Wong Path

May 9, 2024

More research at UC San Diego revealed yet another side-channel attack on x86_64 processors. The research identified a new vulnerability that allows precise control of conditional branch prediction in modern processors.� Read more…

The Ultimate 2024 Winter Class Round-Up

May 8, 2024

To make navigating easier, we have compiled a collection of all the 2024 Winter Classic News in this single page round-up. Meet The Teams   Introducing Team Lobo This is the other team from University of New Mex Read more…

How the Chip Industry is Helping a Battery Company

May 8, 2024

Chip companies, once seen as engineering pure plays, are now at the center of geopolitical intrigue. Chip manufacturing firms, especially TSMC and Intel, have become the backbone of devices with an on/off switch. Thes Read more…

Illinois Considers $20 Billion Quantum Manhattan Project Says Report

May 7, 2024

There are multiple reports that Illinois governor Jay Robert Pritzker is considering a $20 billion Quantum Manhattan-like project for the Chicago area. According to the reports, photonics quantum computer developer PsiQu Read more…

ISC Preview: Focus Will Be on Top500 and HPC Diversity 

May 9, 2024

Last year's Supercomputing 2023 in November had record attendance, but the direction of high-performance computing was a hot topic on the floor. Expect more of Read more…

Illinois Considers $20 Billion Quantum Manhattan Project Says Report

May 7, 2024

There are multiple reports that Illinois governor Jay Robert Pritzker is considering a $20 billion Quantum Manhattan-like project for the Chicago area. Accordin Read more…

The NASA Black Hole Plunge

May 7, 2024

We have all thought about it. No one has done it, but now, thanks to HPC, we see what it looks like. Hold on to your feet because NASA has released videos of wh Read more…

How Nvidia Could Use $700M Run.ai Acquisition for AI Consumption

May 6, 2024

Nvidia is touching $2 trillion in market cap purely on the brute force of its GPU sales, and there's room for the company to grow with software. The company hop Read more…

Hyperion To Provide a Peek at Storage, File System Usage with Global Site Survey

May 3, 2024

Curious how the market for distributed file systems, interconnects, and high-end storage is playing out in 2024? Then you might be interested in the market anal Read more…

Qubit Watch: Intel Process, IBM’s Heron, APS March Meeting, PsiQuantum Platform, QED-C on Logistics, FS Comparison

May 1, 2024

Intel has long argued that leveraging its semiconductor manufacturing prowess and use of quantum dot qubits will help Intel emerge as a leader in the race to de Read more…

Stanford HAI AI Index Report: Science and Medicine

April 29, 2024

While AI tools are incredibly useful in a variety of industries, they truly shine when applied to solving problems in scientific and medical discovery. Research Read more…

IBM Delivers Qiskit 1.0 and Best Practices for Transitioning to It

April 29, 2024

After spending much of its December Quantum Summit discussing forthcoming quantum software development kit Qiskit 1.0 — the first full version — IBM quietly Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Leading Solution Providers

Contributors

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

Intel Plans Falcon Shores 2 GPU Supercomputing Chip for 2026  

August 8, 2023

Intel is planning to onboard a new version of the Falcon Shores chip in 2026, which is code-named Falcon Shores 2. The new product was announced by CEO Pat Gel Read more…

The NASA Black Hole Plunge

May 7, 2024

We have all thought about it. No one has done it, but now, thanks to HPC, we see what it looks like. Hold on to your feet because NASA has released videos of wh Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire