A Q&A with Quantum Systems Accelerator Director Bert de Jong

By Aliyah Kovner

September 30, 2024

Editor’s Note: The five U.S.National Quantum Information Science Research Centers are all doing important and fascinating work. Unfortunately we don’t get to hear about that work enough. We’re reposting (below) a Q&A with the Quantum Science Accelerator (QSA) director Bert de Jong that was published today on the Lawrence Berkeley National Lab website. He does a nice job highlighting broad and specific examples of QSA’s work. You may not know, for example, that QSA is the lone QIS center working with atom-based technologies. de Jong touches on the QIS progress, QSA commercial and academic collaborations, and on real tasks being tackled by quantum computing.

de Jong says, “I’m a computational chemist by training, I do quantum research, but I also enjoy working with experimentalists solving real-world chemistry problems. One of the things I’m working on is how can we develop better materials to capture carbon from the air. The calculations that go into simulating materials for this purpose are very expensive. We would need access to exascale computers for pretty much months at a time, here at Berkeley Lab’s National Energy Research Computing Center or at Oak Ridge National Laboratory. Even with exascale computers, we greatly simplify the model of the reactions so we can run it on such a classical computer. Our hope is that we won’t have to reduce our models.”

Enjoy!

Quantum technologies may still be in development, but these systems are evolving rapidly and existing prototypes are already making a big impact on science and industry. One of the major hubs of quantum R&D is the Quantum Systems Accelerator (QSA), led by Lawrence Berkeley National Laboratory (Berkeley Lab). QSA is one of five National Quantum Information (NQI) Science Research Centers funded by the U.S. Department of Energy (DOE) to develop technologies that can solve longstanding challenges in physics, chemistry, materials, and biology that can’t be addressed with classical computers.

QSA brings together 15 member institutions in North America, including Sandia National Laboratories as lead partner. By collaborating on all aspects of quantum technology, the center is helping shift the field from theories to real-world tools.

We spoke to QSA Director Bert de Jong to learn more about QSA’s progress in the past four years, exciting plans for the future, and what kind of breakthroughs might await as quantum systems grow and mature.

Q. How is QSA getting us closer to scalable, more efficient quantum computers that are less error-prone?

de Jong: We are doing research that advances quantum computing technologies and we do it in a very unique way. We work on three of the most advanced qubit technologies: trapped ions, neutral atoms, and superconducting qubits. We are the only DOE National Quantum Initiative (NQI) center that pursues atom-based technologies. As a center, we focus on co-design of quantum technologies with control systems, algorithms, and end user applications. And we are not just advancing the technology to get more qubits and bigger quantum computers. We’re also trying to improve the qubits themselves – the fundamental computational component of a quantum computer. We’re doing this by designing better materials for the qubits, more accurate controls, and more accurate measurements. There are many sources of “noise” that muddy the functioning of the system and make quantum computers very error prone right now. And finding ways to address them is hard because nature is noisy!

We also develop the algorithms that actually run on these quantum computers and study how they run with increasing qubit count. Again, all of this is done in a co-design fashion. So, researchers investigating all these different components of quantum systems work together in QSA. We want to advance quantum computers that are better suited to solve the scientific problems that are key to the Department of Energy’s mission.

Graduate student Ben Saarel adjusts the settings on a vacuum apparatus used to make trapped ion qubits on the UC Berkeley campus. Credit: Thor Swift/Berkeley Lab

Q. How has QSA already led to new scientific discoveries or new approaches to doing research?

de Jong: We have really advanced technology, and I’ll give you a couple of examples of how we’re using it now to do real science. For neutral atoms, we’ve reached quantum simulators that are the size of 250 atoms, which is 250 qubits – that’s really large. That team, led by Harvard University, is not just able to control 250 atoms in the right way, they’re also able to do scientific simulations with them in a very accurate way. Now, once you get to 250 qubits, you can actually look at a lot of interesting problems.

We have also done some modeling of what scientists call “many-body systems,” which include materials that have a very complicated quantum behavior. Understanding these materials is critical to developing next-generation batteries, catalysts, and solar cells. By simulating these systems, we can start to observe behaviors of superconducting materials, and measure multiple phases of matter, which is something that is not trivial to do in a real experimental setting or to compute on classical computers.

The neutral atoms can also be used as quantum sensors. Our team at JILA recently used neutral atoms to make very accurate measurements of the gravitational redshift in a very small volume atomic sample. This is unique, it’s a new way of sensing for fundamental physics questions.

Beyond that, people are starting to use the atoms for quantum simulations of particle physics and high energy physics models that are going to become difficult to do on classical computers.

For trapped ion-based quantum computers, our partner Sandia developed the first system that can potentially hold 200 ions, which they have shipped to Duke University to integrate into their system. The next stage is exploring how to control the ions at that scale, and then we can start to do simulations and science with the system.

Using neutral atoms devices, Harvard has been looking at simulating open quantum systems. That’s important, because nature is an open quantum system. Everything is noisy outside of the closed systems we create. Understanding and modeling those real, noisy physical systems – like a molecule or material that is interacting with light or electromagnetic fields – are critical for better control over the chemical processes and materials that are relevant for a lot of science applications.

Teams in QSA have also done similar work with superconducting qubits. The team at Berkeley Lab is designing and fabricating new types of qubits, called grid qubits, that are much less prone to noise. The team led by MIT and MIT Lincoln Laboratory is focused on scaling superconducting systems, and recently used advanced superconducting devices to explore how entanglement works in complex quantum systems.

Sandia National Laboratories’ Enchilada Trap. Credit: Sandia Labs

Q. What about overlap with other fields? Tell me about your field of applying quantum computers to chemistry.

de Jong: Chemistry has long been touted as one of the early domains to take advantage of quantum computers. But it’s not easy to run chemistry calculations on a quantum computer. That’s why we have theorists and experimentalists working together to design chemistry simulations that get the most out of quantum computers. Let me give you one example of a recent chemistry experiment. The QSA team at Duke used their trapped ion quantum computer to actually simulate a nuclear magnetic resonance (NMR) experiment of a molecule, which is actually funny because an NMR experiment is effectively a quantum experiment too. If you wonder what NMR is, it is pretty similar to an MRI machine in a hospital. So, using a quantum computer to model a quantum system is a logical way to do it. This is what Richard Feynman really envisioned for quantum computers – modeling quantum systems.

Q. How are Centers like QSA working with industry and the wider research community to share advances, collaborate, and grow the field as a whole?

de Jong: We have a lot of other connections with industry because multiple technological advances and engineering advances that are being made either here or at Sandia or at MIT Lincoln Laboratory or any of our other partners get interest from a range of companies.

One notable example of helping the field is the work at Berkeley Lab advancing QubiC, an open-source FPGA-based quantum bit control system. This system is being adopted by academic institutions, among others.

And it goes in two ways actually. QSA is actually supporting tech transfer on QSA research and inventions into industry as well as supporting the creation of companies. I mentioned our advances in neutral atoms; most of that work was done at Harvard. QuEra is a company built on the pioneering research at Harvard. So, a lot of the QSA technology has now shown up in a commercial offering by QuEra, and there is a continued research partnership. MIT, another QSA member, has been working on superconducting qubits, and they just started a spinoff company called Atlantic Quantum. We also work with industry, such as IonQ, a company that commercializes trapped ion quantum computers for basic research that is used by other commercial entities like Microsoft, NVIDIA, and General Electric. They have a long history of getting trapped ion systems from Sandia.

Gang Huang and Yilun Xu from Berkeley Lab’s ATAP division led the development of QubiC, an open-source classical control system for quantum processors. Credit: Thor Swift/Berkeley Lab

Q. Data encryption is talked about a lot as a commercial application for quantum systems, but what are some other uses that people are hoping to bring to fruition soon?

de Jong: In the next couple of years, we should be able to start to do simulations of materials in a way that really helps industry develop better energy technologies like I mentioned earlier and potentially better materials for classical computing. I also think a lot of companies are definitely gunning for applications in life science, where there are a lot of potential applications, like drug discovery for example. But that’s a longer way out.

Quantum computers are also very good at optimization. Think of the electric grid, managing airplane traffic, road traffic, financial transactions, or anything that operates as a complex network of systems – that is something that is very complicated to optimize and classical computers have a hard time with it.

So, there are many areas that could benefit from quantum computing and sensing as well as quantum networking, which is the very nascent field of creating powerful networks of quantum computers like we have networks of classical computers. Those three technologies have the potential to really make a step change in how our society and economy operates. Are we there yet? No. But that’s why I would say almost every Fortune 500 company has at least a couple of people looking at how quantum can help them.

Q. One of the huge draws of quantum computers is the potential to solve problems that are intractable with current computing technologies. What are some examples of new types of research that you are excited about?

de Jong: Oh, so many. I’m a computational chemist by training, I do quantum research, but I also enjoy working with experimentalists solving real-world chemistry problems. One of the things I’m working on is how can we develop better materials to capture carbon from the air. The calculations that go into simulating materials for this purpose are very expensive. We would need access to exascale computers for pretty much months at a time, here at Berkeley Lab’s National Energy Research Computing Center or at Oak Ridge National Laboratory. Even with exascale computers, we greatly simplify the model of the reactions so we can run it on such a classical computer.

Our hope is that we won’t have to reduce our models. We can actually start running full models of reactions we’re interested in on a quantum computer. This comes with the additional benefit of saving a lot of power, as classical supercomputers require a great deal of energy to operate.

There are a lot of other fields with similar challenges, including high-energy physics, who hope to use quantum tools to increase our understanding of the foundational laws of the universe.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

AMD Announces Flurry of New Chips

October 10, 2024

AMD today announced several new chips including its newest Instinct GPU — the MI325X — as it chases Nvidia. Other new devices announced at the company event in San Francisco included the 5th Gen AMD EPYC processors, Read more…

NSF Grants $107,600 to English Professors to Research Aurora Supercomputer

October 9, 2024

The National Science Foundation has granted $107,600 to English professors at US universities to unearth the mysteries of the Aurora supercomputer. The two-year grant recipients will write up what the Aurora supercompute Read more…

VAST Looks Inward, Outward for An AI Edge

October 9, 2024

There’s no single best way to respond to the explosion of data and AI. Sometimes you need to bring everything into your own unified platform. Other times, you lean on friends and neighbors to chart a way forward. Those Read more…

Google Reports Progress on Quantum Devices beyond Supercomputer Capability

October 9, 2024

A Google-led team of researchers has presented more evidence that it’s possible to run productive circuits on today’s near-term intermediate scale quantum devices that are beyond the reach of classical computing. � Read more…

At 50, Foxconn Celebrates Graduation from Connectors to AI Supercomputing

October 8, 2024

Foxconn is celebrating its 50th birthday this year. It started by making connectors, then moved to systems, and now, a supercomputer. The company announced it would build the supercomputer with Nvidia's Blackwell GPUs an Read more…

ZLUDA Takes Third Wack as a CUDA Emulator

October 7, 2024

The ZLUDA CUDA emulator is back in its third invocation. At one point, the project was quietly funded by AMD and demonstrated the ability to run unmodified CUDA applications with near-native performance on AMD GPUs. Cons Read more…

NSF Grants $107,600 to English Professors to Research Aurora Supercomputer

October 9, 2024

The National Science Foundation has granted $107,600 to English professors at US universities to unearth the mysteries of the Aurora supercomputer. The two-year Read more…

VAST Looks Inward, Outward for An AI Edge

October 9, 2024

There’s no single best way to respond to the explosion of data and AI. Sometimes you need to bring everything into your own unified platform. Other times, you Read more…

Google Reports Progress on Quantum Devices beyond Supercomputer Capability

October 9, 2024

A Google-led team of researchers has presented more evidence that it’s possible to run productive circuits on today’s near-term intermediate scale quantum d Read more…

At 50, Foxconn Celebrates Graduation from Connectors to AI Supercomputing

October 8, 2024

Foxconn is celebrating its 50th birthday this year. It started by making connectors, then moved to systems, and now, a supercomputer. The company announced it w Read more…

The New MLPerf Storage Benchmark Runs Without ML Accelerators

October 3, 2024

MLCommons is known for its independent Machine Learning (ML) benchmarks. These benchmarks have focused on mathematical ML operations and accelerators (e.g., Nvi Read more…

DataPelago Unveils Universal Engine to Unite Big Data, Advanced Analytics, HPC, and AI Workloads

October 3, 2024

DataPelago this week emerged from stealth with a new virtualization layer that it says will allow users to move AI, data analytics, and ETL workloads to whateve Read more…

Stayin’ Alive: Intel’s Falcon Shores GPU Will Survive Restructuring

October 2, 2024

Intel's upcoming Falcon Shores GPU will survive the brutal cost-cutting measures as part of its "next phase of transformation." An Intel spokeswoman confirmed t Read more…

How GenAI Will Impact Jobs In the Real World

September 30, 2024

There’s been a lot of fear, uncertainty, and doubt (FUD) about the potential for generative AI to take people’s jobs. The capability of large language model Read more…

Shutterstock_2176157037

Intel’s Falcon Shores Future Looks Bleak as It Concedes AI Training to GPU Rivals

September 17, 2024

Intel's Falcon Shores future looks bleak as it concedes AI training to GPU rivals On Monday, Intel sent a letter to employees detailing its comeback plan after Read more…

Granite Rapids HPC Benchmarks: I’m Thinking Intel Is Back (Updated)

September 25, 2024

Waiting is the hardest part. In the fall of 2023, HPCwire wrote about the new diverging Xeon processor strategy from Intel. Instead of a on-size-fits all approa Read more…

Ansys Fluent® Adds AMD Instinct™ MI200 and MI300 Acceleration to Power CFD Simulations

September 23, 2024

Ansys Fluent® is well-known in the commercial computational fluid dynamics (CFD) space and is praised for its versatility as a general-purpose solver. Its impr Read more…

AMD Clears Up Messy GPU Roadmap, Upgrades Chips Annually

June 3, 2024

In the world of AI, there's a desperate search for an alternative to Nvidia's GPUs, and AMD is stepping up to the plate. AMD detailed its updated GPU roadmap, w Read more…

Nvidia Shipped 3.76 Million Data-center GPUs in 2023, According to Study

June 10, 2024

Nvidia had an explosive 2023 in data-center GPU shipments, which totaled roughly 3.76 million units, according to a study conducted by semiconductor analyst fir Read more…

Shutterstock_1687123447

Nvidia Economics: Make $5-$7 for Every $1 Spent on GPUs

June 30, 2024

Nvidia is saying that companies could make $5 to $7 for every $1 invested in GPUs over a four-year period. Customers are investing billions in new Nvidia hardwa Read more…

Shutterstock 1024337068

Researchers Benchmark Nvidia’s GH200 Supercomputing Chips

September 4, 2024

Nvidia is putting its GH200 chips in European supercomputers, and researchers are getting their hands on those systems and releasing research papers with perfor Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Leading Solution Providers

Contributors

IBM Develops New Quantum Benchmarking Tool — Benchpress

September 26, 2024

Benchmarking is an important topic in quantum computing. There’s consensus it’s needed but opinions vary widely on how to go about it. Last week, IBM introd Read more…

Intel Customizing Granite Rapids Server Chips for Nvidia GPUs

September 25, 2024

Intel is now customizing its latest Xeon 6 server chips for use with Nvidia's GPUs that dominate the AI landscape. The chipmaker's new Xeon 6 chips, also called Read more…

Quantum and AI: Navigating the Resource Challenge

September 18, 2024

Rapid advancements in quantum computing are bringing a new era of technological possibilities. However, as quantum technology progresses, there are growing conc Read more…

Google’s DataGemma Tackles AI Hallucination

September 18, 2024

The rapid evolution of large language models (LLMs) has fueled significant advancement in AI, enabling these systems to analyze text, generate summaries, sugges Read more…

IonQ Plots Path to Commercial (Quantum) Advantage

July 2, 2024

IonQ, the trapped ion quantum computing specialist, delivered a progress report last week firming up 2024/25 product goals and reviewing its technology roadmap. Read more…

Microsoft, Quantinuum Use Hybrid Workflow to Simulate Catalyst

September 13, 2024

Microsoft and Quantinuum reported the ability to create 12 logical qubits on Quantinuum's H2 trapped ion system this week and also reported using two logical qu Read more…

US Implements Controls on Quantum Computing and other Technologies

September 27, 2024

Yesterday the Commerce Department announced export controls on quantum computing technologies as well as new controls for advanced semiconductors and additive Read more…

Everyone Except Nvidia Forms Ultra Accelerator Link (UALink) Consortium

May 30, 2024

Consider the GPU. An island of SIMD greatness that makes light work of matrix math. Originally designed to rapidly paint dots on a computer monitor, it was then Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire