Quantum Watch: Neutral Atoms Draw Growing Attention as Promising Qubit Technology

By John Russell

January 25, 2022

Currently, there are many qubit technologies vying for sway in quantum computing. So far, superconducting (IBM, Google) and trapped ion (IonQ, Quantinuum) have dominated the conversation. Microsoft’s proposed topological qubit, which relies on the existence of a still-unproven particle (Majorana), may be the most intriguing. Recently, neutral atom approaches have quickened pulses in the quantum community. Advocates argue the technology is inherently more scalable, offers longer coherence times (key for error correction), and they point to proof-of-concept 100-qubit systems that have already been built.

Atom Computing, founded in 2018, is one of the neutral atom quantum computing pioneers. Last week, it announced a successful Series B funding round ($60M). It currently has a 100-qubit system (Phoenix) and says it will use the latest cash infusion to build its launch system (Valkyrie) which will be much larger (qubit count) and likely be formally announced in 2022 and brought to market in 2023. It’s also touting 40-second coherence (nuclear spin qubit), which the company says is a world record.

Rob Hays, CEO, Atom Computing

“Our first product will launch as a cloud service initially, probably with a partner. And we know which one or two partners want to go with, and just haven’t signed any contracts yet. We’re still kind of negotiating the T’s and C’s,” said Rob Hays, Atom Computing’s relatively new (July ’21) CEO and president. Most recently, Hays was the chief strategy officer at Lenovo. Before that, he spent 20-plus years in Intel’s datacenter group, working on Xeon, GPUs and OmniPath products.

Company founder Ben Bloom shifted to CTO with Hays’ arrival. Bloom’s Ph.D. work was cold atom quantum research, done with Jun Ye at the University Colorado. Notably, Ye recently won the 2022 Breakthrough Prize in Fundamental Physics[i] for “outstanding contributions to the invention and development of the optical lattice clock, which enables precision tests of the fundamental laws of nature.” Ye is on Atom’s science advisory. Company headcount is now roughly 40, and the next leg in its journey is to bring a commercial neutral atom to market.

If this formula sounds familiar, that’s because it is. There’s been a proliferation of quantum computing startups founded by prominent quantum researchers who, after producing POC systems, bring on veteran electronics industry executives to grow the company. (Link to a growing list of quantum computing/communication companies).

All of the newer emerging qubit technologies are drawing attention. Quantum market watcher Bob Sorensen of Hyperion Research told HPCwire, “I am somewhat of a fan of neutral atom qubit technology. It’s room temp, it has impressive coherence times, but to me, most importantly, it shows good promise for scaling to a large single qubit processor. Atom, along with and France’s Pasqal, are committed to the technology and are they getting the funding and additional support to keep on their development and deployment track. So we do need to keep an eye on their progress.”

So what is neutral atom-based quantum computing? Bloom and Hays recently briefed HPCwire on the company’s technology and plans.

Broadly, neutral atom qubit technology shares much with trapped ion technology — except, obviously, the atoms aren’t charged. Instead of confining ions with electromagnetic forces, neutral atom approaches use light to trap atoms and hold them in position. The qubits are the atoms whose nuclear magnetic spin states (levels) are manipulated to set the qubit state. Atom has written a recent paper (Assembly and coherent control of a register of nuclear spin qubits) describing its approach.

Ben Bloom, founder and CTO, Atom Computing

Bloom said, “We use atoms in the second column of the periodic table (alkaline earth metals). All those atoms share properties. We use strontium, but it doesn’t actually have to have been strontium, it could have been anyone in that column. Similar to trapped ion technology, we capture single atoms, and we optically trapped them. We create this optical trapping landscape with lasers. The nice thing about this is every atom you trap and you put in those light traps is exactly the same. The coherence times you can make are really, really long. It was kind of only theorized you could create them that long, but now we’ve shown that you can create them that long.”

Hays describes the apparatus. “We put some strontium crystals and a little oven next to the vacuum chamber. There’s a little tube that [takes in] gaseous form of strontium as they get heated up and off-gased. The atoms are sucked into the vacuum chamber. Then we shine lasers through the little windows in the vacuum chamber to [form] a grid of light and the little individual atoms that are floating around in there get stuck like a magnet to those spots of light. Once we get them stuck in space, we can actually move them if we want and we can write quantum information with them using a separate set of lasers at a different wavelength. We’ve got a camera that sits under the microscope objective in the top of the system that reads out of the results.

“All that gets fed back into a standard rack of servers that’s running our software stack, you know, the classic compute system off to the side. That’s running our operating system, our scheduler, all the API’s for the access, programming, data storage. That rack also has our proprietary radio frequency control system, which is how we control the lasers. And we’re basically just controlling how many spots light there are, and what the frequency phase and amplitude is of those spots of light. People interact with it remotely.”

It’s pretty cool. Think of a cloud of atoms trapped in the vacuum tube. Lasers are shined through the cloud along an X/Y axes (2D). Wherever the beams intersect, a sticky spot is created, and nearby atoms get stuck in those spots. You don’t get 100 percent filled sticky spots on the first pass, but Atom has demonstrated the ability to move individual atoms to fill in open spots. The result is an 10×10 array of stuck neutral atoms which serve as qubits at addressable locations. The trapped atoms are spaced four microns apart, which is far enough to prevent nuclear spin (qubit state) interaction.

Entanglement between qubits is accomplished by pumping the atoms up into a Rydberg state. This basically puffs up the atoms’ outer shell, enlarging the spatial footprint, and permits becoming entangled with neighbors. This is how Atom Computing gets two-qubit gates.

“To scale the system, we just simply create more spots a light, so instead of like a 10 by 10 array, if we went to a 100 by 100 array of lasers, then we get to 10,000 qubits and if we went to 1000 by 1000 we get to a million the qubits,” said Hays. “So, at four microns [apart] to get to a million qubits we’re still less than a millimeter on a side in a cube. And it’s all, again, wireless control. We don’t have to worry about cabling up the different chips together and then putting them in a dilution refrigerator and all that kind of stuff; we just put more spots of light in the same vacuum chamber and read them with the control systems in the cameras.”

Hays noted that achieving 3D arrays is possible, but much trickier. Currently, Atom computing is focused on 2D arrays. Achieving the current 100-qubit system was done with lots of hand-tuning and intended for experimental flexibility. Moving forward, said Hays, CAD tools with an emphasis on manufacturability, use efficiency, will guide development of the Valkyrie system. Bloom and Hays declined to say how many qubits it would have.

It will be interesting to watch the ongoing jostling among qubit technologies.

Sorensen said, “I still think it is too early to start picking winners and losers in the qubit modality race… and isn’t that part of the fun right now? In reality, there are lots of variables to consider besides qubit count and other qubit specific technical parameters. To me, increasingly, the goal to focus on is not on how to build a qubit, but how to build a processor. That is why when I look at a modality, I consider its overall architectural potential: does it scale, can you do reasonable I/O to the classical side, does it have ready solutions for networking, and does it require esoteric equipment to manufacture and/or operate in a traditional compute environment?”

The issue, says Sorensen, is that there are many factors to consider here, so specific modality may not be the only valid indicator of the winner: “IBM, Quantinuum, Rigetti, and IonQ are quite visible in the sector, representing a range of modalities, but they recognize that they need to bring more to the table in terms of vision, experience, market philosophy, and end use relevance. The smart players know that it is entirely possible, as we have seen in the past, that the best pure technology does not always win in the final market analysis.”

Hays emphasizes Atom Computing is a hardware company and will work with the growing ecosystem for other tools, “We’re focusing on the hardware and the necessary software levels – operating system, scheduler, API’s, etc. – that allow people to interact with the system. [For other needs] we’re working with the ecosystem. We’re going to support Qiskit, we support it internally and we’ll support it for whichever cloud service provider we choose to go to market with we’ll support their tool suite as well. Then there’s companies like QC Ware, Zapata, Classiq and others that are building their own platforms. We’re going to be very partner friendly.”

Atom Computing says it has early collaborators but it’s hard to judge progress without fuller public access to the system. It will be interesting to see just how big (qubit count) the forthcoming system ends up being, and also what benchmarks Atom Computing supplies to the community along the way.

Figure describing Atom Computing approach from its paper.

[i] The Breakthrough Prize in Fundamental Physics[1] is awarded by the Fundamental Physics Prize Foundation, a not-for-profit organization dedicated to awarding physicists involved in fundamental research. The foundation was founded in July 2012 by Russian physicist and internet entrepreneur Yuri Milner.[2]

As of September 2018, this prize is the most lucrative academic prize in the world[3] and is more than twice the amount given to the Nobel Prize awardees.[4][5] This prize is also dubbed by the media as the “XXI Century Nobel”.[6]

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Empowering High-Performance Computing for Artificial Intelligence

April 19, 2024

Artificial intelligence (AI) presents some of the most challenging demands in information technology, especially concerning computing power and data movement. As a result of these challenges, high-performance computing Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire