Q&A Part Two: ORNL’s Pooser on Progress in Quantum Communication

By John Russell

March 30, 2020

Quantum computing seems to get more than its fair share of attention compared to quantum communication. That’s despite the fact that quantum networking may be nearer to becoming a practical reality. In this second installment of HPCwire’s interview with Raphael Pooser, PI for DoE’s Quantum Testbed Pathfinder project and a member of Oak Ridge National Laboratory’s Quantum Information Science group, he discusses the state of quantum communication research.

Pooser notes, for example that the lack of robust quantum repeaters remains an obstacle to creating a quantum internet while the use of quantum key distribution (QKD[i]) to secure communications is already in limited use both in government and industry. He also emphasizes, at least in theory, it is possible to create an unhackable quantum communication network, just not easy to do. Pooser also offers some thoughts on the quantum hype cycle – it’s not all bad, he says, and most companies have a realistic view of quantum’s likely timetable.

HPCwire: I know there’s been a lot of work in quantum communications to make it more robust and cover longer distance. What’s happening in that area?

Raphael Pooser, ORNL

Raphael Pooser: That’s actually a big area of research in our quantum information science group. We have three teams – communication, sensing, and computing. We found a lot of companies want to hear about quantum communications because they’re quite concerned with cyber security. At Oak Ridge we’ve even licensed products to startup companies that are specifically based around quantum cyber security. Quantum random number generators, for example, that we’ve licensed to a start-up so that they can build quantum communication devices are based on the security of physics rather than [classical] computational complexity.

Not a lot of people know quantum communication is a big area of interest to the US electric grid. So again, back to DoE, but not the science arm of DOE, but the power and the energy science side of DOE which is required to protect the electric grid. DoE is funding a lot of research at various national labs, [such as] Los Alamos, Oak Ridge, and Brookhaven to study how to secure the grid with quantum key distribution (QKD). A lot of companies ask us about QKD. It’s one of the things, for example, Kaiser is interested in and I’ll talk about that next tomorrow as well [when I visit there].

HPCwire: What makes quantum communications secure? There seem to be conflicting claims about what quantum communications can and can’t do with regard to cyber-cryptography?

Raphael Pooser: There are limitations and maybe what you you’re picking up on are ways that people are hacking quantum key distribution. Quantum key distribution or quantum communication generally first came out a long time ago. One of the most popular schemes was invented in 1984 for example. Since then people have been trying to hack it because the original claim was it isn’t hackable. The key point is that it is unhackable by your traditional classical means. You can’t do things like intercepting, resend, man-in-the-middle eaves dropping, or cracking a code because there’s no code to crack. But people resorted to interesting new ideas for hacking, basically physics-based hacking. Now people attack the physics of these systems and are trying to discover ways to steal the secret keys by making those physics-based assumptions upon which the security is based untrue. It’s a whole different type of hacking. There have been a few successful demonstrations of bona fide hacking of quantum key distribution systems using physics based approaches.

The good news is there are unconditionally secure quantum key distribution, quantum communication schemes out there. The thing to know about why some implementations may not be absolutely secure under all conditions is because they relax some of the physics assumptions that go into them when they build them so that they can make a more practically buildable system. Some of the systems that have the true, fully unconditional security based on the physics are more difficult to build. They’ve been built though in the laboratory and demonstrated in the laboratory. So you really can build unconditionally secure systems. It’s just that they’re a bit harder.

HPCwire: Is quantum communication closer to practical reality and wide-spread use than quantum computing?

Raphael Pooser: That’s a very interesting question. In some ways, yes. In some ways, no. Here’s what I mean by that. If you have a quantum communications network that is regional, you know, it’s not very long range, then absolutely it’s useful and you can look around and see various quantum communication networks actually operating right now. [Think] secured voting results. You’ve probably heard of this example in Vienna, they secured voting results by sending the results using quantum communications down the street from the polling station to the Capitol building. And banks use it already. You can see quantum communication used in quite a few places.

Now, if you’re thinking about things like really widespread quantum communication, like a quantum internet, where you have a nationwide network of quantum communication from coast to coast, that is probably as far off as fault-tolerant quantum computing. So there’s different levels of usability. You can network quantum mechanically right now on a regional scale fairly well. But a national scale internet level networking is very far off and it’s because of the key requirement called a quantum repeater, right? That doesn’t get doesn’t exist yet.

HPCwire: I saw a recent report by researchers in China who were using atomic ensembles as a base for repeaters. They reported getting more predictable results and getting better distance, around 50 miles, and it looked promising as a potential technology for use as quantum repeaters.

Raphael Pooser: What’s interesting about this result is they’re using a quantum memory. When you talk about an ensemble of atoms, you’re thinking about using them as a quantum memory. You couple the photons, the quantum information for the photons, into the atoms and then you couple the quantum information back from the atoms when you’re ready to proceed with the repeater operations. What’s interesting is integrating that [kind of quantum memory] into a quantum communications system. The distance of 50 miles itself is not quite so impressive. It doesn’t exceed any current record for repeater-less communication. The next step would be to incorporate a repeater, the memory-based repeater, into a system that goes a longer distance than you currently could do without a repeater.

So there’s two steps here. What they’ve done is step one which is to integrate this quantum memory system into their communication system. They demonstrated the functionality, but not the distance. The next step is to show that it actually works to get you a longer distance because that’s what bridges the coasts in a quantum network; it is the quantum repeater. There’s a lot of work on repeaters right now in the U.S. as well [as China]. There’s a lot of quantum memory work going on especially within DOE [including] memory-based repeater work at Brookhaven. At Oak Ridge we are working on what we call memory-less repeaters. Those are repeaters that don’t require coupling into and out of atoms. Those are all optical devices.

HPCwire: Isn’t signal loss still a problem there?

Raphael Pooser: It’s a problem with every repeater including the memory ensembles because what’s important is how much quantum information you can get in and out. When you have lots of loss one of the things you can do is try to correct for it using quantum error correction inside the repeater. Another way is to try to distill out a very nice high-quality quantum state that can still be used as a resource for communication. Those two are definitely paths forward to trying to help correct for this information loss.

HPCwire: Will you present next week at the APS meeting? [This meeting was eventually cancelled as result of the COVID-19 pandemic]

Raphael Pooser: Not personally, but many of my team will be there. I’m actually on my way right now to a conference on the West Coast at Kaiser Permanente (health care). They want to know about quantum computing and I’m on my way to talk to them about just the general idea of quantum technology and what it could be good for us more, more outward.

A rendering of IBM Q System One, the world’s first fully integrated universal quantum computing system, currently installed at the Thomas J Watson Research Center. Source: IBM

HPCwire: Isn’t a bit early still for real applications. What are your thoughts on controlling the hype, which seems everywhere. It still seems like it will be a long time before Kaiser Permanente is going to be able to use quantum computing.

Raphael Pooser: Maybe. It may be a long way off and there is a question of hype. I think that we are in a state of high hype. We are in a state of high-risk and-high reward research and it’s important for people to explain to the public that this is high risk research. It’s not a 100% done deal that quantum computing, especially of the fault-tolerant type, is going to solve most problems of interest and is around the corner.

I think the reason the hype is high is because there have been some key results in recent years that encouraged everyone. Folks like Kaiser, people from health care companies to oil companies and gasoline producers, are interested in quantum computing because they’ve seen these promising early results in areas like chemistry, nuclear physics, computational fluid dynamics, and also in in AI and machine learning. What I like about this hype cycle is it’s a substantive-driven hype cycle. In other words, it’s driven by real scientific results. It’s definitely possible that things are getting overhyped right now. But at the same time, these companies don’t want to miss out on understanding what’s going on. A lot of companies that are really active asking us what’s going on with quantum technology are coming in with their heads on straight. They are asking about quantum information science more generally and they don’t say what can I do tomorrow with it? They say, what does this mean for us in the future because we don’t want to lose the competitive edge, even if it’s 10 years out?

I’ll just say something else about these companies. They’re not just interested in quantum computing. There are some nearer term technologies they’re interested in and those are the technologies of quantum sensing, and potentially quantum networking, or quantum communications. Quantum computing is a big driver of their interest, but they want to hear about these other quantum technologies that are getting a lot less hype, but are potentially very impactful as well.

HPCwire: Thanks for your time.

[i](Brief QKD Backgrounder – Quantum key distribution is currently the main way to implement quantum-secured communications. In essence, a quantum system generates a random key to encode a message. The key is used to encrypt the message and the key is shared between parties in such a way that any attempt to discover the key – in physical terms, measure it – causes detectable changes so both parties immediately know a third party has tried to read the message. The latter would trigger a resend cycle involving use of a new key. Implementation schemes vary. Quantum cryptography is only used to produce and distribute a key, not to transmit any message data. Here are links to a few explanations: WikipediaWired, EC Quantum Flagship Project, and Physics.org.)

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

IBM, CQC Intro Cloud-based Quantum Random Number Generation

September 21, 2020

IBM and Cambridge Quantum Computing (CQC) have partnered to achieve progress on one of the major business aspirations for quantum computing – the goal of generating verified, truly random numbers that can be used for a Read more…

By Todd R. Weiss

European Commission Declares €8 Billion Investment in Supercomputing

September 18, 2020

Just under two years ago, the European Commission formalized the EuroHPC Joint Undertaking (JU): a concerted HPC effort (comprising 32 participating states at current count) across the European Union and supplanting HPC Read more…

By Oliver Peckham

Google Hires Longtime Intel Exec Bill Magro to Lead HPC Strategy

September 18, 2020

In a sign of the times, another prominent HPCer has made a move to a hyperscaler. Longtime Intel executive Bill Magro joined Google as chief technologist for high-performance computing, a newly created position that is a Read more…

By Tiffany Trader

Swiss Supercomputer Enables Ultra-Precise Climate Simulations

September 17, 2020

As smoke from the record-breaking West Coast wildfires pours across the globe and tropical storms continue to form at unprecedented rates, the state of the global climate is once again looming in the public eye. Owing to Read more…

By Oliver Peckham

Future of Fintech on Display at HPC + AI Wall Street

September 17, 2020

Those who tuned in for Tuesday's HPC + AI Wall Street event got a peak at the future of fintech and lively discussion of topics like blockchain, AI for risk management, and high-frequency trading, as told by a group of l Read more…

By Alex Woodie,Tiffany Trader and Todd R. Weiss

AWS Solution Channel

Next-generation aerospace modeling and simulation: benchmarking Amazon Web Services High Performance Computing services

The aerospace industry has been using Computational Fluid Dynamics (CFD) for decades to create and optimize designs digitally, from the largest passenger planes and fighter jets to gliders and drones. Read more…

Intel® HPC + AI Pavilion

Berlin Institute of Health: Putting HPC to Work for the World

Researchers from the Center for Digital Health at the Berlin Institute of Health (BIH) are using science to understand the pathophysiology of COVID-19, which can help to inform the development of targeted treatments. Read more…

Legacy HPC System Seeds Supercomputing Excellence at UT Dallas

September 16, 2020

What happens to supercomputers after their productive life at an academic research center ends? The question often arises when people hear that the average age of a top supercomputer at retirement is about five years. Rest assured — systems aren’t simply scrapped. Instead, they’re donated to organizations and institutions that can make... Read more…

By Aaron Dubrow

European Commission Declares €8 Billion Investment in Supercomputing

September 18, 2020

Just under two years ago, the European Commission formalized the EuroHPC Joint Undertaking (JU): a concerted HPC effort (comprising 32 participating states at c Read more…

By Oliver Peckham

Google Hires Longtime Intel Exec Bill Magro to Lead HPC Strategy

September 18, 2020

In a sign of the times, another prominent HPCer has made a move to a hyperscaler. Longtime Intel executive Bill Magro joined Google as chief technologist for hi Read more…

By Tiffany Trader

Future of Fintech on Display at HPC + AI Wall Street

September 17, 2020

Those who tuned in for Tuesday's HPC + AI Wall Street event got a peak at the future of fintech and lively discussion of topics like blockchain, AI for risk man Read more…

By Alex Woodie,Tiffany Trader and Todd R. Weiss

IBM’s Quantum Race to One Million Qubits

September 15, 2020

IBM today outlined its ambitious quantum computing technology roadmap at its virtual Quantum Summit. The eye-popping million qubit number is still far out, agrees IBM, but perhaps not that far out. Just as eye-popping is IBM’s nearer-term plan for a 1,000-plus qubit system named Condor... Read more…

By John Russell

Nvidia Commits to Buy Arm for $40B

September 14, 2020

Nvidia is acquiring semiconductor design company Arm Ltd. for $40 billion from SoftBank in a blockbuster deal that catapults the GPU chipmaker to a dominant position in the datacenter while helping troubled SoftBank reverse its financial woes. The deal, which has been rumored for... Read more…

By Todd R. Weiss and George Leopold

AMD’s Massive COVID-19 HPC Fund Adds 18 Institutions, 5 Petaflops of Power

September 14, 2020

Almost exactly five months ago, AMD announced its COVID-19 HPC Fund, an ongoing flow of resources and equipment to research institutions studying COVID-19 that began with an initial donation of $15 million. In June, AMD announced major equipment donations to several major institutions. Now, AMD is making its third major COVID-19 HPC Fund... Read more…

By Oliver Peckham

HPC Strategist Dave Turek Joins DNA Storage (and Computing) Company Catalog

September 11, 2020

You've heard the saying "flash is the new disk and disk is the new tape," which traces its origins back to Jim Gray*. But what if DNA-based data storage could o Read more…

By Tiffany Trader

Google’s Quantum Chemistry Simulation Suggests Promising Path Forward

September 9, 2020

A much-anticipated prize in quantum computing is the ability to more accurately model chemical bonding behavior. Doing so should lead to better chemical synthes Read more…

By John Russell

Supercomputer-Powered Research Uncovers Signs of ‘Bradykinin Storm’ That May Explain COVID-19 Symptoms

July 28, 2020

Doctors and medical researchers have struggled to pinpoint – let alone explain – the deluge of symptoms induced by COVID-19 infections in patients, and what Read more…

By Oliver Peckham

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

Intel’s 7nm Slip Raises Questions About Ponte Vecchio GPU, Aurora Supercomputer

July 30, 2020

During its second-quarter earnings call, Intel announced a one-year delay of its 7nm process technology, which it says it will create an approximate six-month shift for its CPU product timing relative to prior expectations. The primary issue is a defect mode in the 7nm process that resulted in yield degradation... Read more…

By Tiffany Trader

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing th Read more…

By Tiffany Trader

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer

June 9, 2020

Pittsburgh Supercomputing Center (PSC - a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award Read more…

By Tiffany Trader

Google Hires Longtime Intel Exec Bill Magro to Lead HPC Strategy

September 18, 2020

In a sign of the times, another prominent HPCer has made a move to a hyperscaler. Longtime Intel executive Bill Magro joined Google as chief technologist for hi Read more…

By Tiffany Trader

Leading Solution Providers

Contributors

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

Oracle Cloud Infrastructure Powers Fugaku’s Storage, Scores IO500 Win

August 28, 2020

In June, RIKEN shook the supercomputing world with its Arm-based, Fujitsu-built juggernaut: Fugaku. The system, which weighs in at 415.5 Linpack petaflops, topp Read more…

By Oliver Peckham

Google Cloud Debuts 16-GPU Ampere A100 Instances

July 7, 2020

On the heels of the Nvidia’s Ampere A100 GPU launch in May, Google Cloud is announcing alpha availability of the A100 “Accelerator Optimized” VM A2 instance family on Google Compute Engine. The instances are powered by the HGX A100 16-GPU platform, which combines two HGX A100 8-GPU baseboards using... Read more…

By Tiffany Trader

DOD Orders Two AI-Focused Supercomputers from Liqid

August 24, 2020

The U.S. Department of Defense is making a big investment in data analytics and AI computing with the procurement of two HPC systems that will provide the High Read more…

By Tiffany Trader

Joliot-Curie Supercomputer Used to Build First Full, High-Fidelity Aircraft Engine Simulation

July 14, 2020

When industrial designers plan the design of a new element of a vehicle’s propulsion or exterior, they typically use fluid dynamics to optimize airflow and in Read more…

By Oliver Peckham

Microsoft Azure Adds A100 GPU Instances for ‘Supercomputer-Class AI’ in the Cloud

August 19, 2020

Microsoft Azure continues to infuse its cloud platform with HPC- and AI-directed technologies. Today the cloud services purveyor announced a new virtual machine Read more…

By Tiffany Trader

Japan’s Fugaku Tops Global Supercomputing Rankings

June 22, 2020

A new Top500 champ was unveiled today. Supercomputer Fugaku, the pride of Japan and the namesake of Mount Fuji, vaulted to the top of the 55th edition of the To Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This