The Race to Ensure Post Quantum Data Security

By John Russell

July 21, 2022

Fault-tolerant quantum computers won’t exist for years – a decade is the most common estimate. When they do arrive, thanks to Shor’s now-famous algorithm, they will be able to crack the most widely-used encryption methods, which are based on factoring. Earlier this month, the National Institute of Standards and Technology (NIST) settled on four algorithms – one for public-key-encryption (KEM) and three for digital signatures – based on lattice problems and hash functions, for incorporation into new post-quantum encryption standards.

These are deliverables of NIST’s post-quantum cryptography standardization project (PQC), begun in 2016 and involving multiple rounds of submissions by industry, academia, and public entities, and assessment by NIST. This was the third round. A final fourth round is planned to consider four more algorithms.

NIST has issued a thorough report detailing the PQC process and sharing, for example, benchmark data across multiple processor types, and explaining NIST’s rationale for the selections. Three selection criteria were used: 1) security (most important), 2) cost and performance, and 3) algorithm and implementation characteristics. The latest NIST report isn’t news in the sense that the quantum community and virtually all enterprise data security professionals have been closely tracking NIST’s PQC efforts.

Even as NIST works to formalizes the new standards, it has begun a new project – Migration to Post Quantum Cryptography – in collaboration with industry to develop tools and migration practices to protect data. That project is being run by NIST’s National Cybersecurity Center of Excellence (NCCoE). Here’s a snapshot of the program’s main goals:

  • Demonstrate the use of automated discovery tools to identify instances of quantum-vulnerable public-key algorithm use, where they are used in dependent systems, and for what purposes.
  • Once the public-key cryptography components and associated assets in the enterprise are identified, the next project element is prioritizing those applications that need to be considered first in migration planning.
  • Finally, the project will describe systematic approaches for migrating from vulnerable algorithms to quantum-resistant algorithms across different types of organizations, assets, and supporting technologies.

NIST issued a Cooperative Research and Development Agreement (CRADA) to work on the project and last week disclosed the list of vendors who’ve signed on to participate so far: Amazon Web Services, Inc. (AWS), Cisco Systems, Inc., Crypto4A Technologies, Inc., Cryptosense SA, InfoSec Global, ISARA Corporation, Microsoft, Samsung SDS Co., Ltd., SandboxAQ, Thales DIS CPL USA, Inc., and Thales Trusted Cyber Technologies, VMware, Inc.

Bottom line: The race to develop tools to protect data from quantum computer-based attacks has begun in earnest.

It’s been reported that more than 20 billion devices will need to upgrade their software to PQC before quantum computers crack RSA encryption. “Adversaries are already engaged in Store Now Decrypt Later (SNDL) attacks — stealing and storing encrypted data (e.g., financial records, intellectual property, medical records, etc.) to crack and exploit later when quantum computers become readily available,” says Google spinout, SandBoxAQ, a CRADA participant.

The first step, of course, was selection of the first set of PQC algorithms, intended for use in public-key encryption (KEMs) and digital signatures. HPCwire recently talked with Vadim Lyubashevsky, a prominent cryptography analyst and IBM researcher whose contributions have been important for the NIST PQC. Lyubashevsky provided insight into development and characteristics of effective encryption.

The history of cryptography is fascinating. It blends political and industrial intrigue and clever mathematics, and has affected the outcomes of war. The enigma machine for decoding German communications during World War II is perhaps the most dramatic modern example. Cryptography features an exotic vocabulary – one-way functions, computational hardness, average case and worst case hardness, NP-completeness, supersingular isogeny key exchange, and lattice problems – that is opaque to most except cryptographers and mathematicians. Yet securing data and communications is fundamental to so many areas of life.

Interestingly, most modern encryption schemes begin as problems that mathematicians can’t solve with current RSA factoring-based encryption as an example.

Lyubashevsky said, “The way we gauge [cryptographic] strength is by having people try to solve the underlying mathematical problem. If the community dedicates a lot of effort to it for a few decades and it’s not successful, then we think that it’s secure, the underlying mathematical problem. Then we build cryptography, based on that problem. It’s the reason we think that factoring is secure against [attack from] classical computers; people have been trying to solve factoring for over 40-50 years and haven’t been successful so we think it’s okay.”

“The problems that they originally came up with in the 1970s, these factoring discrete logarithms and all the things about elliptic curves. They stayed classically hard. Of course, better algorithms for factoring appeared but in some sense, there wasn’t some catastrophic break and all of a sudden, somebody just broke it. But now, when a quantum [fault tolerant] computer exists, it will be completely broken.”

“The same thing with lattice algorithms. People have been trying for a shorter amount of time, let’s say seriously trying for 20-25 years, to break these underlying mathematical problems,” he said. So far, no game-changing algorithm for solving lattice problems on a quantum computer has arisen.

The lattice problem, upon which three of the NIST PQC selected algorithms are based, is commonly illustrated as The Knapsack Problem and dates back to the late 1890s. This description is from Wikipedia: “Given a set of items, each with a weight and a value, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible. It derives its name from the problem faced by someone who is constrained by a fixed-size knapsack and must fill it with the most valuable items. The problem often arises in resource allocation where the decision-makers have to choose from a set of non-divisible projects or tasks under a fixed budget or time constraint, respectively.”

IBM Research has even published a short video describing the knapsack problem and its relevance to quantum cryptography.

Lattice reduction algorithms can be used to tackle the problem but the computational resources required are impractical. Wikipedia, once again: “The conjectured intractability of such problems is central to the construction of secure lattice-based cryptosystems: Lattice problems are an example of NP-hard problems which have been shown to be average-case hard, providing a test case for the security of cryptographic algorithms. In addition, some lattice problems which are worst-case hard can be used as a basis for extremely secure cryptographic schemes. The use of worst-case hardness in such schemes makes them among the very few schemes that are very likely secure even against quantum computers.”

Likewise, hash function cryptography also has a lengthy history. These functions are said to be one-way functions, which means it is practically infeasible to invert or reverse the computation. Ideally, the only way to find a message that produces a given hash is to attempt a brute-force search of possible inputs to see if they produce a match, or use a rainbow table of matched hashes.

Lyubashevsky points out that work on these difficult mathematical problems often starts out seeking solutions for useful applications.

“These are natural mathematical problems. If somebody does find a fast algorithm for them, it will have positive effect in the areas of optimization and things like that,” he said. “So these are questions people are interested in solving positively. But since they couldn’t, you know, we base cryptography on it. As far as quantum computing goes, it’s kind of the same story. People who specialize in quantum algorithms know these lattice problems. A lot of these lattice cryptography problems originated from people working in quantum algorithms trying to find a new type of algorithms that a quantum computer can solve and they didn’t succeed.”

In its PQC program, NIST asked for security equivalence of AES-128 and AES-192 (AES, advanced encryption standards) against quantum computers. The 128 and 192 refer to the bit size of keys. There is also an AES-256 but that was not requested.

The PQC program included five security strength categories “based on the computational resources required to perform certain brute-force attacks against the existing NIST standards for AES and SHA in a variety of different models of the cost of computation, both classical and quantum…The idea is that in order to meet, for example, category 1, the best attack violating the security definition of a parameter set should cost more than a brute-force key search attack on a single instance of AES-128, according to any plausible assumption regarding the relative cost of the various computational resources involved in a real-world attack.[i]”  (A list of the five security strengths is included at the end of the article.)

As acknowledged by NIST, there’s room for some disagreement about what constitutes relative computational cost – “including quantum gates, classical gates, quantum memory, classical memory, hardware, energy, and time” – and how much of a given resource an attack actually requires.”

Talking about the submissions to NIST, Lyubashevsky said, “We had to provide them the parameters for which we believe these (AES) problems are secure at these levels. Now, of course, it’s a bit messy. Things do change a bit even during the standardization effort. There were attacks that lowered security by, you know, five bits or 10 bits. This does require us to increase parameters.”

This type of ongoing change has happened with factoring schemes, said Lyubashevsky, “because the original things that were proposed in or 1970s can be broken on a cell phone now, and not just because better technology exists, but because better algorithms have been developed.”

Focusing on the NIST PQC effort, “All these algorithms kind of existed before in this [NIST PQC] competition with lattice algorithms. The main positive effect of the standardization effort is that it got a lot more cryptanalysts looking at these problems. I think most cryptanalysts work for government agencies like the NSA, GCHQ, and places like that,” said Lyubashevsky.

Given all the scrutiny, Lyubashevsky thinks we can be confident about the security of the PQC algorithms selected. Moreover, he thinks advancing computational hardware technology isn’t a big threat.

“Cryptography should not be affected by any of this (advancing supercomputers), because we believe the hardness of the problem (lattice) is exponential. If you increase your computing power by a factor of two, you just increase the key by one bit, I mean there is a limit to how much computing can increase; you cannot probably do more operations beyond some energy threshold.” There’s only so much energy available on the planet, he wryly chides, “the security of the cryptography kind of goes above this threshold, and it’s not hard to get it above it,” he said.

The lattice algorithms used for public-key encryption are faster and smaller than hash function algorithms, according to Lyubashevsky.

“You can’t get public key encryption from hash functions; you can only get digital signatures. It’s slower and bigger but one big advantage is that the security of hash functions, are not really based on any mathematical problem, it’s just random stuff and should be hard to invert. People will be very, very surprised if something like SHA-3 (secure hash algorithm-3) falls to quantum computing,” he said. “This would be a complete shock and a lot of symmetric key cryptography would be broken. This is not something that we’re even discussing in terms of this NIST standardization effort,” he said.

NIST intends to select at least one more KEM for standardization after the fourth final round of assessments. Lyubashevsky notes there are always tradeoffs between encryption schemes and implementations of the same scheme. For example the submitted algorithms may vary in size, may require different amounts of computational resources, have have different performance characteristics.

“Maybe with respect to some new candidates that’s something NIST will consider in round four. For example, there’s this other type of mathematics based on supersingular isogeny math. It provides solutions smaller than lattices, but it’s very, very slow. So that’s something they may want to standardize in the fourth round. They want different digital signatures [and] there you also have a lot of tradeoffs. Something can have a much smaller signature but a much larger public key. The lattice-based scheme is very balanced, whereas other kinds of specialist schemes can have an advantage in one area and a big disadvantage from another,” he said.

Despite the work still to be done, NIST wants industry and the encryption community to push ahead and not wait. “Even though the third round is ending and NIST will begin to draft the first PQC standards, standardization efforts in this area will continue for some time. This should not be interpreted to mean that users should wait to adopt post-quantum algorithms. NIST hopes for rapid adoption of these first standardized algorithms,” is the guidance from the just-released report.

Lyubashevsky says the need to revamp security has highlighted the ad hoc way in which encryption has often been applied and spotlights the need for a more orderly approach going forward.

“I was a bit surprised when the folks at IBM who actually work on this stuff [deployment and products] said finding the cryptography in data is hard because the crypto is intermingled with everything else, with all the other code and the all the other programs. It’s hard to find and [therefore] hard to yank out and put something else in. This was all an artifact from the 80s and 90s,” he said.

What’s needed now, said Lyubashevsky, is a more modular approach. “If we’re going to change everything, let’s at least do it right. Let’s make everything modular so if something needs to be changed in the future, for whatever reason, it’s easy to take out and plug in something new. This is also important in quantum safe cryptography because you don’t have an algorithm that is best at everything; you have some that are very fast, you have some that are very slow, but have slightly smaller keys. There will be scenarios where one is better than the other. And maybe you don’t quite know which one it will be and you want to be agile. You want to be able to quickly put algorithms in and take algorithms out.”

Link to latest NIST report

Link to NIST migration project

Strength Categories from NIST

1)  Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for key search on a block cipher with a 128-bit key (e.g. AES-128)

2)  Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for collision search on a 256-bit hash function (e.g. SHA-256/ SHA3-256)

3)  Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for key search on a block cipher with a 192-bit key (e.g. AES-192)

4)  Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for collision search on a 384-bit hash function (e.g. SHA-384/ SHA3-384)

5)  Any attack that breaks the relevant security definition must require computational resources comparable to or greater than those required for key search on a block cipher with a 256-bit key (e.g. AES-256)

[i] Status Report on the Third Round of the NIST Post-Quantum Cryptography Standardization Process, https://csrc.nist.gov/publications/detail/nistir/8413/final

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Google Program to Free Chips Boosts University Semiconductor Design

August 11, 2022

A Google-led program to design and manufacture chips for free is becoming popular among researchers and computer enthusiasts. The search giant's open silicon program is providing the tools for anyone to design chips, which then get manufactured. Google foots the entire bill, from a chip's conception to delivery of the final product in a user's hand. Google's... Read more…

Argonne Deploys Polaris Supercomputer for Science in Advance of Aurora

August 9, 2022

Argonne National Laboratory has made its newest supercomputer, Polaris, available for scientific research. The system, which ranked 14th on the most recent Top500 list, is serving as a testbed for the exascale Aurora sys Read more…

US CHIPS and Science Act Signed Into Law

August 9, 2022

Just a few days after it was passed in the Senate, the U.S. CHIPS and Science Act has been signed into law by President Biden. In a ceremony today, Biden signed and lauded the ambitious piece of legislation, which over t Read more…

12 Midwestern Universities Team to Boost Semiconductor Supply Chain

August 8, 2022

The combined stressors of Covid-19 and the invasion of Ukraine have sent every major nation scrambling to reinforce its mission-critical supply chains – including and in particular the semiconductor supply chain. In the U.S. – which, like much of the world, relies on Asia for its semiconductors – those efforts have taken shape through the recently... Read more…

Quantum Pioneer D-Wave Rings NYSE Bell, Begins Life as Public Company

August 8, 2022

D-Wave Systems, one of the early quantum computing pioneers, has completed its SPAC deal to go public. Its merger with DPCM Capital was completed last Friday, and today, D-Wave management rang the bell on the New York Stock Exchange. It is now trading under two ticker symbols – QBTS and QBTS WS (warrant shares), respectively. Welcome to the public... Read more…

AWS Solution Channel

Shutterstock 1519171757

Running large-scale CFD fire simulations on AWS for Amazon.com

This post was contributed by Matt Broadfoot, Senior Fire Strategy Manager at Amazon Design and Construction, and Antonio Cennamo ProServe Customer Practice Manager, Colin Bridger Principal HPC GTM Specialist, Grigorios Pikoulas ProServe Strategic Program Leader, Neil Ashton Principal, Computational Engineering Product Strategy, Roberto Medar, ProServe HPC Consultant, Taiwo Abioye ProServe Security Consultant, Talib Mahouari ProServe Engagement Manager at AWS. Read more…

Microsoft/NVIDIA Solution Channel

Shutterstock 1689646429

Gain a Competitive Edge using Cloud-Based, GPU-Accelerated AI KYC Recommender Systems

Financial services organizations face increased competition for customers from technologies such as FinTechs, mobile banking applications, and online payment systems. To meet this challenge, it is important for organizations to have a deep understanding of their customers. Read more…

Supercomputer Models Explosives Critical for Nuclear Weapons

August 6, 2022

Lawrence Livermore National Laboratory (LLNL) is one of the laboratories that operates under the auspices of the National Nuclear Security Administration (NNSA), which manages the United States’ stockpile of nuclear we Read more…

Google Program to Free Chips Boosts University Semiconductor Design

August 11, 2022

A Google-led program to design and manufacture chips for free is becoming popular among researchers and computer enthusiasts. The search giant's open silicon program is providing the tools for anyone to design chips, which then get manufactured. Google foots the entire bill, from a chip's conception to delivery of the final product in a user's hand. Google's... Read more…

US CHIPS and Science Act Signed Into Law

August 9, 2022

Just a few days after it was passed in the Senate, the U.S. CHIPS and Science Act has been signed into law by President Biden. In a ceremony today, Biden signed Read more…

Quantum Pioneer D-Wave Rings NYSE Bell, Begins Life as Public Company

August 8, 2022

D-Wave Systems, one of the early quantum computing pioneers, has completed its SPAC deal to go public. Its merger with DPCM Capital was completed last Friday, and today, D-Wave management rang the bell on the New York Stock Exchange. It is now trading under two ticker symbols – QBTS and QBTS WS (warrant shares), respectively. Welcome to the public... Read more…

SEA Changes: How EuroHPC Is Preparing for Exascale

August 5, 2022

Back in June, the EuroHPC Joint Undertaking — which serves as the EU’s concerted supercomputing play — announced its first exascale system: JUPITER, set to be installed by the Jülich Supercomputing Centre (FZJ) in 2023. But EuroHPC has been preparing for the exascale era for a much longer time: eight months before... Read more…

Not Just Cash for Chips – The New Chips and Science Act Boosts NSF, DOE, NIST

August 3, 2022

After two-plus years of contentious debate, several different names, and final passage by the House (243-187) and Senate (64-33) last week, the Chips and Science Act will soon become law. Besides the $54.2 billion provided to boost US-based chip manufacturing, the act reshapes US science policy in meaningful ways. NSF’s proposed budget... Read more…

CXL Brings Datacenter-sized Computing with 3.0 Standard, Thinks Ahead to 4.0

August 2, 2022

A new version of a standard backed by major cloud providers and chip companies could change the way some of the world's largest datacenters and fastest supercomputers are built. The CXL Consortium on Tuesday announced a new specification called CXL 3.0 – also known as Compute Express Link 3.0... Read more…

Inside an Ambitious Play to Shake Up HPC and the Texas Grid

August 2, 2022

With HPC demand ballooning and Moore’s law slowing down, modern supercomputers often undergo exhaustive efficiency efforts aimed at ameliorating exorbitant energy bills and correspondingly large carbon footprints. Others, meanwhile, are asking: is min-maxing the best option, or are there easier paths to reducing the bills and emissions of... Read more…

UCIe Consortium Incorporates, Nvidia and Alibaba Round Out Board

August 2, 2022

The Universal Chiplet Interconnect Express (UCIe) consortium is moving ahead with its effort to standardize a universal interconnect at the package level. The c Read more…

Nvidia R&D Chief on How AI is Improving Chip Design

April 18, 2022

Getting a glimpse into Nvidia’s R&D has become a regular feature of the spring GTC conference with Bill Dally, chief scientist and senior vice president of research, providing an overview of Nvidia’s R&D organization and a few details on current priorities. This year, Dally focused mostly on AI tools that Nvidia is both developing and using in-house to improve... Read more…

Royalty-free stock illustration ID: 1919750255

Intel Says UCIe to Outpace PCIe in Speed Race

May 11, 2022

Intel has shared more details on a new interconnect that is the foundation of the company’s long-term plan for x86, Arm and RISC-V architectures to co-exist in a single chip package. The semiconductor company is taking a modular approach to chip design with the option for customers to cram computing blocks such as CPUs, GPUs and AI accelerators inside a single chip package. Read more…

The Final Frontier: US Has Its First Exascale Supercomputer

May 30, 2022

In April 2018, the U.S. Department of Energy announced plans to procure a trio of exascale supercomputers at a total cost of up to $1.8 billion dollars. Over the ensuing four years, many announcements were made, many deadlines were missed, and a pandemic threw the world into disarray. Now, at long last, HPE and Oak Ridge National Laboratory (ORNL) have announced that the first of those... Read more…

US Senate Passes CHIPS Act Temperature Check, but Challenges Linger

July 19, 2022

The U.S. Senate on Tuesday passed a major hurdle that will open up close to $52 billion in grants for the semiconductor industry to boost manufacturing, supply chain and research and development. U.S. senators voted 64-34 in favor of advancing the CHIPS Act, which sets the stage for the final consideration... Read more…

Top500: Exascale Is Officially Here with Debut of Frontier

May 30, 2022

The 59th installment of the Top500 list, issued today from ISC 2022 in Hamburg, Germany, officially marks a new era in supercomputing with the debut of the first-ever exascale system on the list. Frontier, deployed at the Department of Energy’s Oak Ridge National Laboratory, achieved 1.102 exaflops in its fastest High Performance Linpack run, which was completed... Read more…

Newly-Observed Higgs Mode Holds Promise in Quantum Computing

June 8, 2022

The first-ever appearance of a previously undetectable quantum excitation known as the axial Higgs mode – exciting in its own right – also holds promise for developing and manipulating higher temperature quantum materials... Read more…

AMD’s MI300 APUs to Power Exascale El Capitan Supercomputer

June 21, 2022

Additional details of the architecture of the exascale El Capitan supercomputer were disclosed today by Lawrence Livermore National Laboratory’s (LLNL) Terri Read more…

PsiQuantum’s Path to 1 Million Qubits

April 21, 2022

PsiQuantum, founded in 2016 by four researchers with roots at Bristol University, Stanford University, and York University, is one of a few quantum computing startups that’s kept a moderately low PR profile. (That’s if you disregard the roughly $700 million in funding it has attracted.) The main reason is PsiQuantum has eschewed the clamorous public chase for... Read more…

Leading Solution Providers

Contributors

ISC 2022 Booth Video Tours

AMD
AWS
DDN
Dell
Intel
Lenovo
Microsoft
PENGUIN SOLUTIONS

Exclusive Inside Look at First US Exascale Supercomputer

July 1, 2022

HPCwire takes you inside the Frontier datacenter at DOE's Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tenn., for an interview with Frontier Project Direc Read more…

AMD Opens Up Chip Design to the Outside for Custom Future

June 15, 2022

AMD is getting personal with chips as it sets sail to make products more to the liking of its customers. The chipmaker detailed a modular chip future in which customers can mix and match non-AMD processors in a custom chip package. "We are focused on making it easier to implement chips with more flexibility," said Mark Papermaster, chief technology officer at AMD during the analyst day meeting late last week. Read more…

Intel Reiterates Plans to Merge CPU, GPU High-performance Chip Roadmaps

May 31, 2022

Intel reiterated it is well on its way to merging its roadmap of high-performance CPUs and GPUs as it shifts over to newer manufacturing processes and packaging technologies in the coming years. The company is merging the CPU and GPU lineups into a chip (codenamed Falcon Shores) which Intel has dubbed an XPU. Falcon Shores... Read more…

Nvidia, Intel to Power Atos-Built MareNostrum 5 Supercomputer

June 16, 2022

The long-troubled, hotly anticipated MareNostrum 5 supercomputer finally has a vendor: Atos, which will be supplying a system that includes both Nvidia and Inte Read more…

India Launches Petascale ‘PARAM Ganga’ Supercomputer

March 8, 2022

Just a couple of weeks ago, the Indian government promised that it had five HPC systems in the final stages of installation and would launch nine new supercomputers this year. Now, it appears to be making good on that promise: the country’s National Supercomputing Mission (NSM) has announced the deployment of “PARAM Ganga” petascale supercomputer at Indian Institute of Technology (IIT)... Read more…

Is Time Running Out for Compromise on America COMPETES/USICA Act?

June 22, 2022

You may recall that efforts proposed in 2020 to remake the National Science Foundation (Endless Frontier Act) have since expanded and morphed into two gigantic bills, the America COMPETES Act in the U.S. House of Representatives and the U.S. Innovation and Competition Act in the U.S. Senate. So far, efforts to reconcile the two pieces of legislation have snagged and recent reports... Read more…

AMD Lines Up Alternate Chips as It Eyes a ‘Post-exaflops’ Future

June 10, 2022

Close to a decade ago, AMD was in turmoil. The company was playing second fiddle to Intel in PCs and datacenters, and its road to profitability hinged mostly on Read more…

Exascale Watch: Aurora Installation Underway, Now Open for Reservations

May 10, 2022

Installation has begun on the Aurora supercomputer, Rick Stevens (associate director of Argonne National Laboratory) revealed today during the Intel Vision event keynote taking place in Dallas, Texas, and online. Joining Intel exec Raja Koduri on stage, Stevens confirmed that the Aurora build is underway – a major development for a system that is projected to deliver more... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire