Peter Shor Wins Breakthrough Prize in Fundamental Physics

September 26, 2022

Sept. 26, 2022 — Peter Shor, the Morss Professor of Applied Mathematics at MIT, has been named a recipient of the 2023 Breakthrough Prize in Fundamental Physics. He shares the $3 million prize with three others for “foundational work in the field of quantum information”: David Deutsch at the University of Oxford, Charles Bennett at IBM Research, and Gilles Brassard of the University of Montreal.

In announcing the award, the Breakthrough Prize Foundation highlighted Shor’s contributions to the quantum information field, including the eponymous Shor’s algorithm for factoring extremely large numbers, and for an algorithm to correct errors in quantum computers.

“These ideas not only paved the way for today’s fast-developing quantum computers; they are now also at the frontiers of fundamental physics, especially in the study of metrology — the science of measurement — and of quantum gravity,” the award announcement reads.

“I’m very grateful to see the prize going to quantum information and quantum computation theory this year,” Shor commented to MIT News. “My three co-winners were the most influential people in founding this field. I consider them friends, and they all clearly deserve it.”

In addition, an MIT alumnus, Daniel A. Spielman PhD ’95, has won the 2023 Breakthrough Prize in Mathematics for “contributions to theoretical computer science and mathematics, including to spectral graph theory, the Kadison-Singer problem, numerical linear algebra, optimization, and coding theory.”

“I am ecstatic to see both Peter Shor and Dan Spielman be recognized with Breakthrough Prizes in Fundamental Physics and Mathematics, respectively,” says Michel Goemans, the RSA Professor and head of MIT’s Department of Mathematics.  “Both would have been natural nominees of the Breakthrough Prize in Theoretical Computer Science, if such a prize existed. Peter and Dan are PhD graduates of our math department, both have held tenured appointments in our department and have been members of the theory group at CSAIL, and both have received the same prizes. It is a testimony of the importance of theoretical computer science across disciplines, in particular mathematics and physics.”

Quantum Seeds

The first seeds of quantum computing’s potential were planted through the early algorithms derived by Deutsch, Bennett, Brassard, and Shor.

In the early 1980s, Deutsch began thinking of problems whose solutions could be sped up using quantum algorithms — formulas that were derived using the laws of quantum mechanics, rather than classical physics. He was the first to develop a quantum algorithm that could solve a simple, albeit contrived, problem far more efficiently than a classical algorithm.

Meanwhile, Bennett and Brassard were also looking for uses of quantum information. In 1984, they developed the first quantum cryptography protocol, BB84. They put forth the idea that two distant parties could agree on a secret encryption key, which would be secure against eavesdroppers, based on a strange quantum principle in which the value of the encryption key would instantly be disturbed and therefore unreadable when measured.

Their work demonstrated the first practical application of quantum information theory. It was also Shor’s first introduction to the field. The mathematician was working at AT&T Bell Labs at the time, and Bennett came to give a talk on his new quantum key encryption system. “Their work inspired me to do a little thinking and research on quantum information,” Shor recalls. “But I didn’t really get anywhere at the time.”

A decade later, in 1994, Shor introduced his own landmark algorithm. Shor’s algorithm describes how a sufficiently large quantum computer could efficiently factorize extremely large numbers — a task that would take more than the age of the universe for the most powerful classical supercomputer to solve.

Most data encryption schemes today rely on the difficulty of factorization to keep information secure. Shor’s algorithm was the first to show that, in theory, a quantum system could break through most modern data security walls. To do this practically, however, would require a system of many precisely controlled quantum bits. Even then, scientists assumed that the tiniest noise in the environment would disrupt the delicate qubits, and set off a ripple of errors in their calculations that could not be corrected without further disturbing the qubits.

“When I first came up with this factoring algorithm, people thought it would remain theoretical forever because there was this argument that you could not correct errors on a quantum computer,” Shor says.

Shortly thereafter, in 1995, Shor worked out another algorithm, this time on quantum error correction, which showed that errors in a quantum system could in fact be isolated and fixed without disturbing the qubit itself, thereby leaving the quantum computation intact. The vision of a practical quantum computer became immediately tangible.

“With these two bombshell contributions, Peter set the stage for quantum computing to become the huge field that it is now,” says Alan Guth, the Victor F. Weisskopf Professor of Physics at MIT, who as a former recipient of the Breakthrough Prize, was the one who called Shor to deliver the news of this year’s award.

“It was a real pleasure for me to be able to tell him that he is one of the winners,” Guth says. “His algorithms took the world by surprise, and ignited the field of quantum computing. And despite his spectacular contributions, Peter continues to be a warm, friendly, smiling colleague to all around him.”

“Peter is a wonderful colleague and is totally unique,” adds Goemans. “His thought process seems to parallel the quantum algorithms he designs and invents: Out of entangled ideas and a superposition of states, a brilliant solution often emerges in a Eureka moment!”

“One of the best things about MIT is that we have great students,” says Shor, who earned a PhD in applied mathematics from MIT in 1985. He then spent one year as a postdoc at the Mathematical Sciences Research Institute before moving on to work at AT&T Bell Labs, where he developed Shor’s algorithm. In 2003, he returned to MIT, where he has continued his research and teaching for the past 20 years.

Today, he is working to formulate a theory of quantum information, which would describe how data can be stored and transmitted, using the principles of quantum physics. Will there come a day when quantum computers are advanced enough to break through our classical security systems?

“In five or 10 years, we could be at the start of a Moore’s Law, where quantum computers will steadily improve every few years,” Shor predicts. “I suspect they’ll improve fast enough that within two or three decades we will get quantum computers that can do useful stuff. Hopefully by the time quantum computers are that large, we’ll be using different crypto systems that aren’t susceptible to quantum computers.”

Shor credits his father with fostering his early interest in mathematics. As a young boy, would flip through his father’s issues of Scientific American, to find his favorite section.

“Martin Gardner had a column, ‘Mathematical Games,’ which was really amazing,” Shor recalls. “It was sometimes a puzzle, sometimes a report on a new discovery in mathematics, and it was often at a level that I could understand. I looked forward to reading it every month, and that was something that turned me onto math early on.”

Beautiful Breakthroughs

Daniel Spielman, this year’s recipient of the Breakthrough Prize in Mathematics, received a PhD in applied mathematics at MIT in 1995, for which he was advised by Michael Sipser, the Donner Professor of Mathematics and former dean of the MIT School of Science. Spielman then joined the math department and was on the MIT faculty until 2005, before moving on to Yale University, where he is currently the Sterling Professor of Computer Science, Mathematics, Statistics and Data Science.

Spielman specializes in the design and analysis of algorithms, many of which have yielded insights “not only for mathematics, but for highly practical problems in computing, signal processing, engineering, and even the design of clinical trials,” notes the Breakthrough Foundation in their announcement today.

“Dan has made a number of important and beautiful breakthroughs over the years, from expander-based error-correcting codes, to the smoothed analysis of algorithms, or spectral sparsifications of graphs, all characterized by innovative mathematics,” says Goemans.

Among numerous discoveries, Spielman is best known for solving the Kadison-Singer problem, which for decades was thought to be unsolvable. The problem can be interpreted as posing a fundamental question for quantum physics: In a quantum system, can new information be deciphered, if only some of the system’s properties are observed or measured? The answer, most mathematicians agreed, was no.

Over decades, the Kadison-Singer problem was reformulated and shown to be equivalent to problems across a wide range of mathematical fields. And in 2013, Spielman and his colleagues resolved one of these equivalent formulations involving linear algebra and matrices, proving the answer to be yes — indeed, it was possible to determine a quantum system’s sum from its parts.

The Breakthrough Prizes are a set of international awards that recognize the achievements of scientists in three categories — fundamental physics, mathematics, and life sciences. The prizes were founded by Sergey Brin; Priscilla Chan and Mark Zuckerberg; Julia and Yuri Milner; and Anne Wojcicki, and have been sponsored by foundations established by them. The 2023 prizes will be presented at a gala award ceremony, and prize recipients will take part in lectures and discussions.


Source: , MIT

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, code-named Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from its predecessors, including the red-hot H100 and A100 GPUs. Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. While Nvidia may not spring to mind when thinking of the quant Read more…

2024 Winter Classic: Meet the HPE Mentors

March 18, 2024

The latest installment of the 2024 Winter Classic Studio Update Show features our interview with the HPE mentor team who introduced our student teams to the joys (and potential sorrows) of the HPL (LINPACK) and accompany Read more…

Houston We Have a Solution: Addressing the HPC and Tech Talent Gap

March 15, 2024

Generations of Houstonian teachers, counselors, and parents have either worked in the aerospace industry or know people who do - the prospect of entering the field was normalized for boys in 1969 when the Apollo 11 missi Read more…

Apple Buys DarwinAI Deepening its AI Push According to Report

March 14, 2024

Apple has purchased Canadian AI startup DarwinAI according to a Bloomberg report today. Apparently the deal was done early this year but still hasn’t been publicly announced according to the report. Apple is preparing Read more…

Survey of Rapid Training Methods for Neural Networks

March 14, 2024

Artificial neural networks are computing systems with interconnected layers that process and learn from data. During training, neural networks utilize optimization algorithms to iteratively refine their parameters until Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, code-named Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Houston We Have a Solution: Addressing the HPC and Tech Talent Gap

March 15, 2024

Generations of Houstonian teachers, counselors, and parents have either worked in the aerospace industry or know people who do - the prospect of entering the fi Read more…

Survey of Rapid Training Methods for Neural Networks

March 14, 2024

Artificial neural networks are computing systems with interconnected layers that process and learn from data. During training, neural networks utilize optimizat Read more…

PASQAL Issues Roadmap to 10,000 Qubits in 2026 and Fault Tolerance in 2028

March 13, 2024

Paris-based PASQAL, a developer of neutral atom-based quantum computers, yesterday issued a roadmap for delivering systems with 10,000 physical qubits in 2026 a Read more…

India Is an AI Powerhouse Waiting to Happen, but Challenges Await

March 12, 2024

The Indian government is pushing full speed ahead to make the country an attractive technology base, especially in the hot fields of AI and semiconductors, but Read more…

Charles Tahan Exits National Quantum Coordination Office

March 12, 2024

(March 1, 2024) My first official day at the White House Office of Science and Technology Policy (OSTP) was June 15, 2020, during the depths of the COVID-19 loc Read more…

AI Bias In the Spotlight On International Women’s Day

March 11, 2024

What impact does AI bias have on women and girls? What can people do to increase female participation in the AI field? These are some of the questions the tech Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Analyst Panel Says Take the Quantum Computing Plunge Now…

November 27, 2023

Should you start exploring quantum computing? Yes, said a panel of analysts convened at Tabor Communications HPC and AI on Wall Street conference earlier this y Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Training of 1-Trillion Parameter Scientific AI Begins

November 13, 2023

A US national lab has started training a massive AI brain that could ultimately become the must-have computing resource for scientific researchers. Argonne N Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire