ORNL-Led Collaboration Solves a Beta-Decay Puzzle with Advanced Nuclear Models

March 11, 2019

OAK RIDGE, Tenn., March 11, 2019 — An international collaboration including scientists at the Department of Energy’s Oak Ridge National Laboratory solved a 50-year-old puzzle that explains why beta decays of atomic nuclei are slower than what is expected based on the beta decays of free neutrons.

The findings, published in Nature Physics, fill a long-standing gap in physicists’ understanding of beta decay, an important process stars use to create heavier elements, and emphasize the need to include subtle effects—or more realistic physics—when predicting certain nuclear processes.

“For decades, scientists have lacked a first-principles understanding of nuclear beta decay, in which protons convert into neutrons, or vice versa, to form other elements,” said ORNL staff scientist Gaute Hagen, who led the study. “Our team demonstrated that theoretical models and computation have progressed to the point where it is possible to calculate some decay properties with enough precision to allow for direct comparison to experiment.”

To solve the problem, the team simulated tin-100 decaying into indium-100, a neighboring element on the periodic table. The two elements share the same number of nucleons (protons and neutrons), with tin-100 possessing 50 protons to indium-100’s 49.

Calculating beta decay precisely required the team to not only accurately simulate the structure of the mother and daughter nuclei but also account for the correlated interactions between two nucleons during the transition. This additional treatment presented an extreme computational challenge due to the combination of strong nuclear correlations and interactions involving the decaying nucleon.

In the past, nuclear physicists worked around this problem by inserting a fundamental constant to reconcile observed beta-decay rates of neutrons inside and outside the nucleus, a practice known as “quenching.” But with machines like ORNL’s Titan supercomputer, Hagen’s team demonstrated that this mathematical crutch is no longer necessary.

“Nobody really understood why this quenching factor worked. It just did,” said ORNL computational scientist Gustav Jansen. “We found that it could largely be explained by including two nucleons in the decay—for example, two protons decaying into a proton and a neutron, or a proton and a neutron decaying into two neutrons.”

The team, which included partners from Lawrence Livermore National Laboratory, University of Tennessee, University of Washington, TRIUMF (Canada), and Technical University Darmstadt (Germany), performed a comprehensive study of beta decays from light to medium-heavy nuclei up to tin-100.

The achievement gives nuclear physicists increased confidence as they search for answers to some of the most perplexing mysteries related to the formation of matter in the universe. Beyond regular beta decay, scientists are looking to compute neutrinoless double beta decay, a theorized form of nuclear decay that, if observed, would explore important new physics and help to determine the mass of the neutrino.

Tin to In

Many elements have isotopes that decay over long periods of time. For example, the half-life of carbon-14, the nucleus used in carbon dating, is 5,730 years. Other nuclei, however, exist only for fractions of a second before ejecting particles in an attempt to stabilize.

In neutron beta decay, an electron and an anti-neutrino are emitted. When tin-100 transforms into indium-100, the nucleus undergoes beta-plus decay, expelling a positron and a neutrino when converting a proton to a neutron.

With its equal number of protons and neutrons, tin-100 exhibits an unusually high rate of beta decay, giving the ORNL team a strong signal from which to verify its results. Furthermore, the tin-100 nucleus is “doubly magic,” meaning the nucleons fill out defined shells inside the nucleus that make it strongly bound and relatively simple in structure. The ORNL team’s NUCCOR code, which is programmed to solve the nuclear many-body problem, excels at describing doubly magic nuclei up and down the nuclear chart.

“A doubly magic nucleus like tin-100 isn’t as complicated as many other nuclei,” said Thomas Papenbrock, a researcher at the University of Tennessee and ORNL. “This means we can reliably compute it using our coupled cluster method, which calculates properties of large nuclei by accounting for forces between the individual nucleons.”

To model beta decay, however, the team also had to calculate the structure of indium-100, a more complex nucleus than the doubly magic tin-100. This required a more precise treatment of the strong correlations between the nucleons. By borrowing ideas from quantum chemistry, which treats electrons as waves, Hagen’s team successfully developed techniques to model these processes.

“In our case we are dealing with nucleons instead of electrons, but the quantum chemistry concepts have helped us branch out from doubly magic nuclei and expand into these open-shell regions,” said ORNL physicist Titus Morris.

Guiding experiment

Now that Hagen’s team has shown its understanding of beta decay is on par with experiment, it’s looking to take advantage of new supercomputers like ORNL’s Summit, the world’s most powerful, to guide current and future experiments.

Researchers are currently using Summit to simulate how calcium-48, another doubly magic nucleus, would undergo neutrinoless double beta decay—a process in which two neutrons beta decay into protons, but without emitting any neutrinos. The results could aid experimentalists in the selection of an optimal detector material for the potential discovery of this rare phenomenon.

“Currently, calculations using different nuclear models of neutrinoless double beta decay may differ by as much as a factor of six,” Hagen said. “Our goal is to provide a benchmark for other models and theories.”

This research was supported by the DOE Office of Science.

UT-Battelle LLC manages ORNL for DOE’s Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit https://science.energy.gov/.


Source: ORNL

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Empowering High-Performance Computing for Artificial Intelligence

April 19, 2024

Artificial intelligence (AI) presents some of the most challenging demands in information technology, especially concerning computing power and data movement. As a result of these challenges, high-performance computing Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire