ORNL-Led Collaboration Solves a Beta-Decay Puzzle with Advanced Nuclear Models

March 11, 2019

OAK RIDGE, Tenn., March 11, 2019 — An international collaboration including scientists at the Department of Energy’s Oak Ridge National Laboratory solved a 50-year-old puzzle that explains why beta decays of atomic nuclei are slower than what is expected based on the beta decays of free neutrons.

The findings, published in Nature Physics, fill a long-standing gap in physicists’ understanding of beta decay, an important process stars use to create heavier elements, and emphasize the need to include subtle effects—or more realistic physics—when predicting certain nuclear processes.

“For decades, scientists have lacked a first-principles understanding of nuclear beta decay, in which protons convert into neutrons, or vice versa, to form other elements,” said ORNL staff scientist Gaute Hagen, who led the study. “Our team demonstrated that theoretical models and computation have progressed to the point where it is possible to calculate some decay properties with enough precision to allow for direct comparison to experiment.”

To solve the problem, the team simulated tin-100 decaying into indium-100, a neighboring element on the periodic table. The two elements share the same number of nucleons (protons and neutrons), with tin-100 possessing 50 protons to indium-100’s 49.

Calculating beta decay precisely required the team to not only accurately simulate the structure of the mother and daughter nuclei but also account for the correlated interactions between two nucleons during the transition. This additional treatment presented an extreme computational challenge due to the combination of strong nuclear correlations and interactions involving the decaying nucleon.

In the past, nuclear physicists worked around this problem by inserting a fundamental constant to reconcile observed beta-decay rates of neutrons inside and outside the nucleus, a practice known as “quenching.” But with machines like ORNL’s Titan supercomputer, Hagen’s team demonstrated that this mathematical crutch is no longer necessary.

“Nobody really understood why this quenching factor worked. It just did,” said ORNL computational scientist Gustav Jansen. “We found that it could largely be explained by including two nucleons in the decay—for example, two protons decaying into a proton and a neutron, or a proton and a neutron decaying into two neutrons.”

The team, which included partners from Lawrence Livermore National Laboratory, University of Tennessee, University of Washington, TRIUMF (Canada), and Technical University Darmstadt (Germany), performed a comprehensive study of beta decays from light to medium-heavy nuclei up to tin-100.

The achievement gives nuclear physicists increased confidence as they search for answers to some of the most perplexing mysteries related to the formation of matter in the universe. Beyond regular beta decay, scientists are looking to compute neutrinoless double beta decay, a theorized form of nuclear decay that, if observed, would explore important new physics and help to determine the mass of the neutrino.

Tin to In

Many elements have isotopes that decay over long periods of time. For example, the half-life of carbon-14, the nucleus used in carbon dating, is 5,730 years. Other nuclei, however, exist only for fractions of a second before ejecting particles in an attempt to stabilize.

In neutron beta decay, an electron and an anti-neutrino are emitted. When tin-100 transforms into indium-100, the nucleus undergoes beta-plus decay, expelling a positron and a neutrino when converting a proton to a neutron.

With its equal number of protons and neutrons, tin-100 exhibits an unusually high rate of beta decay, giving the ORNL team a strong signal from which to verify its results. Furthermore, the tin-100 nucleus is “doubly magic,” meaning the nucleons fill out defined shells inside the nucleus that make it strongly bound and relatively simple in structure. The ORNL team’s NUCCOR code, which is programmed to solve the nuclear many-body problem, excels at describing doubly magic nuclei up and down the nuclear chart.

“A doubly magic nucleus like tin-100 isn’t as complicated as many other nuclei,” said Thomas Papenbrock, a researcher at the University of Tennessee and ORNL. “This means we can reliably compute it using our coupled cluster method, which calculates properties of large nuclei by accounting for forces between the individual nucleons.”

To model beta decay, however, the team also had to calculate the structure of indium-100, a more complex nucleus than the doubly magic tin-100. This required a more precise treatment of the strong correlations between the nucleons. By borrowing ideas from quantum chemistry, which treats electrons as waves, Hagen’s team successfully developed techniques to model these processes.

“In our case we are dealing with nucleons instead of electrons, but the quantum chemistry concepts have helped us branch out from doubly magic nuclei and expand into these open-shell regions,” said ORNL physicist Titus Morris.

Guiding experiment

Now that Hagen’s team has shown its understanding of beta decay is on par with experiment, it’s looking to take advantage of new supercomputers like ORNL’s Summit, the world’s most powerful, to guide current and future experiments.

Researchers are currently using Summit to simulate how calcium-48, another doubly magic nucleus, would undergo neutrinoless double beta decay—a process in which two neutrons beta decay into protons, but without emitting any neutrinos. The results could aid experimentalists in the selection of an optimal detector material for the potential discovery of this rare phenomenon.

“Currently, calculations using different nuclear models of neutrinoless double beta decay may differ by as much as a factor of six,” Hagen said. “Our goal is to provide a benchmark for other models and theories.”

This research was supported by the DOE Office of Science.

UT-Battelle LLC manages ORNL for DOE’s Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit https://science.energy.gov/.


Source: ORNL

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

GTC 2019: Chief Scientist Bill Dally Provides Glimpse into Nvidia Research Engine

March 22, 2019

Amid the frenzy of GTC this week – Nvidia’s annual conference showcasing all things GPU (and now AI) – William Dally, chief scientist and SVP of research, provided a brief but insightful portrait of Nvidia’s rese Read more…

By John Russell

ORNL Helps Identify Challenges of Extremely Heterogeneous Architectures

March 21, 2019

Exponential growth in classical computing over the last two decades has produced hardware and software that support lightning-fast processing speeds, but advancements are topping out as computing architectures reach thei Read more…

By Laurie Varma

Interview with 2019 Person to Watch Jim Keller

March 21, 2019

On the heels of Intel's reaffirmation that it will deliver the first U.S. exascale computer in 2021, which will feature the company's new Intel Xe architecture, we bring you our interview with our 2019 Person to Watch Jim Keller, head of the Silicon Engineering Group at Intel. Read more…

By HPCwire Editorial Team

HPE Extreme Performance Solutions

HPE and Intel® Omni-Path Architecture: How to Power a Cloud

Learn how HPE and Intel® Omni-Path Architecture provide critical infrastructure for leading Nordic HPC provider’s HPCFLOW cloud service.

powercloud_blog.jpgFor decades, HPE has been at the forefront of high-performance computing, and we’ve powered some of the fastest and most robust supercomputers in the world. Read more…

IBM Accelerated Insights

Insurance: Where’s the Risk?

Insurers are facing extreme competitive challenges in their core businesses. Property and Casualty (P&C) and Life and Health (L&H) firms alike are highly impacted by the ongoing globalization, increasing regulation, and digital transformation of their client bases. Read more…

What’s New in HPC Research: TensorFlow, Buddy Compression, Intel Optane & More

March 20, 2019

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

GTC 2019: Chief Scientist Bill Dally Provides Glimpse into Nvidia Research Engine

March 22, 2019

Amid the frenzy of GTC this week – Nvidia’s annual conference showcasing all things GPU (and now AI) – William Dally, chief scientist and SVP of research, Read more…

By John Russell

At GTC: Nvidia Expands Scope of Its AI and Datacenter Ecosystem

March 19, 2019

In the high-stakes race to provide the AI life-cycle solution of choice, three of the biggest horses in the field are IBM, Intel and Nvidia. While the latter is only a fraction of the size of its two bigger rivals, and has been in business for only a fraction of the time, Nvidia continues to impress with an expanding array of new GPU-based hardware, software, robotics, partnerships and... Read more…

By Doug Black

Nvidia Debuts Clara AI Toolkit with Pre-Trained Models for Radiology Use

March 19, 2019

AI’s push into healthcare got a boost yesterday with Nvidia’s release of the Clara Deploy AI toolkit which includes 13 pre-trained models for use in radiolo Read more…

By John Russell

It’s Official: Aurora on Track to Be First US Exascale Computer in 2021

March 18, 2019

The U.S. Department of Energy along with Intel and Cray confirmed today that an Intel/Cray supercomputer, "Aurora," capable of sustained performance of one exaf Read more…

By Tiffany Trader

Why Nvidia Bought Mellanox: ‘Future Datacenters Will Be…Like High Performance Computers’

March 14, 2019

“Future datacenters of all kinds will be built like high performance computers,” said Nvidia CEO Jensen Huang during a phone briefing on Monday after Nvidia revealed scooping up the high performance networking company Mellanox for $6.9 billion. Read more…

By Tiffany Trader

Oil and Gas Supercloud Clears Out Remaining Knights Landing Inventory: All 38,000 Wafers

March 13, 2019

The McCloud HPC service being built by Australia’s DownUnder GeoSolutions (DUG) outside Houston is set to become the largest oil and gas cloud in the world th Read more…

By Tiffany Trader

Quick Take: Trump’s 2020 Budget Spares DoE-funded HPC but Slams NSF and NIH

March 12, 2019

U.S. President Donald Trump’s 2020 budget request, released yesterday, proposes deep cuts in many science programs but seems to spare HPC funding by the Depar Read more…

By John Russell

Nvidia Wins Mellanox Stakes for $6.9 Billion

March 11, 2019

The long-rumored acquisition of Mellanox came to fruition this morning with GPU chipmaker Nvidia’s announcement that it has purchased the high-performance net Read more…

By Doug Black

Quantum Computing Will Never Work

November 27, 2018

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Read more…

By John Russell

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

ClusterVision in Bankruptcy, Fate Uncertain

February 13, 2019

ClusterVision, European HPC specialists that have built and installed over 20 Top500-ranked systems in their nearly 17-year history, appear to be in the midst o Read more…

By Tiffany Trader

Intel Reportedly in $6B Bid for Mellanox

January 30, 2019

The latest rumors and reports around an acquisition of Mellanox focus on Intel, which has reportedly offered a $6 billion bid for the high performance interconn Read more…

By Doug Black

Why Nvidia Bought Mellanox: ‘Future Datacenters Will Be…Like High Performance Computers’

March 14, 2019

“Future datacenters of all kinds will be built like high performance computers,” said Nvidia CEO Jensen Huang during a phone briefing on Monday after Nvidia revealed scooping up the high performance networking company Mellanox for $6.9 billion. Read more…

By Tiffany Trader

Looking for Light Reading? NSF-backed ‘Comic Books’ Tackle Quantum Computing

January 28, 2019

Still baffled by quantum computing? How about turning to comic books (graphic novels for the well-read among you) for some clarity and a little humor on QC. The Read more…

By John Russell

Contract Signed for New Finnish Supercomputer

December 13, 2018

After the official contract signing yesterday, configuration details were made public for the new BullSequana system that the Finnish IT Center for Science (CSC Read more…

By Tiffany Trader

Deep500: ETH Researchers Introduce New Deep Learning Benchmark for HPC

February 5, 2019

ETH researchers have developed a new deep learning benchmarking environment – Deep500 – they say is “the first distributed and reproducible benchmarking s Read more…

By John Russell

Leading Solution Providers

SC 18 Virtual Booth Video Tour

Advania @ SC18 AMD @ SC18
ASRock Rack @ SC18
DDN Storage @ SC18
HPE @ SC18
IBM @ SC18
Lenovo @ SC18 Mellanox Technologies @ SC18
NVIDIA @ SC18
One Stop Systems @ SC18
Oracle @ SC18 Panasas @ SC18
Supermicro @ SC18 SUSE @ SC18 TYAN @ SC18
Verne Global @ SC18

It’s Official: Aurora on Track to Be First US Exascale Computer in 2021

March 18, 2019

The U.S. Department of Energy along with Intel and Cray confirmed today that an Intel/Cray supercomputer, "Aurora," capable of sustained performance of one exaf Read more…

By Tiffany Trader

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

IBM Bets $2B Seeking 1000X AI Hardware Performance Boost

February 7, 2019

For now, AI systems are mostly machine learning-based and “narrow” – powerful as they are by today's standards, they're limited to performing a few, narro Read more…

By Doug Black

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

HPC Reflections and (Mostly Hopeful) Predictions

December 19, 2018

So much ‘spaghetti’ gets tossed on walls by the technology community (vendors and researchers) to see what sticks that it is often difficult to peer through Read more…

By John Russell

Arm Unveils Neoverse N1 Platform with up to 128-Cores

February 20, 2019

Following on its Neoverse roadmap announcement last October, Arm today revealed its next-gen Neoverse microarchitecture with compute and throughput-optimized si Read more…

By Tiffany Trader

Move Over Lustre & Spectrum Scale – Here Comes BeeGFS?

November 26, 2018

Is BeeGFS – the parallel file system with European roots – on a path to compete with Lustre and Spectrum Scale worldwide in HPC environments? Frank Herold Read more…

By John Russell

France to Deploy AI-Focused Supercomputer: Jean Zay

January 22, 2019

HPE announced today that it won the contract to build a supercomputer that will drive France’s AI and HPC efforts. The computer will be part of GENCI, the Fre Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This