Exascale Advocates Stand on Nuclear Stockpiles

By Nicole Hemsoth

May 23, 2013

When it comes to investment in scientific research, the U.S. government tends to have an open ear for new ideas. However, in this time of tight budgets and heightened national security, federal coffers tend to have looser locks when there is a threat situation—whether that is global competitiveness or the safety and security of the nation.

According to a group of leading voices in high performance computing who gathered before yesterday’s U.S. Subcommittee on Energy, all of these national commodities are at stake without sustained investment in exascale systems.

While exascale funding hearings are nothing new, yesterday’s appeal struck a different chord, harmonizing with the urgency of ensuring U.S. nuclear capabilities—a note that has been resonating in headlines lately.

Instead of pitching the “big science” projects that lack a direct call to action, the threat of enroaching dominance from China and others, internal security, continued economic viability, and even the ability to predict tornado paths (a top news item during yesterday’s hearings following a devastating F5 in Oklahoma) took center stage, pushing exascale into the light of a requirement versus another expensive scientific endeavor.

Dr. Roscoe Giles, Chairman of the Advanced Scientific Computing Advisory Committee; Dr. Rick Stevens, Associate Director for Computing, Environment and Life Sciences at Argonne; Dona Crawford, Associate Director for Computation at Lawrence Livermore; and Dr. Dan Reed, VP of Research and Economic Development at the University of Iowa, all weighed in on various, expected components of exascale’s future (architecture, power/cooling, memory, etc.) before ringing the urgency alarm.

The hearing’s purpose was to examine draft legislation as it relates to the Department of Energy’s goals to build an exascale system. While the scientific payload of exascale was an important topic, the real meat, particularly when the floor was opened for questions, was how exascale will fit into larger national security goals, including nuclear stockpile stewardship—a rather familiar subject in the context of historical HPC funding.

The government has a $465.59 million proposal for FY 2014 in their hands to fund the DOE’s Office of Science Advanced Scientific Computing Research program, which will help spearhead U.S exascale efforts. Additionally, the National Nuclear Security Administration (NNSA) is requesting a tick over $400 million for its Advanced Simulation and Computing programs, which will help the U.S. maintain the safety and viability of its nuclear weapons stockpile without active underground or small on-ground tests.

If the Advanced Simulation and Computing Program rings a bell, it’s because it was an original part of the initial DOE Stockpile Stewardship and Management plan, which took the dirt and grit out of the physical testing process of nukes and plugged the possibilities into supercomputers and new instruments instead. Since even the youngest nuclear devices in the U.S. shed are 20 years old, a lot of testing needs to be done to see how they will react under the stresses of aging in terms of stability and viability should the unfortunate need arise.

From the beginning, this Stewardship and associated Simulation and Computing program pulled in funding—breathing new life into research endeavors at a number of national labs, most notably Sandia, Lawrence Livermore and Los Alamos. It also kicked funds into the private technology sector by default. To avoid a tangent, take this redirect to an analysis of some of the program’s strengths and weaknesses in terms of the computational horsepower.

Using the arsenal of current tools, the NNSA continuously assesses each nuclear weapon to certify its reliability and to detect or anticipate any potential problems that may come about as a result of aging.  All weapon types in the U.S. nuclear stockpile require routine maintenance, periodic repair, replacement of limited life components, surveillance (a thorough examination of a weapon)—all tasks that Crawford and colleagues say require exaflop-capable resources.

In short, this convincing approach worked in the 1990s when modeling and simulation capabilities were increasing rapidly—but the question is whether or not even that call to action for exascale’s value will be enough to add the required $400 million-level of urgency. Combined, however, with the dramatic and timely issues of nuclear threats pointed at allies—not to mention our competitive stew has cooled on multiple industrial and economic fronts—this appeal might carry more weight than it would have even this time last year.

As Dona Crawford explained, it is now the use of exascale systems that represents the only way to truly understand how to make sure the U.S. nuclear stockpile is safe, secure and in top condition. The same argument that propelled a great deal of investment into tech companies back in the 1990s when the NNSA looked to simulations and supercomputing to carry the stewardship load.

“Computing is the integrating element of maintaining the safety, security and reliability of our nuclear weapons stockpile without returning to underground tests,” said Crawford. “By integrating element, I mean that right now we have old test data, above-ground small test data, a lot of theory and some new models,” but that these cannot be used effectively unless scientists have access to far higher-fidelity simulations.

Even without using exascale to ensure nuclear stockpile safety and security, the side effect of lagging investment is a dwindling of our competitive prowess.

When asked why the U.S. doesn’t look to more international collaboration to reach its exascale ambitions, Dr. Stevens said that this makes sense on the software level, especially since so many large-scale systems use the same open source packages that are then pushed out to the community. However, he argued that it would not be suitable for us to share resources on the hardware front, pointing to what might happen if we were to trust our secure operations to run on hardware built in China.

The competitive threat wasn’t difficult for the speakers to tease apart for the committee—they pointed to investments in China and Japan toward exascale, making it clear that these were not insignificant funding efforts.  

Dan Reed made the argument that we are facing an uncertain future in HPC as other nations are making critical investments in supercomputing, noting, “Global leadership isn’t a birthright.” Even if the nuclear stockpile can make do with its current level of petascale capabilities, winning a silver, bronze—or even no medal in the exascale race itself presents a bevy of potential problems.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

NREL ‘Eagle’ Supercomputer to Advance Energy Tech R&D

August 14, 2018

The U.S. Department of Energy (DOE) National Renewable Energy Laboratory (NREL) has contracted with HPE for a new 8-petaflops (peak) supercomputer that will be used to advance early-stage R&D on energy technologies s Read more…

By Tiffany Trader

Training Time Slashed for Deep Learning

August 14, 2018

Fast.ai, an organization offering free courses on deep learning, claimed a new speed record for training a popular image database using Nvidia GPUs running on public cloud infrastructure. A pair of researchers trained Read more…

By George Leopold

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learning. The CERN team demonstrated that AI-based models have the Read more…

By Rob Farber

HPE Extreme Performance Solutions

Introducing the First Integrated System Management Software for HPC Clusters from HPE

How do you manage your complex, growing cluster environments? Answer that big challenge with the new HPC cluster management solution: HPE Performance Cluster Manager. Read more…

IBM Accelerated Insights

Super Problem Solving

You might think that tackling the world’s toughest problems is a job only for superheroes, but at special places such as the Oak Ridge National Laboratory, supercomputers are the real heroes. Read more…

Rigetti Eyes Scaling with 128-Qubit Architecture

August 10, 2018

Rigetti Computing plans to build a 128-qubit quantum computer based on an equivalent quantum processor that leverages emerging hybrid computing algorithms used to test programs and potential applications. Founded in 2 Read more…

By George Leopold

NREL ‘Eagle’ Supercomputer to Advance Energy Tech R&D

August 14, 2018

The U.S. Department of Energy (DOE) National Renewable Energy Laboratory (NREL) has contracted with HPE for a new 8-petaflops (peak) supercomputer that will be Read more…

By Tiffany Trader

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learni Read more…

By Rob Farber

Intel Announces Cooper Lake, Advances AI Strategy

August 9, 2018

Intel's chief datacenter exec Navin Shenoy kicked off the company's Data-Centric Innovation Summit Wednesday, the day-long program devoted to Intel's datacenter Read more…

By Tiffany Trader

SLATE Update: Making Math Libraries Exascale-ready

August 9, 2018

Practically-speaking, achieving exascale computing requires enabling HPC software to effectively use accelerators – mostly GPUs at present – and that remain Read more…

By John Russell

Summertime in Washington: Some Unexpected Advanced Computing News

August 8, 2018

Summertime in Washington DC is known for its heat and humidity. That is why most people get away to either the mountains or the seashore and things slow down. H Read more…

By Alex R. Larzelere

NSF Invests $15 Million in Quantum STAQ

August 7, 2018

Quantum computing development is in full ascent as global backers aim to transcend the limitations of classical computing by leveraging the magical-seeming prop Read more…

By Tiffany Trader

By the Numbers: Cray Would Like Exascale to Be the Icing on the Cake

August 1, 2018

On its earnings call held for investors yesterday, Cray gave an accounting for its latest quarterly financials, offered future guidance and provided an update o Read more…

By Tiffany Trader

Google is First Partner in NIH’s STRIDES Effort to Speed Discovery in the Cloud

July 31, 2018

The National Institutes of Health, with the help of Google, last week launched STRIDES - Science and Technology Research Infrastructure for Discovery, Experimen Read more…

By John Russell

Leading Solution Providers

SC17 Booth Video Tours Playlist

Altair @ SC17

Altair

AMD @ SC17

AMD

ASRock Rack @ SC17

ASRock Rack

CEJN @ SC17

CEJN

DDN Storage @ SC17

DDN Storage

Huawei @ SC17

Huawei

IBM @ SC17

IBM

IBM Power Systems @ SC17

IBM Power Systems

Intel @ SC17

Intel

Lenovo @ SC17

Lenovo

Mellanox Technologies @ SC17

Mellanox Technologies

Microsoft @ SC17

Microsoft

Penguin Computing @ SC17

Penguin Computing

Pure Storage @ SC17

Pure Storage

Supericro @ SC17

Supericro

Tyan @ SC17

Tyan

Univa @ SC17

Univa

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This