White House Launches National HPC Strategy

By John Russell and Tiffany Trader

July 30, 2015

Yesterday’s executive order by President Barack Obama creating a National Strategic Computing Initiative (NSCI) is not only powerful acknowledgment of the vital role HPC plays in modern society but is also indicative of government’s mounting worry that failure to coordinate and nourish HPC development on a broader scale would put the nation at risk. Not surprisingly, early reaction from the HPC community has been largely positive.

“My first reaction is that this is a long needed recognition of both the critical role that HPC, including HPC at the very limits of what is possible, plays in science and engineering, and the tremendous challenges facing computing as we reach the limits of current technologies. The statement that advances in HPC will require a holistic approach, including algorithms, software, and hardware, is most welcome,” said William Gropp, director, Parallel Computing Institute and chief scientist, NCSA University of Illinois Urbana-Champaign, and co-editor of HPCwire Exascale Edition.

“Cray is excited to see the executive order creating a national strategic computing initiative and the focus it will provide for supercomputing. Supercomputing is critical to our national competitiveness,” shared Barry Bolding, chief strategy officer at Cray. “This executive order’s call for coherence between modeling and simulation and data analytic computing will spur needed innovation and improve competitiveness. We are in an era where the convergence between supercomputing and big data is changing our lives daily. Because of this convergence, we face technological challenges that will require sustained engagements between government, academia and industry and Cray sees this executive order as a very positive step in global competitiveness.”

“IBM commends this effort by the Administration to sustain America’s position at the forefront of advanced computing,” commented Dave Turek, vice president of high performance computing market engagement at IBM. “Doing so is vital not only to helping our country compete in the global race to innovate, but also to giving our researchers and scientists powerful tools to unlock new discoveries.”

“The NSCI effort is terrific news for the research community and the HPC industry,” stated Ian Buck, vice president of Accelerated Computing at NVIDIA. “Reaching exascale computing levels will require new technologies that maximize performance and minimize power consumption, while making it easier for programmers and researchers to take full advantage of these new systems to drive innovation. At NVIDIA, we’ve helped build some of our nation’s largest supercomputers, including the future CORAL pre-exascale systems. With NSCI moving forward, we are now poised to drive further technology advancements to help make exascale a reality.”

“Now, what we’re seeing in President Obama’s Executive Order is a major proof point of the importance of high-end computer technology in bolstering and redefining national competitiveness,” commented Jorge Titinger, president and CEO of SGI. “In the past, a country’s competitiveness and global power was defined by economic growth and defense capabilities. But now we’re seeing the advent of actionable technological insight — especially derived from the power of big data — becoming a factor of a country’s power.”

“HPC has become such a competitive weapon. IDC ROI research, sponsored by DOE, is showing that national investments in HPC resources can provide an extremely large return on investment on the order of over $500 dollars in revenue for each dollar invested in HPC. Industries like oil & gas, finance, automotive, aerospace, pharmaceuticals, healthcare are showing massive ROIs from their investments in HPC,” said Earl Joseph, IDC program vice president and executive director HPC User Forum.

As outlined in the executive order, the NSCI has four overarching principles and five objectives, both bulleted out below.

NSCI principles:

  1. The United States must deploy and apply new HPC technologies broadly for economic competitiveness and scientific discovery.
  2. The United States must foster public-private collaboration, relying on the respective strengths of government, industry, and academia to maximize the benefits of HPC.
  3. The United States must adopt a whole-of-government approach that draws upon the strengths of and seeks cooperation among all executive departments and agencies with significant expertise or equities in HPC while also collaborating with industry and academia.
  4. The United States must develop a comprehensive technical and scientific approach to transition HPC research on hardware, system software, development tools, and applications efficiently into development and, ultimately, operations.

NSCI objectives:

  1. Accelerating delivery of a capable exascale computing system that integrates hardware and software capability to deliver approximately 100 times the performance of current 10 petaflop systems across a range of applications representing government needs.
  2. Increasing coherence between the technology base used for modeling and simulation and that used for data analytic computing.
  3. Establishing, over the next 15 years, a viable path forward for future HPC systems even after the limits of current semiconductor technology are reached (the “post- Moore’s Law era”).
  4. Increasing the capacity and capability of an enduring national HPC ecosystem by employing a holistic approach that addresses relevant factors such as networking technology, workflow, downward scaling, foundational algorithms and software, accessibility, and workforce development.
  5. Developing an enduring public-private collaboration to ensure that the benefits of the research and development advances are, to the greatest extent, shared between the United States Government and industrial and academic sectors.

Many of the objectives echo plans already underway in the current Exascale Computing Initiative run by the DOE and National Nuclear Security Administration. This effort, however, seems broader and as the roster of planned NSCI participants indicates, it will be a huge undertaking.

Three agencies will lead: the Department of Energy (DOE), the Department of Defense (DOD), and the National Science Foundation (NSF). Also named are two foundational research and development agencies – the Intelligence Advanced Research Projects Activity (IARPA) and the National Institute of Standards and Technology (NIST). The five deployment agencies identified are the National Aeronautics and Space Administration, the Federal Bureau of Investigation, the National Institutes of Health, the Department of Homeland Security, and the National Oceanic and Atmospheric Administration.

An NCSI Executive Council, co-chaired by the director of the Office of Science and Technology Policy (OSTP) and the director of the Office of Management and Budget (OMB), will oversee NSCI activities. This council has been charged to “establish an implementation plan to support and align efforts across agencies” within 90 days and update annually. It’s well worth reviewing the relatively short full text of the executive order, which spells out in greater detail the roles and responsibilities of various NSCI participants.

Clearly making such an ambitious program work will be challenging.

“Ultimately, the success or failure of this ambitious effort hinges on the ability of the US Government to actively engage and include both the US academic and industrial sectors to help drive US gains in this critical field. Simply developing high-end systems to meet individual agency missions will not be enough; the project needs to foster a vibrant R&D as well as commercial HPC capability to ensure that the US can continue to build and market the most effective HPCs in the world,” said Bob Sorenson, now a research vice president for HPC with IDC but who previously served as a longtime senior HPC technology analyst supporting senior US Government policy makers on global HPC developments.

This program shouldn’t be about building the fastest and most powerful high-performance computer in the world, said Sorensen, but about establishing a broad-based ecosystem that can support the most ambitious US public and private scientific research agendas while helping the growing base of US industries that rely on these systems to design, test and build products – such as automobiles, aircrafts, and even specialty pharmaceuticals.

The devil, of course, is in the details and hopefully more will be revealed in the plan the NSCI Council is set to deliver before the end of the year.

On the technology front, the NSCI outline touches not only on familiar HPC challenges but also acknowledges the growing convergence of data-intensive computing with compute-intensive. Noted in the White House press release announcing the initiative is that “in the last 10 years, a new class of HPC system has emerged to collect, manage and analyze vast quantities of data arising from diverse sources, such as Internet web pages and scientific instruments. These “big data” systems will approach scales measured in exabytes (10^18 bytes)…By combining the computing power and the data capacity of these two classes of HPC systems, deeper insights can be gained through new approaches that combine simulation with actual data.” This recognition is drawing attention and approval.

“Importantly, the NSCI embraces the idea of big data and HPC convergence, something I believe is crucial to the future of computing – for scientific discovery, for national security and for economic competitiveness. Many of the tools and technologies for big data analytics and scientific computing are similar, yet the cultures and communities are largely disparate.  We must bring them together, for the benefit of both and for societal benefit. The NSCI will help do that,” said Dan Reed, vice president for research and economic development at the University of Iowa and an author of the recent ASCA review of the DOE’s Exascale Computing Initiative.

Reed also endorsed the multi-agency approach: “One of the NSCI’s key elements is interagency collaboration, with differential roles based on each agency’s unique strengths and capabilities. These include the development and deployment of exascale systems, research on new algorithms, software and enabling technologies (including post-silicon ones), and workforce development to address the critical shortage of computing experts. I’m also pleased to see the key role of NSF in science at the exascale (or extreme scale) is also specifically called out.”

As the executive order notes, for more than six decades, “US computing capabilities have been maintained through continuous research and the development and deployment of new computing systems with rapidly increasing performance on applications of major significance to government, industry, and academia.

“Maximizing the benefits of HPC in the coming decades will require an effective national response to increasing demands for computing power, emerging technological challenges and opportunities, and growing economic dependency on and competition with other nations. This national response will require a cohesive, strategic effort within the Federal Government and a close collaboration between the public and private sectors.”

We’ll see.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

NIST/Xanadu Researchers Report Photonic Quantum Computing Advance

March 3, 2021

Researchers from the National Institute of Standards and Technology (NIST) and Xanadu, a young Canada-based quantum computing company, have reported developing a full-stack, photonic quantum computer able to carry out th Read more…

By John Russell

Can Deep Learning Replace Numerical Weather Prediction?

March 3, 2021

Numerical weather prediction (NWP) is a mainstay of supercomputing. Some of the first applications of the first supercomputers dealt with climate modeling, and even to this day, the largest climate models are heavily con Read more…

By Oliver Peckham

HPE Names Justin Hotard New HPC Chief as Pete Ungaro Departs

March 2, 2021

HPE CEO Antonio Neri announced today (March 2, 2020) the appointment of Justin Hotard as general manager of HPC, mission critical solutions and labs, effective immediately. Hotard replaces long-time Cray exec Pete Ungaro Read more…

By Tiffany Trader

ORNL’s Jeffrey Vetter on How IRIS Runtime will Help Deal with Extreme Heterogeneity

March 2, 2021

Jeffery Vetter is a familiar figure in HPC. Last year he became one of the new section heads in a reorganization at Oak Ridge National Laboratory. He had been founding director of ORNL's Future Technologies Group which i Read more…

By John Russell

HPC Career Notes: March 2021 Edition

March 1, 2021

In this monthly feature, we’ll keep you up-to-date on the latest career developments for individuals in the high-performance computing community. Whether it’s a promotion, new company hire, or even an accolade, we’ Read more…

By Mariana Iriarte

AWS Solution Channel

Moderna Accelerates COVID-19 Vaccine Development on AWS

Marcello Damiani, Chief Digital and Operational Excellence Officer at Moderna, joins Todd Weatherby, Vice President of AWS Professional Services Worldwide, for a discussion on developing Moderna’s COVID-19 vaccine, scaling systems to enable global distribution, and leveraging cloud technologies to accelerate processes. Read more…

Supercomputers Enable First Holistic Model of SARS-CoV-2, Showing Spike Proteins Move in Tandem

February 28, 2021

Most models of SARS-CoV-2, the coronavirus that causes COVID-19, hone in on key features of the virus: for instance, the spike protein. Some of this is attributable to the relative importance of those features, but most Read more…

By Oliver Peckham

Can Deep Learning Replace Numerical Weather Prediction?

March 3, 2021

Numerical weather prediction (NWP) is a mainstay of supercomputing. Some of the first applications of the first supercomputers dealt with climate modeling, and Read more…

By Oliver Peckham

HPE Names Justin Hotard New HPC Chief as Pete Ungaro Departs

March 2, 2021

HPE CEO Antonio Neri announced today (March 2, 2020) the appointment of Justin Hotard as general manager of HPC, mission critical solutions and labs, effective Read more…

By Tiffany Trader

ORNL’s Jeffrey Vetter on How IRIS Runtime will Help Deal with Extreme Heterogeneity

March 2, 2021

Jeffery Vetter is a familiar figure in HPC. Last year he became one of the new section heads in a reorganization at Oak Ridge National Laboratory. He had been f Read more…

By John Russell

HPC Career Notes: March 2021 Edition

March 1, 2021

In this monthly feature, we’ll keep you up-to-date on the latest career developments for individuals in the high-performance computing community. Whether it Read more…

By Mariana Iriarte

African Supercomputing Center Inaugurates ‘Toubkal,’ Most Powerful Supercomputer on the Continent

February 25, 2021

Historically, Africa hasn’t exactly been synonymous with supercomputing. There are only a handful of supercomputers on the continent, with few ranking on the Read more…

By Oliver Peckham

Japan to Debut Integrated Fujitsu HPC/AI Supercomputer This Spring

February 25, 2021

The integrated Fujitsu HPC/AI Supercomputer, Wisteria, is coming to Japan this spring. The University of Tokyo is preparing to deploy a heterogeneous computing Read more…

By Tiffany Trader

Xilinx Launches Alveo SN1000 SmartNIC

February 24, 2021

FPGA vendor Xilinx has debuted its latest SmartNIC model, the Alveo SN1000, with integrated “composability” features that allow enterprise users to add their own custom networking functions to supplement its built-in networking. By providing deep flexibility... Read more…

By Todd R. Weiss

ASF Keynotes Showcase How HPC and Big Data Have Pervaded the Pandemic

February 24, 2021

Last Thursday, a range of experts joined the Advanced Scale Forum (ASF) in a rapid-fire roundtable to discuss how advanced technologies have transformed the way humanity responded to the COVID-19 pandemic in indelible ways. The roundtable, held near the one-year mark of the first... Read more…

By Oliver Peckham

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

By John Russell

Esperanto Unveils ML Chip with Nearly 1,100 RISC-V Cores

December 8, 2020

At the RISC-V Summit today, Art Swift, CEO of Esperanto Technologies, announced a new, RISC-V based chip aimed at machine learning and containing nearly 1,100 low-power cores based on the open-source RISC-V architecture. Esperanto Technologies, headquartered in... Read more…

By Oliver Peckham

Azure Scaled to Record 86,400 Cores for Molecular Dynamics

November 20, 2020

A new record for HPC scaling on the public cloud has been achieved on Microsoft Azure. Led by Dr. Jer-Ming Chia, the cloud provider partnered with the Beckman I Read more…

By Oliver Peckham

Programming the Soon-to-Be World’s Fastest Supercomputer, Frontier

January 5, 2021

What’s it like designing an app for the world’s fastest supercomputer, set to come online in the United States in 2021? The University of Delaware’s Sunita Chandrasekaran is leading an elite international team in just that task. Chandrasekaran, assistant professor of computer and information sciences, recently was named... Read more…

By Tracey Bryant

NICS Unleashes ‘Kraken’ Supercomputer

April 4, 2008

A Cray XT4 supercomputer, dubbed Kraken, is scheduled to come online in mid-summer at the National Institute for Computational Sciences (NICS). The soon-to-be petascale system, and the resulting NICS organization, are the result of an NSF Track II award of $65 million to the University of Tennessee and its partners to provide next-generation supercomputing for the nation's science community. Read more…

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

Top500: Fugaku Keeps Crown, Nvidia’s Selene Climbs to #5

November 16, 2020

With the publication of the 56th Top500 list today from SC20's virtual proceedings, Japan's Fugaku supercomputer – now fully deployed – notches another win, Read more…

By Tiffany Trader

Gordon Bell Special Prize Goes to Massive SARS-CoV-2 Simulations

November 19, 2020

2020 has proven a harrowing year – but it has produced remarkable heroes. To that end, this year, the Association for Computing Machinery (ACM) introduced the Read more…

By Oliver Peckham

Leading Solution Providers

Contributors

Texas A&M Announces Flagship ‘Grace’ Supercomputer

November 9, 2020

Texas A&M University has announced its next flagship system: Grace. The new supercomputer, named for legendary programming pioneer Grace Hopper, is replacing the Ada system (itself named for mathematician Ada Lovelace) as the primary workhorse for Texas A&M’s High Performance Research Computing (HPRC). Read more…

By Oliver Peckham

Saudi Aramco Unveils Dammam 7, Its New Top Ten Supercomputer

January 21, 2021

By revenue, oil and gas giant Saudi Aramco is one of the largest companies in the world, and it has historically employed commensurate amounts of supercomputing Read more…

By Oliver Peckham

Intel Xe-HP GPU Deployed for Aurora Exascale Development

November 17, 2020

At SC20, Intel announced that it is making its Xe-HP high performance discrete GPUs available to early access developers. Notably, the new chips have been deplo Read more…

By Tiffany Trader

Intel Teases Ice Lake-SP, Shows Competitive Benchmarking

November 17, 2020

At SC20 this week, Intel teased its forthcoming third-generation Xeon "Ice Lake-SP" server processor, claiming competitive benchmarking results against AMD's second-generation Epyc "Rome" processor. Ice Lake-SP, Intel's first server processor with 10nm technology... Read more…

By Tiffany Trader

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

By John Russell

Livermore’s El Capitan Supercomputer to Debut HPE ‘Rabbit’ Near Node Local Storage

February 18, 2021

A near node local storage innovation called Rabbit factored heavily into Lawrence Livermore National Laboratory’s decision to select Cray’s proposal for its CORAL-2 machine, the lab’s first exascale-class supercomputer, El Capitan. Details of this new storage technology were revealed... Read more…

By Tiffany Trader

It’s Fugaku vs. COVID-19: How the World’s Top Supercomputer Is Shaping Our New Normal

November 9, 2020

Fugaku is currently the most powerful publicly ranked supercomputer in the world – but we weren’t supposed to have it yet. The supercomputer, situated at Japan’s Riken scientific research institute, was scheduled to come online in 2021. When the pandemic struck... Read more…

By Oliver Peckham

African Supercomputing Center Inaugurates ‘Toubkal,’ Most Powerful Supercomputer on the Continent

February 25, 2021

Historically, Africa hasn’t exactly been synonymous with supercomputing. There are only a handful of supercomputers on the continent, with few ranking on the Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire