White House Launches National HPC Strategy

By John Russell and Tiffany Trader

July 30, 2015

Yesterday’s executive order by President Barack Obama creating a National Strategic Computing Initiative (NSCI) is not only powerful acknowledgment of the vital role HPC plays in modern society but is also indicative of government’s mounting worry that failure to coordinate and nourish HPC development on a broader scale would put the nation at risk. Not surprisingly, early reaction from the HPC community has been largely positive.

“My first reaction is that this is a long needed recognition of both the critical role that HPC, including HPC at the very limits of what is possible, plays in science and engineering, and the tremendous challenges facing computing as we reach the limits of current technologies. The statement that advances in HPC will require a holistic approach, including algorithms, software, and hardware, is most welcome,” said William Gropp, director, Parallel Computing Institute and chief scientist, NCSA University of Illinois Urbana-Champaign, and co-editor of HPCwire Exascale Edition.

“Cray is excited to see the executive order creating a national strategic computing initiative and the focus it will provide for supercomputing. Supercomputing is critical to our national competitiveness,” shared Barry Bolding, chief strategy officer at Cray. “This executive order’s call for coherence between modeling and simulation and data analytic computing will spur needed innovation and improve competitiveness. We are in an era where the convergence between supercomputing and big data is changing our lives daily. Because of this convergence, we face technological challenges that will require sustained engagements between government, academia and industry and Cray sees this executive order as a very positive step in global competitiveness.”

“IBM commends this effort by the Administration to sustain America’s position at the forefront of advanced computing,” commented Dave Turek, vice president of high performance computing market engagement at IBM. “Doing so is vital not only to helping our country compete in the global race to innovate, but also to giving our researchers and scientists powerful tools to unlock new discoveries.”

“The NSCI effort is terrific news for the research community and the HPC industry,” stated Ian Buck, vice president of Accelerated Computing at NVIDIA. “Reaching exascale computing levels will require new technologies that maximize performance and minimize power consumption, while making it easier for programmers and researchers to take full advantage of these new systems to drive innovation. At NVIDIA, we’ve helped build some of our nation’s largest supercomputers, including the future CORAL pre-exascale systems. With NSCI moving forward, we are now poised to drive further technology advancements to help make exascale a reality.”

“Now, what we’re seeing in President Obama’s Executive Order is a major proof point of the importance of high-end computer technology in bolstering and redefining national competitiveness,” commented Jorge Titinger, president and CEO of SGI. “In the past, a country’s competitiveness and global power was defined by economic growth and defense capabilities. But now we’re seeing the advent of actionable technological insight — especially derived from the power of big data — becoming a factor of a country’s power.”

“HPC has become such a competitive weapon. IDC ROI research, sponsored by DOE, is showing that national investments in HPC resources can provide an extremely large return on investment on the order of over $500 dollars in revenue for each dollar invested in HPC. Industries like oil & gas, finance, automotive, aerospace, pharmaceuticals, healthcare are showing massive ROIs from their investments in HPC,” said Earl Joseph, IDC program vice president and executive director HPC User Forum.

As outlined in the executive order, the NSCI has four overarching principles and five objectives, both bulleted out below.

NSCI principles:

  1. The United States must deploy and apply new HPC technologies broadly for economic competitiveness and scientific discovery.
  2. The United States must foster public-private collaboration, relying on the respective strengths of government, industry, and academia to maximize the benefits of HPC.
  3. The United States must adopt a whole-of-government approach that draws upon the strengths of and seeks cooperation among all executive departments and agencies with significant expertise or equities in HPC while also collaborating with industry and academia.
  4. The United States must develop a comprehensive technical and scientific approach to transition HPC research on hardware, system software, development tools, and applications efficiently into development and, ultimately, operations.

NSCI objectives:

  1. Accelerating delivery of a capable exascale computing system that integrates hardware and software capability to deliver approximately 100 times the performance of current 10 petaflop systems across a range of applications representing government needs.
  2. Increasing coherence between the technology base used for modeling and simulation and that used for data analytic computing.
  3. Establishing, over the next 15 years, a viable path forward for future HPC systems even after the limits of current semiconductor technology are reached (the “post- Moore’s Law era”).
  4. Increasing the capacity and capability of an enduring national HPC ecosystem by employing a holistic approach that addresses relevant factors such as networking technology, workflow, downward scaling, foundational algorithms and software, accessibility, and workforce development.
  5. Developing an enduring public-private collaboration to ensure that the benefits of the research and development advances are, to the greatest extent, shared between the United States Government and industrial and academic sectors.

Many of the objectives echo plans already underway in the current Exascale Computing Initiative run by the DOE and National Nuclear Security Administration. This effort, however, seems broader and as the roster of planned NSCI participants indicates, it will be a huge undertaking.

Three agencies will lead: the Department of Energy (DOE), the Department of Defense (DOD), and the National Science Foundation (NSF). Also named are two foundational research and development agencies – the Intelligence Advanced Research Projects Activity (IARPA) and the National Institute of Standards and Technology (NIST). The five deployment agencies identified are the National Aeronautics and Space Administration, the Federal Bureau of Investigation, the National Institutes of Health, the Department of Homeland Security, and the National Oceanic and Atmospheric Administration.

An NCSI Executive Council, co-chaired by the director of the Office of Science and Technology Policy (OSTP) and the director of the Office of Management and Budget (OMB), will oversee NSCI activities. This council has been charged to “establish an implementation plan to support and align efforts across agencies” within 90 days and update annually. It’s well worth reviewing the relatively short full text of the executive order, which spells out in greater detail the roles and responsibilities of various NSCI participants.

Clearly making such an ambitious program work will be challenging.

“Ultimately, the success or failure of this ambitious effort hinges on the ability of the US Government to actively engage and include both the US academic and industrial sectors to help drive US gains in this critical field. Simply developing high-end systems to meet individual agency missions will not be enough; the project needs to foster a vibrant R&D as well as commercial HPC capability to ensure that the US can continue to build and market the most effective HPCs in the world,” said Bob Sorenson, now a research vice president for HPC with IDC but who previously served as a longtime senior HPC technology analyst supporting senior US Government policy makers on global HPC developments.

This program shouldn’t be about building the fastest and most powerful high-performance computer in the world, said Sorensen, but about establishing a broad-based ecosystem that can support the most ambitious US public and private scientific research agendas while helping the growing base of US industries that rely on these systems to design, test and build products – such as automobiles, aircrafts, and even specialty pharmaceuticals.

The devil, of course, is in the details and hopefully more will be revealed in the plan the NSCI Council is set to deliver before the end of the year.

On the technology front, the NSCI outline touches not only on familiar HPC challenges but also acknowledges the growing convergence of data-intensive computing with compute-intensive. Noted in the White House press release announcing the initiative is that “in the last 10 years, a new class of HPC system has emerged to collect, manage and analyze vast quantities of data arising from diverse sources, such as Internet web pages and scientific instruments. These “big data” systems will approach scales measured in exabytes (10^18 bytes)…By combining the computing power and the data capacity of these two classes of HPC systems, deeper insights can be gained through new approaches that combine simulation with actual data.” This recognition is drawing attention and approval.

“Importantly, the NSCI embraces the idea of big data and HPC convergence, something I believe is crucial to the future of computing – for scientific discovery, for national security and for economic competitiveness. Many of the tools and technologies for big data analytics and scientific computing are similar, yet the cultures and communities are largely disparate.  We must bring them together, for the benefit of both and for societal benefit. The NSCI will help do that,” said Dan Reed, vice president for research and economic development at the University of Iowa and an author of the recent ASCA review of the DOE’s Exascale Computing Initiative.

Reed also endorsed the multi-agency approach: “One of the NSCI’s key elements is interagency collaboration, with differential roles based on each agency’s unique strengths and capabilities. These include the development and deployment of exascale systems, research on new algorithms, software and enabling technologies (including post-silicon ones), and workforce development to address the critical shortage of computing experts. I’m also pleased to see the key role of NSF in science at the exascale (or extreme scale) is also specifically called out.”

As the executive order notes, for more than six decades, “US computing capabilities have been maintained through continuous research and the development and deployment of new computing systems with rapidly increasing performance on applications of major significance to government, industry, and academia.

“Maximizing the benefits of HPC in the coming decades will require an effective national response to increasing demands for computing power, emerging technological challenges and opportunities, and growing economic dependency on and competition with other nations. This national response will require a cohesive, strategic effort within the Federal Government and a close collaboration between the public and private sectors.”

We’ll see.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

What’s New in HPC Research: October (Part 2)

October 15, 2018

In this bimonthly feature, HPCwire will highlight newly published research in the high-performance computing community and related domains. From exascale to quantum computing, the details are here. Check back on the firs Read more…

By Oliver Peckham

Building a Diverse Workforce for Next-Generation Analytics and AI

October 15, 2018

High-performance computing (HPC) has a well-known diversity problem, and groups such as Women in HPC are working to address it. But while the diversity challenge crosses the science and technology spectrum, it is especia Read more…

By Jan Rowell

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas monster, which would be a first, but at a spec'd 250 single-pre Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

One Small Step Toward Mars: One Giant Leap for Supercomputing

Since the days of the Space Race between the U.S. and the former Soviet Union, we have continually sought ways to perform experiments in space. Read more…

IBM Accelerated Insights

Nvidia, Oracle Expand Cloud GPU Ties for AI, HPC

October 11, 2018

Oracle is collaborating with Nvidia to bring the GPU leader’s unified AI and HPC platform to the public cloud for accelerating analytics and machine learning workloads. The move makes Oracle the first public cloud vendor to support Nvidia’s HGX-2 platform, the partners said this week. Read more…

By George Leopold

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

Nvidia Platform Pushes GPUs into Machine Learning, High Performance Data Analytics

October 10, 2018

GPU leader Nvidia, generally associated with deep learning, autonomous vehicles and other higher-end enterprise and scientific workloads (and gaming, of course) Read more…

By Doug Black

Federal Investment in Exascale – What It Really Means

October 10, 2018

Earlier this month, the EuroHPC JU (Joint Undertaking) reached critical mass, and it seems all EU and affiliated member states, bar the UK (unsurprisingly), have or will sign on. The EuroHPC JU was born from a recognition that individual EU member states, and the EU as a whole, were significantly underinvesting in HPC compared to the US, China and Japan, who all have their own exascale investment and delivery strategies (NSCI, 13th 5 Year Plan, Post-K, etc). Read more…

By Dairsie Latimer

NERSC-9 Clues Found in NERSC 2017 Annual Report

October 8, 2018

If you’re eager to find out who’ll supply NERSC’s next-gen supercomputer, codenamed NERSC-9, here’s a project update to tide you over until the winning bid and system details are revealed. The upcoming system is referenced several times in the recently published 2017 NERSC annual report. Read more…

By Tiffany Trader

DDN, Nvidia Blueprint Unified AI Appliance with Up to 9 DGX-1s

October 4, 2018

Continuing the roll-out of the A3I (Accelerated, Any-Scale AI) storage strategy kicked off in June, DDN today announced a new set of solutions that combine the Read more…

By Tiffany Trader

D-Wave Is Latest to Offer Quantum Cloud Platform

October 4, 2018

D-Wave Systems today launched its cloud platform for quantum computing – Leap – which combines a development environment, community features, and "real-time Read more…

By John Russell

Rise of the Machines – Clarion Call on AI by U.S. House Subcommittee

October 2, 2018

Last week, the top U.S. House of Representatives subcommittee on IT weighed in on AI with a new report - Rise of the Machines: Artificial Intelligence and its Growing Impact on U.S. Policy. Read more…

By John Russell

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

TACC Wins Next NSF-funded Major Supercomputer

July 30, 2018

The Texas Advanced Computing Center (TACC) has won the next NSF-funded big supercomputer beating out rivals including the National Center for Supercomputing Ap Read more…

By John Russell

IBM at Hot Chips: What’s Next for Power

August 23, 2018

With processor, memory and networking technologies all racing to fill in for an ailing Moore’s law, the era of the heterogeneous datacenter is well underway, Read more…

By Tiffany Trader

Requiem for a Phi: Knights Landing Discontinued

July 25, 2018

On Monday, Intel made public its end of life strategy for the Knights Landing "KNL" Phi product set. The announcement makes official what has already been wide Read more…

By Tiffany Trader

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learni Read more…

By Rob Farber

House Passes $1.275B National Quantum Initiative

September 17, 2018

Last Thursday the U.S. House of Representatives passed the National Quantum Initiative Act (NQIA) intended to accelerate quantum computing research and developm Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

By John Russell

AMD’s EPYC Road to Redemption in Six Slides

June 21, 2018

A year ago AMD returned to the server market with its EPYC processor line. The earth didn’t tremble but folks took notice. People remember the Opteron fondly Read more…

By John Russell

Leading Solution Providers

HPC on Wall Street 2018 Booth Video Tours Playlist

Arista

Dell EMC

IBM

Intel

RStor

VMWare

D-Wave Breaks New Ground in Quantum Simulation

July 16, 2018

Last Friday D-Wave scientists and colleagues published work in Science which they say represents the first fulfillment of Richard Feynman’s 1982 notion that Read more…

By John Russell

TACC’s ‘Frontera’ Supercomputer Expands Horizon for Extreme-Scale Science

August 29, 2018

The National Science Foundation and the Texas Advanced Computing Center announced today that a new system, called Frontera, will overtake Stampede 2 as the fast Read more…

By Tiffany Trader

Intel Announces Cooper Lake, Advances AI Strategy

August 9, 2018

Intel's chief datacenter exec Navin Shenoy kicked off the company's Data-Centric Innovation Summit Wednesday, the day-long program devoted to Intel's datacenter Read more…

By Tiffany Trader

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

GPUs Power Five of World’s Top Seven Supercomputers

June 25, 2018

The top 10 echelon of the newly minted Top500 list boasts three powerful new systems with one common engine: the Nvidia Volta V100 general-purpose graphics proc Read more…

By Tiffany Trader

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

Aerodynamic Simulation Reveals Best Position in a Peloton of Cyclists

July 5, 2018

Eindhoven University of Technology (TU/e) and KU Leuven research group conducts the largest numerical simulation ever done in the sport industry and cycling discipline. The goal was to understand the aerodynamic interactions in the peloton, i.e., the main pack of cyclists in a race. Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This