NSCI Discussion at HPC User Forum Shows Hunger for Details

By John Russell

April 20, 2016

Is the National Strategic Computing Initiative in trouble? Launched by Presidential Executive Order last July, there are still few public details of the draft implementation plan, which was delivered to the NSCI Executive Council back in October. Last week on the final day of the HPC User Forum being held in Tucson, Saul Gonzalez Martirena (OSTP, NSF) gave an NSCI update talk that contained, really, no new information from what was presented at SC15.

As he was heading out, and aware that a open discussion on NSCI was scheduled for later in the day, Martirena asked one of the meeting’s organizers (IDC Research VP, Bob Sorensen) to take good notes, adding “if there is any possibility send them to me by tomorrow. We are really looking for good ideas.” It’s too bad he missed the discussion.

Starved for details and perhaps becoming tone-deaf to NSCI aspirations, the late afternoon discussion was wide-ranging and concern-ridden relative to NSCI’s reality. The first member of the gathered group to venture an opinion, a very senior member of the HPC community, said simply, “It’s all [BS].” What followed was candid conversation among the forty or so attendees who stuck around for the final session of the forum.

NCSI, of course, is a grand plan to ensure the U.S. maintains a leadership position in high performance computing. It’s five objectives, bulleted below, represent frank recognition by the U.S. government, or at least the current administration, that HPC leadership is vital for advancing science, ensuring national defense, and maintaining national economic competitiveness. Now, nearly nine months after its start, there’s still not much known about the plan to operationalize the vision.

The five NSCI objectives, excerpted from the original executive order, are:

  1. “Accelerating delivery of a capable exascale computing system that integrates hardware and software capability to deliver approximately 100 times the performance of current 10 petaflop systems across a range of applications representing government needs.
  2. Increasing coherence between the technology base used for modeling and simulation and that used for data analytic computing.
  3. Establishing, over the next 15 years, a viable path forward for future HPC systems even after the limits of current semiconductor technology are reached (the “post- Moore’s Law era”).
  4. Increasing the capacity and capability of an enduring national HPC ecosystem by employing a holistic approach that addresses relevant factors such as networking technology, workflow, downward scaling, foundational algorithms and software, accessibility, and workforce development.
  5. Developing an enduring public-private collaboration to ensure that the benefits of the research and development advances are, to the greatest extent, shared between the United States Government and industrial and academic sectors.”

Sorensen was a good choice for moderator. Before joining IDC he spent 33 years in the federal government as a science and technology analyst covering HPC for DoD, Treasury, and the White House. He was involved during the early formation of NSCI and remains an advocate; that said he has since written that more must be done to ensure success (see NSCI Update: More Work Needed on Budgetary Details and Industry Outreach).

NSCI discussion points
NSCI discussion points

Views were decidedly mixed in the audience, which was mostly drawn from academia, national labs, and industry. Sorensen kicked things of with a list of discussion points (see on left) but discussion wandered extensively.

What did seemed clear is that lacking a concrete NSCI implementation plan to react to, members of the audience defaulted to ad hoc concerns and attitudes, sometimes predictably characteristic of the segment of the HPC community to which they belonged, but also often representative of the diversity of opinion within segments. Two of the more contentious issues were the picking of winners and losers by government and the challenges in creating an enduring national HPC ecosystem through pubic-private efforts.

Uncle Sam Not Good At Picking Winners
Every time an RFP goes out, there’s a winner and a loser, said Sorensen. One attendee recalled the government-funded effort to develop a national aerodynamic simulator in the 1980s as something less than successful . “They funded Control Data Corp. and Burroughs Corporation. Somebody asked Cray how come you’re not going after any of that. Seymour [Cray] said ‘when I build my machine they will decide that ‘s the machine they really want.’ And he built the Cray 2 and Burroughs dropped out of the business. [In the end] CDC supplied a dead machine and Cray won the business. The point is for years the government has tried to pick winners and losers and hasn’t been successful.”

Bob Sorensen, IDC
Bob Sorensen, IDC

Sorensen, in his introductory remarks, noted further there is an inherent “dichotomy” in the program. The folks who are doing this, DOE, NNSA want the best HPC systems in the world [because] leadership here means greater potential for greater national security. [While] at the same time we want a vibrant HPC infrastructure that builds the best equipment in the world and sells it to anyone that has money that.”

Indeed, mention was made of the recent report that China – denied Intel’s latest chips roughly one year ago by the U.S. Commerce Department– would soon bring on two 100 petaflops machines made with Chinese components and planing to benchmark one in time for the next Top500 list (June). One comment was “The Chinese are giving a gift to this program. Imagine what Trump is going to say. We are going to be portrayed as being way behind the Chinese and get out the check book because we have to catch up.”

It was hardly smooth sailing for the sprawling NSCI blueprint. Still, it would be very inaccurate to say the mood was anti-NSCI; rather so much uncertainty remains that there was little to focus on. The devil is in the details, said one attendee. Funding, HPC training, software issues (modernization and ISV interest), big box envy, the politically charged environment, clarity of NSCI goals, and program metrics were all part of the discussion mix.

Acknowledged but not discussed at length was the fact that the NSCI might not survive the charged political atmosphere of an election year and might not be supported by the next administration. During Q&A following his earlier presentation Gonzalez Martirena was cautiously optimistic that bipartisan support around national security and national competitiveness issues was possible.

Broadly, the difficulties of democratizing HPC dominated concerns. Buying and building supercomputers for national and academic purposes is a more traveled road where best practices (and stumbling blocks) are better known.

Here is a brief sampling of a few issues raised:

The “ISV Problem”
In a rare show of consensus, many thought enticing ISVs to embrace HPC would be a major hurdle. Indeed, software presents challenges on several fronts – from modernizing code to run on exascale machines to simply making HPC software more widely available to industry were discussed

Unless ISVs sees larger scale HPC as a lucrative market for they won’t have the incentive to scale their software was the general opinion. Consequently, companies who are completely dependent on commercial applications would discover to their movement into the HPC world limited by software availability and cost.

Moreover, NSCI’s seeming intense attention on hardware could become problematic. Throughput, at least for industrial HPC, is far more important than impressive machine specs. Perhaps, suggested one attendee, what’s needed is an X Prize of sorts to incentivize ISVs to go after these ‘world’s hardest’, meaningful work.

The Big Box Syndrome
A far amount of discussion was given to DoE and NSCI’s apparent focus on producing exascale machines. Talking about the early NSCI planning, Sorensen noted, “We talked long and hard about using exacale. It really came down to we don’t need an exascale machine, we need exascale technologies, that could be sitting on someone’s desktop. I remember the day the NSCI came out, the headline in Washington was ‘New Supercomputer’. It’s like no, don’t you understand. We are not talking about the top ten systems anymore; we need to at least deal with 100,000 technical servers out there.”

Certainly academia, national labs, and DoE do care about big machines. One person said these programs always make him wonder is there’s a hidden agenda by “people who just always want to get the faster system and NSCI is sort of being steered in that direction.”

HPC Workforce
The HPC skill shortage is a widely acknowledged problem. Young talent races to the start-up world, not HPC. Several approaches were bandied about, ranging from better use of formal training at the national labs and DoE to creation of new outreach programs. Even so, one attendee also wondered if a small company with limited engineering talent would be able or willing to allow those resources get needed training.

Getting the word out for existing training resources is an issue, said one attendee, who noted DoE doesn’t have a marketing budget per se to alert companies that training is available at centers. “It’s not like a company that has a marketing budget like Intel and IBM that’s going out and telling people all the time about this. That’s probably a barrier to getting the word out about resources are available.”

What Should Success Look Like?
For all its grand goals, the gathering wondered what NSCI success should look like, particularly if the idea was to achieve more than incremental success in economics or science or just build an exascale computer.

Merle Giles, director of Private Sector Programs and Economic Impact at the National Center for Supercomputing Applications, and co-editor of the text, Industrial Applications of High-Performance Computing: Best Global Practices, said “Look at the game-changing events that affected the economy in this country. They were all an order of magnitude of 10X to 100X changes. It was railroad, [etc]. We don’t need to extract those last ten percent of performance of the machine. We need 10X to 100X and we can be really sloppy and still be really good. The 10X to 100x is not just the technology – it’s not exascale that will change the entire nation. It is greater access [to HPC resources] for those who can take advantage of that access.”

In that vein, another attendee added that NSCI is a projection of what was done in the past. What’s needed instead is to fundamentally think differently, saying, “Probably the biggest advantage comes from miniaturization of systems, not the biggest systems.”

One missing element to the entire program, agreed Gonzalez Martirena after his presentation, is more extensive interaction is industry. He showed a slide of responses to the RFI issued by NSF last fall (shown here) indicating roughly 200 academia/national lab responses with just eight from industry. Perhaps industry should form a group of representatives to NSCI RFIworks with NSCI, suggested Gonzalez Martirena to HPCwire.

Sorensen indicated IDC would send along the group’s comments to NASCI and Gonzalez Martirena, who recently moved back from OSTP to his position as program director of the division of physics at NSF. It seemed clear from the breadth of the discussion that the lack of a definite NSCI plan has created a something of vacuum for the HPC community and impatience for more detail.

One attendee offered, “How many people are left in the room, 40? I’d be willing to bet there are 40 different visions about what [NSCI] success looks like. We’re having lots of conversation but I think we are down in the weeds.”

NSCI Resources
NSCI Update: More Work Needed on Budgetary Details and Industry Outreach; http://www.hpcwire.com/2016/03/10/nsci-update-more-work-needed-on-budgetary-details-and-industry-outreach/

Speak Up: NSF Seeks Science Drivers for Exascale and the NSCI; http://www.hpcwire.com/off-the-wire/sc15-releases-latest-invited-talk-spotlight-randal-bryant-and-tim-polk/

HPC User Forum Presses NSCI Panelists on Plans; http://www.hpcwire.com/2015/10/06/speak-up-nsf-seeks-science-drivers-for-exascale-and-the-nsci/

Podcast: Industry Leaders on the Promise & Peril of NSCI; http://www.hpcwire.com/2015/08/27/podcast-industry-leaders-on-the-promise-peril-of-nsci/

New National HPC Strategy Is Bold, Important and More Daunting than US Moonshot; http://www.hpcwire.com/2015/08/06/new-national-hpc-strategy-is-bold-important-and-more-daunting-than-us-moonshot/

White House Launches National HPC Strategy; http://www.hpcwire.com/2015/07/30/white-house-launches-national-hpc-strategy/

President Obama’s Executive Order ‘Creating a National Strategic Computing Initiative’; http://www.hpcwire.com/off-the-wire/creating-a-national-strategic-computing-initiative/

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Simulating Car Crashes with Supercomputers – and Lego

October 18, 2019

It’s an experiment many of us have carried out at home: crashing two Lego creations into each other, bricks flying everywhere. But for the researchers at the General German Automobile Club (ADAC) – which is comparabl Read more…

By Oliver Peckham

NASA Uses Deep Learning to Monitor Solar Weather

October 17, 2019

Solar flares may be best-known as sci-fi MacGuffins, but those flares – and other space weather – can have serious impacts on not only spacecraft and satellites, but also on Earth-based systems such as radio communic Read more…

By Oliver Peckham

Federated Learning Applied to Cancer Research

October 17, 2019

The ability to share and analyze data while protecting patient privacy is giving medical researchers a new tool in their efforts to use what one vendor calls “federated learning” to train models based on diverse data Read more…

By George Leopold

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

NSB 2020 S&E Indicators Dig into Workforce and Education

October 16, 2019

Every two years the National Science Board is required by Congress to issue a report on the state of science and engineering in the U.S. This year, in a departure from past practice, the NSB has divided the 2020 S&E Read more…

By John Russell

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

HPE Extreme Performance Solutions

Intel FPGAs: More Than Just an Accelerator Card

FPGA (Field Programmable Gate Array) acceleration cards are not new, as they’ve been commercially available since 1984. Typically, the emphasis around FPGAs has centered on the fact that they’re programmable accelerators, and that they can truly offer workload specific hardware acceleration solutions without requiring custom silicon. Read more…

IBM Accelerated Insights

How Do We Power the New Industrial Revolution?

[Attend the IBM LSF, HPC & AI User Group Meeting at SC19 in Denver on November 19!]

Almost everyone is talking about artificial intelligence (AI). Read more…

What’s New in HPC Research: Rabies, Smog, Robots & More

October 14, 2019

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

NSB 2020 S&E Indicators Dig into Workforce and Education

October 16, 2019

Every two years the National Science Board is required by Congress to issue a report on the state of science and engineering in the U.S. This year, in a departu Read more…

By John Russell

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Summit Simulates Braking – on Mars

October 14, 2019

NASA is planning to send humans to Mars by the 2030s – and landing on the surface will be considerably trickier than landing a rover like Curiosity. To solve Read more…

By Staff report

Trovares Drives Memory-Driven, Property Graph Analytics Strategy with HPE

October 10, 2019

Trovares, a high performance property graph analytics company, has partnered with HPE and its Superdome Flex memory-driven servers on a cybersecurity capability the companies say “routinely” runs near-time workloads on 24TB-capacity systems... Read more…

By Doug Black

Intel, Lenovo Join Forces on HPC Cluster for Flatiron

October 9, 2019

An HPC cluster with deep learning techniques will be used to process petabytes of scientific data as part of workload-intensive projects spanning astrophysics to genomics. AI partners Intel and Lenovo said they are providing... Read more…

By George Leopold

Optimizing Offshore Wind Farms with Supercomputer Simulations

October 9, 2019

Offshore wind farms offer a number of benefits; many of the areas with the strongest winds are located offshore, and siting wind farms offshore ameliorates many of the land use concerns associated with onshore wind farms. Some estimates say that, if leveraged, offshore wind power... Read more…

By Oliver Peckham

Harvard Deploys Cannon, New Lenovo Water-Cooled HPC Cluster

October 9, 2019

Harvard's Faculty of Arts & Sciences Research Computing (FASRC) center announced a refresh of their primary HPC resource. The new cluster, called Cannon after the pioneering American astronomer Annie Jump Cannon, is supplied by Lenovo... Read more…

By Tiffany Trader

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Chinese Company Sugon Placed on US ‘Entity List’ After Strong Showing at International Supercomputing Conference

June 26, 2019

After more than a decade of advancing its supercomputing prowess, operating the world’s most powerful supercomputer from June 2013 to June 2018, China is keep Read more…

By Tiffany Trader

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

A Behind-the-Scenes Look at the Hardware That Powered the Black Hole Image

June 24, 2019

Two months ago, the first-ever image of a black hole took the internet by storm. A team of scientists took years to produce and verify the striking image – an Read more…

By Oliver Peckham

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System

July 17, 2019

Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated mo Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Quantum Bits: Neven’s Law (Who Asked for That), D-Wave’s Steady Push, IBM’s Li-O2- Simulation

July 3, 2019

Quantum computing’s (QC) many-faceted R&D train keeps slogging ahead and recently Japan is taking a leading role. Yesterday D-Wave Systems announced it ha Read more…

By John Russell

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This