HPC User Forum Presses NSCI Panelists on Plans

By John Russell

September 17, 2015

In less than two months, the National Strategic Computing Initiative (NSCI) Executive Council must present its implementation plan. Just what that will look like remains a mystery but budgets, governance, and private-public partnering models were on the minds of attendees to last week’s HPC User Forum in Broomfield, CO, where the first public panels with NSCI agencies offered a wide-ranging glimpse into agency thinking. It was also a chance for HPC industry execs to press for more details.

At the moment industry enthusiasm for NCSI is high and panelists strove to reinforce that goodwill and reassure attendees that disabling missteps could be avoided. On technology issues – despite differences around the edges – there was broad agreement among the panelists and attendees on problems needing solving, with large-scale data analytics as the new but important kid on the block. Governance and collaboration challenges drew a more wary response from the audience, although panelists insisted conflicts would be amicably and equitably dealt with.

In the end conversation around non-technology issues was the most revealing with many attendees wondering what potential obstacles the NSCI panelists foresaw. Funding, perhaps not surprisingly, was a touchy topic particularly given the many calls that NSCI should emulate the U.S. Apollo program, which galvanized public opinion and loosened federal purse strings.

Playing devil’s advocate, Barry Bolding, Cray (NASDAQ: CRAY) senior vice president and chief strategy officer, said to panelists during Q&A, “I’d like to push the panel in the direction of pitfalls a little bit and hear about things you think could be gotchas. You’ve mentioned the space program a few times and how this might be a corollary and one can look at the space program and all of us agree it benefited the country a great deal. [But] one can also look at it and say, oh it’s been 45 years and we’re only beginning to have a vision for unmanned space program and only beginning to get private industry into space programs. So it wasn’t very successful.”

Put another way, what did we get for the money and given the times, can we get those kinds of budgets going forward.

Randy Bryant, OSTP
Randy Bryant, OSTP

This drew a pragmatic response from OSTP representative Randy Bryant: “It’s hard to get significant federal funding output [now]. Flat is the new normal; that’s true across the entire research budget and I see that as a core problem. The Apollo program was a great program but it consumed a significant fraction of the U.S. GDP. It was a huge investment, and I don’t anticipate in our current budget climate that would be possible,” he said. “It helped that there was an existential threat of the Soviet Union at the time. I don’t see anything that’s going to make us step up at that level.”

Rob Leland, one of the original organizers of the NSCI proposal and a representative from Sandia National Laboratories, countered by saying the real gotcha is to not sufficiently fund and execute NSCI.

“The US used to dominate investment in this space quite dramatically. In fact up until about 2010 or so U.S. investment was equal to the rest of the world combined and [it’s] now about a 1/3 of the total investment. More worrying is the disparity in growth rates. The U.S. growth rate in investment is about 2.5% [while] the average for the rest of the world that is engaged in this space is about 12 % and I think China is up to about 23%.

“If that disparity persists for five or 10 years we will not dominate this space technologically the way we have previously,” Leland said.

Robert Leland, Sandia NL
Robert Leland, Sandia NL

If that wasn’t compelling enough, Leland emphasized the “erosion of Moore’s Law” has upped the ante in HPC competition. “If we don’t rally effectively as a society around that challenge the technical path forward is very unclear. I think there are also good indicators we’re [also] coming to the end of the MPP era and so if we don’t make a transition to some new architecture approach, I think we will be on a path of less relevance.”

In many ways, the event marked the beginning of NSCI’s public outreach to industry in a program, which among other things, is designed to energize public-private partnering for the good of both. There were two panels: 1) US Plans for Advancing HPC: Potential Implications of the White House Executive Order and NSCI, and 2) Open Forum Discussion and Q&A of the NSCI Plans and Directions.

The roster of panelists was impressive: Bryant, OSTP; Irene Qualters, National Science Foundation; Doug Kothe, Oak Ridge National Laboratory; Will Koella, DOD/NASA; Bert Still, Lawrence Livermore National Laboratory; Piyush Mehrotra, NASA Advanced Supercomputing Division; Bill Kramer, NCSA; Nathan Baker, Pacific Northwest National Laboratory; and Leland, Sandia. Bob Sorenson of IDC moderated both panels.

As a rule, panelists directed their comments to one or another of NSCI’s five strategic objectives excerpted from the Executive Order here:

  1. Accelerating delivery of a capable exascale computing system that integrates hardware and software capability to deliver approximately 100 times the performance of current 10 petaflop systems across a range of applications representing government needs.
  2. Increasing coherence between the technology base used for modeling and simulation and that used for data analytic computing.
  3. Establishing, over the next 15 years, a viable path forward for future HPC systems even after the limits of current semiconductor technology are reached (the “post- Moore’s Law era”).
  4. Increasing the capacity and capability of an enduring national HPC ecosystem by employing a holistic approach that addresses relevant factors such as networking technology, workflow, downward scaling, foundational algorithms and software, accessibility, and workforce development.
  5. Developing an enduring public-private collaboration to ensure that the benefits of the research and development advances are, to the greatest extent, shared between the United States Government and industrial and academic sectors.

Not surprisingly panelists’ comments largely reflected their specific agency missions – this was helpful in making clear there are some differing priorities. To a large extent the list of technology issues tackled was very familiar to anyone in HPC: the end of Moore’s Law; death of single thread performance; power management; a need for higher fidelity models; the flood of data from scientific and other instruments and sensors; code modernization; and future computing (quantum, neuromorphic, et. al). You get the picture.

The problems are plentiful and solutions scarce, but that’s kind of the point of NSCI. Big Data and code modernization took somewhat center stage. Here are a few examples:

  • Kothe (ORNL) pushed the importance of codesign citing ongoing work: “In DOE there are three of those centers and I know the NNSA labs are heavily involved in that activity. In the ECI (Exascale Computing Initiative, DOE, and being somewhat subsumed into NSCI) project we see that activity continuing and growing. It’s critical because we are looking at some fairly substantial challenges at least on the applications side, the scariest are the deep memory hierarchies, probably more so than hybrid floating point.”
  • Mehrotra (NASA) locked in on big data issues: “We are very interested in the convergence of data analytics with HPC. Our satellites produce petabytes of data every year streaming down. This is observational data. How do we handle that data, how do we manage that data, and then how do we actually extract any knowledge out of that in conjunction with not just observational data [but with] model data. We are very concerned about how to bring the two environments together so that we can do quantitative simulation along with large-scale data analytics.”
  • Still (LLNL) sounded a familiar note on code modernization, “You’ve heard the acronym IC used for Intelligence Community, we used it in a slightly different way for integrated code. The gist is our IC [effort] is multi-million lines of work across the three labs within the NNSA; it’s kind of a $6B investment in code. We can’t rewrite it overnight and take advantage of each new architecture that shows up because the codes are decade-old type codes. We have to make modifications or reengineer one and revalidate. So performance portability is an absolute key inside the NCSI. We are all about trying to make useable machines. That is a key component as far as we’re concerned.”
  • Baker (PNNL) offered: “The amount of data that comes off a big instrument is too high a bandwidth to even write out to a box. So you’ve got a ‘baby and bathwater’ conundrum. We spend billion of dollars looking for rare particles and yet the data is coming out at a rate that we may have to triage, we may lose what you’re looking for. How do you design robust algorithms that can handle that? How do you design algorithms that can detect what you need to detect and although you’d love to keep all the data, triage what you have to triage?”

Transferring NSCI-generated technology advances to industry got perhaps less shrift than one would think. Kramer (NCSA) strongly suggested that HPC-as-a-service must be a necessary component of any realistic approach to induce widespread use of HPC technology by most of industry.

“We’re talking about the management of IP and partnerships and relationships very seriously,” said Koethe (ORNL) “and that scope is probably not as deep and broad as it should be. I neglected to show our structure which call for councils – industry council, science council, board of directors. Not that boxology fixes everything but I think at least we’re implementing lessons learned and best practices from past projects.”

Attendees dedicated a fair amount of discussion to coordination challenges within the NSCI organization framework. Competition among government agencies for power and budget is hardly rare. Diverse suggestions ranging from close coordination, loose coordination, sole agency lead, collective agency lead and others were all raised at some point.

Irene Qualters, NSF
Irene Qualters, NSF

Qualters (NSF) said simply, “I think this is a very aggressive program and there’s not one path forward. I think one has to be careful. One wants a fair amount of innovation at this stage and diversity. So there can be coordination but I think the [idea] that you just have everyone marching in one line is wrong too.”

What came through is a hunger for a good model for success. This is a sprawling program – and those have been tackled before (e.g. Large Hadron Collider) and produced many lessons. One audience member looked back to the rural electrification project in the late 1930s as a good model, particularly from a public-private partnership perspective. It was long-lived and worked. The HPC initiative (High Performance Computing Act 1991) in the early 90s seemed to be the most favored.

As an early architect of the NSCI directive, Leland offered this: “I think there is an excellent analog in the HPC initiative in the early 1990s that is generally viewed as quite successful and I think we can hope to replicate that success. [If] you look at history I think each major new era in computing has been preceded by 5-7 years by a forward looking investment by the government in R&D. [The pattern can be] traced that back at least five cycles. I think that can be true again here. There are many indicators that we are approaching a wall and need to make a substantial jump in our capabilities and a change in our approach. I think all the preconditions are here for us to replicate that history once again with a sixth cycle.”

It will be interesting to see how opinions shift once the implementation plan comes out.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

A Big Memory Nvidia GH200 Next to Your Desk: Closer Than You Think

February 22, 2024

Students of the microprocessor may recall that the original 8086/8088 processors did not have floating point units. The motherboard often had an extra socket for an optional 8087 math coprocessor. The math coprocessor ma Read more…

IonQ Reports Advance on Path to Networked Quantum Computing

February 22, 2024

IonQ reported reaching a milestone in its efforts to use entangled photon-ion connectivity to scale its quantum computers. IonQ’s quantum computers are based on trapped ions which feature long coherence times and qubit Read more…

Apple Rolls out Post Quantum Security for iOS

February 21, 2024

Think implementing so-called Post Quantum Cryptography (PQC) isn't important because quantum computers able to decrypt current RSA codes don’t yet exist? Not Apple. Today the consumer electronics giant started rolling Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to derive any substantial value from it. However, the GenAI hyp Read more…

QED-C Issues New Quantum Benchmarking Paper

February 20, 2024

The Quantum Economic Development Consortium last week released a new paper on benchmarking – Quantum Algorithm Exploration using Application-Oriented Performance Benchmarks – that builds on earlier work and is an eff Read more…

AWS Solution Channel

Shutterstock 2283618597

Deep-dive into Ansys Fluent performance on Ansys Gateway powered by AWS

Today, we’re going to deep-dive into the performance and associated cost of running computational fluid dynamics (CFD) simulations on AWS using Ansys Fluent through the Ansys Gateway powered by AWS (or just “Ansys Gateway” for the rest of this post). Read more…

Atom Computing Reports Advance in Scaling Up Neutral Atom Qubit Arrays

February 15, 2024

The scale-up challenge facing quantum computing (QC) is daunting and varied. It’s commonly held that 1 million qubits (or more) will be needed to deliver practical fault tolerant QC. It’s also a varied challenge beca Read more…

A Big Memory Nvidia GH200 Next to Your Desk: Closer Than You Think

February 22, 2024

Students of the microprocessor may recall that the original 8086/8088 processors did not have floating point units. The motherboard often had an extra socket fo Read more…

Apple Rolls out Post Quantum Security for iOS

February 21, 2024

Think implementing so-called Post Quantum Cryptography (PQC) isn't important because quantum computers able to decrypt current RSA codes don’t yet exist? Not Read more…

QED-C Issues New Quantum Benchmarking Paper

February 20, 2024

The Quantum Economic Development Consortium last week released a new paper on benchmarking – Quantum Algorithm Exploration using Application-Oriented Performa Read more…

The Pulse of HPC: Tracking 4.5 Million Heartbeats of 3D Coronary Flow

February 15, 2024

Working in Duke University's Randles Lab, Cyrus Tanade, a National Science Foundation graduate student fellow and Ph.D. candidate in biomedical engineering, is Read more…

It Doesn’t Get Much SWEETER: The Winter HPC Computing Festival in Corpus Christi

February 14, 2024

(Main Photo by Visit Corpus Christi CrowdRiff) Texas A&M University's High-Performance Research Computing (HPRC) team hosted the "SWEETER Winter Comput Read more…

Q-Roundup: Diraq’s War Chest, DARPA’s Bet on Topological Qubits, Citi/Classiq Explore Optimization, WEF’s Quantum Blueprint

February 13, 2024

Yesterday, Australian start-up Diraq added $15 million to its war chest (now $120 million) to build a fault tolerant computer based on quantum dots. Last week D Read more…

2024 Winter Classic: Razor Thin Margins in HPL/HPCG

February 12, 2024

The first task for the 11 teams in the 2024 Winter Classic student cluster competition was to run and optimize the LINPACK and HPCG benchmarks. As usual, the Read more…

2024 Winter Classic: We’re Back!

February 9, 2024

The fourth edition of the Winter Classic Invitational Student Cluster Competition is up and running. This year, we have 11 teams of eager students representin Read more…

CORNELL I-WAY DEMONSTRATION PITS PARASITE AGAINST VICTIM

October 6, 1995

Ithaca, NY --Visitors to this year's Supercomputing '95 (SC'95) conference will witness a life-and-death struggle between parasite and victim, using virtual Read more…

SGI POWERS VIRTUAL OPERATING ROOM USED IN SURGEON TRAINING

October 6, 1995

Surgery simulations to date have largely been created through the development of dedicated applications requiring considerable programming and computer graphi Read more…

U.S. Will Relax Export Restrictions on Supercomputers

October 6, 1995

New York, NY -- U.S. President Bill Clinton has announced that he will definitely relax restrictions on exports of high-performance computers, giving a boost Read more…

Dutch HPC Center Will Have 20 GFlop, 76-Node SP2 Online by 1996

October 6, 1995

Amsterdam, the Netherlands -- SARA, (Stichting Academisch Rekencentrum Amsterdam), Academic Computing Services of Amsterdam recently announced that it has pur Read more…

Cray Delivers J916 Compact Supercomputer to Solvay Chemical

October 6, 1995

Eagan, Minn. -- Cray Research Inc. has delivered a Cray J916 low-cost compact supercomputer and Cray's UniChem client/server computational chemistry software Read more…

NEC Laboratory Reviews First Year of Cooperative Projects

October 6, 1995

Sankt Augustin, Germany -- NEC C&C (Computers and Communication) Research Laboratory at the GMD Technopark has wrapped up its first year of operation. Read more…

Sun and Sybase Say SQL Server 11 Benchmarks at 4544.60 tpmC

October 6, 1995

Mountain View, Calif. -- Sun Microsystems, Inc. and Sybase, Inc. recently announced the first benchmark results for SQL Server 11. The result represents a n Read more…

New Study Says Parallel Processing Market Will Reach $14B in 1999

October 6, 1995

Mountain View, Calif. -- A study by the Palo Alto Management Group (PAMG) indicates the market for parallel processing systems will increase at more than 4 Read more…

Leading Solution Providers

Contributors

CORNELL I-WAY DEMONSTRATION PITS PARASITE AGAINST VICTIM

October 6, 1995

Ithaca, NY --Visitors to this year's Supercomputing '95 (SC'95) conference will witness a life-and-death struggle between parasite and victim, using virtual Read more…

SGI POWERS VIRTUAL OPERATING ROOM USED IN SURGEON TRAINING

October 6, 1995

Surgery simulations to date have largely been created through the development of dedicated applications requiring considerable programming and computer graphi Read more…

U.S. Will Relax Export Restrictions on Supercomputers

October 6, 1995

New York, NY -- U.S. President Bill Clinton has announced that he will definitely relax restrictions on exports of high-performance computers, giving a boost Read more…

Dutch HPC Center Will Have 20 GFlop, 76-Node SP2 Online by 1996

October 6, 1995

Amsterdam, the Netherlands -- SARA, (Stichting Academisch Rekencentrum Amsterdam), Academic Computing Services of Amsterdam recently announced that it has pur Read more…

Cray Delivers J916 Compact Supercomputer to Solvay Chemical

October 6, 1995

Eagan, Minn. -- Cray Research Inc. has delivered a Cray J916 low-cost compact supercomputer and Cray's UniChem client/server computational chemistry software Read more…

NEC Laboratory Reviews First Year of Cooperative Projects

October 6, 1995

Sankt Augustin, Germany -- NEC C&C (Computers and Communication) Research Laboratory at the GMD Technopark has wrapped up its first year of operation. Read more…

Sun and Sybase Say SQL Server 11 Benchmarks at 4544.60 tpmC

October 6, 1995

Mountain View, Calif. -- Sun Microsystems, Inc. and Sybase, Inc. recently announced the first benchmark results for SQL Server 11. The result represents a n Read more…

New Study Says Parallel Processing Market Will Reach $14B in 1999

October 6, 1995

Mountain View, Calif. -- A study by the Palo Alto Management Group (PAMG) indicates the market for parallel processing systems will increase at more than 4 Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire