Almost exactly a year ago, the White House announced an ambitious collaboration across academia, government and private business: the COVID-19 HPC Consortium. Over the intervening 12 months, the consortium brought together 43 high-profile members, 600 petaflops of compute power and extensive expertise to provide 98 teams across 17 countries with the computational resources they required to fight COVID-19 through cutting-edge research. “And all of it was achieved without exchange of money or contracts, but rather simply thanks to the donation of time, talent and determination,” said Dario Gil, director of IBM Research. “The consortium is proof that we were able to act fast and act together.”

But even with the pandemic waning, the consortium is still meeting weekly – and the individuals and institutions behind the massive effort aren’t resting on their laurels. In the one-year update for the COVID-19 HPC Consortium, many of them spent their time throwing their support behind an even more ambitious, longer-term project: a National Strategic Computing Reserve (NSCR).
What would the NSCR look like?
“Now, it’s time to go even further,” Gil said. “We should use the lessons we’ve learned so far thanks to the consortium, the knowledge and the experience gained this past year, to address future global crises. We should create a broader international organization exactly for that. … A few months ago, building on the consortium’s foundation, we started developing [the NSCR].”
The NSCR, Gil explained, would not serve a specific class of crisis. Rather, it would be a crisis catch-all, ready to spin up at a moment’s notice to serve any urgent need in the public interest. “Computing is a core element of so many important capabilities. It’s essential to properly respond to public crises, to ensure public health and safety and to protect critical resources and infrastructure,” Gil said, offering space missions, hurricanes, earthquakes, oil spills, wildfires, and yes, pandemics as examples of potential applications for the NSCR.
In a panel discussion during the one-year update, much of the conversation revolved around the idea of the NSCR.
“It’s clear from the COVID-19 experience … that computing and data analysis will play an increasingly important role in addressing future national emergencies, whether they will be pandemics or other events,” said Manish Parashar, office director of the Office of Advanced Cyberinfrastructure at the National Science Foundation (NSF). “Our national advanced computing infrastructure that spans academia, government, industry and non-profits, can be a strategic national asset and can serve as an important tool in our response to these events if it can be mobilized and made available to researchers quickly and in an agile way as we did in response to COVID-19.”
The NSCR, Parashar explained, was not only necessary to take advantage of opportunities in computing, but also to address problems that have arisen during the operation of the COVID-19 HPC Consortium.
“The consortium has also taught us that using the ad-hoc structural processes – as we did in the case of the HPC consortium due to the urgency of the situation – without longer-term planning can have some undesired impacts,” Parashar said. Researchers, he elaborated, had to suddenly backburner other important science and engineering projects, delaying advances in the broader research ecosystem and impacting U.S. competitiveness. Furthermore, many of the participating researchers had to work overtime during an already psychologically stressful pandemic, wearing them thin.

What would the NSCR need to succeed?
The NSCR has already moved out of the purely conceptual stages – if just barely. Just before Christmas, the NSF and the White House Office of Science and Technology Policy (OSTP) issued a Request for Information (RFI) on “Potential Concepts and Approaches for a National Strategic Computing Reserve (NSCR).” “The NSCR,” the RFI reads, “may be envisioned as a coalition of experts and resource providers that could be mobilized quickly to provide critical computational resources (including compute, software, data, and technical expertise) in times of urgent need.”
The RFI received several responses, Parashar revealed in the panel. “The responses consistently and very strongly expressed support for this concept of a National Strategic Computing Reserve, as well as its potential positive impacts,” he said, adding that they highlighted the need for well-defined operations, as well as strong governance and oversight structures. “For example, we need clearly defined processes for activating the reserve and for returning to normal operations when the role of the reserve has completed, right?”
The panelists acknowledged that the NSCR will need to be carefully structured to both maximize benefits and minimize pitfalls. Parashar, for his part, highlighted the importance of working out data-sharing and IP agreements ahead of time to get in front of any potential delays imposed by related concerns at the onset of a crisis. Pat Falcone, deputy director for science and technology at Lawrence Livermore National Laboratory (LLNL), also spoke to battle-readiness, emphasizing the need for any NSCR to have both the commitment and the resources to practice: “Rehearsals matter. Practice matters,” she said.
Kelvin Droegemeier, who directed the OSTP through the pandemic until January, introduced two points for consideration: first, the general need for the NSCR to minimize bureaucracy to expedite innovation; second, the need for clear, consistent messaging from a “single authoritative source” to minimize disinformation and confusion.
Finally, Geralyn Miller, senior director of the AI for Good Research Lab at Microsoft, stressed the importance of building a more proactively inclusive consortium through diversity in the types of proposals and the people and institutions submitting them.
“Maybe there is a way in the future to provide a little more upfront formal mentoring for the technical review process,” she said. “We might have a diverse set of proposals coming in from people – for example, early-career PIs – or it could be people who might not typically apply for this type of research, and helping them shore up that proposal … not only helps us review it properly but also helps the investigator, as well.”
A way to rebuild
“This consortium is a realization of my long-held belief that when the world needs us, the science and technology community will be there to help,” said Maria Zuber, vice president for research at MIT, adding: “[The NSCR] is one of the many ways that we can address President Biden’s stated intention to rebuild trust in science and ensure the power of science is at the forefront of America build back better.”
In closing, Gil compared the idea of the NSCR to institutions like DARPA and NASA that emerged from the stressors of the Cold War. “Perhaps also in the context of these crises, we will see evolving institutions in science and technology – and … [that] could be the very beginning of what we’re trying to create here together.”