COVID-19 isn’t over – not even close. With about six months until broad vaccine distribution is expected, the world will likely face a long, difficult winter before the pandemic begins to truly wane. Still, with the anniversary of the pandemic swiftly approaching, many in the scientific community are beginning to look back on the early months of COVID-19 to examine what worked (and what didn’t) when the world scrambled to respond to the global crisis. At SC20, four panelists gathered to discuss the roles HPC has played in this pandemic – and what will need to change for advanced computing to respond more successfully to the next one.
The plenary session — titled “Advanced Computing and COVID-19: It’s More Than HPC” — featured four distinguished panelists.
Rommie Amaro is a professor and endowed chair of chemistry and biochemistry at the University of California, San Diego. She studies computational methods in biophysics, including work to develop the first complete all-atom model of SARS-CoV-2’s viral envelope.
Alessandro “Alex” Vespignani is a professor of physics and the director of the Network Science Institute at Northeastern University. Vespignani describes his technosocial research on COVID-19 as “what is done by numerical weather forecasting, but in the area of infectious diseases.”
Ilkay Altintas is a data and computer scientist who serves as the chief data science officer of the San Diego Supercomputer Center (SDSC). Altintas is also the director of the WIFIRE Lab, which uses data science to fight wildfires – which she likens to fighting COVID-19.
Rick Stevens is associate lab director for computing, environment and life sciences at Argonne National Laboratory. Stevens has been applying machine learning techniques to drug discovery for COVID-19 in an effort to compress the early stage of drug development, which normally take years to complete.
HPC rose to the occasion…
HPC, of course, has been crucial to fighting the pandemic. As early as the final months of 2019, computing systems were alerting researchers and policy-makers to the rise of the coronavirus; over the following year, the world’s supercomputers and cloud systems moved with common purpose to understand SARS-CoV-2, stem its spread and quickly develop therapeutics and vaccines to fight its progression in the human body.
The panelists spoke to their personal experiences with this fight. Amaro’s all-atom model, for instance, will help identify crucial drug targets on the virus’ spike protein by modeling its glycan shield; Vespignani has been working to model the spread of COVID-19 using commuter network data; Altintas launched the TemPredict project, which is using wearable health monitors to create an early alert system; and Stevens’ machine learning approach has identified 40 promising molecules that are currently undergoing further experimental analysis.
… but experimental bottlenecks stood in the way.
“The rate-limiter step right now isn’t our ability to train machine learning models,” Stevens said. “It’s really the experimental process at the other end. We’re able to make a lot of predictions much faster than we can assay them, and certainly much faster than we can get compounds.”
The other panelists agreed. Offering a “sobering perspective,” Vespignani lamented delays in data reporting and generally low granularity that had limited the efficacy of high-performance approaches to data analysis. “It’s really the real-world data that is holding us back,” he said.
“There really are a lot of challenges,” Amaro added, saying that “good structural starting data points” were necessary, and that data collection processes and workflows could be better optimized.
Stevens went a step further: automated data acquisition; automated experiments; automated chemical synthesis – and more. “Alex [Vespignani] talks about delays in reporting data – well, why the hell aren’t we having mechanisms that can collect this data automatically?” he exclaimed. “Why do people have to be in that loop?”
“… why the hell aren’t we having mechanisms that can collect this data automatically?”
“I think it’s important to recognize that there are many ways we can collect data today,” Altintas said, citing solutions ranging from social media analysis to sensors embedded in infrastructure, like sewage monitoring systems that can detect SARS-CoV-2. “I think the solution to data collection will come from integrating different streams of data into a knowledge environment that many types of questions can be informed with. … It all scales from the atom to the person to the societal level.”
Worries – and hopes – remain for the future
Even with this clear vision of the future in mind, the panelists worried about whether or not it could be realized in time for the next global crisis.
“One of the things that sort of does keep me up at night is: how do we help to make these sort of cultural swings?” Amaro said. “Is it with these sort of leading-edge examples? … Is there another way that we help to catalyze the social construct change?”
The panelists offered their ideas of the roadblocks standing in the way of these transformative changes. Ilkay said that some of it was legal, with impediments stemming from the challenges and responsibilities that come with data handling. Stevens mused on whether it would be better to work outside the healthcare system (particularly in the U.S.) to avoid trudging through the bureaucracy of health data management. Amaro cited the resistance she had experienced from the federal government on coordinating small molecule discovery on a national level, and expressed hope that such an effort might be revisited under another administration.
Some of the panelists also expressed concern that the wartime lessons of COVID-19 would not be retained by a peacetime world: would the infrastructure built during COVID-19 be dismantled after some years without another pandemic? Would governments invest in the additional policies, training and infrastructure necessary to respond quickly to the next health crisis?
“This is our preparation for SARS 3.”
“This is our preparation for SARS 3,” Amaro said. “This is our preparation for when we lose antibiotics, this is our preparation for when it’s so hot that everything starts to die. We’re going to have to adjust to working in super-high-stress, super-integrated environments.”
“We’re gonna have this crisis of domain expertise,” Stevens said. “We’re gonna get lots and lots of datasets, and we’ll have CS people, and architectures that can do AI, but we’re gonna have this rate-limiting step, which is gonna be … actually understanding anything.”
While the future remains disquietingly unclear, the aspiration couldn’t have been clearer.
“We have to push into the experimental space to build a balanced ecosystem where the simulation, the AI and the experiments are gonna come along at kind of the same rate,” Stevens insisted. “It’s the business, of course, of SC to dream these things.”