Using HPC Cloud, Researchers Investigate the COVID-19 Lab Leak Hypothesis

By Oliver Peckham

May 27, 2021

At the end of 2019, strange pneumonia cases started cropping up in Wuhan, China. As Wuhan (then China, then the world) scrambled to contain what would, of course, go on to become the worst pandemic in a century, speculation on its origins reached a quick – if hasty – conclusion. The virus, many reasoned, must have come from Wuhan’s wet markets – open-air markets featuring seafood and meat, with Wuhan’s in particular featuring many illicit animals (such as bats and pangolins) known to be major disease vectors, with Chinese bats in particular thought to be the origin of the original SARS virus (now known as SARS-CoV-1).

But what if that’s not what happened? Over the last year, a team of Australian researchers worked to decipher the origins of SARS-CoV-2 using molecular dynamics and cloud-based HPC resources – and the results may lend further plausibility to a less natural scenario for those first COVID-19 infections. In a talk for Virtual ICM Seminars (part of Supercomputing Frontiers Europe 2021), David Winkler (a professor at La Trobe University, Monash University and the University of Nottingham) discussed his team’s work.

Mysterious origins

Over the course of the pandemic – now past the year-and-a-half mark – the so-called “lab leak hypothesis,” arguing that SARS-CoV-2 escaped from the Wuhan Institute of Virology, has gained traction: and, increasingly, legitimate traction. What was once considered a far-flung conspiracy theory (and one that was quickly shot down by the World Health Organization) is being revisited by researchers around the world. Most recently, those flames were stoked by revelations that a group of workers from the Wuhan Institute of Virology fell ill in early November – news that was quickly followed by President Biden ordering the U.S. intelligence apparatus to determine the origins of the deadly virus.

Some, however, have been skeptical of the virus’ origins for much longer. “There are a number of questions that arise,” Winkler said. “How did the disease arise? How did that jump from some other species into men? How did it adapt to men once it was in men?”

At first, he, like many, applied Occam’s razor, suspecting bats. “Bats were probably the most likely candidate, because there’s a bat coronavirus that’s quite similar to the current SARS-CoV-2,” he said. “But there needs to be an intermediate animal to bridge between bats and humans.” Without a viable intermediate animal to serve as a bridge, natural zoonotic transmission would appear far less likely – and lab escape, far more.

Beginning early in 2020, Winkler and others were wrangled by Nikolai Petrovsky, a professor in the College of Medicine and Public Health at Flinders University and chairman and research director of the Australian biotech company Vaxine (which itself has developed an as-yet-unreleased COVID-19 vaccine). “Nick is quite an entrepreneurial guy,” Winkler said, “and he got a team of people together working on molecular dynamics and homology modeling and so forth to try to understand what was going on here.” 

“We wanted to look at this question: where did the virus come from?” Winkler said. “No one really knows. There’s been, I guess, a bit of an active effort to not canvas the escape-from-a-lab situation, but I think we need to look at all possibilities because all of them are feasible.”

The binding process between the spike protein and the ACE2 protein. Image courtesy of the researchers.

The researchers decided to approach the question from the angle of binding affinity – that is, testing how well the spike protein from SARS-CoV-2 binds to the ACE2 receptors of host organisms. This, they reasoned, would indicate how likely it was that the virus infected, say, a bat, and in turn how likely it would be for the infected bat to infect a human.

“What’s the susceptibility of these species that have been implicated as being potential intermediates in the transmission?” Winkler said. “And do we have to worry about other animals like companion animals – dogs, cats, birds – and farm animals – like horses, sheep, chickens – being a reservoir for the disease? And potentially for us to pass the disease to them, and then from them to pass back to us, which would be rather disastrous?”

With these questions in mind, the team got to work.

Working from the ground up

“So, what we wanted to do is to look at the very first interaction of the virus with humans,” Winkler said. There was one issue: at the time, no 3D structure of the spike protein was available. So using a viral structure retrieved from NCBI GenBank Database in January 2020, the team built their own. (Later, when such structures became widely available, theirs held up well.)

The next part of the puzzle was the ACE2 receptor – or rather, ACE2 receptors. “[The] human [ACE2 receptor model] was available, of course, but we wanted to look at a range of other animals,” Winkler said. “So we had to build models of the ACE2 proteins for other species, because there were no crystal structures available.” 

The species at hand included not only humans, bats and pangolins, but also monkeys (due to their similarity to humans), pets (dogs and cats), farm animals (horses and cattle), common lab test subjects (civets, ferrets and mice), snakes (the king cobra) and tigers (some of which had been diagnosed with COVID-19). The researchers built models of the ACE2 proteins for all of them. (Subsequently, some of the crystal structures appeared in the protein data bank and the researchers were able to use those more accurate versions – but they did find their own structures to be highly accurate.)

The HPC of it all

With the structures in hand, the researchers turned to the heavy-duty computational elements of the research: simulating the virus’ spike proteins binding to the slew of ACE2 receptors in a realistic manner that accounted for uncertainties. 

They began with HDOCK, a state-of-the-art protein-protein docking package that performed the initial docking calculations. These were then optimized using the 2020 edition of GROMACS, a molecular dynamics package dating back to the early ‘90s that remains one of most popular tools for MD simulations and has been widely applied during the pandemic to study spike protein interactions.

GROMACS is a heavy-duty tool, and performing the intensive simulations required to test the hypothesis necessitated correspondingly high-powered computing. In that regard, the team was supported by Oracle Cloud, which has a wide range of programs in place to support researchers with computational time and expertise. “We had computational resources very kindly given to us by the Oracle Cloud system and the Oracle corporation,” Winkler said. “They were very generous with their computational time.” Oracle supplied the researchers with (presumably Nvidia-powered) GPU nodes, allowing the team to use the GPU-accelerated version of GROMACS.

First, they ran a production run of 500 nanoseconds, simulating the binding between the spike protein and the human ACE2 protein. Confirming that 50 nanoseconds was enough to capture the convergence process, they then simulated 100 nanoseconds for the remainder of the bindings. These were all put through multiple production runs using various random starting seeds to capture the uncertainties in the calculations and models. (“We did this as carefully as we possibly could,” Winkler stressed. “We ran the calculations multiple times and so forth, trying to estimate the accuracy.”) After all these runs, a GROMACS tool was used to calculate the binding energies.

And the vector is…

“Surprisingly, we found that humans had the highest affinity,” Winkler said. “You would expect the virus – like the flu virus or a coronavirus – to adapt to its host over time and become more tightly bound, but this was a structure that we had from the very, very early part of the pandemic before the virus would have had time to adapt to a human host.”

Affinity (blue) and infectivity (orange) by species. Image courtesy of the researchers.

“Pangolins were a little bit lower,” he continued. “Bat was quite a long way down. That was where people considered the virus originated, and it needed to pass through an intermediate animal, which at this stage most likely to be a pangolin based on these calculations. But there are other things, like [the] snake, which was also considered a potential source of the coronavirus, and it was way, way down on the binding energy – so it seems very unlikely.”

The researchers did their best to vet those affinities, correlating them with infectivity by species where such data was available to establish a “reasonable sort of qualitative correlation between the observed permissivity of infection and degree of infection and the binding energies[.]” And, Winkler said, the available data supported the results.

But those results, he said, lack the kind of decisive answer that might have precluded the lab leak hypothesis.

“We didn’t really expect the human to come out on top,” he said. “Because we would have thought that if the virus had come from a bat, probably that would come out to be the top – possibly a pangolin. So based on this information, you can’t really exclude the possibility that the virus could have escaped from a lab. I’m not saying it did, but those calculations suggest that it could have come from a lab.”

Winkler cautioned that the results – which have remained in preprint for many months after submission to several journals – are not evidence of deliberate manipulation of the virus (so-called “gain of function” research) by Chinese scientists, a farther-flung hypothesis which he said seemed to not have much – if any – supporting evidence. Much more likely, he said, was an accidental release, though the pangolin remained a plausible bridge between bats and humans.

In any case, he said, “further deliberations” were needed to move closer to the truth.

Update: the research has now been published in Scientific Reports. To read it, click here.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, code-named Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from its predecessors, including the red-hot H100 and A100 GPUs. Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. While Nvidia may not spring to mind when thinking of the quant Read more…

2024 Winter Classic: Meet the HPE Mentors

March 18, 2024

The latest installment of the 2024 Winter Classic Studio Update Show features our interview with the HPE mentor team who introduced our student teams to the joys (and potential sorrows) of the HPL (LINPACK) and accompany Read more…

Houston We Have a Solution: Addressing the HPC and Tech Talent Gap

March 15, 2024

Generations of Houstonian teachers, counselors, and parents have either worked in the aerospace industry or know people who do - the prospect of entering the field was normalized for boys in 1969 when the Apollo 11 missi Read more…

Apple Buys DarwinAI Deepening its AI Push According to Report

March 14, 2024

Apple has purchased Canadian AI startup DarwinAI according to a Bloomberg report today. Apparently the deal was done early this year but still hasn’t been publicly announced according to the report. Apple is preparing Read more…

Survey of Rapid Training Methods for Neural Networks

March 14, 2024

Artificial neural networks are computing systems with interconnected layers that process and learn from data. During training, neural networks utilize optimization algorithms to iteratively refine their parameters until Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, code-named Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Houston We Have a Solution: Addressing the HPC and Tech Talent Gap

March 15, 2024

Generations of Houstonian teachers, counselors, and parents have either worked in the aerospace industry or know people who do - the prospect of entering the fi Read more…

Survey of Rapid Training Methods for Neural Networks

March 14, 2024

Artificial neural networks are computing systems with interconnected layers that process and learn from data. During training, neural networks utilize optimizat Read more…

PASQAL Issues Roadmap to 10,000 Qubits in 2026 and Fault Tolerance in 2028

March 13, 2024

Paris-based PASQAL, a developer of neutral atom-based quantum computers, yesterday issued a roadmap for delivering systems with 10,000 physical qubits in 2026 a Read more…

India Is an AI Powerhouse Waiting to Happen, but Challenges Await

March 12, 2024

The Indian government is pushing full speed ahead to make the country an attractive technology base, especially in the hot fields of AI and semiconductors, but Read more…

Charles Tahan Exits National Quantum Coordination Office

March 12, 2024

(March 1, 2024) My first official day at the White House Office of Science and Technology Policy (OSTP) was June 15, 2020, during the depths of the COVID-19 loc Read more…

AI Bias In the Spotlight On International Women’s Day

March 11, 2024

What impact does AI bias have on women and girls? What can people do to increase female participation in the AI field? These are some of the questions the tech Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Analyst Panel Says Take the Quantum Computing Plunge Now…

November 27, 2023

Should you start exploring quantum computing? Yes, said a panel of analysts convened at Tabor Communications HPC and AI on Wall Street conference earlier this y Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Training of 1-Trillion Parameter Scientific AI Begins

November 13, 2023

A US national lab has started training a massive AI brain that could ultimately become the must-have computing resource for scientific researchers. Argonne N Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire