If you’re not worried enough by the ongoing pandemic or rapidly accelerating climate change, you’re in luck: you can worry about space weather, too. In the interim since the last major space weather event – a massive solar flare in 1859 that fried telegraph machines – severe space weather has become a much more concerning prospect, as the world has progressed into deeper and deeper reliance on electronic systems that could be disrupted by such an event. Now, researchers at the University of Michigan are leveraging supercomputing to improve space weather forecasting.
“There are only two natural disasters that could impact the entire U.S.,” said Gabor Toth, a professor of climate and space science at the University of Michigan, in an interview with Aaron Dubrow for the Texas Advanced Computing Center (TACC). “One is a pandemic and the other is an extreme space weather event. … We have all these technological assets that are at risk. If an extreme event like the one in 1859 happened again, it would completely destroy the power grid and satellite and communications systems – the stakes are much higher.”

Toth works on the foremost space weather prediction model, which is simply called the Geospace Model. The Geospace Model, which operates under the auspices of NASA’s Space Weather with Quantified Uncertainties (SWQU) program, simulates the magnetohydrodynamics surrounding Earth, predicting planetside disturbances based on the solar winds. The Geospace Model was updated to version 2.0 in February of this year.
“We’re constantly improving our models,” Toth said. The new model replaces version 1.5 which has been in operations since November 2017. “The main change in version 2 was the refinement of the numerical grid in the magnetosphere, several improvements in the algorithms and a recalibration of the empirical parameters.”
Even after the update, the Geospace Model is only capable of providing around half an hour’s warning – and that’s where supercomputing comes in, with Toth’s team aiming to improve that lead time to days, rather than minutes.
The sun averages a distance of around 93 million miles from Earth; but right now, Toth said, the team is using data from a satellite measuring plasma just a million miles away. By supplementing that data with remote observation of the Sun itself using advanced algorithms, the team hopes to make large strides. The team has made similar improvements in the past, such as combining the kinetic and fluid aspects of plasma into a single simulation model. For this algorithmic work, they’ve turned to TACC’s Frontera, a 23.5 Linpack petaflops system. “Without Frontera, I don’t think we could do this research,” Toth said.
Currently, the team is also working on getting the Geospace Model to run better on modern, heterogeneous supercomputers. They recently ported the Geospace Model to GPUs using Nvidia’s Fortran compiler, allowing them to run the full model faster than real time – and faster than 100 CPU cores – using a single GPU on TACC’s Longhorn system. “It took a whole year of code development to make this happen,” Toth said.
Making the Geospace Model lightweight and efficient is key to the next step of the team’s plan.
“The goal is to run an ensemble of simulations fast and efficiently to provide a probabilistic space weather forecast,” Toth said. “Should we worry in Michigan or only in Canada? What is the maximum induced current particular transformers will experience? How long will generators need to be shut off? To do this accurately, you need a model you believe in. Whatever we predict, there’s always some uncertainty. We want to give predictions with precise probabilities, similar to terrestrial weather forecasts.”
To learn more about this research, read the reporting from TACC’s Aaron Dubrow here.