The relative rarity of data from major earthquakes is a double-edged sword: on one hand, it means that major earthquakes are relatively rare, which is good; on the other, it means that it is more difficult to learn from the past to better-anticipate and understand future major earthquakes, which is bad. To tackle this data scarcity problem, researchers from the Southern California Earthquake Center (SCEC) at the University of Southern California used a pair of massive supercomputers to gaze 800,000 years into California’s seismic past.
“We haven’t observed most of the possible events that could cause large damage,” explained Kevin Milner, a computer scientist at the SCEC, in an interview with TACC’s Aaron Dubrow. “Using Southern California as an example, we haven’t had a truly big earthquake since 1857 – that was the last time the southern San Andreas broke into a massive magnitude 7.9 earthquake. A San Andreas earthquake could impact a much larger area than the 1994 Northridge earthquake, and other large earthquakes can occur too. That’s what we’re worried about.”
The researchers used a new framework combining a prototype earthquake simulator called RSQSim and another code called CyberShake. In tandem, the codes can simulate hundreds of thousands of years of earthquake history while calculating each quake’s amount of shaking. “For the first time, we have a whole pipeline from start to finish where earthquake occurrence and ground-motion simulation are physics-based,” Milner said. “It can simulate up to hundreds of thousands of years on a really complicated fault system.”
To run this advanced framework for such a long time-scale, the researchers turned to a series of heavy-hitting supercomputers over the course of several years. These included the Blue Waters at the National Center for Supercomputing Applications (NCSA); Frontera at the Texas Advanced Computing Center (TACC), which placed ninth on the most recent Top500 list; and Summit at Oak Ridge National Laboratory (ORNL), which placed second.
“We’ve made a lot of progress on Frontera in determining what kind of earthquakes we can expect, on which fault, and how often,” said Christine Goulet, the SCEC’s executive director for Applied Science, also involved in the work. “We don’t prescribe or tell the code when the earthquakes are going to happen. We launch a simulation of hundreds of thousands of years, and just let the code transfer the stress from one fault to another.”
Overall, the project used around eight days of continuous computing on Frontera across 3,500 processors and a similar amount of computing on Summit. The result: one of the largest earthquake simulation catalogs ever produced. Based on the results, the researchers believe that the catalog represents a reasonable accurate facsimile of the past, and they look forward to using the results to help predict where earthquakes will occur in California’s future.
To read the article from TACC’s Aaron Dubrow discussing this research, click here.