Simulating Earthquakes for Science and Society

By Paul Tooby

January 27, 2006

Earthquakes are a fact of life in California. The southern part of the major San Andreas fault, however, has not seen a major earthquake since about 1690, and the accumulated movement may now amount to as much as six meters — setting the stage for an earthquake as large as magnitude 7.7 — the “big one.”

To understand the basic science of earthquakes and to help engineers better prepare for such an event, scientists want to identify which regions are likely to experience the most intense shaking, particularly in the populated sediment-filled basin of Los Angeles and similar areas in Southern California and northern Mexico. This understanding can be used to improve building codes in high-risk areas and to help engineers design safer structures, potentially saving lives and property.

Instantaneous movement in the fault-parallel x direction, 110 seconds after the start of the northwest-moving rupture on the San Andreas Fault near the Salton Sea.But the challenges in modeling earthquakes are daunting. Accurate simulations must span an enormous range of scales, from meters near the earthquake source to hundreds of kilometers across the entire region, and time scales from hundredths of a second — to capture the higher frequencies which have greatest impact on buildings — to hundreds of seconds for the full event. Adding to the challenge, ground motion from earthquake waves is strongly influenced by the complex 3-D subsurface structure of the soil, which is not fully known and scientists can observe only indirectly.

Now, based on previous simulations at the San Diego Supercomputer Center (SDSC), earthquake scientists from the Southern California Earthquake Center/Community Modeling Environment (SCEC/CME) have run enhanced simulations at SDSC using the improved TeraShake 2 earthquake model. The new simulations, which used the Anelastic Wave Model (AWM), a fourth-order finite difference code developed by Kim Olsen, associate professor of geological sciences at San Diego State University (SDSU), are the most realistic yet of where the most intense ground motion may occur in Southern California during a magnitude 7.7 San Andreas Fault earthquake.

An important scientific goal of SCEC scientists including the TeraShake research group is to improve model realism by incorporating more fundamental physics into earthquake simulations. “To make the TeraShake 2 simulations more realistic, we added a new physics-based dynamic rupture component to the simulation, run at very high 100 m resolution, to create the earthquake source description for the San Andreas Fault,” said Steven Day, professor of geological sciences at SDSU.  The dynamic rupture component models friction-based slip on the fault surface. This is more physically realistic than the kinematic source descriptions used previously, which were adapted from recorded earthquake data.

Using the results of the improved rupture simulation developed by Olsen and Day as the earthquake source, the researchers modeled the 3-D velocity throughout the volume and surface of the simulated region. To fully capture this large-scale natural event, the researchers needed to encompass the entire region in their model – a slab 600 kilometers long by 300 kilometers wide and 80 km deep (a volume of more than 14 million cubic kilometers) that includes all major population centers in Southern California and runs from the Ventura Basin, Tehachapi, and the southern San Joaquin Valley in the north, down to Los Angeles, San Diego, out to Catalina Island, and as far as the Mexican cities of Mexicali, Tijuana, and Ensenada in the south.

The highly detailed TeraShake 2 simulation, run at 200 m resolution over the immense volume with some 1.8 billion grid points, ran for four days on 240 processors of the newly-expanded 15.6 Teraflops DataStar supercomputer, requiring a complex choreography of data movement between DataStar, disk, and archival storage. The 10 terabytes of output data is archived in the SCEC Digital Library, managed by the SDSC Storage Resource Broker (SRB) at SDSC, where it is easily available to researchers for further analysis.

These improved simulations are giving scientists new insights into where strong ground motions may occur in the event of such an earthquake, which can be especially intense and long-lasting in sediment-filled basins such as the Los Angeles area.

Two factors have been especially important to this advance in computational science. One is SDSC's focus on large-scale, end-to-end data cyberinfrastructure, with data-oriented resources such as DataStar supported by a General Parallel File System (GPFS) that makes more than a petabyte of online disk available to users, and reliable data archiving with more than six petabytes of HPSS and SAM-QFS tape. One petabyte is one million gigabytes, about 10,000 times the capacity of today's typical PC hard drives.

The second factor in enabling this research is the close, long-term collaboration between SCEC researchers and experts from groups across SDSC in a Strategic Applications Collaboration. “To solve the novel challenges that emerge when scientists run their codes at the largest scales and data sets grow to immense size, we worked closely with the scientists through months of code porting, new feature integration, and optimization,” said SDSC computational scientist Yifeng Cui.

The collaboration incorporated the new, dynamic rupture feature and modifications to speed up the AWM code, and the optimized dynamic TeraShake 2 code can now scale up to 2,048 processors. During and after the run, many other SDSC staff followed up with data movement, application and data support, SAM-QFS and SANergy targeting, SRB support, I/O and GPFS expertise, and overall run management.

SDSC also provided important visualization services to the collaboration, helping SCEC scientists monitor the simulations and find new insights in integrated views of the immense 10 terabyte data set. To produce the visualizations, the researchers used over 30,000 hours on SDSC resources, including 20,000 hours on DataStar alone to create more than 100,000 images. The visualizations also help make the results more understandable to nonspecialists, and dramatic movies of the simulations can be viewed online (see links below).

Another product of the SDSC SAC collaboration with the SCEC scientists is the enhanced TeraShake code that is capable of large-scale runs, maintained at SDSC, which is now available to the U.S. earthquake community for future earthquake simulations.

TeraShake 2 demonstrates SDSC's capabilities as a leading site for end-to-end data cyberinfrastructure, showing how much the capabilities have grown to support large-scale simulations with correspondingly large data challenges.

The SCEC TeraShake 2 project is led by Thomas H. Jordan of the University of Southern California, Jean-Bernard Minster of the Institute of Geophysics and Planetary Physics (IGPP) at SIO/UCSD, Kim Olsen and Steven Day of SDSU, and Reagan Moore of SDSC/UCSD.

Related Links:

The Southern California Earthquake Center (SCEC) – http://www.scec.org/
SCEC Community Modeling Environment – http://www.scec.org/cme/
Quicktime 15 mb – http://visservices.sdsc.edu/projects/scec/terashake/movies/Terashake2.1-volume-Vy-800-map.mov
Windows Media 2 mb – http://visservices.sdsc.edu/projects/scec/terashake/movies/Terashake2.1-volume-Vy-800-map.wmv

This article was provided courtesy of the San Diego Supercomputer Center.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

The New Scalability

April 20, 2021

HPC is all about scalability. The most powerful systems. The biggest data sets. The most cores, the most bytes, the most flops, the most bandwidth. HPC scales! Notwithstanding a few recurring arguments over the last twenty years about scaling up versus scaling out, the definition of scalability... Read more…

Supercomputer-Powered Climate Model Makes Startling Sea Level Rise Prediction

April 19, 2021

The climate science community is tasked with striking a difficult balance: inspiring precisely the amount of alarm commensurate to the climate crisis. Make estimates that are too conservative, and the public might not re Read more…

San Diego Supercomputer Center Opens ‘Expanse’ to Industry Users

April 15, 2021

When San Diego Supercomputer Center (SDSC) at the University of California San Diego was getting ready to deploy its flagship Expanse supercomputer for the large research community it supports, it also sought to optimize Read more…

GTC21: Dell Building Cloud Native Supercomputers at U Cambridge and Durham

April 14, 2021

In conjunction with GTC21, Dell Technologies today announced new supercomputers at universities across DiRAC (Distributed Research utilizing Advanced Computing) in the UK with plans to explore use of Nvidia BlueField DPU technology. The University of Cambridge will expand... Read more…

The Role and Potential of CPUs in Deep Learning

April 14, 2021

Deep learning (DL) applications have unique architectural characteristics and efficiency requirements. Hence, the choice of computing system has a profound impact on how large a piece of the DL pie a user can finally enj Read more…

AWS Solution Channel

Research computing with RONIN on AWS

To allow more visibility into and management of Amazon Web Services (AWS) resources and expenses and minimize the cloud skills training required to operate these resources, AWS Partner RONIN created the RONIN research computing platform. Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized in his keynote, Nvidia doesn’t plan to build... Read more…

The New Scalability

April 20, 2021

HPC is all about scalability. The most powerful systems. The biggest data sets. The most cores, the most bytes, the most flops, the most bandwidth. HPC scales! Notwithstanding a few recurring arguments over the last twenty years about scaling up versus scaling out, the definition of scalability... Read more…

San Diego Supercomputer Center Opens ‘Expanse’ to Industry Users

April 15, 2021

When San Diego Supercomputer Center (SDSC) at the University of California San Diego was getting ready to deploy its flagship Expanse supercomputer for the larg Read more…

GTC21: Dell Building Cloud Native Supercomputers at U Cambridge and Durham

April 14, 2021

In conjunction with GTC21, Dell Technologies today announced new supercomputers at universities across DiRAC (Distributed Research utilizing Advanced Computing) in the UK with plans to explore use of Nvidia BlueField DPU technology. The University of Cambridge will expand... Read more…

The Role and Potential of CPUs in Deep Learning

April 14, 2021

Deep learning (DL) applications have unique architectural characteristics and efficiency requirements. Hence, the choice of computing system has a profound impa Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized in his keynote, Nvidia doesn’t plan to build... Read more…

Nvidia Aims Clara Healthcare at Drug Discovery, Imaging via DGX

April 12, 2021

Nvidia Corp. continues to expand its Clara healthcare platform with the addition of computational drug discovery and medical imaging tools based on its DGX A100 platform, related InfiniBand networking and its AGX developer kit. The Clara partnerships announced during... Read more…

Nvidia Serves Up Its First Arm Datacenter CPU ‘Grace’ During Kitchen Keynote

April 12, 2021

Today at Nvidia’s annual spring GPU Technology Conference (GTC), held virtually once more due to the pandemic, the company unveiled its first ever Arm-based CPU, called Grace in honor of the famous American programmer Grace Hopper. The announcement of the new... Read more…

Nvidia Debuts BlueField-3 – Its Next DPU with Big Plans for an Expanded Role

April 12, 2021

Nvidia today announced its next generation data processing unit (DPU) – BlueField-3 – adding more substance to its evolving concept of the DPU as a full-fledged partner to CPUs and GPUs in delivering advanced computing. Nvidia is pitching the DPU as an active engine... Read more…

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

Intel Launches 10nm ‘Ice Lake’ Datacenter CPU with Up to 40 Cores

April 6, 2021

The wait is over. Today Intel officially launched its 10nm datacenter CPU, the third-generation Intel Xeon Scalable processor, codenamed Ice Lake. With up to 40 Read more…

CERN Is Betting Big on Exascale

April 1, 2021

The European Organization for Nuclear Research (CERN) involves 23 countries, 15,000 researchers, billions of dollars a year, and the biggest machine in the worl Read more…

Programming the Soon-to-Be World’s Fastest Supercomputer, Frontier

January 5, 2021

What’s it like designing an app for the world’s fastest supercomputer, set to come online in the United States in 2021? The University of Delaware’s Sunita Chandrasekaran is leading an elite international team in just that task. Chandrasekaran, assistant professor of computer and information sciences, recently was named... Read more…

HPE Launches Storage Line Loaded with IBM’s Spectrum Scale File System

April 6, 2021

HPE today launched a new family of storage solutions bundled with IBM’s Spectrum Scale Erasure Code Edition parallel file system (description below) and featu Read more…

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

Saudi Aramco Unveils Dammam 7, Its New Top Ten Supercomputer

January 21, 2021

By revenue, oil and gas giant Saudi Aramco is one of the largest companies in the world, and it has historically employed commensurate amounts of supercomputing Read more…

Quantum Computer Start-up IonQ Plans IPO via SPAC

March 8, 2021

IonQ, a Maryland-based quantum computing start-up working with ion trap technology, plans to go public via a Special Purpose Acquisition Company (SPAC) merger a Read more…

Leading Solution Providers

Contributors

Can Deep Learning Replace Numerical Weather Prediction?

March 3, 2021

Numerical weather prediction (NWP) is a mainstay of supercomputing. Some of the first applications of the first supercomputers dealt with climate modeling, and Read more…

Livermore’s El Capitan Supercomputer to Debut HPE ‘Rabbit’ Near Node Local Storage

February 18, 2021

A near node local storage innovation called Rabbit factored heavily into Lawrence Livermore National Laboratory’s decision to select Cray’s proposal for its CORAL-2 machine, the lab’s first exascale-class supercomputer, El Capitan. Details of this new storage technology were revealed... Read more…

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

African Supercomputing Center Inaugurates ‘Toubkal,’ Most Powerful Supercomputer on the Continent

February 25, 2021

Historically, Africa hasn’t exactly been synonymous with supercomputing. There are only a handful of supercomputers on the continent, with few ranking on the Read more…

AMD Launches Epyc ‘Milan’ with 19 SKUs for HPC, Enterprise and Hyperscale

March 15, 2021

At a virtual launch event held today (Monday), AMD revealed its third-generation Epyc “Milan” CPU lineup: a set of 19 SKUs -- including the flagship 64-core, 280-watt 7763 part --  aimed at HPC, enterprise and cloud workloads. Notably, the third-gen Epyc Milan chips achieve 19 percent... Read more…

The History of Supercomputing vs. COVID-19

March 9, 2021

The COVID-19 pandemic poses a greater challenge to the high-performance computing community than any before. HPCwire's coverage of the supercomputing response t Read more…

HPE Names Justin Hotard New HPC Chief as Pete Ungaro Departs

March 2, 2021

HPE CEO Antonio Neri announced today (March 2, 2021) the appointment of Justin Hotard as general manager of HPC, mission critical solutions and labs, effective Read more…

Microsoft, HPE Bringing AI, Edge, Cloud to Earth Orbit in Preparation for Mars Missions

February 12, 2021

The International Space Station will soon get a delivery of powerful AI, edge and cloud computing tools from HPE and Microsoft Azure to expand technology experi Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire