The Water Institute of the Gulf (Water Institute) runs its storm surge and wave analysis models on Amazon Web Services (AWS)—a task that sometimes requires large bursts of compute power. These models are critical in forecasting hurricane storm surge event (like Hurricane Laura in August 2020), evaluating flood risk for Louisiana and other coastal states, helping governments prepare for future conditions, and managing the coast proactively. Unfortunately, all of this can be exacerbated by the land loss problems plaguing Louisiana. The United States Geological Survey calculates that the state loses the equivalent of one football field of land every 100 minutes.
The Water Institute was founded through a collaborative effort involving the State of Louisiana, Senator Mary Landrieu, and the Baton Rouge Area Foundation (BRAF). The Water Institute links academic, public, and private research partnerships and conducts applied research to better inform the decisions facing communities and industries around the world.
Zach Cobell, a numerical modeler at the Water Institute says, “We’re trying to provide the best scientific information we can about an uncertain future to stakeholders.” Cobell leads the hurricane storm surge and wave analysis modeling team at the Water Institute.
One of Cobell’s largest projects is storm surge analysis for Louisiana’s Coastal Master Plan, which is updated by the Coastal Protection and Restoration Authority of Louisiana (CPRA) every six years. The master plan aims to estimate how the landscape of the state’s coastline will change over the next 50 years and what steps can be taken to help.
The storm surge model is an important component of the analysis that informs updates to Louisiana’s Coastal Master Plan. “The storm surge model is computationally demanding because we’re simulating large portions of the Atlantic Ocean, Gulf of Mexico, and coastal Louisiana in great detail to ensure that the hurricanes push surge and waves realistically into the areas that they make landfall. Trying to correctly represent how these large storms are able to move the ocean water takes large amounts computational power,” according to Cobell.
The team has traditionally run these workloads at universities and government supercomputing centers. A consistent challenge was gaining access in rapid succession to many processors at once for long periods of time. To solve this challenge, the team turned to AWS. “With the AWS Cloud, we are able to rethink our workflow. Instead of focusing on maximum speed for individual simulations, we focus on maximum throughput. We can run greater numbers of simulations on fewer cores. For a recent set of simulations, we used 48 processors with AWS for a job which typically runs with 256 at traditional computing centers. This allowed us to run 200 simultaneous simulations whereas we typically would only have four to six running in parallel at traditional computing centers,” said Cobell.
For storm surge modeling to support Louisiana’s Coastal Master Plan, the team needed to run 645 simulations of potential hurricane events. Running this full suite of simulations might have taken approximately three months at a traditional supercomputing center, depending on availability of the machine and how quickly the simulations could gain priority in the queue, competing with jobs from other users. With AWS, the team completed all 645 simulations in three and a half days.
“Finishing in a fraction of the time, in days, was really beneficial. We had a tight deadline. Instantaneous, on-demand access to compute power was the only reason we were able to get this done on time,” said Cobell. To complete their models, the team used Amazon Elastic Compute Cloud (Amazon EC2) Spot Instances, Amazon FSx for Lustre, and Amazon ParallelCluster resulting in a faster workflow with reduced costs.
Read more about the project in this AWS Blog.