In the spring of 2019, environmental modelers at the Lake Michigan Air Directors Consortium (LADCO) had a new problem to solve. Emerging research on air pollution along the shores of the Great Lakes in the United States showed that to properly simulate the pollution episodes in the region we needed to apply our models at a finer spatial granularity than the computational capacity of our in-house high performance computing (HPC) cluster could handle. The LADCO modelers turned to AWS ParallelCluster to access the HPC resources needed to do this modeling faster and scale for our member states.
LADCO provides technical assistance to the states in the Great Lakes region on problems of urban to regional-scale air quality. We use complex computer models of weather, emissions, and atmospheric chemistry to investigate the main drivers of air pollution in the region. We work with our member states—Illinois, Indiana, Michigan, Minnesota, Ohio and Wisconsin—to use the models for exploring air pollution control programs pursuant to their clean air goals.
We entered into the work with AWS with a goal to create a model that could be used by all member states, but no clear path on how to meet that goal. We also had few constraints. Our infrastructure needed to be cost-effective, reliable, and secure. So we got to work developing and testing modeling platforms on the Amazon Elastic Compute Cloud (Amazon EC2).
After extensive prototyping and testing a weather modeling platform using Amazon EC2 Spot Instances, we configured the system to simulate atmospheric chemistry. LADCO then ran our first operational application on AWS for modeling ground level ozone in Chicago. We used the results of this simulation to support an Environmental Protection Agency (EPA) regulatory process called a State Implementation Plan (SIP).
To learn how LADCO ran their weather simulations on AWS, read the full blog here.