March 28, 2013

Yellowstone Helps Predict Air Pollution

Ryan Hill

The Yellowstone supercomputer has a 1.5-petaflop I-data plex system at peak. With 72,288 processor cores, the machine is powerful enough for No. 13 on the Top500. The machine was first tasked with 11 compute-intensive projects as part of the Accelerated Scientific Discovery (ASD) initiative.

Yellowstone is based on IBM’s iDataPlex architecture and can perform 29x the workload throughput of NCAR’s Bluefire, which was decommissioned on January 31. It is capable of performing one-and-a-half quadrillion operations a second and stores eleven petabytes of information, one thousand times the total print holding of the Library of Congress.

The ASD initiative provides these large-scale computational resources to a small number of projects for a short time period. These projects help give the system a workout and allows for the pursuit of scientific objectives that otherwise would not be possible through normal allocation opportunities.

These projects, chosen at the National Center for Atmospheric Research (NCAR), were part of the system’s original purpose. The supercomputer carried out large amounts of computing over a two-month period, investigating timely issues surrounding Earth and its atmosphere, such as creating better long-range weather forecasts and closing the spatial gap between model cloud dynamics and cloud microphysics.

Yellowstone has customized Geyser and Caldera clusters, which are specialized data analysis and visualization resources within Yellowstone’s data-centric environment. These systems provide a 20-fold increase in Computational and Information Systems Laboratory’s (CISL) dedicated data analysis and visualization resources. With 16 large-memory nodes and 1 TB of memory per node, Geyser is designed to facilitate large-scale data analysis and post-processing tasks, including 3D visualization; Caldera also has 16 total nodes, with two NVIDIA Tesla GPUs per node, to support parallel processing, visualization activities, and development and testing of general-purpose GPU code.

Taken together, these components improve capabilities central to NCAR’s mission, such as supporting the development of climate models, weather forecasting, and other critical research.

One of these projects selected by NCAR involved predicting North American air quality through the year 2055. Gabriele Pfister of NCAR led the project, which had 6.25 million core hours allocated to Yellowstone. The study performed simulations with the nested regional climate model with chemistry (NRCM-Chem) to study possible changes in weather and air quality over North America between present-day and two future time periods: 2020-2030 and 2045-2055. This will provide insights into expected future changes related to air quality and will also be used for dynamical downscaling (of meteorology and air quality) of global climate simulations performed at NCAR.

Related Articles:

TACC Unveils Dell HPC System Stampede

NSF-funded Superhero Supercomputer Helps Battle Autism

Extreme Networks Provides 40GbE Network For ‘Blue Waters’ Supercomputer