Weather and climate simulations are often extraordinarily demanding, requiring the integration of a wide range of fickle, interdependent variables over a large geographic area. As a result, most of those simulations either operate at a coarse spatial resolution or focus on a very specific area or timescale. But as the exascale era approaches, those tradeoffs are diminishing, and researchers are beginning to simulate weather over large areas for long periods of time with relatively fine resolution. Now, Summit, the most powerful computer in the United States, has enabled an unprecedented weather simulation: the entire atmosphere of the Earth at a one-kilometer resolution for a four-month season.
The simulation, conducted by the European Centre for Medium-Range Weather Forecasts (ECMWF) in partnership with Oak Ridge National Laboratory, used the Integrated Forecasting System (IFS) code – which currently uses a nine-kilometer grid for its operational weather forecasts. To achieve the much higher resolution, the researchers optimized the code for Summit’s GPU capabilities, as well as its memory hierarchy and network.
“In this project, we have now shown for the first time that simulations at this resolution can be sustained over a long time span – a full season – and that the large amount of data that are produced can be handled on a supercomputer such as Summit,” said Nils Wedi, head of Earth System Modelling at ECMWF, in an interview with ORNL’s Coury Turczyn.

The higher-resolution simulation resulted in some very meaningful changes. Many high mountain peaks, for instance, were no longer averaged out into lower elevations spread over nine-kilometer squares, and instead were recognized by the model at or near their full heights, better informing airflow through those areas. The higher-resolution model also does a better job at representing small weather formations, such as tropical thunderstorms.
“We are not simply looking at whether we can make an improvement at any given locality, but ideally all these things should translate into a global circulation system,” said Val Anantharaj, an ORNL computational scientist who serves as data liaison on the project. “If we can resolve the global atmospheric circulation pattern better, then we should be able to produce better forecasts.”
The researchers are still analyzing the data resulting from the simulation and preparing the results for publication, after which they will make the data publicly available. While it may take some time for such high-resolution forecasts to become the norm, the team is hopeful that their results can be used to inform lower-resolution forecasting and better-anticipate results from future exascale systems.
“The handling and data challenges that we have overcome during this project are still very large, and our simulations are at the edge of what is achievable today,” Wedi said. “However, I believe that it will be possible to run simulations with one-kilometer grid-spacing routinely in the future – in the same way we are running simulations with nine-kilometer grid-spacing today. To continue to push these boundaries is important.”
To read the reporting from ORNL’s Coury Turczyn on this research, click here.