EXCLAIM: New High-Resolution Models Merge Weather and Climate

August 6, 2021

Aug. 6, 2021 — Torrential rain, hailstorms and floods in the Alpine region and northwest Europe: the past few weeks have highlighted the impacts of severe thunderstorms. But how exactly are extreme weather events connected to global warming? This is one of the central questions for researchers studying and modelling the interaction between weather and climate.

By representing the underlying fundamental physical processes, models are a very powerful tool to understand these interactions. But current models and the required computer infrastructure have reached a wall, limiting the extent to which researchers can draw conclusions about how, for example, climate change affects extreme weather. To overcome this issue, ETH Zurich has teamed up with partners to launch the EXCLAIM research initiative. This project aims to dramatically increase the spatial resolution of the models, thereby enhancing their accuracy in simulating the weather on a global scale in a future, warmer world.

Seamless weather simulations in climate models

“Thanks to their high resolution, the new, global models will simulate key processes such as storms and weather systems in much more detail than before, allowing us to study the interaction of climate change and weather events much more accurately,” says Nicolas Gruber, EXCLAIM lead PI and Professor of Environmental Physics.

EXCLAIM is interdisciplinary: along with the climate researchers from the ETH Center for Climate Systems Modeling (C2SM), ETH computer scientists, the Swiss National Supercomputing Centre (CSCS), the Swiss Data Science Center (SDSC), the Swiss Federal Laboratories for Materials Science and Technology (Empa) and MeteoSwiss, the Federal Office of Meteorology and Climatology, are all involved in the project. Not only will this collaboration improve the modelling of climate, it will also make the weather forecasts provided by MeteoSwiss more reliable. International project partners include Germany’s National Meteorological Service, Deutscher Wetterdienst (DWD), and the Max Planck Institute for Meteorology (MPI-M), which together developed the ICON (Icosahedral Nonhydrostatic) Model – the basis of EXCLAIM – as well as the European Centre for Medium-Range Weather Forecasts (ECMWF), of which Switzerland is a full member.

With EXCLAIM, researchers are aiming to radically scale up the spatial resolution of the weather and climate models. To simulate global weather and climate with all its regional detail, such models place a virtual, three-dimensional grid over the Earth. Researchers then use the laws of physics to calculate the respective climate conditions for each cell in their models. Current global climate models typically have grid cells with a width of 50 to 100 kilometres. In the long run, EXCLAIM researchers aim to increase the resolution to just one kilometre.

In the past, given the limited computing power of modern supercomputers, only regional weather could be simulated with such a fine grid – and for relatively short periods of time at most. With the new models, the researchers now hope to attain this fine resolution worldwide, enabling them to simulate weather patterns from a global climate perspective and with a much sharper focus. This is like giving global climate models an additional zoom function for small-scale events. “What’s more, the new models will pave the way for ‘forecasting’ weather in the future climate, providing the answers as to how extreme weather events like the torrential rain we experienced this summer might look in the future,” says Christof Appenzeller, Head of Analysis and Forecasting at MeteoSwiss.

Powerful infrastructure for climate simulations

Customised computer infrastructure is essential to get the best out of the new models. Weather and climate models are some of the most complex, most data-intensive computational problems there are, which is why the EXCLAIM models are being developed in parallel with the hardware and software for supercomputers. “The computing and data infrastructure is being tailored to the exact requirements of the weather and climate models,” says Thomas Schulthess, Director of the Swiss National Supercomputing Centre (CSCS) in Lugano. For example, the new “Alps” supercomputing system is configured to allow the high-resolution climate models to properly resolve convective systems, such as thunderstorms.

To effectively simulate weather and climate on a global scale over several decades with a grid width of just a few kilometres, the model will have to run approximately 100 times faster than is currently possible. The first option for achieving this goal is to deploy faster, more powerful computers. Switching from the current supercomputer at CSCS to the “Alps” system will be instrumental in this regard.

One challenge is the end of “Moore’s law”, which holds that processor performance doubles approximately every 20 months. “As processor haven’t increased in serial performance for about 15 years, the only way of improving supercomputer performance is to improve their parallel processing architecture,” Schulthess says. “Furthermore, it’s worth setting up the supercomputer architecture specifically to allow it to solve classes of research problems in an optimal manner.” The key to providing the requisite computing power here lies in a hybrid computer architecture in which conventional CPUs (central processing units), responsible for performing calculations and sharing data between the memory and components, are deployed in conjunction with GPUs (graphical processing units).

The second option concerns the software, namely the optimisation of the model code to ensure it fully benefits from the hybrid computer architecture. EXCLAIM is taking a revolutionary approach by splitting the source code into two parts: a first part that represents the interface to the model developers and users; and an underlying software infrastructure part in which the model’s central algorithms are implemented with a high degree of efficiency for the respective hardware. CSCS, MeteoSwiss and C2SM have already used this approach in the current MeteoSwiss weather model with great success. This approach is now being applied to the ICON weather and climate model. “We were able to accelerate the MeteoSwiss weather model by a factor of ten, improving the reliability of the MeteoSwiss forecasts as a result,” Schulthess says.

Managing the flood of data

Computing speed alone is not the decisive factor. Increasing the resolution of the models also leads to a data explosion. Furthermore, weather and climate research require and produce a high diversity of data. To ensure effective throughput, it is equally crucial that the computers are able both to access the data and to write the results to storage media as quickly as possible. The computing processes have to be organised accordingly, while memory bandwidth is maximised and costly data transfers avoided. “For the new weather and climate models to produce useful results, we have to optimise the entire infrastructure. To this end, we’re leveraging the expertise gained from many years of working with MeteoSwiss and the ETH domain,” Schulthess says.

A new high-performance weather model leads to more precise estimates of greenhouse gas emissions

In ETH’s EXCLAIM project, in which Empa is involved as an external partner, a highly efficient weather and climate model is being developed that makes optimum use of the capabilities of the latest generation of high-performance computers and breaks new ground in programming to achieve this. The starting point for this development is the ICON model, which was mainly devel-oped by Deutscher Wetterdienst (German Weather Service) and the Max Planck Institute for Mete-orology, and which will be used in the future by MeteoSwiss for its weather forecasts.

Atmospheric models, however, can be used not only for weather forecasting and climate predictions, but also to simulate air quality or the dispersion of pollution emission plumes, for example from volcanic eruptions or nuclear incidents.

Empa uses such models to estimate greenhouse gas emissions from individual sources or entire countries by comparing simulated concentrations with measurements, for example Empa’s measurements at the Jungfraujoch. Their estimates of Swiss emissions of methane and nitrous oxide are published in the National Greenhouse Gas Inventory, which is delivered annually by Switzerland to the UNFCCC under the Paris Climate Agreement. Empa thereby provides a valuable, independent review of the annually published inventory.

In order to perform simulations at a previously unattainable resolution in the range of a few kilo-meters, Empa will in future rely on the powerful model being developed in EXCLAIM. This will re-quire simulating up to several hundred different realizations of the concentrations of a greenhouse gas – a complex process that in the past was only possible with a coarse spatial resolution. It will also make it possible to use measurements from future satellites, which measure the global distribution of CO2 and methane for emission estimation. (Pof. Dominik Brunner, Amanda Caracas, Empa)


Source: Florian Meyer, ETH Zurich

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire