EXCLAIM: New High-Resolution Models Merge Weather and Climate

August 6, 2021

Aug. 6, 2021 — Torrential rain, hailstorms and floods in the Alpine region and northwest Europe: the past few weeks have highlighted the impacts of severe thunderstorms. But how exactly are extreme weather events connected to global warming? This is one of the central questions for researchers studying and modelling the interaction between weather and climate.

By representing the underlying fundamental physical processes, models are a very powerful tool to understand these interactions. But current models and the required computer infrastructure have reached a wall, limiting the extent to which researchers can draw conclusions about how, for example, climate change affects extreme weather. To overcome this issue, ETH Zurich has teamed up with partners to launch the EXCLAIM research initiative. This project aims to dramatically increase the spatial resolution of the models, thereby enhancing their accuracy in simulating the weather on a global scale in a future, warmer world.

Seamless weather simulations in climate models

“Thanks to their high resolution, the new, global models will simulate key processes such as storms and weather systems in much more detail than before, allowing us to study the interaction of climate change and weather events much more accurately,” says Nicolas Gruber, EXCLAIM lead PI and Professor of Environmental Physics.

EXCLAIM is interdisciplinary: along with the climate researchers from the ETH Center for Climate Systems Modeling (C2SM), ETH computer scientists, the Swiss National Supercomputing Centre (CSCS), the Swiss Data Science Center (SDSC), the Swiss Federal Laboratories for Materials Science and Technology (Empa) and MeteoSwiss, the Federal Office of Meteorology and Climatology, are all involved in the project. Not only will this collaboration improve the modelling of climate, it will also make the weather forecasts provided by MeteoSwiss more reliable. International project partners include Germany’s National Meteorological Service, Deutscher Wetterdienst (DWD), and the Max Planck Institute for Meteorology (MPI-M), which together developed the ICON (Icosahedral Nonhydrostatic) Model – the basis of EXCLAIM – as well as the European Centre for Medium-Range Weather Forecasts (ECMWF), of which Switzerland is a full member.

With EXCLAIM, researchers are aiming to radically scale up the spatial resolution of the weather and climate models. To simulate global weather and climate with all its regional detail, such models place a virtual, three-dimensional grid over the Earth. Researchers then use the laws of physics to calculate the respective climate conditions for each cell in their models. Current global climate models typically have grid cells with a width of 50 to 100 kilometres. In the long run, EXCLAIM researchers aim to increase the resolution to just one kilometre.

In the past, given the limited computing power of modern supercomputers, only regional weather could be simulated with such a fine grid – and for relatively short periods of time at most. With the new models, the researchers now hope to attain this fine resolution worldwide, enabling them to simulate weather patterns from a global climate perspective and with a much sharper focus. This is like giving global climate models an additional zoom function for small-scale events. “What’s more, the new models will pave the way for ‘forecasting’ weather in the future climate, providing the answers as to how extreme weather events like the torrential rain we experienced this summer might look in the future,” says Christof Appenzeller, Head of Analysis and Forecasting at MeteoSwiss.

Powerful infrastructure for climate simulations

Customised computer infrastructure is essential to get the best out of the new models. Weather and climate models are some of the most complex, most data-intensive computational problems there are, which is why the EXCLAIM models are being developed in parallel with the hardware and software for supercomputers. “The computing and data infrastructure is being tailored to the exact requirements of the weather and climate models,” says Thomas Schulthess, Director of the Swiss National Supercomputing Centre (CSCS) in Lugano. For example, the new “Alps” supercomputing system is configured to allow the high-resolution climate models to properly resolve convective systems, such as thunderstorms.

To effectively simulate weather and climate on a global scale over several decades with a grid width of just a few kilometres, the model will have to run approximately 100 times faster than is currently possible. The first option for achieving this goal is to deploy faster, more powerful computers. Switching from the current supercomputer at CSCS to the “Alps” system will be instrumental in this regard.

One challenge is the end of “Moore’s law”, which holds that processor performance doubles approximately every 20 months. “As processor haven’t increased in serial performance for about 15 years, the only way of improving supercomputer performance is to improve their parallel processing architecture,” Schulthess says. “Furthermore, it’s worth setting up the supercomputer architecture specifically to allow it to solve classes of research problems in an optimal manner.” The key to providing the requisite computing power here lies in a hybrid computer architecture in which conventional CPUs (central processing units), responsible for performing calculations and sharing data between the memory and components, are deployed in conjunction with GPUs (graphical processing units).

The second option concerns the software, namely the optimisation of the model code to ensure it fully benefits from the hybrid computer architecture. EXCLAIM is taking a revolutionary approach by splitting the source code into two parts: a first part that represents the interface to the model developers and users; and an underlying software infrastructure part in which the model’s central algorithms are implemented with a high degree of efficiency for the respective hardware. CSCS, MeteoSwiss and C2SM have already used this approach in the current MeteoSwiss weather model with great success. This approach is now being applied to the ICON weather and climate model. “We were able to accelerate the MeteoSwiss weather model by a factor of ten, improving the reliability of the MeteoSwiss forecasts as a result,” Schulthess says.

Managing the flood of data

Computing speed alone is not the decisive factor. Increasing the resolution of the models also leads to a data explosion. Furthermore, weather and climate research require and produce a high diversity of data. To ensure effective throughput, it is equally crucial that the computers are able both to access the data and to write the results to storage media as quickly as possible. The computing processes have to be organised accordingly, while memory bandwidth is maximised and costly data transfers avoided. “For the new weather and climate models to produce useful results, we have to optimise the entire infrastructure. To this end, we’re leveraging the expertise gained from many years of working with MeteoSwiss and the ETH domain,” Schulthess says.

A new high-performance weather model leads to more precise estimates of greenhouse gas emissions

In ETH’s EXCLAIM project, in which Empa is involved as an external partner, a highly efficient weather and climate model is being developed that makes optimum use of the capabilities of the latest generation of high-performance computers and breaks new ground in programming to achieve this. The starting point for this development is the ICON model, which was mainly devel-oped by Deutscher Wetterdienst (German Weather Service) and the Max Planck Institute for Mete-orology, and which will be used in the future by MeteoSwiss for its weather forecasts.

Atmospheric models, however, can be used not only for weather forecasting and climate predictions, but also to simulate air quality or the dispersion of pollution emission plumes, for example from volcanic eruptions or nuclear incidents.

Empa uses such models to estimate greenhouse gas emissions from individual sources or entire countries by comparing simulated concentrations with measurements, for example Empa’s measurements at the Jungfraujoch. Their estimates of Swiss emissions of methane and nitrous oxide are published in the National Greenhouse Gas Inventory, which is delivered annually by Switzerland to the UNFCCC under the Paris Climate Agreement. Empa thereby provides a valuable, independent review of the annually published inventory.

In order to perform simulations at a previously unattainable resolution in the range of a few kilo-meters, Empa will in future rely on the powerful model being developed in EXCLAIM. This will re-quire simulating up to several hundred different realizations of the concentrations of a greenhouse gas – a complex process that in the past was only possible with a coarse spatial resolution. It will also make it possible to use measurements from future satellites, which measure the global distribution of CO2 and methane for emission estimation. (Pof. Dominik Brunner, Amanda Caracas, Empa)


Source: Florian Meyer, ETH Zurich

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

SC21 Was Unlike Any Other — Was That a Good Thing?

December 3, 2021

For a long time, the promised in-person SC21 seemed like an impossible fever dream, the assurances of a prominent physical component persisting across years of canceled conferences, including two virtual ISCs and the virtual SC20. With the advent of the Delta variant, Covid surges in St. Louis and contention over vaccine requirements... Read more…

The Green500’s Crystal Anniversary Sees MN-3 Crystallize Its Winning Streak

December 2, 2021

“This is the 30th Green500,” said Wu Feng, custodian of the Green500 list, at the list’s SC21 birds-of-a-feather session. “You could say 15 years of Green500, which makes it, I guess, the crystal anniversary.” Indeed, HPCwire marked the 15th anniversary of the Green500 – which ranks supercomputers by flops-per-watt, rather than just by flops – earlier this year with... Read more…

AWS Arm-based Graviton3 Instances Now in Preview

December 1, 2021

Three years after unveiling the first generation of its AWS Graviton chip-powered instances in 2018, Amazon Web Services announced that the third generation of the processors – the AWS Graviton3 – will power all-new Amazon Elastic Compute 2 (EC2) C7g instances that are now available in preview. Debuting at the AWS re:Invent 2021... Read more…

Nvidia Dominates Latest MLPerf Results but Competitors Start Speaking Up

December 1, 2021

MLCommons today released its fifth round of MLPerf training benchmark results with Nvidia GPUs again dominating. That said, a few other AI accelerator companies participated and, one of them, Graphcore, even held a separ Read more…

HPC Career Notes: December 2021 Edition

December 1, 2021

In this monthly feature, we’ll keep you up-to-date on the latest career developments for individuals in the high-performance computing community. Whether it’s a promotion, new company hire, or even an accolade, we’ Read more…

AWS Solution Channel

Running a 3.2M vCPU HPC Workload on AWS with YellowDog

Historically, advances in fields such as meteorology, healthcare, and engineering, were achieved through large investments in on-premises computing infrastructure. Upfront capital investment and operational complexity have been the accepted norm of large-scale HPC research. Read more…

At SC21, Experts Ask: Can Fast HPC Be Green?

November 30, 2021

HPC is entering a new era: exascale is (somewhat) officially here, but Moore’s law is ending. Power consumption and other sustainability concerns loom over the enormous systems and chips of this new epoch, for both cost and compliance reasons. Reconciling the need to continue the supercomputer scale-up while reducing HPC’s environmental impacts... Read more…

SC21 Was Unlike Any Other — Was That a Good Thing?

December 3, 2021

For a long time, the promised in-person SC21 seemed like an impossible fever dream, the assurances of a prominent physical component persisting across years of canceled conferences, including two virtual ISCs and the virtual SC20. With the advent of the Delta variant, Covid surges in St. Louis and contention over vaccine requirements... Read more…

The Green500’s Crystal Anniversary Sees MN-3 Crystallize Its Winning Streak

December 2, 2021

“This is the 30th Green500,” said Wu Feng, custodian of the Green500 list, at the list’s SC21 birds-of-a-feather session. “You could say 15 years of Green500, which makes it, I guess, the crystal anniversary.” Indeed, HPCwire marked the 15th anniversary of the Green500 – which ranks supercomputers by flops-per-watt, rather than just by flops – earlier this year with... Read more…

Nvidia Dominates Latest MLPerf Results but Competitors Start Speaking Up

December 1, 2021

MLCommons today released its fifth round of MLPerf training benchmark results with Nvidia GPUs again dominating. That said, a few other AI accelerator companies Read more…

At SC21, Experts Ask: Can Fast HPC Be Green?

November 30, 2021

HPC is entering a new era: exascale is (somewhat) officially here, but Moore’s law is ending. Power consumption and other sustainability concerns loom over the enormous systems and chips of this new epoch, for both cost and compliance reasons. Reconciling the need to continue the supercomputer scale-up while reducing HPC’s environmental impacts... Read more…

Raja Koduri and Satoshi Matsuoka Discuss the Future of HPC at SC21

November 29, 2021

HPCwire's Managing Editor sits down with Intel's Raja Koduri and Riken's Satoshi Matsuoka in St. Louis for an off-the-cuff conversation about their SC21 experience, what comes after exascale and why they are collaborating. Koduri, senior vice president and general manager of Intel's accelerated computing systems and graphics (AXG) group, leads the team... Read more…

Jack Dongarra on SC21, the Top500 and His Retirement Plans

November 29, 2021

HPCwire's Managing Editor sits down with Jack Dongarra, Top500 co-founder and Distinguished Professor at the University of Tennessee, during SC21 in St. Louis to discuss the 2021 Top500 list, the outlook for global exascale computing, and what exactly is going on in that Viking helmet photo. Read more…

SC21: Larry Smarr on The Rise of Supernetwork Data Intensive Computing

November 26, 2021

Larry Smarr, founding director of Calit2 (now Distinguished Professor Emeritus at the University of California San Diego) and the first director of NCSA, is one of the seminal figures in the U.S. supercomputing community. What began as a personal drive, shared by others, to spur the creation of supercomputers in the U.S. for scientific use, later expanded into a... Read more…

Three Chinese Exascale Systems Detailed at SC21: Two Operational and One Delayed

November 24, 2021

Details about two previously rumored Chinese exascale systems came to light during last week’s SC21 proceedings. Asked about these systems during the Top500 media briefing on Monday, Nov. 15, list author and co-founder Jack Dongarra indicated he was aware of some very impressive results, but withheld comment when asked directly if he had... Read more…

IonQ Is First Quantum Startup to Go Public; Will It be First to Deliver Profits?

November 3, 2021

On October 1 of this year, IonQ became the first pure-play quantum computing start-up to go public. At this writing, the stock (NYSE: IONQ) was around $15 and its market capitalization was roughly $2.89 billion. Co-founder and chief scientist Chris Monroe says it was fun to have a few of the company’s roughly 100 employees travel to New York to ring the opening bell of the New York Stock... Read more…

Enter Dojo: Tesla Reveals Design for Modular Supercomputer & D1 Chip

August 20, 2021

Two months ago, Tesla revealed a massive GPU cluster that it said was “roughly the number five supercomputer in the world,” and which was just a precursor to Tesla’s real supercomputing moonshot: the long-rumored, little-detailed Dojo system. Read more…

Esperanto, Silicon in Hand, Champions the Efficiency of Its 1,092-Core RISC-V Chip

August 27, 2021

Esperanto Technologies made waves last December when it announced ET-SoC-1, a new RISC-V-based chip aimed at machine learning that packed nearly 1,100 cores onto a package small enough to fit six times over on a single PCIe card. Now, Esperanto is back, silicon in-hand and taking aim... Read more…

US Closes in on Exascale: Frontier Installation Is Underway

September 29, 2021

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, held by Zoom this week (Sept. 29-30), it was revealed that the Frontier supercomputer is currently being installed at Oak Ridge National Laboratory in Oak Ridge, Tenn. The staff at the Oak Ridge Leadership... Read more…

AMD Launches Milan-X CPU with 3D V-Cache and Multichip Instinct MI200 GPU

November 8, 2021

At a virtual event this morning, AMD CEO Lisa Su unveiled the company’s latest and much-anticipated server products: the new Milan-X CPU, which leverages AMD’s new 3D V-Cache technology; and its new Instinct MI200 GPU, which provides up to 220 compute units across two Infinity Fabric-connected dies, delivering an astounding 47.9 peak double-precision teraflops. “We're in a high-performance computing megacycle, driven by the growing need to deploy additional compute performance... Read more…

Intel Reorgs HPC Group, Creates Two ‘Super Compute’ Groups

October 15, 2021

Following on changes made in June that moved Intel’s HPC unit out of the Data Platform Group and into the newly created Accelerated Computing Systems and Graphics (AXG) business unit, led by Raja Koduri, Intel is making further updates to the HPC group and announcing... Read more…

Intel Completes LLVM Adoption; Will End Updates to Classic C/C++ Compilers in Future

August 10, 2021

Intel reported in a blog this week that its adoption of the open source LLVM architecture for Intel’s C/C++ compiler is complete. The transition is part of In Read more…

Killer Instinct: AMD’s Multi-Chip MI200 GPU Readies for a Major Global Debut

October 21, 2021

AMD’s next-generation supercomputer GPU is on its way – and by all appearances, it’s about to make a name for itself. The AMD Radeon Instinct MI200 GPU (a successor to the MI100) will, over the next year, begin to power three massive systems on three continents: the United States’ exascale Frontier system; the European Union’s pre-exascale LUMI system; and Australia’s petascale Setonix system. Read more…

Leading Solution Providers

Contributors

Hot Chips: Here Come the DPUs and IPUs from Arm, Nvidia and Intel

August 25, 2021

The emergence of data processing units (DPU) and infrastructure processing units (IPU) as potentially important pieces in cloud and datacenter architectures was Read more…

D-Wave Embraces Gate-Based Quantum Computing; Charts Path Forward

October 21, 2021

Earlier this month D-Wave Systems, the quantum computing pioneer that has long championed quantum annealing-based quantum computing (and sometimes taken heat fo Read more…

The Latest MLPerf Inference Results: Nvidia GPUs Hold Sway but Here Come CPUs and Intel

September 22, 2021

The latest round of MLPerf inference benchmark (v 1.1) results was released today and Nvidia again dominated, sweeping the top spots in the closed (apples-to-ap Read more…

HPE Wins $2B GreenLake HPC-as-a-Service Deal with NSA

September 1, 2021

In the heated, oft-contentious, government IT space, HPE has won a massive $2 billion contract to provide HPC and AI services to the United States’ National Security Agency (NSA). Following on the heels of the now-canceled $10 billion JEDI contract (reissued as JWCC) and a $10 billion... Read more…

Ahead of ‘Dojo,’ Tesla Reveals Its Massive Precursor Supercomputer

June 22, 2021

In spring 2019, Tesla made cryptic reference to a project called Dojo, a “super-powerful training computer” for video data processing. Then, in summer 2020, Tesla CEO Elon Musk tweeted: “Tesla is developing a [neural network] training computer... Read more…

Three Chinese Exascale Systems Detailed at SC21: Two Operational and One Delayed

November 24, 2021

Details about two previously rumored Chinese exascale systems came to light during last week’s SC21 proceedings. Asked about these systems during the Top500 media briefing on Monday, Nov. 15, list author and co-founder Jack Dongarra indicated he was aware of some very impressive results, but withheld comment when asked directly if he had... Read more…

2021 Gordon Bell Prize Goes to Exascale-Powered Quantum Supremacy Challenge

November 18, 2021

Today at the hybrid virtual/in-person SC21 conference, the organizers announced the winners of the 2021 ACM Gordon Bell Prize: a team of Chinese researchers leveraging the new exascale Sunway system to simulate quantum circuits. The Gordon Bell Prize, which comes with an award of $10,000 courtesy of HPC pioneer Gordon Bell, is awarded annually... Read more…

Quantum Computer Market Headed to $830M in 2024

September 13, 2021

What is one to make of the quantum computing market? Energized (lots of funding) but still chaotic and advancing in unpredictable ways (e.g. competing qubit tec Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire