Could Machine Learning Replace the Entire Weather Forecast System?

By Oliver Peckham

April 27, 2020

Just a few months ago, a series of major new weather and climate supercomputing investments were announced, including a £1.2 billion order for the world’s most powerful weather and climate supercomputer and a tripling of the U.S.’ operational supercomputing capacity for weather forecasting. Weather and climate modeling are among the most power-hungry use cases for supercomputers, and research and forecasting agencies often struggle to keep up with the computing needs of models that are, in many cases, simulating the atmosphere of the entire planet as granularly – and as regularly – as possible.

What if that all changed?

In a virtual keynote for the HPC-AI Advisory Council’s 2020 Stanford Conference, Peter Dueben outlined how machine learning might (or might not) begin to augment – and even, eventually, compete with – heavy-duty, supercomputer-powered climate models. Dueben is the coordinator for machine learning and AI activities at the European Centre for Medium-Range Weather Forecasts (ECMWF), a UK-based intergovernmental organization that houses two supercomputers and provides 24/7 operational weather services at several timescales. ECMWF is also the home of the Integrated Forecast System (IFS), which Dueben says is “probably one of the best forecast models in the world.”

Why machine learning at all?

The Earth, Dueben explained, is big. So big, in fact, that apart from being laborious, developing a representational model of the Earth’s weather and climate systems brick-by-brick isn’t achieving the accuracy that you might imagine. Despite the computing firepower behind weather forecasting, most models remain at a 10 kilometer resolution that doesn’t represent clouds, and the chaotic atmospheric dynamics and occasionally opaque interactions further complicate model outputs.

“However, on the other side, we have a huge number of observations,” Dueben said. “Just to give you an impression, ECMWF is getting hundreds of millions of observations onto the site every day. Some observations come from satellites, planes, ships, ground measurements, balloons…” This data – collected over the last several decades – constituted hundreds of petabytes if simulations and climate modeling results were included. 

“If you combine those two points, we have a very complex nonlinear system and we also have a lot of data,” he said. “There’s obviously lots of potential applications for machine learning in weather modeling.”

Potential applications of machine learning

“Machine learning applications are really spread all over the entire workflow of weather prediction,” Dueben said, breaking that workflow down into observations, data assimilation, numerical weather forecasting, and post-processing and dissemination. Across those areas, he explained, machine learning could be used for anything from weather data monitoring to learning the underlying equations of atmospheric motions.

By way of example, Dueben highlighted a handful of current, real-world applications. In one case, researchers had applied machine learning to detecting wildfires caused by lightning. Using observations for 15 variables (such as temperature, soil moisture and vegetation cover), the researchers constructed a machine learning-based decision tree to assess whether or not satellite observations included wildfires. The team achieved an accuracy of 77 percent – which, Deuben said, “doesn’t sound too great in principle,” but was actually “quite good.” 

Elsewhere, another team explored the use of machine learning to correct persistent biases in forecast model results. Dueben explained that researchers were examining the use of a “weak constraint” machine learning algorithm (in this case, 4D-Var), “which is a kind of algorithm that would be able to learn this kind of forecast error and correct it in the data assimilation process.” 

A visualization of the 4D-Var bias correction, with the lighter blue segments representing lower biases over time as the model learned. Image courtesy of Peter Dueben.

“We learn, basically, the bias,” he said, “and then once we have learned the bias, we can correct the bias of the forecast model by just adding forcing terms to the system.” Once 4D-Var was implemented on a sample of forecast model results, the biases were ameliorated. Though Dueben cautioned that the process is “still fairly simplistic,” a new collaboration with Nvidia is looking into more sophisticated ways of correcting those forecast errors with machine learning.

Dueben also outlined applications in post-processing. Much of modern weather forecasting focuses on ensemble methods, where a model is run many times to obtain a spread of possible scenarios – and as a result, probabilities of various outcomes. “We investigate whether we can correct the ensemble spread calculated from a small number of ensemble members via deep learning,” Dueben said. Once again, machine learning – when applied to a ten-member ensemble looking at temperatures in Europe – improved the results, reducing error in temperature spreads.

Can machine learning replace core functionality – or even the entire forecast system?

“One of the things that we’re looking into is the emulation of different permutation schemes,” Dueben said. Chief among those, at least initially, have been the radiation component of forecast models, which account for the fluxes of solar radiation between the ground, the clouds and the upper atmosphere. As a trial run, Dueben and his colleagues are using extensive radiation output data from a forecast model to train a neural network. “First of all, it’s very, very light,” Dueben said. “Second of all, it’s also going to be much more portable. Once we represent radiation with a deep neural network, you can basically port it to whatever hardware you want.”

Showing a pair of output images, one from the machine learning model and one from the forecast model, Dueben pointed out that it was hard to notice significant differences – and even refused to tell the audience which was which. Furthermore, he said, the model had achieved around a tenfold speedup. (“I’m quite confident that it will actually be much better than a factor of ten,” Dueben said.)

A comparison of radiation outputs from a machine learning emulator and the original model. Image courtesy of Peter Dueben.

Dueben and his colleagues have also scaled their tests up to more ambitious realms. They pulled hourly data on geopotential height (Z500) – which is related to air pressure – and trained a deep learning model to predict future changes in Z500 across the globe using only that historical data. “For this, no physical understanding is really required,” Dueben said, “and it turns out that it’s actually working quite well.”

Still, Dueben forced himself to face the crucial question.

“Is this the future?” he asked. “I have to say it’s probably not.”

There were several reasons for this. First, Dueben said, the simulations were unstable, eventually “blowing up” if they were stretched too far. “Second of all,” he said, “it’s also unknown how to increase complexity at this stage. We only have one field here.” Finally, he explained, there were only forty years of sufficiently detailed data with which to work.

Still, it wasn’t all pessimism. “It’s kind of unlikely that it’s going to fly and basically feed operational forecasting at one point,” he said. “However, having said this, there are now a number of papers coming out … where people are looking into this in a much, much more complicated way than we have done with really sophisticated convolutional networks … and they get, actually, quite good results. So who knows!”

The path forward

“The main challenge for machine learning in the community that we’re facing at the moment,” Dueben said, “is basically that we need to prove now that machine learning solutions can really be better than conventional tools – and we need to do this in the next couple of years.”

There are, of course, many roadblocks to that goal. Forecasting models are extraordinarily complicated; iterations on deep learning models require significant HPC resources to test and validate; and metrics of comparison among models are unclear. Dueben also outlined a series of major unknowns in machine learning for weather forecasting: could our explicit knowledge of atmospheric mechanisms be used to improve a machine learning forecast? Could researchers guarantee reproducibility? Could the tools be scaled effectively to HPC? The list went on.

“Many scientists are working on these dilemmas as we speak,” Dueben said, “and I’m sure we will have an enormous amount of progress in the next couple of years.” Outlining a path forward, Dueben emphasized a “mixture of a top-down and a bottom-up approach to link machine learning with weather and climate models.” Per his diagram, this would combine neutral networks based on human knowledge of earth systems with reliable benchmarks, scalability and better uncertainty quantification.

As far as where he sees machine learning for weather prediction in ten years?

“It could be that machine learning will have no long-term effect whatsoever – that it’s just a wave going through,” Dueben mused. “But on the other hand, it could well be that machine learning tools will actually replace almost all conventional models that we’re working with.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

Nvidia Appoints Andy Grant as EMEA Director of Supercomputing, Higher Education, and AI

March 22, 2024

Nvidia recently appointed Andy Grant as Director, Supercomputing, Higher Education, and AI for Europe, the Middle East, and Africa (EMEA). With over 25 years of high-performance computing (HPC) experience, Grant brings a Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Houston We Have a Solution: Addressing the HPC and Tech Talent Gap

March 15, 2024

Generations of Houstonian teachers, counselors, and parents have either worked in the aerospace industry or know people who do - the prospect of entering the fi Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire