Topology Can Help Us Find Patterns in Weather

By James Reinders

December 6, 2018

Topology–the study of shapes–seems to be all the rage. You could even say that data has shape, and shape matters. Shapes are comfortable and familiar concepts, so it is intriguing to see that many applications are being recast to use topology. For instance, looking for weather and climate patterns.

The Quest for Explainability

Interestingly, a key motivation for looking to topology is explainability (interpretability). One might say that, for many, the honeymoon period with “AI” (Artificial Intelligence, including machine learning) is over. Now, we hear talk of “XAI” (Explainable Artificial Intelligence). Goals for XAI expressed by researchers at DARPA (Defense Advanced Research Projects Agency) are to [1] produce more explainable models, while maintaining a high level of learning performance (prediction accuracy), and [2] enable human users to understand, appropriately trust, and effectively manage the emerging generation of artificially intelligent partners.

A goal of XAI is to enable human users to understand, appropriately trust, and effectively manage the emerging generation of artificially intelligent partners.

An XAI research area for this is called topological data analysis, and it offers us the opportunity to relate results of data analysis in terms of shapes.

Interest in Extreme Weather

Thanks to high-performance computing, weather predictions have become both more accurate and more precise (localized) in recent years. While this is true for most weather, it is far less the case for extreme weather events. It turns out that the extreme weather, such as thunderstorms, blizzards, heavy rains, dry spells, and hurricanes, are more challenging to forecast than more ordinary weather. The immediate and tangible benefits of better forecasting of extreme weather better are obvious. Additionally, there are longer-term trends to consider as well. In this vein, puzzling over the apparent supercharging of extreme weather events due to human activity is one of the youngest and most important branches of climate science.

Weather vs. Climate

Scientists speak distinctly of weather forecasts vs. climate forecasts. Today, the two types of modeling and forecasting use different mathematical modeling and computer programming. The fundamental reason for differences stem from the fact that computer resources are not unlimited in scope and speed. If we had infinitely fast computers at our command, the models for weather and climate would converge. As it is, we are very far from that, and thus weather and climate modeling are very different beasts in practice. Despite the differences in weather and climate modeling, techniques such as using topological data analysis will find a place in both weather and climate forecasting.

The concept of weather versus climate can be thought of this way: a weather forecast seeks to help us understand if it will rain on Thursday, while a climate forecast seeks to help us understand if a drought will continue for the next decade. A weather forecast for a hurricane in Central Florida should help us deploy emergency workers now, while a climate forecast for years of drought could guide planning for water rationing programs, longer-term investments in locating more sources, or reducing demand. It is not unusual for climate models to be run on supercomputers only when the computer is not being used for the first priority, which is weather forecasting. This makes sense when you consider there is no immediate risk if a climate computation takes a bit longer to run, but the timeliness of a weather forecast can be critical.

Patterns from Topological Analysis

Researchers at the University of Liverpool[1], working with researchers at Lawrence Berkeley National Laboratory, are exploring the use of topological data analysis for detecting and classifying patterns (shapes) in climate data.

A pattern of interest is that of events called atmospheric rivers. An atmospheric river is a long narrow high-moisture filament, resembling a river in many ways including their shape. They have been called “rivers in the sky.”

Given this “shape” thinking, it is not surprising that atmospheric rivers can have very different widths and lengths, but they have connectivity like a river and holes like small islands in the path of a river. We know from shapes the differences between a river and a string of unconnected lakes. Atmospheric rivers play a key role in water movement, with a strong atmospheric river having a flow seven to 15 times that of the flow at the mouth of the Mississippi River. Using topological analysis, atmospheric rivers can be identified and separated from events in the atmosphere that do not have the correct shape of an atmospheric river.

atmospheric rivers – have the shape of rivers in the sky; they can yield extreme rainfall and floods that takes away life, or normal rainfall that supports life

Atmospheric rivers that contain the largest amounts of water vapor and the strongest winds can lead to extreme rainfall and floods when they stall over watersheds vulnerable to flooding. Such events can disrupt travel, induce mudslides, and lead to catastrophic damage to life and property. Not all atmospheric rivers cause damage, the majority are simply responsible for the rain or snow that animals and plants depend upon for life.

Wet weather for Seattle: An Atmosphere River commonly called the Pineapple Express is easy to see in this image.

 

No Atmosphere Rivers in this image.
Shape tell the story: The familiar shape of a “river” jumps out in the first image, and is absent in the second image. It seems intuitive that the image with an apparent river will result in a lot of rainfall in the U.S. Pacific Northwest. The first image shows an atmospheric river (AR), this particular one is commonly called the “Pineapple Express” which is characterized by a strong flow of the moisture associated with the heavy precipitation originating from the close waters to the Hawaiian Islands. The second image is an example of a non-atmospheric river that does not form a narrow corridor of high concentrated atmospheric moisture in the atmosphere reaching the Pacific coast of North America. Both images are courtesy of [email protected] — they come from an integrated water vapor (IWV, kg/m^2) product of the version 5.1 of the Community Atmosphere Model (CAM 5.1) simulated at the National Energy Research Scientific Computing Center (NERSC), Lawrence Berkeley Lab, CA, USA.

The researchers combined ideas from topological data analysis with machine learning for detecting, classifying and characterizing extreme weather events, such as certain atmospheric rivers. While these researchers were developing their techniques to analyze climate model output, it will have applicability to weather model output as well. They have successfully demonstrated this approach on the Cori supercomputer. Cori, one of the world’s dozen most powerful supercomputers, with high performance Intel multicore processors, is operated by the National Energy Research Scientific Computing Center (NERSC).[2]

Researchers have published results showing that their accuracy (up to 90%) is higher than any prior published results for detection and classification of atmospheric rivers. They applied their algorithm to climate models, using data spanning nearly four decades of weather data, including four different spatial resolutions and two different temporal resolutions. Computing on up to 480 high-performance Intel Xeon (Haswell) processor cores, their typical run times for the analysis was on the order of 10 minutes for the topological analysis followed by a few hours for the classification algorithm. Their implementation used C++ code for the topological data analysis, and Python scikit-learn for the machine learning classification algorithm known as SVM (Support Vector Machine). For the SVM, good scaling was achieved because the Intel Data Analytics Acceleration Library (DAAL) was installed to accelerate Python.

Shape of Weather to Come

We all have a vested interest in seeing climate and weather models improve, and this is especially true for extreme weather which can literally be a matter of life or death. This particular work shows that thinking in terms of shapes via topological data analysis, combined with machine learning, may provide a uniquely powerful approach for identification and analysis of extreme weather. Aside from providing a more accurate method, the use of topological data analysis might lead to better interpretability of the predictions. Whether we are considering deploying a thousand emergency workers now, or considering a multi-billion-dollar infrastructure investment, we would like to be able to get explanations from those responsible for the forecast motivating our potential actions. Humans ultimately need to be able to defend their predictions, even if they come from “artificially intelligent partners” (AI programs). Topological data analysis offers to help scientists with this challenge, and do so in the familiar language of shapes.


[1] Machine Learning and Topological Data Analysis: Application to Pattern Classification in Fluid and Climate Simulations, by Vitaliy Kurlin and Grzegorz Muszynski from the University of Liverpool, plus Michael Wehner, Karthik Kashinath, and Prabhat from Lawrence Berkeley National Laboratory, presented at the Big Data Summit.

[2] Number 10 on the TOP500.org when the work was done, now #12 as of the November 2018 list.

James Reinders is an HPC enthusiast and author of eight books with more than 30 years of industry experience, including 27 years at Intel Corporation (retired June 2016).

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

The Present and Future of AI: A Discussion with HPC Visionary Dr. Eng Lim Goh

November 27, 2020

As HPE’s chief technology officer for artificial intelligence, Dr. Eng Lim Goh devotes much of his time talking and consulting with enterprise customers about how AI can benefit their business operations and products. Read more…

By Todd R. Weiss

SC20 Panel – OK, You Hate Storage Tiering. What’s Next Then?

November 25, 2020

Tiering in HPC storage has a bad rep. No one likes it. It complicates things and slows I/O. At least one storage technology newcomer – VAST Data – advocates dumping the whole idea. One large-scale user, NERSC storage architect Glenn Lockwood sort of agrees. The challenge, of course, is that tiering... Read more…

By John Russell

Exscalate4CoV Runs 70 Billion-Molecule Coronavirus Simulation

November 25, 2020

The winds of the pandemic are changing – for better and for worse. Three viable vaccines now teeter on the brink of regulatory approval, which will pave the way for broad distribution by April or May. But until then, COVID-19 cases are skyrocketing across the U.S. and Europe... Read more…

By Oliver Peckham

Azure Scaled to Record 86,400 Cores for Molecular Dynamics

November 20, 2020

A new record for HPC scaling on the public cloud has been achieved on Microsoft Azure. Led by Dr. Jer-Ming Chia, the cloud provider partnered with the Beckman Institute for Advanced Science and Technology at the Universi Read more…

By Oliver Peckham

Gordon Bell Special Prize Goes to Massive SARS-CoV-2 Simulations

November 19, 2020

2020 has proven a harrowing year – but it has produced remarkable heroes. To that end, this year, the Association for Computing Machinery (ACM) introduced the Gordon Bell Special Prize for High Performance Computing-Ba Read more…

By Oliver Peckham

AWS Solution Channel

Introducing AWS ParallelCluster as an Intel Select Solution

High performance computing (HPC) system owners can spend weeks or months researching, procuring, and assembling components to build HPC clusters to run their workloads. Understanding and managing the complexities of compute, storage, networking, and software requirements can be confusing and time-consuming, slowing innovation and results. Read more…

Intel® HPC + AI Pavilion

Intel Keynote Address

Intel is the foundation of HPC – from the workstation to the cloud to the backbone of the Top500. At SC20, Intel’s Trish Damkroger, VP and GM of high performance computing, addresses the audience to show how Intel and its partners are building the future of HPC today, through hardware and software technologies that accelerate the broad deployment of advanced HPC systems. Read more…

Gordon Bell Prize Winner Breaks Ground in AI-Infused Ab Initio Simulation

November 19, 2020

The race to blend deep learning and first-principle simulation to speed up solutions and scale up problems tackled is one of the most exciting research areas in computational science today. This year’s ACM Gordon Bell Prize winner announced today at SC20 makes significant progress in that direction. Read more…

By John Russell

The Present and Future of AI: A Discussion with HPC Visionary Dr. Eng Lim Goh

November 27, 2020

As HPE’s chief technology officer for artificial intelligence, Dr. Eng Lim Goh devotes much of his time talking and consulting with enterprise customers about Read more…

By Todd R. Weiss

SC20 Panel – OK, You Hate Storage Tiering. What’s Next Then?

November 25, 2020

Tiering in HPC storage has a bad rep. No one likes it. It complicates things and slows I/O. At least one storage technology newcomer – VAST Data – advocates dumping the whole idea. One large-scale user, NERSC storage architect Glenn Lockwood sort of agrees. The challenge, of course, is that tiering... Read more…

By John Russell

Exscalate4CoV Runs 70 Billion-Molecule Coronavirus Simulation

November 25, 2020

The winds of the pandemic are changing – for better and for worse. Three viable vaccines now teeter on the brink of regulatory approval, which will pave the way for broad distribution by April or May. But until then, COVID-19 cases are skyrocketing across the U.S. and Europe... Read more…

By Oliver Peckham

Azure Scaled to Record 86,400 Cores for Molecular Dynamics

November 20, 2020

A new record for HPC scaling on the public cloud has been achieved on Microsoft Azure. Led by Dr. Jer-Ming Chia, the cloud provider partnered with the Beckman I Read more…

By Oliver Peckham

Gordon Bell Special Prize Goes to Massive SARS-CoV-2 Simulations

November 19, 2020

2020 has proven a harrowing year – but it has produced remarkable heroes. To that end, this year, the Association for Computing Machinery (ACM) introduced the Read more…

By Oliver Peckham

Gordon Bell Prize Winner Breaks Ground in AI-Infused Ab Initio Simulation

November 19, 2020

The race to blend deep learning and first-principle simulation to speed up solutions and scale up problems tackled is one of the most exciting research areas in computational science today. This year’s ACM Gordon Bell Prize winner announced today at SC20 makes significant progress in that direction. Read more…

By John Russell

SC20 Keynote: Climate, Exascale & the Ultimate Answer

November 19, 2020

SC20’s keynote was delivered by renowned meteorologist and climatologist Bjorn Stevens, a director at the Max Planck Institute for Meteorology since 2008 and a professor at the University of Hamburg. In his keynote, Stevens traced the history of climate science from its earliest days through... Read more…

By Oliver Peckham

EuroHPC Exec. Dir. Talks Procurement, EPI, and Europe’s Efforts to Control its HPC Destiny

November 19, 2020

While much of the HPC community’s attention is fixed on SC20’s flood of news and new product announcements, Anders Dam Jensen, the newly-minted executive di Read more…

By Steve Conway

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

Supercomputer-Powered Research Uncovers Signs of ‘Bradykinin Storm’ That May Explain COVID-19 Symptoms

July 28, 2020

Doctors and medical researchers have struggled to pinpoint – let alone explain – the deluge of symptoms induced by COVID-19 infections in patients, and what Read more…

By Oliver Peckham

Azure Scaled to Record 86,400 Cores for Molecular Dynamics

November 20, 2020

A new record for HPC scaling on the public cloud has been achieved on Microsoft Azure. Led by Dr. Jer-Ming Chia, the cloud provider partnered with the Beckman I Read more…

By Oliver Peckham

Google Hires Longtime Intel Exec Bill Magro to Lead HPC Strategy

September 18, 2020

In a sign of the times, another prominent HPCer has made a move to a hyperscaler. Longtime Intel executive Bill Magro joined Google as chief technologist for hi Read more…

By Tiffany Trader

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing th Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

NICS Unleashes ‘Kraken’ Supercomputer

April 4, 2008

A Cray XT4 supercomputer, dubbed Kraken, is scheduled to come online in mid-summer at the National Institute for Computational Sciences (NICS). The soon-to-be petascale system, and the resulting NICS organization, are the result of an NSF Track II award of $65 million to the University of Tennessee and its partners to provide next-generation supercomputing for the nation's science community. Read more…

Is the Nvidia A100 GPU Performance Worth a Hardware Upgrade?

October 16, 2020

Over the last decade, accelerators have seen an increasing rate of adoption in high-performance computing (HPC) platforms, and in the June 2020 Top500 list, eig Read more…

By Hartwig Anzt, Ahmad Abdelfattah and Jack Dongarra

Leading Solution Providers

Contributors

Aurora’s Troubles Move Frontier into Pole Exascale Position

October 1, 2020

Intel’s 7nm node delay has raised questions about the status of the Aurora supercomputer that was scheduled to be stood up at Argonne National Laboratory next year. Aurora was in the running to be the United States’ first exascale supercomputer although it was on a contemporaneous timeline with... Read more…

By Tiffany Trader

European Commission Declares €8 Billion Investment in Supercomputing

September 18, 2020

Just under two years ago, the European Commission formalized the EuroHPC Joint Undertaking (JU): a concerted HPC effort (comprising 32 participating states at c Read more…

By Oliver Peckham

At Oak Ridge, ‘End of Life’ Sometimes Isn’t

October 31, 2020

Sometimes, the old dog actually does go live on a farm. HPC systems are often cursed with short lifespans, as they are continually supplanted by the latest and Read more…

By Oliver Peckham

Texas A&M Announces Flagship ‘Grace’ Supercomputer

November 9, 2020

Texas A&M University has announced its next flagship system: Grace. The new supercomputer, named for legendary programming pioneer Grace Hopper, is replacing the Ada system (itself named for mathematician Ada Lovelace) as the primary workhorse for Texas A&M’s High Performance Research Computing (HPRC). Read more…

By Oliver Peckham

Top500: Fugaku Keeps Crown, Nvidia’s Selene Climbs to #5

November 16, 2020

With the publication of the 56th Top500 list today from SC20's virtual proceedings, Japan's Fugaku supercomputer – now fully deployed – notches another win, Read more…

By Tiffany Trader

Nvidia and EuroHPC Team for Four Supercomputers, Including Massive ‘Leonardo’ System

October 15, 2020

The EuroHPC Joint Undertaking (JU) serves as Europe’s concerted supercomputing play, currently comprising 32 member states and billions of euros in funding. I Read more…

By Oliver Peckham

Microsoft Azure Adds A100 GPU Instances for ‘Supercomputer-Class AI’ in the Cloud

August 19, 2020

Microsoft Azure continues to infuse its cloud platform with HPC- and AI-directed technologies. Today the cloud services purveyor announced a new virtual machine Read more…

By Tiffany Trader

Nvidia-Arm Deal a Boon for RISC-V?

October 26, 2020

The $40 billion blockbuster acquisition deal that will bring chipmaker Arm into the Nvidia corporate family could provide a boost for the competing RISC-V architecture. As regulators in the U.S., China and the European Union begin scrutinizing the impact of the blockbuster deal on semiconductor industry competition and innovation, the deal has at the very least... Read more…

By George Leopold

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This