Climate-Induced Storm Flood Data Wins 2023 DesignSafe Dataset Awards

June 7, 2023

June 7, 2023 — A dataset on ‘plausible worst-case scenario’ flooding in California has received a 2023 DesignSafe Dataset award, given in recognition of the dataset’s diverse contributions to natural hazards research.

The dataset’s story begins in 2010, when the U.S. Geological Survey (USGS) conducted a ‘what-if’ scenario of an extremely powerful rainstorm that strikes California, which they called ARkStorm (1.0), short for Atmospheric River (1k). The USGS’ motivation came from their sediment research showing historical recurrence of a ‘megastorm’ every 100-200 years.

Adapted from the visualization for ARkStorm 2.0 by James Done (NCAR), with the color scale based on Scripps/CW3E’s AR scale. Credit: Daniel Swain, UCLA.

ARkStorm 1.0 found that widespread flooding and wind damage along the Central Valley of California from a 25-day deluge of atmospheric rivers would cause at the least hundreds of billions of dollars of damage to property and infrastructure, and widespread evacuations of millions of people.

ARkStorm 1.0 was meant as a stress test for emergency response systems to reveal weak points in an emergency, and then hopefully take steps to address them.

But one thing that ARkStorm 1.0 didn’t account for was climate change, which scientists predict increases atmospheric water vapor and therefore could increase the intensity of megastorms.

Climate Makes ARkStorm 2.0

Enter ARkStorm 2.0 — an update to ARkStorm 1.0 that takes climate change into account by embedding a high-resolution weather model inside of a climate model, using the climate model conditions as the boundary conditions.

Scientists embarked on the first of three phases of ARkStorm 2.0 in 2022, the development of the atmospheric scenario. They completed a dataset that includes the initial condition, forcing, and configuration files for the Weather Research and Forecasting Model (WRF) simulations used to develop the hypothetical extreme storms.

It also includes WRF output meteorological data such as precipitation, wind speed, snow water equivalent, and surface runoff to characterize natural hazard impacts from the storm simulations.

Award-Winning Dataset

The dataset, PRJ-3499 | ARkStorm 2.0: Atmospheric Simulations Depicting Extreme Storm Scenarios Capable of Producing a California Megaflood, received a 2023 DesignSafe Dataset award, given in recognition to the dataset’s diverse contributions to natural hazards research. It is publicly available on the NHERI DesignSafe cyberinfrastructure (https://doi.org/10.17603/ds2-mzgn-cy51).

The ARkStorm 2.0 dataset team consisted of Xingying Huang (NCAR) and Daniel Swain of the University of California, Los Angeles (UCLA).

“The ARkStorm 2.0 dataset is curated from an impacts-based perspective,” said climate scientist Daniel Swain of UCLA’s Institute of the Environment and Sustainability; Swain also is with the National Center for Atmospheric Research’s Capacity Center for Climate and Weather Extremes; and is a California Climate Fellow with The Nature Conservancy.

“We tried to include the variables that are most important for folks who might be doing follow-on analyses to understand the impacts to people, infrastructure, ecosystems of these scenarios,” Swain said.

“Practically, this dataset has been shared with the U.S. Federal Emergency Management Agency (FEMA), the California Department of Water Resources, the Governor’s Office of Emergency Services, and other entities because there is real interest in gaming this scenario out from an infrastructure and risk assessment perspective,” Swain added.

Published Results

The ARkStorm 2.0 dataset is the primary result of a study published August 2022 in Science Advances. The study received widespread attention, including coverage by the New York Times.

“The key findings are twofold. One is, we assess this as a scenario and study what it actually looks like. Yes, clearly, it rains a lot. But how much is a lot, how quickly, and which areas are hit hardest,” Swain said.

Snowfall associated with California megastorm scenarios. (A and B) Cumulative 30-day gross SWE (mm) during ARkHist (A) and ARkFuture (B). (C) Difference in cumulative SWE (mm) between ARkFuture and ARkHist. (D and E) Mean snow fraction (snow-to-rain ratio, in percent) during ARkHist (D) and ARkFuture (E). (F) Difference (%) in mean snow fraction between ARkFuture and ARkHist. Credit: DOI: 10.1126/sciadv.abq099.

 In this scenario and datasets, the scientists focused on the precipitation intensity on an hourly basis, an interesting finding for the future relative to the historical scenarios.

“It’s not just that the future scenario is wetter overall,” Swain said. “The heaviest localized downpours also get considerably more intense. Some of the peaks of the future scenario look like the heavy Texas-style downpours. These intense hourly rainfall rates — unusual for California — can cause big problems in urban and other populated areas.”

The study also conducted a broader assessment of the changing risk of a warming climate, narrowing down the likelihood of a megastorm in the past, present, and future.

”We found that even in this era today of severe droughts and wildfires in California, the risk of a mega flood has probably doubled for the present relative to about a century ago,” Swain said.

Xingying Huang of NCAR (L) and Daniel Swain of UCLA (R).

What that translates to is that a 200-year flood event is now more likely to happen once in 100 years, and because of climate change it could become a once in a 50-year flood event.

“That’s a profound difference with just a few degrees of warming,” Swain said.

DesignSafe Assistance

The datasets produced by the ARkStorm 2.0 scenario are voluminous, making DesignSafe a good choice to host and share the data, since there are no direct caps on the size.

“DesignSafe — it’s in the name. This is very much infrastructure, design and risk-assessment relevant. The repository is well-designed to accommodate sharing with various state and federal agencies,” Swain said.

Initially, the dataset didn’t meet DesignSafe’s strict curation criteria, and it wasn’t accepted. Fortunately, Swain and Huang reached out to DesignSafe staff, who worked with them to improve the dataset’s structure and accessibility.

“It helped to have professional data curators at DesignSafe available for things like this. With their help, we increased the value of the repository from an open science perspective, and we made it easier for scientists to engage with our data. We can just send people a direct URL where they can click on a file name and download it,” he added.

Climate Signal

Said Swain: “Climate change is here. It’s no longer a prediction about the future, but it’s a is an observable reality in the present.  The changes that we expect to see — revealed in the data — when it comes to certain kinds of extreme events, especially temperature and precipitation related, which, of course, precipitation is most directly related to flooding — are large, because the atmospheric response to warming, following the Clausius–Clapeyron curve, is exponential in the case of moisture.  This climate signal is becoming increasingly obvious, even in places like California that people more often associate with water scarcity, drought and wildfires.”

About DesignSafe

DesignSafe is a comprehensive cyberinfrastructure that is part of the NSF-funded Natural Hazard Engineering Research Infrastructure (NHERI) and provides cloud-based tools to manage, analyze, understand, and publish critical data for research to understand the impacts of natural hazards. The capabilities within the DesignSafe infrastructure are available at no-cost to all researchers working in natural hazards. The cyberinfrastructure and software development team is located at the Texas Advanced Computing Center (TACC) at The University of Texas at Austin, with a team of natural hazards researchers from the University of Texas, the Florida Institute of Technology, and Rice University comprising the senior management team. NHERI is supported by multiple grants from the National Science Foundation, including the DesignSafe Cyberinfrastructure, Award #2022469.


Source: DesignSafe

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Watsonx Brings AI Visibility to Banking Systems

September 21, 2023

A new set of AI-based code conversion tools is available with IBM watsonx. Before introducing the new "watsonx," let's talk about the previous generation Watson, perhaps better known as "Jeopardy!-Watson." The origi Read more…

Researchers Advance Topological Superconductors for Quantum Computing

September 21, 2023

Quantum computers process information using quantum bits, or qubits, based on fragile, short-lived quantum mechanical states. To make qubits robust and tailor them for applications, researchers from the Department of Ene Read more…

Fortran: Still Compiling After All These Years

September 20, 2023

A recent article appearing in EDN (Electrical Design News) points out that on this day, September 20, 1954, the first Fortran program ran on a mainframe computer. Originally developed by IBM, Fortran (or FORmula TRANslat Read more…

Intel’s Gelsinger Lays Out Vision and Map at Innovation 2023 Conference

September 20, 2023

Intel’s sprawling, optimistic vision for the future was on full display yesterday in CEO Pat Gelsinger’s opening keynote at the Intel Innovation 2023 conference being held in San Jose. While technical details were sc Read more…

Intel Showcases “AI Everywhere” Strategy in MLPerf Inferencing v3.1

September 18, 2023

Intel used the latest MLPerf Inference (version 3.1) results as a platform to reinforce its developing “AI Everywhere” vision, which rests upon 4th gen Xeon CPUs and Gaudi2 (Habana) accelerators. Both fared well on t Read more…

AWS Solution Channel

Shutterstock 1679562793

How Maxar Builds Short Duration ‘Bursty’ HPC Workloads on AWS at Scale

Introduction

High performance computing (HPC) has been key to solving the most complex problems in every industry and has been steadily changing the way we work and live. Read more…

QCT Solution Channel

QCT and Intel Codeveloped QCT DevCloud Program to Jumpstart HPC and AI Development

Organizations and developers face a variety of issues in developing and testing HPC and AI applications. Challenges they face can range from simply having access to a wide variety of hardware, frameworks, and toolkits to time spent on installation, development, testing, and troubleshooting which can lead to increases in cost. Read more…

Survey: Majority of US Workers Are Already Using Generative AI Tools, But Company Policies Trail Behind

September 18, 2023

A new survey from the Conference Board indicates that More than half of US employees are already using generative AI tools, at least occasionally, to accomplish work-related tasks. Yet some three-quarters of companies st Read more…

Watsonx Brings AI Visibility to Banking Systems

September 21, 2023

A new set of AI-based code conversion tools is available with IBM watsonx. Before introducing the new "watsonx," let's talk about the previous generation Watson Read more…

Intel’s Gelsinger Lays Out Vision and Map at Innovation 2023 Conference

September 20, 2023

Intel’s sprawling, optimistic vision for the future was on full display yesterday in CEO Pat Gelsinger’s opening keynote at the Intel Innovation 2023 confer Read more…

Intel Showcases “AI Everywhere” Strategy in MLPerf Inferencing v3.1

September 18, 2023

Intel used the latest MLPerf Inference (version 3.1) results as a platform to reinforce its developing “AI Everywhere” vision, which rests upon 4th gen Xeon Read more…

China’s Quiet Journey into Exascale Computing

September 17, 2023

As reported in the South China Morning Post HPC pioneer Jack Dongarra mentioned the lack of benchmarks from recent HPC systems built by China. “It’s a we Read more…

Nvidia Releasing Open-Source Optimized Tensor RT-LLM Runtime with Commercial Foundational AI Models to Follow Later This Year

September 14, 2023

Nvidia's large-language models will become generally available later this year, the company confirmed. Organizations widely rely on Nvidia's graphics process Read more…

MLPerf Releases Latest Inference Results and New Storage Benchmark

September 13, 2023

MLCommons this week issued the results of its latest MLPerf Inference (v3.1) benchmark exercise. Nvidia was again the top performing accelerator, but Intel (Xeo Read more…

Need Some H100 GPUs? Nvidia is Listening

September 12, 2023

During a recent earnings call, Tesla CEO Elon Musk, the world's richest man, summed up the shortage of Nvidia enterprise GPUs in a few sentences.  "We're us Read more…

Intel Getting Squeezed and Benefiting from Nvidia GPU Shortages

September 10, 2023

The shortage of Nvidia's GPUs has customers searching for scrap heap to kickstart makeshift AI projects, and Intel is benefitting from it. Customers seeking qui Read more…

CORNELL I-WAY DEMONSTRATION PITS PARASITE AGAINST VICTIM

October 6, 1995

Ithaca, NY --Visitors to this year's Supercomputing '95 (SC'95) conference will witness a life-and-death struggle between parasite and victim, using virtual Read more…

SGI POWERS VIRTUAL OPERATING ROOM USED IN SURGEON TRAINING

October 6, 1995

Surgery simulations to date have largely been created through the development of dedicated applications requiring considerable programming and computer graphi Read more…

U.S. Will Relax Export Restrictions on Supercomputers

October 6, 1995

New York, NY -- U.S. President Bill Clinton has announced that he will definitely relax restrictions on exports of high-performance computers, giving a boost Read more…

Dutch HPC Center Will Have 20 GFlop, 76-Node SP2 Online by 1996

October 6, 1995

Amsterdam, the Netherlands -- SARA, (Stichting Academisch Rekencentrum Amsterdam), Academic Computing Services of Amsterdam recently announced that it has pur Read more…

Cray Delivers J916 Compact Supercomputer to Solvay Chemical

October 6, 1995

Eagan, Minn. -- Cray Research Inc. has delivered a Cray J916 low-cost compact supercomputer and Cray's UniChem client/server computational chemistry software Read more…

NEC Laboratory Reviews First Year of Cooperative Projects

October 6, 1995

Sankt Augustin, Germany -- NEC C&C (Computers and Communication) Research Laboratory at the GMD Technopark has wrapped up its first year of operation. Read more…

Sun and Sybase Say SQL Server 11 Benchmarks at 4544.60 tpmC

October 6, 1995

Mountain View, Calif. -- Sun Microsystems, Inc. and Sybase, Inc. recently announced the first benchmark results for SQL Server 11. The result represents a n Read more…

New Study Says Parallel Processing Market Will Reach $14B in 1999

October 6, 1995

Mountain View, Calif. -- A study by the Palo Alto Management Group (PAMG) indicates the market for parallel processing systems will increase at more than 4 Read more…

Leading Solution Providers

Contributors

CORNELL I-WAY DEMONSTRATION PITS PARASITE AGAINST VICTIM

October 6, 1995

Ithaca, NY --Visitors to this year's Supercomputing '95 (SC'95) conference will witness a life-and-death struggle between parasite and victim, using virtual Read more…

SGI POWERS VIRTUAL OPERATING ROOM USED IN SURGEON TRAINING

October 6, 1995

Surgery simulations to date have largely been created through the development of dedicated applications requiring considerable programming and computer graphi Read more…

U.S. Will Relax Export Restrictions on Supercomputers

October 6, 1995

New York, NY -- U.S. President Bill Clinton has announced that he will definitely relax restrictions on exports of high-performance computers, giving a boost Read more…

Dutch HPC Center Will Have 20 GFlop, 76-Node SP2 Online by 1996

October 6, 1995

Amsterdam, the Netherlands -- SARA, (Stichting Academisch Rekencentrum Amsterdam), Academic Computing Services of Amsterdam recently announced that it has pur Read more…

Cray Delivers J916 Compact Supercomputer to Solvay Chemical

October 6, 1995

Eagan, Minn. -- Cray Research Inc. has delivered a Cray J916 low-cost compact supercomputer and Cray's UniChem client/server computational chemistry software Read more…

NEC Laboratory Reviews First Year of Cooperative Projects

October 6, 1995

Sankt Augustin, Germany -- NEC C&C (Computers and Communication) Research Laboratory at the GMD Technopark has wrapped up its first year of operation. Read more…

Sun and Sybase Say SQL Server 11 Benchmarks at 4544.60 tpmC

October 6, 1995

Mountain View, Calif. -- Sun Microsystems, Inc. and Sybase, Inc. recently announced the first benchmark results for SQL Server 11. The result represents a n Read more…

New Study Says Parallel Processing Market Will Reach $14B in 1999

October 6, 1995

Mountain View, Calif. -- A study by the Palo Alto Management Group (PAMG) indicates the market for parallel processing systems will increase at more than 4 Read more…

ISC 2023 Booth Videos

Cornelis Networks @ ISC23
Dell Technologies @ ISC23
Intel @ ISC23
Lenovo @ ISC23
Microsoft @ ISC23
ISC23 Playlist
  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire