Amid wildfire and drought season, worries are growing that another natural disaster is looming over the West Coast: megafloods. While concurrent threats from megafloods and droughts may seem at odds with each other, researchers at the National Center for Atmospheric Research (NCAR) recently conducted a supercomputer-powered study showing that climate change is greatly exacerbating the risk of catastrophic flooding in California.
California hasn’t seen a megaflood for more than 150 years. In 1861-1862, the “Great Flood of 1862” killed more than 4,000 people — about one percent of the state’s total population — and destroyed the equivalent of more than $3 billion in property, accounting for inflation.
Megafloods are tightly related to atmospheric rivers, which carry moisture through the air above the West Coast. But climate change, of course, is drastically altering what might otherwise be predictable weather patterns and structures — so the researchers set out to understand how those changes are affecting the likelihood of California experiencing another megaflood.
To do that, the research team coupled NCAR’s Community Earth System Model and its Weather and Research Forecasting Model — the former representing the global climate, the latter representing regional weather events. In a baseline scenario, they recreated the kinds of storms associated with the Great Flood of 1862 under 1996-2005 climate conditions; then, they switched to a much warmer climate projected under an aggressive emissions scenario from 2071-2080.
“Imagine the wettest storms you can remember in your lifetime. Then imagine they all came back to back, in quick succession, over the course of a month,” explained co-lead author Daniel Swain, a climate scientist affiliated with NCAR and based at the University of California, Los Angeles, in an interview with NCAR’s David Hosansky. “That’s essentially our historical scenario. Add a couple of additional storms that are worse than anything you can remember in your lifetime, and that’s equivalent to our future scenario.”
The simulations were run on NCAR’s in-house Cheyenne supercomputer, housed at the NCAR-Wyoming Supercomputing Center (NWSC). Cheyenne, built by HPE and powered by Intel Broadwell CPUs, delivers 4.79 Linpack petaflops, placing it 109th on the most recent Top500.
The result of the simulations: in the baseline scenario, the storms typically produced about 20-40 inches of precipitation. In the future scenario, that increased to 55 inches. In some unlucky areas, they could produce up to 100 inches of precipitation. To make matters worse, the risk is already higher in the baseline scenario compared to the (comparatively cool) early 1900s — all of that adding up to a worst-case scenario where, by 2060, megafloods are up to seven times more likely than they were in 1920.
“Although California has recently experienced historically severe drought and the broader Southwest is facing an accelerating water scarcity crisis, it’s important to remember that this is still a region susceptible to rare but potentially severe floods,” Swain said. “It may seem paradoxical that climate change is increasing the risks associated with both droughts and floods in a place like California, but that’s exactly what the scientific evidence suggests.”
To learn more about this research, read the reporting from NCAR’s David Hosansky here.
You can also read the research paper written by Xingying Huang and Daniel Swan here.