Rare, severe flooding struck both Kentucky and Missouri in the last week alone — and with climate change accelerating, such events are likely to continue. However, flood modeling remains computationally expensive. Now, researchers from Oak Ridge National Laboratory (ORNL) and Tennessee Technological University have created the TRITON toolkit, which leverages GPUs to quickly create accurate 2D inundation maps of given areas.
“The unique thing about TRITON is not just that it uses GPUs — it’s not the only GPU-accessible flood model,” explained Shih-Chieh Kao, lead researcher on the project and a group leader at ORNL, in an interview with ORNL’s Betsy Sonewald. “But it is customized to use multiple GPUs simultaneously, which makes it suitable for solving flood problems on Summit.”
Summit, of course, is ORNL’s 148.6 Linpack petaflops supercomputer, which boasts tens of thousands of Nvidia GPUs and ranked fourth on the most recent Top500 list.
TRITON — which stands for “two-dimensional runoff inundation toolkit for operational needs” — is able to simulate both pluvial (flash) floods and riverine floods (which originate from streams or rivers).
“In order to really understand flood impact, we need to understand inundation, which includes how deep a river is and accounts for different flood events: riverine and flash floods,” Kao said. “Conventional flood models usually only address riverine floods. TRITON can address both and provide more information about the flood impact. If you have this inundation information, you can overlay it on assets and evaluate which are at risk and which are not.”
To test TRITON’s mettle, the researchers simulated the flooding in Houston caused by 2017’s Hurricane Harvey. Hurricane Harvey has become a reference point in recent years for hurricane and flood modeling due to both the severity of the flooding and the insufficient prediction of the hurricane’s progression. In the test case, the researchers simulated 10 days on both a CPU-based configuration and a GPU-based configuration, finding that a single-node GPU configuration on TRITON outperformed a 64-node CPU simulation.
The researchers say that TRITON is open-source, scalable from laptops to supercomputers and under continuous development.
“TRITON will be a foundation for us to keep building on, and we call it a toolkit for a reason,” Kao said. “We keep building to make it more useful — that’s our vision. As computing power increases, and the prices go down, eventually everyone should have more access to use these capabilities to better simulate floods.”
To learn more about this research, read the reporting from ORNL’s Betsy Sonewald.