“Our long-term goal is to estimate the structure of the earth with UQ,” Bui-Thanh explained. “If you can image the Earth quite well and solve for how an earthquake propagates in real time, you can help decision-makers know where there will be potential earthquakes, and use that information to set building codes, determine where and when to evacuate, and save lives.”

The research also has important applications in energy discovery, potentially helping companies discover new oil resources and determine the amount of fossil fuels left from existing wells. The mathematical methods will be general enough that researchers will be able to use them for a host of other inverse problems, like medical imaging and weather forecasting.

Overcoming the Curse of Dimensionality

The problem at the heart of Bui-Thanh’s research is known as the ‘curse of dimensionality.’ This refers to the fact that when one tries to gain more resolution or clarity in solving inverse problems, the difficulty of the calculations increases exponentially, frequently pushing them into the realm of impossibility.

For instance, using the high-performance computers at the Texas Advanced Computing Center (TACC), among the fastest in the world, it can take minutes or hours to perform a single simulation, also known as a sample, to determine the makeup of the Earth.

“If a problem needs 1,000 samples, we don’t have the time,” Bui-Thanh said. “But it may not be a thousand samples we need. It can require a million samples to obtain reliable uncertainty quantification estimations.”

For that reason, even with supercomputers getting faster every year, traditional methods can only get researchers so far. Bui-Thanh will augment traditional inverse methods with machine learning to make problems more solvable. In the case of seismic wave propagation, he hopes to employ a multi-disciplinary approach, including machine learning, to do fast approximations for often-large areas of less importance, and focus the high-resolution simulations on often-small parts of the problem that are deemed most critical.

“We will develop new mathematical algorithms and rigorously justify that they can be accurate and effective,” he said. “We’ll do this in the context of big data and will apply it to new problems.”

In 2017-2018, Bui-Thanh and colleagues at UT Austin and other universities published preliminary results from their work on TACC systems in Inverse Problems, the Journal of Computational PhysicsSIAM Journal on Scientific Computing, and Water Resources Research. The papers applied new scalable methods to various inverse modeling problems to mitigate the curse of dimensionality.

Using the Stampede1 supercomputer at TACC, they effectively used up to 16,384 computing cores and solved large, complex problems in a close to linear, rather than exponential, timescale. Bui-Thanh will expand on this research, which will continue to take advantage of TACC’s large computing resources.

“I have been very fortunate to have direct and instant support from TACC, which has provided me with computing hours and timely software trouble-shootings,” said Bui-Thanh. “These have facilitated my group to produce various preliminary results published in many papers, which in turn have helped establish the credibility for the research proposed in my NSF CAREER award.”

The CAREER award compliments four other grants that Bui-Thanh received in 2018 from NSF, King Abdullah University of Science and Technology, UT System and the UT Austin Portugal Program, which together total $1.2 million, as well as grants in 2017 from the Department of Energy Fusion Energy Sciences and Advanced Scientific Computing Research, the Defense Threat Reduction Agency, and ExxonMobil that apply his inverse modeling methods to a range of critical problems.

“Since my proposed mathematical algorithms are designed for current and future large-scale computing systems, TACC will play an important role in the success of my research work,” Bui-Thanh said.


Source: Aaron Dubrow, TACC