In a recent blog post, Cray Senior Practice Leader Geert Wenes argues that “a perfect storm in seismic processing requirements” is ensuring that the oil and gas industry will be an early adopter of exascale computing technologies.
The industrial sector is not far behind the DOE when it comes to interest in exascale, contends Wenes, especially the integrated oil and gas (O&G) companies (IOCs) that have a substantial presence in the Gulf of Mexico.
“The business case for exascale in O&G is extremely compelling, and — as anyone who has read Daniel Yergin’s “The Prize” will appreciate — goes to the very core of why IOCs exist,” writes Wenes. “In the search for oil and gas in the Gulf of Mexico — one of the richest hydrocarbon basins in the world that continues to reinvent itself for exploration plays — the biggest prizes lie in ultra-deep water.”
The center of this zone lies about 300 miles southwest of New Orleans and nearly 30,000 feet down. Here formations from the Paleogene period, also known as the Lower Tertiary, left high temperature and high pressure reservoirs-pressure often buried under thick salt sheets.
New technology has been essential in unlocking the vast stores of oil held in Lower Tertiary reservoirs. Successful exploration hinges on the acquisition of accurate seismic data. Seismic imaging tools make it possible for explorers to see through layers of salt deposits that had previously limited geological mapping.
Wenes explains that modern surveys can encompass thousands of square kilometers of ocean surface and take months to complete. A job of this magnitude would likely require three vessels producing sound waves, and five support vessels, each towing kilometers of multiple streamers with thousands of listening devices. After the raw seismic data is generated, the resulting petabytes of big data must be processed into subsurface maps, so that geophysicists and geologists can decide if it’s worthwhile to drill an exploration well. The stakes are substantial as drilling a dry hole is a $100 million venture.
Consider the technical challenges. With rocks this deep, sound wave reflections are very weak. Formations are also complicated, so it’s hard to tell where those weak reflections are coming from, but fidelity and resolution of fault lines and traps is critical. This is where the advent of novel processing schemes together with leadership-class supercomputers will provide a huge competitive advantage.
So where does exascale computing come in? During a Rice HPC Forum in 2011, a group of industry representatives presented a chart showing where the industry is going and how it tracks the HPC roadmap.
“It was immediately recognized that seismic processing is really (exa)scale computing, not just (exa)flops — that is, it’s a merger of big data and big compute,” says Wenes. “Second is the inflection point in the processing demand and requirements that occurred in the early 2000s when the O&G industry moved to medium model-complexity in immediate response to the HPC industry breaking the petascale barrier; the latter mostly enabled by many-core co-processors and accelerators.” He adds that the industry was also quick to embrace reverse time migration and full waveform inversion.
The arguments are all leading up to the prediction that “IOCs may surprise us with their rapid adoption and deployment of exascale computing as it becomes available.”