Sectors » Oil & Gas
<img src=”http://media2.hpcwire.com/hpcwire/NETL_HPCEE_supercomputer_150x.jpg” alt=”” width=”95″ height=”65″ />The Office of Fossil Energy’s National Energy Technology Laboratory is the proud owner of a brand new SGI supercomputer. Named High-Performance Computer for Energy and the Environment, or HPCEE for short, the 500 teraflops machine will help NETL scientists undertake a broad range of energy and environmental research, with a focus on coal, natural gas and oil.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/oil_rig_small.jpg” alt=”” width=”96″ height=”83″ />British multinational BP revealed it is building a new datacenter in Houston to house a 2-petaflop supercomputer. When installed in 2013, it will likely be the most powerful system deployed by a commercial entity, at least of the ones that have been publicly revealed. The upcoming petaflopper will support the company’s oil and gas exploration efforts and other research objectives.
Successful oil and gas exploration today requires ever-faster upstream processing. To shorten the compute time needed to get actionable information, organizations need to reduce survey processing run times from months to weeks and be capable of scaling to handle the explosive data growth.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Kepler_GPU_die_small.bmp” alt=”” width=”81″ height=”78″ />NVIDIA has introduced its first Kepler-generation GPU product for high performance computing, and revealed some of the inner working of the new architecture. The announcement took place at the kickoff of the company’s GPU Technology Conference taking place this week in San Jose, California.
The short-list of HPC cloud providers just got a little longer. Infrastructure-as-a-Service provider SoftLayer has added high-end NVIDIA Tesla GPUs to its line of dedicated servers.
Supercomputer-maker Cray is helping oil and gas companies benefit from the most-advanced reservoir modeling approach yet. Called Permanent Reservoir Monitoring, or PRM, the technique requires innovative data warehousing technology and data analysis techniques.
Scientists at GE Global Research are using the multi-petaflop Titan supercomputer at Oak Ridge National Laboratory to study the way that ice forms as water droplets come in contact with cold surfaces. They are working to develop “icephobic” materials that prevent ice formation and accumulation.
GE’s Steve Pavlosky explains how high-performance computing technology helps users in the power industry, where changes happen very fast. Whether it’s changes in demand or a power failure, the control system has to be able to react very quickly.
Anybody who drives one of Ford’s recent vehicles spends a little less money on gasoline thanks to HPC work the carmaker undertook with Oak Ridge National Laboratory, where more than one million processor hours were spent getting a handle on the complex fluid dynamics governing airflow under the hood.
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Off the Wire
Sept. 9 — High-level, directive-based programming models have been rapidly gaining traction as a portable, productive means to develop application code for multicore platforms and accelerators. Due to their usability and portability, programming APIs such as OpenMP and OpenACC, are increasingly being adopted as an alternative to lower-level APIs such as CUDA and OpenCL. This Read more…
NUREMBERG, Germany, Sept. 3 – As new oil and gas reserves become more elusive, companies like Total rely increasingly on high performance computing (HPC) to find opportunities in an ocean of seismic data. With SUSE Linux Enterprise Server as the operating system for its new SGI supercomputer, Total now has an optimal combination of performance, price and efficiency. The solution – Read more…
Aug. 22 — The University of Houston (Dr. Barbara Chapman) in collaboration with the other members of OpenACC, will be hosting an Oil and Gas (O&G) Workshop. We would like to reach out to O&G domain scientists/researchers who are keen to hear about alternative high-level programming models used to port seismic codes to use accelerators. High-level Read more…
SANTA CLARA, Calif., Aug. 19 — DataDirect Networks (DDN) today announced impressive revenue traction for the first six months of 2014, demonstrating a strong trajectory of profitable growth for the company heading into the second half of the year. The company added more than 20 net new petabyte-class customers including one of world’s largest banks, a leader Read more…
Aug. 5 — As global demand for hydrocarbon-based fuel continues to grow, particularly in Asia, and as Western economies recover, there is more pressure on oil companies to invest in new reserves, sweat the assets of mature fields and even revisit what were once considered to be exhausted wells. Exploration techniques and technology advancement has Read more…