Introduction
Successful oil and gas exploration today requires ever-faster upstream processing. To shorten the compute time needed to get actionable information, organizations need to reduce survey processing run times from months to weeks and be capable of scaling to handle the explosive data growth.
With growing competition to open new fields and get more out of existing wells, getting answers faster gives organizations an advantage in overall costs, time to market, competitive bidding processes and with time-sensitive projects. Removing IT infrastructure obstacles that can slow upstream processing will improve an organization’s chance for success.
The biggest hurdle to time-to-oil is massive increases in the amount of data used and generated in support of a single project.
What’s needed to be the most productive (i.e., run the most jobs in a given time and make decisions faster) is a storage solution that is highly scalable, and that can the handle mixed workloads – large, sequential bandwidth and small random I/O together – that are increasingly important in upstream projects. Such a solution would accelerate data access and time-to-results by supporting high speed ingest to the broad range of custom and commercial applications used in processing and modeling.
Evolving requirements
Over the past decade, the geographical size of an average study has increased tenfold, and advances in study techniques, new sensors, and the transition to 4D have raised the average study dataset size to gigabytes or even terabytes. In fact, it is not unusual for a completed project to end up in the hundreds of terabytes range.
Additionally, some companies have sought to increase production of existing oil wells using innovative techniques. For example, this year BP announced a method whereby salt is removed from sea water before pumping it into an oil field. Compared to older techniques, the company expects this desalination step added to the traditional “waterflooding” technique will allow it to extract an extra 42 million barrels in its St Clair Ridge oilfield west of Shetland, off the Scotland coast.
The constant development and implementation of new extraction procedures means organizations will need to reexamine raw seismic and probe data, re-running analyses and simulations. That means data will need to be cost-effectively stored for long periods, located when new analysis is needed, and placed on high-performance storage to ensure upstream processing is not slowed when this data is re-run.
Taking all of these factors into account will help define the required characteristics of a scale-out storage solution that speeds upstream processing.
The storage solution must be high performance. The ability to handle large sequential I/O is no longer enough on its own. With so much data in every phase of every project, effective storage solutions need to handle small random I/O with equal grace. In this way, massive amounts of data and metadata can be effectively moved and computed.
To be effective in upstream, a solution must offer massive scalability. New seismic processing techniques produce hundreds of terabytes data per project. Across multiple projects this regularly develops into a need to store, access, and manage multiple datasets. As such, a storage solution must be able to consolidate hundreds of terabytes to petabytes of data onto a single platform.
Given that the goal is to speed upstream processing, storage-related downtime must be avoided. A solution must offer a full set of high availability features such as redundant components and paths, multiple RAID levels, and failover across multiple nodes.
A solution must also provide organizations with the flexibility to store data for longer times on appropriate cost/performance devices, while offering data management tools to migrate and protect that data.
DDN as your technology partner
Traditional storage solutions can introduce major performance and management problems when scaled to meet today’s increased requirements for upstream proceeding for oil and gas exploration. This is why many of the leading exploration companies are partnering with DataDirect Networks (DDN).
DDN offers an array of storage solutions with different I/O and throughput capabilities to meet the cost/performance requirements of any upstream processing effort. The solutions are extremely scalable in capacity and density. Based on its Storage Fusion Architecture, the DDN SFA 12K line offers a number of firsts including up to 40 GB/s host throughput for reads AND writes, 3.6 PB per rack, and the ability to scale to more than 7.2 PB per system. Furthermore, DDN lets organizations control their cost and performance profile by mixing a variety of media in the same system – SSD, SAS, and SATA – to achieve the appropriate cost/performance mix for their applications.
By consolidating on DDN storage, organizations get fast, scalable storage that solves performance inconsistency issues and provides easy-to-manage long term data retention.
Additionally, DDN offers the industry’s leading storage appliances – GRIDScaler and EXAScaler – which integrate leading HPC parallel file systems with DDN’s SFA storage to eliminate performance bottlenecks, while simplify deployment and management.
The bottom line is that DDN offers storage solutions that are ideally suited to the needs of organizations that want to accelerate their upstream processing.
For more information about DDN solutions for oil and gas exploration, visit www.ddn.com.