January 2013
Columns

What’s new in exploration

A picture worth 2x1015 flops

Nell Lukosavich / World Oil

I remember playing hide-and-seek in the cold, buzzing server room at the steel company my mom worked at in the mid-1980s. To me, the whirling machines and never-ending stretches of paper reports chugging out of the printers seemed like a super-sized playground. Little did I know that 25 years later, the entire 2,500-sq-ft “processing playground” would be condensed into a 4.5-by-3-in. smartphone. My kids one day would probably laugh at me when I tried to explain to them what a server room was. Or how the biggest thrill I had as a kid was getting my people and oxen to safety when playing Oregon Trail on my elementary school’s single black-and-green computer monitor.

The speed at which data processing has advanced has been no easy feat for computer scientists and engineers. In fact, in the seismic industry, the first “computers” were people calculating acoustic first breaks on paper records. Commercial data processing was first developed in the early 1950s by a UK-based catering company. Early programmers coded data on to punched cards or paper tape, which was very time- and cost-intensive. With advances in technology, and mass production of equipment that reduced cost, punch card systems evolved into automatic processors that morphed into standardized digital data storage and computing systems. Microchips, software and hardware continued to advance at lightning speed, allowing for quicker, more accurate processing systems.

In the seismic world, simple data processing was difficult enough when exploration prospects were shallow, but as exploration moved into deeper, more remote locations, including sub-salt formations, the need to map complex subsurface formations became a necessity. And as faster, cheaper and more reliable processing systems became available, quantifying terminology also had to be developed. Considering that the first commercially marketed and mass-produced personal computer in 1974, the Altair 8800, had a whopping 256 bytes of memory, the idea of a “petaflop” 30 years ago was more likely to elicit images of a mythical creature than any quantity humanly possible.

By the early 1980s, processor memory had increased by a million bytes. Apple’s Lisa model, introduced in 1983, was the first computer to have 1 MB of memory and a 5-MB external hard drive. Megabytes turned into gigabytes, which turned into terabytes. Now, welcome to the era of the petabyte.

BP’s current high-performance computing (HPC) center, in Houston, Texas, was the world’s first commercial research center to achieve a petaflop of processing speed. With the upcoming opening of BP’s new HPC center, the largest supercomputing complex for commercial research in the world, a new milestone for the data processing industry has been set at 536 terabytes of memory and 23.5 petabytes of disk space. The center is expected to process seismic data at a rate of up to 2 petaflops next year. That’s 2,000 trillion calculations per second.

To put this in perspective: whereas the entire memory in a mid-1980s server room would now fit on an iPhone, this new super-computer’s memory equals that of 147,000 160-GB iPods.

Just 40 years ago, geophysicists used to draw horizons and faults on paper seismic sections. Now supercomputing resources can help geoscientists integrate seismic and geological data for drilling precisely into reservoir sweet spots. Images courtesy of Gulf Publishing and Paradigm.
Just 40 years ago, geophysicists used to draw horizons and faults on paper seismic sections. Now supercomputing resources can help geoscientists integrate seismic and geological data for drilling precisely into reservoir sweet spots. Images courtesy of Gulf Publishing and Paradigm.

All the computing resources have their value in terms of the discovery of major fields and improving recovery. BP has used its seismic capabilities to discover major fields, such as Jack and Tiber fields in the deepwater Gulf of Mexico. Other operators, such as Statoil, are looking into extending the application of seismic technology to increase recovery in older oil fields. Statoil is using its Permanent Reservoir Monitoring (PRM) technology, which utilizes 700 km of seismic cables placed in trenches on the seabed. The clearer, more frequent 4D seismic imaging delivered by the PRM will allow geoscientists to identify bypassed sweet spots, to recover as much as 30 MMbbl from its Snorre and Grane fields.

So, what’s up next for computing speed? With the prefix, peta-, representing a calculation rate of 1015/second, the next quantifiers are exa-, 1018, zetta-, 1021, and yotta-, 1024. We will know what is beyond “yotta-” in the not-too-distant future.

As I play the Oregon Trail game app on my Android, the all-too-familiar message saying “The river is too deep to ford. You lose: six wagon wheels and four oxen” pops up. Only six Oxen left, I think. This game is a lot harder than I remember it being. At least by now, I know better than to lament to my mom about my less-than-stellar attempts at braving through the Midwest on my smartphone. “I did my entire Master’s thesis on punch cards,” she’d remind me. “So you get no sympathy out of me.”  wo-box_blue.gif

About the Authors
Nell Lukosavich
World Oil
Nell Lukosavich nell.lukosavich@worldoil.com
Related Articles
Connect with World Oil
Connect with World Oil, the upstream industry's most trusted source of forecast data, industry trends, and insights into operational and technological advances.