Subsurface trend analysis in the Gulf of Mexico
Since 1859, over 6 million wells have been drilled worldwide to depths ranging from a few hundred meters to more than 13,000 m. The availability and quality of wellbore data sets vary, yet these resources are used broadly for subsurface interpretation and analyses.
The highest concentration of wellbore data sets is associated with oil and gas drilling in sedimentary basins; however, many non-hydrocarbon regions also have drilling records. These data have proven value for constraining the likely distribution of subsurface properties, however, they contain a high degree of spatial and temporal variability that can misinform predictions, where data are sparse and when geologic context is not considered.
As human exploration of the subsurface increases, there is a need for better data- and knowledge-driven methods to improve prediction of subsurface properties. Even regions with concentrated subsurface exploration still face uncertainties that can obstruct safe and efficient exploration of the subsurface. Informing subsurface predictions with geologic systems knowledge, in combination with available subsurface data, such as well records, is useful for predicting subsurface properties and constraining architecture using deductive and probabilistic methods, including for areas with little or no wellbore data.
BACKGROUND
Improved subsurface characterization, based on geological analysis for resource, geohazard predictions, and real-time drilling risk reduction, can increase the safety and efficiency of offshore operations and reduce hazards and cost, including for enhanced oil recovery (EOR). Geologic interpretations of the subsurface have long been driven by observation-based principles and extrapolation of surface geologic data. Observation-based theories enable prediction of lithologic, structural, and secondary alteration patterns in the subsurface.
Geologic process models and theories include the petroleum system method, which describes processes that contribute to formation and concentration of hydrocarbons in the subsurface. Sequence stratigraphy is another example, one that describes sedimentary depositional processes and patterns, based on the relationship between sediment sources, time, sea level, and accommodation space.
A third example of a model is plate tectonic theory that describes the large-scale motion of lithospheric plates on the earth’s outer surface and provides an interpretive framework for smaller-scale structural, tectonic, and volcanic processes worldwide. These theories were developed from observation and describe geologic systems that can, in turn, aid in analyzing geologic data. Collectively, they offer broad insights into specific geologic systems, which constrain properties and prediction of subsurface architecture.
Geostatistical approaches aim to provide spatially and/or temporally constrained estimates of uni-, bi-, and/or multivariate subsurface properties. In contrast to conventional statistical approaches, spatiotemporal statistical methods are based on the assumption that properties of most earth systems are not randomly distributed. These approaches are based on the principle of autocorrelation (the measure of similarity between observations, including similarity across space and time—as, for example, the measure of temperatures across days of the week). The presence of spatial or temporal autocorrelation (either positive or negative) in a data set allows for the prediction of values at unknown locations or times, based on the knowledge of values at known locations or times.
For geologic systems, time and space are important variables. In the context of subsurface analysis, autocorrelation often is present with geologic properties, such as orientation or concentration of structural features (e.g., faults and fractures) and trends in physical properties (e.g., pressure and temperature). Thus, this analytical approach differs from traditional statistical techniques that assume independence among observations. Using a geostatistical approach uses and preserves spatial and/or temporal context of the data to provide contextualized analytical results.
Spatiotemporal statistics seek to capture and use spatial dependency to improve the certainty of the analysis and deduct the degree of spatial heterogeneity. Thus, within the field of spatiotemporal statistics, a crucial first step in any analysis is to characterize autocorrelative relationships.
METHODS
Uncertainty of the subsurface may be reduced, even for areas with little or no subsurface measurements, using methodical, science-driven geologic knowledge and data. The Subsurface Trend Analysis (STA) method (Fig. 1) is a data-driven approach, in which deductive geologic methods are integrated with spatiotemporal statistical autocorrelative methods. NETL has developed STA, a hybrid spatiotemporal statistical-geologic approach, to improve understanding of subsurface systems and predictions of their properties. The STA method assumes that the present-day subsurface is not random, but is a product of its history, which is the sum of its systematic processes.


Application of STA targets subsurface properties, for which autocorrelation can be established. Subsurface properties include, but are not limited to, lithologic thickness, lithologic composition, porosity, pressure, temperature, permeability, density of natural fractures, and secondary alteration effects, such as mineralization or dissolution. See Table 1 for a list.
Thus, the STA method involves four key stages. These stages are 1) resource acquisition and identification, including the identification of subsurface data sets, gathering information from geologic studies, and testing the data sets for autocorrelative behavior for the subsurface properties of interest; 2) using the geologic systems information to postulate subsurface domains with spatially and temporally consistent geologic histories; 3) testing the robustness of those subsurface domains through statistical validation methods; and, 4) if appropriate, using the finalized domains to drive additional analyses or simulations.
To demonstrate and validate the improved potential of the STA method, it was applied in an analysis of the northern Gulf of Mexico (GOM). This evaluation was prepared, using only existing, publicly available well data and geologic literature. This information was implemented in the STA method to predict subsurface trends for in situ pressure, in situ temperature, porosity, and permeability. Results of this analysis were validated against new reservoir data. STA-driven results were contrasted with previous studies. Both indicated that STA predictions were an improvement.
Overall, STA results can provide critical information to evaluate and reduce risks, identify and improve characterization in areas of scarce or discontinuous data, and provide inputs for multi-scale modeling efforts, from reservoir scale to basin scale. Thereby, the STA method offers an ideal framework for guiding future science-based machine learning and natural language processing to optimize subsurface analyses and predictions.
OVERVIEW
The STA method (Rose et al., 2020) and corresponding software tool (in development) is a member of NETL’s Offshore Risk Modeling (ORM) suite. ORM is a multi-component suite developed initially from 2011 to 2016 through simulation and predicting the behavior of offshore engineered-natural systems—incorporating lessons learned from previous deleterious events and spanning natural and anthropogenic offshore hydrocarbon activities.
Along with the Energy Data eXchange (EDX), NETL’s data curation and collaboration website, ORM provides data, tools and technologies to assist with evaluating potential risks and identifying possible technology gaps using science-based, data-driven assessments. The modules comprising ORM support analysis from the subsurface to coastline, to address impacts by energy activities, such as oil spills. The STA was developed to address the need to characterize the subsurface, to reduce risks associated with subsurface heterogeneity and uncertainty.
Much of the initial effort on the STA method was focused on developing a framework and validating the methodology, as outlined in the previous section. In its current research phase, the STA is being converted to a software tool that acts as a virtual assistant, guiding users through the components of the STA. To further enhance the method, NETL researchers are integrating machine learning and artificial intelligence (ML/AI) capabilities, to accelerate the process of data collection and deepen insights into statistical trends of the subsurface environment. These enhancements also will incorporate additional datasets and refine the STA methodology to the assessment of geologic properties and subsurface uncertainty in new offshore areas at multiple scales.
APPROACH
In current STA development and research, the approach is two-fold. Development of a 3D, real-time software smart tool, based on the STA framework, is underway. As the tool matures, it will be tested and validated, utilizing logging-while-drilling (LWD)/seismic while drilling (SWD) datasets and analyses of structural complexity in the GOM.
This process will identify potential subsurface hazards and innovate new, advanced data computing methods to improve prediction of subsurface properties that inform resource, environmental and operational needs. This research will use data and models from the ORM with intelligent databases, ML/AI, big data, and other advanced computing technologies to address subsurface industry challenges, such as characterization and mapping of geologic hazards, safe operations, equipment reliability, and cost optimization.
The STA tool for 2D and 3D analytics will be released in the next year. The STA tool will be copyrighted and available for licensing and use either through open-source and/or commercial agreements. The analyses and results will be available via Geocube, hosted on EDX.
EXPECTED OUTCOME
Several goals for this R&D effort have been set and are expected to be achieved:
- Development of a machine learning (ML)-NLP-STA science-based, data-driven tool that can be used at various scales, from basin to wellbore, for subsurface exploration and real-time geohazard monitoring of sedimentary systems.
- Enhancement of the STA model into an ML-enhanced tool that will enhance efficiency, increase safety, optimize drilling operations to save costs, and improve resource predictions, thereby optimizing access to domestic oil and gas resources through:
- Present & future 2D work, including
- Additional Integration of ML/NLP
- Suggestions of relevant literature (NLP)
- Identifying and suggesting domains to researcher (supervised ML)
- Ability to ingest real-time data
- Predict reservoir properties during in-field operations.
- Present & future 2D work, including
- Extension into 3D data analysis:
- The structures being analyzed are 3D in nature, which has influence on characteristics critical to resource operations
- More detailed data with better predictions
- Use of fuzzy logic tool in combination
- Custom 3D visualizations to gain perspective on data and subsurface predictions
- Integrating geologic systems knowledge and real-time data (e.g.LWD) to improve instantaneous predictions.
REFERENCES
- Myers et al., 1982;
- Beucher et al., 1993;
- Almeida and Frykman, 1994;
- Chambers et al., 1994;
- Ravenne et al., 2002;
- Osborn et al., 2011;
- Morris et al., 2015).
- Rose, 2016)
- Magoon and Valin, 1994
- Vail et al., 1977
- Kelly Rose,et al., 2020
- The last barrel (November 2023)
- Executive viewpoint (November 2023)
- What's new in exploration (November 2023)
- Digital transformation: Accelerating productivity, sustainability in oil and gas (November 2023)
- X80 heavy wall pipe solutions for deep/ultra-deepwater field developments in mild sour environment (November 2023)
- Technological innovation delivers transformative product suite to upstream sector (November 2023)