An Integrated Rock Catalog for E&P Geotechnologists



Similar documents
DecisionSpace. Prestack Calibration and Analysis Software. DecisionSpace Geosciences DATA SHEET

FAN group includes NAMVARAN UPSTREAM,

DecisionSpace Earth Modeling Software

Integrated Reservoir Asset Management

Graduate Courses in Petroleum Engineering

EnerCom The Oil & Gas Conference. An Integrated Workflow for Unconventional Reservoirs

Data Mining and Exploratory Statistics to Visualize Fractures and Migration Paths in the WCBS*

3D Geomechanical Modeling In The Vaca Muerta Shale: A Predictive Tool For Horizontal Well Landing And Completion Strategy

RESERVOIR GEOSCIENCE AND ENGINEERING

WELL LOGGING TECHNIQUES WELL LOGGING DEPARTMENT OIL INDIA LIMITED

Certificate Programs in. Program Requirements

Nautilus Global Schedule 2016

Figure 2-10: Seismic Well Ties for Correlation and Modelling. Table 2-2: Taglu Mapped Seismic Horizons

DEPARTMENT OF PETROLEUM ENGINEERING Graduate Program (Version 2002)

Search and Discovery Article #40256 (2007) Posted September 5, Abstract

Bringing Oilfield Data into the Enterprise

Unconventional Challenges: Integrated Analysis for Unconventional Resource Development Robert Gales VP Resource Development

Software Solution. PetroBank Master Data Store. Information Management and Platform. Reduced Costs DATA SHEET

c. Borehole Shear Test (BST): BST is performed according to the instructions published by Handy Geotechnical Instruments, Inc.

The ever increasing importance of reservoir geomechanics

Petrophysical Well Log Analysis for Hydrocarbon exploration in parts of Assam Arakan Basin, India

Nexus. Reservoir Simulation Software DATA SHEET

Cracking Information Management?

BS PROGRAM IN PETROLEUM ENGINEERING (VERSION 2010) Course Descriptions

Society of Petroleum Engineers SPE Global Training Committee Training Course Review Process

Integration of Geological, Geophysical, and Historical Production Data in Geostatistical Reservoir Modelling

Iterative Database Design Challenges and Solutions for a Geomechanics Database

Tight Gas Reservoirs R&D Project Approach

Reservoir Modelling and Interpretation with Lamé s Parameters: A Grand Banks Case Study

7.2.4 Seismic velocity, attenuation and rock properties

Collecting and Analyzing Big Data for O&G Exploration and Production Applications October 15, 2013 G&G Technology Seminar

ANNEX D1 BASIC CONSIDERATIONS FOR REVIEWING STUDIES IN THE DETAILED RISK ASSESSMENT FOR SAFETY

GAS WELL/WATER WELL SUBSURFACE CONTAMINATION

OTC MS. New Findings in Drilling and Wells using Big Data Analytics J. Johnston, CGG; A. Guichard, Teradata. Abstract

Business Overview of PRODML

14TH INTERNATIONAL CONGRESS OF THE BRAZILIAN GEOPHYSICAL SOCIETY AND EXPOGEF

Lists of estimated quantities to be performed and prices Estimated quantities to be performed. Prices

Abstract. 1. Introduction

Eagle Ford Shale Exploration Regional Geology meets Geophysical Technology. Galen Treadgold Bruce Campbell Bill McLain

Modelling and Simulation Multi-stage Fracturing Propagation in Tight Reservoir through Horizontal Well

Broadband seismic to support hydrocarbon exploration on the UK Continental Shelf

Oil and Gas Training and Competency Development. Developing expertise to efficiently meet key industry challenges

Unconventional Resources

What we know: shale gas as a promising global energy resource for the future. What we need to know: the scientific challenges.

Periodical meeting CO2Monitor. Leakage characterization at the Sleipner injection site

Application of Nuclear Magnetic Resonance in Petroleum Exploration

eeducation realisation

Geothermal. . To reduce the CO 2 emissions a lot of effort is put in the development of large scale application of sustainable energy.

Paradigm High Tech and Innovative Software Solutions for the Oil and Gas Industry

MILLER AND LENTS, LTD.

University of Cyprus

Stanford Rock Physics Laboratory - Gary Mavko. Basic Geophysical Concepts

The material of which a petroleum reservoir. Effects of the Workover Fluid on Wellbore Permeability. t e c h n o l o g y

A GIS BASED GROUNDWATER MANAGEMENT TOOL FOR LONG TERM MINERAL PLANNING

Shale Field Development Workflow. Ron Dusterhoft

Application for Monitoring and Analysis of real-time drilling data PFAS from ITC a.s

Eurocode 7 - Geotechnical design - Part 2 Ground investigation and testing

In Development. Shale Liquids Production Analysis. Value. Key Deliverables. Principal Investigator: Investment per Sponsor $52K (USD)

Worst Case Discharge (WCD)

HDD High Definition Data. defining a new standard for Open Hole, Cased Hole & Production Logging

Florinel ªuþoiu*, Argentina Tãtaru*, Bogdan Simescu* RIGLESS JOBS IN GAS WELLS

APPLICATION OF TRANSIENT WELLBORE SIMULATOR TO EVALUATE DELIVERABILITY CURVE ON HYPOTHETICAL WELL-X

The Challenges of Integrating Structured and Unstructured Data

Geomechanical Effects of Waterflooding

Responding to New Discoveries: Workflow and Strategies for Conquering the Data Gap and Overcoming Stuck-In-Rut Thinking *

Waterflooding identification of continental clastic reservoirs based on neural network

Large-Scale Reservoir Simulation and Big Data Visualization

FIBER-OPTIC SENSING TECHNOLOGIES

Numerical Simulation of CPT Tip Resistance in Layered Soil

Reservoir Characterization of Gandhar Pay Sands by integrating NMR log data with conventional open hole logs A Case Study.

PREESE HALL SHALE GAS FRACTURING REVIEW & RECOMMENDATIONS FOR INDUCED SEISMIC MITIGATION

Annealing Techniques for Data Integration

Understanding Hydraulic Fracturing

DecisionSpace Well Engineering Software

for Oil & Gas Industry

CIVL451. Soil Exploration and Characterization

Well-logging Correlation Analysis and correlation of well logs in Rio Grande do Norte basin wells

Multiple parameters with one Cone Penetration Test. by Mark Woollard

Query Tool (FMS) Introduction

On the Impact of Oil Extraction in North Orange County: Overview of Hydraulic Fracturing

Digital core flow simulations accelerate evaluation of multiple recovery scenarios

Exploiting Prestack Seismic from Data Store to Desktop

sufilter was applied to the original data and the entire NB attribute volume was output to segy format and imported to SMT for further analysis.

ADX ENERGY. Sidi Dhaher Well test Briefing Live Webcast, 4 July Wolfgang Zimmer, Paul Fink

EMBRACING ANALYTICS ACROSS THE WELL LIFECYCLE GREG PALMER SR. INDUSTRY CONSULTANT

PRESIDENT ENERGY PLC. ( President or the Company ) OIL DISCOVERY IN PARAGUAYAN CHACO

1 Mobilisation and demobilisation 1 Deep boring sum 2 Cone penetration tests sum 3 Miscellenous tests sum

When to Use Immediate Settlement in Settle 3D

Fracture Technologies 9. Frac Candidate Selection

Infosys Oil and Gas Practice

Implementing the Data Management Continuum A Practical Demonstration

Understanding Tight Oil

Degree/ Higher Education Jobs:

PNEC 2015: White Paper The Well Hierarchy as a Foundation for MDM: A Case Study Author

IPIMS.ep IHRDC. The best e-learning solution for the E&P industry. ... join the worldwide ipims learning network! Word is spreading...

Analysis of GS-11 Low-Resistivity Pay in Main Gandhar Field, Cambay Basin, India A Case Study

How big data is changing the oil & gas industry

Tight Gas Reservoir Characterization

Transcription:

An Integrated Rock Catalog for E&P Geotechnologists By Graham Cain, Petris Technology, Janusz Buczak, Petris Technology, and Joe Pumphrey, Logicom E&P Presented at Petroleum Network Education Conference (PNEC) 2012

2 Abstract Core data is one of the most important types of data for an oil company. The acquisition of Cores is tedious and very expensive, but Core data is traditionally significantly under exploited. Also, since different disciplines rely on different types of experimental data, no one seems to own Core data or really keep it organized. This paper describes ongoing activity associated with the deployment of An Integrated Rock Catalog for E&P Geotechnologists. This Catalog leverages PetrisWlNDS Recall, an existing industry standard borehole data management system with associated interpretation and modeling applications, and Q-SCAL, an existing conventional and special core data analysis and interpretation system. The Rock Catalog facilitates the storage of both routine and special Core data, and includes tools to interpret this data. Data integration is also enhanced, since raw and interpreted Core data are stored alongside, and can be displayed together with all other types of borehole data such as logs, zonations and geomechanical information. Through a new, advanced Search Engine, the Catalog provides innovative and versatile study possibilities, in particular as relates to searches for analogues and for risk reduction in reservoir characterization and modeling. Introduction For many oil companies, the ability to exploit an Integrated global Rock Catalog that would allow their geotechnologists to characterize Rock Types with respect to all available rock parameters and tests, and then be able to search this Rock Catalog intelligently and on demand, represents a form of "Holy Grail. It is evident that it would have very significant business value for any oil company. This paper describes ongoing activity associated with the deployment of An Integrated Rock Catalog for E&P Geotechnologists". This Rock Catalog leverages PetrisWlNDS Recall, Petris' existing industry standard borehole data management system with associated interpretation and modeling applications, and Q-SCAL, Logicom's existing conventional and special core data analysis and interpretation system. This paper illustrates that the realization and utilization of a global Integrated Rock Catalog is now within reach for any oil company. We will describe how this Rock Catalog "Platform" was established, and explain how the necessary additional workflows and tools were defined and then deployed on top of the existing PetrisWlNDS Recall borehole data management and application suite to achieve this. We will also show results from analog search examples within a Rock Catalog that we believe are truly innovative, and that clearly demonstrate the added value of managing core data alongside, and in conjunction with, all other data within an integrated borehole data repository. Background Core data is one of the most important types of data for an oil company. The acquisition of cores is tedious and very expensive, and laboratory experiments (core analysis) on core samples are often very time consuming. Nevertheless, core measurements are irreplaceable and of primary importance for any form of subsurface evaluation, and are inherently multidisciplinary.

3 This fact is well illustrated by the following examples of how different disciplines use core data: Geology for core descriptions, facies analysis, mineral identification, depositional information, formation age, and x-ray analysis, etc. Petrophysics to calibrate rock properties (such as porosity, permeability and grain density), identify rock mechanical parameters, obtain saturation from capillary pressure measurements, and study electrical and acoustic properties, etc. Reservoir Engineering to define relative permeability parameters, capillary pressure curves, pore volume compressibility, critical gas saturation, etc. All reservoirs are typically characterized by the pore types that are present (pores with varying shapes and sizes based upon their origin), since variations in pore types result in different porosity-permeability relationships. In order to derive relationships between porosity and permeability, pore types are grouped into Rock Types, where a Rock Type is often defined as an interval of rock with a unique pore geometry. Performing Rock Type classifications requires data derived from analysis of core samples. This analysis is normally conducted in a laboratory using a well defined Special Core Analysis workflow, and the derived data is typically referred to as SCAL measurement data (or just SCAL data). Special core analysis typically provides, for example, porosity, fluid saturations, permeability and relative permeability, capillary pressure, pore throat and grain size distributions, grain density and mineral composition, electrical properties, and hydrocarbon analysis. It is worth noting here, from a data management standpoint, that there is no widely used industry standard format for the delivery of core data and laboratory results other than Excel; and Excel does not provide a standard. This means that there is no core data equivalent of the LAS, LIS and DLIS formats used systematically for wire line data. Following this data management theme a little further, we also observe that SCAL data is often delivered months or even years after a well is completed (and not necessarily all together, since certain laboratory procedures take much longer to complete than others). This means that core data is typically never properly quality controlled upon delivery to an oil company, and that it is not systematically captured and stored in a single location. All this combined with the multidisciplinary nature of core data usage has often meant that core data has found itself "scattered" around an oil company, collected and locally managed by the disciplines that use it most, and then simply "resurrected" for local re-analysis and use when required. In fact, it is often the case that core data is only looked at seriously when a new field or regional study is started. From a data asset management perspective, this is alarming, since there is absolutely no guarantee that all the relevant data will actually be found again at that time. Despite its importance, then, core data continues to be significantly under managed and largely under exploited. Also, since different disciplines rely on different types of measurements from core samples, no one oil company group actually seems to own core data or to really keep it all properly organized.

4 Core Data Management and Exploitation Initiatives The first solicitations received by Petris for assistance with the intelligent management of all types of core data date back to mid-2007. Since then, Petris has worked progressively with a number of key customers to add a variety of core data types to the overall borehole data footprint that is managed by PetrisWlNDS Recall. By late 2009, the digital core data management footprint of PetrisWlNDS Recall included conventional core data, special core analysis (SCAL) data, core images, SW8 (Side Wall Sample) and MCT (Mechanical Coring Tool) data, compaction data, and general laboratory reports and other files of interest. Also, the discrete (and sparse) sampling of core data ("discrete curves") as well as the relationships within multi-dimensional SCAL experiments ( structured curves ) was fully honored. The result is that core data can now be integrated with, and managed alongside, all other forms of borehole data within PetrisWlNDS Recall. It can be manipulated and displayed in Logger's or Driller s depth references as required, and, in particular, can be correlated with other borehole data as well as presented as zonation and zonal parameters. However, the heterogeneity of core data types and file formats necessitated the definition of strict naming conventions and flexible data dictionaries, as well as the development of core data loaders that could handle the multitude of same but different" data presentations found in historic and even current data files. Albeit the we must organize this core data" initiative was quite daunting, the results have been well worth the time and effort invested by the oil companies that have principally participated in, and helped to drive the initiative. The following two images (Figure 1 and Figure 2) visually illustrate the integration of several types of core data with open-hole wire-line and zonal data within PetrisWlNDS Recall.

5 Figure 1: Integration of core and wire-line data in PetrisWINDS Recall Figure 2: Integration of core and wire-line data in PetrisWINDS Recall In a similar way, Logicom was also solicited to help customers resolve problems related to the use of core data, and in particular the (re)interpretation of special core analysis measurements. Logicom noted that the multidisciplinary diversity of core data, along with the distributed nature of its storage, often posed problems when the data was (re)assembled and compiled to conduct, for example, a new regional study. Logicom identified that the reinterpretation of individual types of test, and the grouping of different types of test for interpretation together, were extremely difficult activities to pursue, since no single tool was available to bring

6 together multiple types of test from multiple sources for this specific purpose. As a consequence, Logicom worked with customers to develop a tool (Q-SCAL) that could be used to analyze and interpret multiple types of special core analysis data, with a view to ensuring the integrity and value of SCAL data when used for study purposes. The following image (Figure 3) illustrates the conversion of capillary pressure to reservoir conditions (saturation height) following data correction (including adjustment for closure, stress, clay and equilibrium state) and quality control. Figure 3: Conversion of capillary pressure to reservoir conditions using Q-SCAL The Integrated Geomechanics Platform Initiative Geomechanical behavior prediction is critical to resolving problems related to oil and gas exploration, drilling, completions, reservoir management and production. The ongoing operations shift towards more difficult environments (deepwater, high pressure/temperature, tighter reservoirs, unconventional gas, etc.), the requirement for complex reservoir studies (coupling fluid flow and rock mass deformation), the take-up of shale gas exploitation (requiring prediction of fracture propagation in media already discontinuous), and the need for ambitious drilling plans (shallower longer horizontal sections, close to faults, etc.) have all enforced the importance of geomechanical characterisation and modelling. Geomechanical characterisation requires the detection, measure and estimation of rock properties and in-situ stresses. Success of the estimation relies upon accurate characterisation of geomechanical model inputs such as lithology, stratigraphy, structural geology, geophysical logs, cores, seismic, well tests, drilling parameters and observed well integrity. To characterize the more elusive of the geomechanical model components, an interpreter needs a versatile and comprehensive data management solution that permits efficient investigation of multiple alternative interpretations and consideration of the best possible analogues.

7 Whereas such data management solutions are commonplace in other E&P disciplines, in the specific domain of geomechanical characterization, industry change and challenge had significantly outpaced technology advancement of the integrated data management, interpretation and modeling technologies necessary for full-field geomechanical studies. Petris and Logicom together with Eni E&P were fully aware that the progression of reservoir geomechanics as a front line discipline had significantly out-paced "normal" technological advancements, and this lack of available technology became a key driver for the Integrated Geomechanics Platform initiative that Eni decided to pursue in collaboration with Petris and Logicom between mid-2008 and late 2010. In pursuing this Geomechanics Platform initiative, Eni was very focused on data integration in terms of data footprint as well as application integration in terms of data access. As such, Eni decided to maximize leverage from its existing PetrisWlNDS Recall borehole data management solution and associated applications (provided by Petris), and to integrate with this the SCAL data interpretation tool Q-SCAL (provided by Logicom). Today, the Geomechanics Platform facilitates the storage of raw geomechanical data, and includes the tools necessary to interpret and exploit that data. Raw and interpreted geomechanical data are stored alongside all other borehole data to facilitate data integration, and specially developed modules provide important new possibilities for the integration of all available and relevant data into a single geomechanical study with an extensive ability to search for analogues. This, together with other specially developed statistical and inversion features, means that the Geomechanics Platform can provide complete support for fully integrated geomechanical characterization workflows with the ability to facilitate studies from the core scale, to the well scale, and then to full field scale (see Figure 4). Figure 4: Schematic illustration of the Integrated Geomechanics Platform with support for complex geomechanical characterization and modeling workflows.

8 Building the Geomechanics Platform PetrisWlNDS Recall was already able to provide the foundation for an Integrated Geomechanics Platform, and Q-SCAL was already able to analyze and interpret many different forms of SCAL data. So, in order to move forward and deploy the Geomechanics Platform (see Figure 4) with maximum leverage from these existing products, Eni, Petris and Logicom together defined a solution roadmap with the following principal objectives: Extend the existing data management facilities in PetrisWlNDS Recall to provide full support for raw geomechanical laboratory test data types, and add the capacity to store interpreted results that are specific to each such experiment type; Provide a dedicated geomechanical laboratory test data interpretation module by extending the existing Q-SCAL tool, and then integrate the Q-SCAL tool with PetrisWlNDS Recall from both a data management and an application perspective; Provide a versatile and efficient Search Engine capable of sifting through core, wire-line and other borehole data, and that would be capable of identifying analogues; and Provide statistical tools for the characterization of the key parameters used for geomechanical modeling, and facilitate correlating target properties and actual available data. Data Management Considerations From a data management perspective, the requirements for the Integrated Geomechanics Platform were essentially as follows: Geomechanical experiment data and associated interpretations need to be loaded and stored alongside all other borehole data for easier exploitation; The data needs to be organized in a manner that facilitates later exploitation and use; Users need to be able to visualize the data on demand in a straightforward manner; Users need to be able to conduct complex searches within the entire corporate borehole database to locate data of interest; It must be possible to find data that are mutually supporting with a view to grouping such data for interpretation; and It must be possible to support complex workflows. Data Interpretation Module To ensure that geomechanical laboratory test data could be consistently quality controlled and eventually re-interpreted in single-sample, multi-sample and even multi-depth and multi-well configurations, a dedicated interpretation module for geomechanical test data was added to Q-SCAL, and Q-SCAL was then fully integrated with PetrisWlNDS Recall from both a data management and application perspective. The special requirements for this interpretation module were that it: Store both the raw data from geomechanical laboratory tests on core samples, as well as the interpreted results from these tests (various rock mechanical properties), alongside all other borehole data within the PetrisWlNDS Recall data repository to improve data integration and enhance the potential for later searches; and Provide dedicated graphical tools for interpreting the laboratory test data (oedometric and triaxial tests etc.)

9 under two distinct categories, namely, The interpretation of single tests, and The interpretation of multiple tests that might be from several depths or even from several wells. The following two images (Figure 5 and Figure 6) illustrate the management of the geomechanical experiment data within PetrisWlNDS Recall and the interpretation of this data using the specific module of Q-SCAL. Figure 5: Data from the consolidation phase of a triaxial test shown as structured curves in the PetrisWlNDS Recall data browser, together with an example plot of void ratio versus effective stress.

10 Figure 6: Example interactive display from the interpretation of a triaxial test using the Q-SCAL module built to interpret geomechanical laboratory tests on core samples.

11 Advanced Search Engine During discussions with the target User base concerning the requirement specifications for the Integrated Geomechanics Platform, a typical "User extrapolation phenomenon was observed. Once discussion had started concerning the correction of a known data exploitation deficiency, the Users immediately began to extrapolate beyond this immediate problem (assuming it would be resolved) to issues that were significantly beyond. This was, with hindsight, very significant for the development of the Platform. The overwhelming messages that came from the expectant target Users were as follows. Storing raw geomechanical data alongside all other borehole data is a good start, but by no means the end of the requirement. The data repository must also contain interpreted information such as elastic and plastic constants. A User must be able to analyze data and make useful correlations such as the porosity dependence of Young's modulus. This must be possible by way of cross-plot displays, and with the further ability to perform linear and/or non-linear regression. A User needs to be able to search globally, and not be limited to a specific well or field. For instance, a User needs to be able to quickly find, for example, all clastic reservoirs that exist in the global corporate portfolio. The Search Engine, at the very minimum, must be able to discriminate by lithology, porosity range, drained versus un-drained, plastic versus elastic, and depth interval, etc. The fundamental concept, then, behind the Advanced Search Engine was that it facilitate the integration of geomechanical data with all other forms of borehole data, and help define geomechanical zonations and field-scale geomechanical models. This, in tum, would also provide support for fully integrated workflows as well as the ability to easily move from the core scale, to the well scale, and then to full-field scale. As such, it was evident that the Advanced Search Engine would need to: Be particularly well adapted to locating core samples using particular property management criteria; Be able to consider "depths" and values at depth and list and report on individual experiments; Be able to exploit core-to-depth matching information (driller-to-logger depth-shift curves) as necessary; Be able to discriminate using all informational items typically used to characterize laboratory test experiments; Be able to discriminate by zone, or on values from wire line logs or data derived from them; Provide an extremely straightforward and intuitive real language" method for building a Search; and Facilitate the construction and archiving of standardized searches. In order to illustrate and appreciate the significant potential of the Advanced Search Engine, the following real language" Search sequence is illustrated in Figures 7 to 10. Find all Wells in aii "Countries" for which we have data...that have Core SampIes"...and that cross a zone called "Permian'...and then... For these Wells, display values of interpreted "Young's Modulus... for aii experiments whose lithology is recorded as Limestone... and for which corresponding TVD depths are less than 2674 meters... and for

12 which core porosity exceeds 12 percent... and then... View the distribution of the identified interpreted values and examine the associated statistics...and then... Save this Permian Limestone (PERMLIM) Zone information to the PetrisWINDS Recall data repository by providing a Zone Name (such as LIMP12TV2674) to the result, noting that the zonal information will be associated with each Well from which the core samples were retrieved. Figure 7: Search results identify all Wells that have core samples and which cross the Permian zone. Figure 8: Table of interpreted Young's Modulus values for all experiments that match the required search criteria.

13 Figure 9: Histogram of Young's Modulus values for experiments that match the required search criteria shown with a table of associated statistics. Figure 10: Interpreted Zonation Parameters shown in the PetrisWINDS Recall data browser for the PERMLIM zone called LIMP12TV2674. Statistical Tools The Advanced Search Engine is well equipped to handle the discriminators needed for geomechanical unit parameterization. The Statistical Tools extend the geomechanical unit characterization capabilities of the Advanced Search Engine further, by providing statistical analysis methods to facilitate analysis of the search results themselves, for example, to: Display histograms and calculate statistics, fit best normal or log-normal distributions and calculate chisquared values etc., and then save characterizing zonal parameters to the PetrisWINDS Recall data repository; and Display cross-plots and calculate correlation coefficients and best-fit regression formulae, and then save and reuse these formulae elsewhere.

14 In order to illustrate and appreciate the significant potential of the Statistical Tools, the following sequence (Figures 11 to 13) continues on from the example provided above for the Advanced Search Engine. Using the same Core Samples" identified in the earlier Search example (from multiple Wells in multiple Countries)...view the relationship between core porosity and wire line density...and then... Fit a curve to describe this relationship...and then save all this information to PetrisWINDS Recall...and then Use the saved relationship in one of the Wells to predict core porosity from wire-line density.and then compare the actual core porosity with the predicted core porosity. Figure 11: Correlation curve derived from statistical analysis of a cross-plot of core porosity and wire-line density. Figure 12: Zonation parameters for the LIMP12TV2674 Zone now include a core porosity formula as shown in the PetrisWINDS Recall browser.

15 Figure 13: Comparison of actual core porosity and core porosity predicted from a density log using a derived formula. Exploitation of the Geomechanics Platform Since late 2010, the Geomechanics Platform has been operational at Eni and in use within both the reservoir department and the drilling department, where it also has straightforward application for reservoir engineering and modeling through the use of field-calibrated 1D geomechanical models as input for 3D rock deformation models for use in fluid flow simulators. In fact, the very operational existence of the Geomechanics Platform has encouraged and underpinned several further initiatives. One of these initiatives was the Implementation of a complex pre-drill analysis workflow for wellbore stability, where the main objectives were: to provide drilling engineers with a complete tool for wellbore stability analysis in order to carry out routine studies as part of new standardized practices in the development of drilling programs; and to provide expert drilling engineers with a complete tool for advanced pre-drill wellbore stability analysis that can cope with any drilling context and situation. One of the important aspects of the pre-drill workflow initiative is that the Geomechanics Platform provides a truly integrated link between geomechanical characterization and the drilling activities themselves, and this helps make geomechanics genuinely part of the process, and not just something that is input. In fact, the pre-drill workflow strongly leverages the operational drilling activity data to help understand the mechanical behavior aspects of the formation, and this constitutes a fundamental change (i.e. an operational improvement) in the way that pre-drill analysis work is conducted. This, together with the well-defined methodology behind the workflow, ensures that the Geomechanics Platform is implicitly usable by Users with different skill sets and different levels of expertise.

16 Figure 14 illustrates a template used by the pre-drill study workflow to estimate maximum horizontal stress. Figure 14: The magnitude of maximum horizontal stress is inverted by using the concept of frictional faulting (stress polygon combined with shear and tensile failure in the wellbore wall). The combination of parameters (strength of the rock, pore pressure, etc.) and evidence of failures (breakouts and/or tensile induced fractures) allows the magnitude to be estimated. Many other E&P companies have also followed the progress of Eni's Geomechanics Platform initiative with interest; one such company is Shell. During 2011, Shell and Petris collaborated on a pilot project to evaluate the potential of the Geomechanics Platform to become the global data repository for all of Shell's geomechanical data (Shell already uses PetrisWINDS Recall as its corporate data repository for well log data). This pilot project was successful in achieving its deliverables, and so was completed w. positive recommendations. One very significant point of the pilot was the intense focus on verifying that the Advanced Search Engine was able to "deliver as advertised". An Integrated Rock Catalog for E&P Geotechnologists Whereas the focus of the Integrated Geomechanics Platform (see Figure 4) initiative was the wider exploitation of geomechanical laboratory test data and interpretations, it was this initiative that also provided the essential components (in particular the Advanced Search Engine) for "An Integrated Rock Catalog for E&P Geotechnologists". In fact, all of the necessary workflows and tools that were defined and deployed during the construction of the Integrated Geomechanics Plafform (as described above) are directly applicable to the Integrated Rock Catalog, since the Integrated Rock Catalog is actually a simple "generalization" of the data management and organization components of the Integrated Geomechanics Platform. The ongoing activity to extend the Integrated Geomechanics Platform and deploy the Integrated Rock Catalog relates primarily to expanding the existing PetrisWINDS Recall core data model to provide specific "containers" for

17 interpreted results coming from other forms of SCAL measurement data (such as capillary pressure and relative permeability). It is noteworthy that such data can already be analyzed using Q-SCAL, the existing conventional and special core data analysis and interpretation system from Logicom and now integrated with PetrisWINDS Recall. Other ongoing activity relates to the implementation of a data dictionary of "categories" that would be exploited as an actual Rock Catalog instance evolves at an E&P company. These categories relate to all forms of experimental samples and include, for example, lithological classification, facies classification, formation name, basin name, geographical reference, driller and logger depths, etc. The categories will be cross-referenced with other available information, such as interpreted results form SCAL experiment, sample images and core images, etc. Conclusions The Integrated Rock Catalog for E&P Technologists will provide E&P companies with the ability to characterize Rock Types with respect to all available rock parameters and tests, and then be able to search this Rock Catalog intelligently upon demand. As an evolution of the Integrated Geomechanics Platform, the Rock Catalog will also help mitigate the chronic lack of skill and expertise that has become a serious dilemma for our industry. Key benefits of the Rock Catalog are: the capability to easily find and exploit all relevant data using a single common interface that maintains the references to information originating from different discipline groups; the ability for users with different levels of expertise to work with the same tool and to take advantage of relevant analysis previously conducted by other expert employees and stored in the Catalog; the significant improvement in the time needed to collect the data necessary for a particular study (including searching and identifying analogues if necessary), and the consequent gain in time that can be devoted to detailed analysis and study using the guided workflows; and the ability to share (store and reuse) information and interpretations in a manner that truly improves the capture and transfer processes of a company's knowledge management program. In the Abstract of this paper we stated that core data is one of the most important types of data for an oil company, but that no one seems to own core data or really keep it organized. The Rock Catalog now provides a solution to that problem, and, in particular, will instill confidence that the data in the Catalog has been validated in a structured geological and engineering context. As such, the it looks OK to me" era for core data management is over. Acknowledgements The authors would like to thank Eni E&P, Petris Technology and Logicom E&P for permission to publish this paper. 2013 Halliburton. All Rights Reserved. H0100XX 02/2013