Data Governance on Well Header Not Only is it Possible, Where Else Would you Start!
Agenda Intro (Noah) A (not so) Brief History Methodology Prioritization Why Well Header Attribute versus Process Oriented Data Types Standards Development Other Findings
Noah by the Numbers Launched in 2008 50%+ year over year growth rate 135+ people All focused on Information Management and Energy 4 full time professionals at Noah focused on Staffing and Recruiting Rigorous qualifying and hiring process 30% of our staff come from the industry Subject Matter Experts from Hess, ExxonMobil, Chevron, Kerr McGee, Schlumberger, Halliburton, Chesapeake and others Services deliver value across the entire Information Management spectrum Proven delivery methodology, templates, artifacts and accelerators 30 clients, 40+ projects Consisting of: Information Strategy, Information Architecture, Data Architecture, Data Quality, Master Data Management, Data Integration, Data Governance, Unstructured Data, Business Intelligence, Data Virtualization, and Big Data Solutions Page 3
Noah Consulting Services Strategy & Planning Infrastructure Business Process & Applications IT Domains Information ECM (Well File, etc.) Data Management Strategic Data Quality Operating Model Definition SAP Content Management (xecm) Data Virtualization Data Integration Business Process Design Technical Data Services MDM (Well Master, Equipment Master, Asset Master, Facilities Master, etc.) Business Value Assessment Maturity Model Mapping Data Governance Data Sciences & Analytics Architecture Services Integrated Operations Analytics (Drilling, Engineering, Production, etc.) Big Data (Hadoop, Appliances, Logs, Seismic, Land, etc.) Information Information Management Disciplines Page 4
A Few of Our Clients past & present Page 5
A (not so) Brief History One of Noah s first engagements in Calgary (2008) At least one attempt before that (2005?) Initiated again in late 2012 Led by IT Broad scope Resourced through conscription (up to 50% of people s time) Large groups Meeting based Very Theoretical PowerPoint after PowerPoint Trouble getting traction through 2013 with such a large group End Result: Frustration Page 6
A (not so) Brief History In mid 2013, the business took control Major business units used representation instead of 100% inclusion in the Data Governance structure Included Corporate and IT Representatives needed to see something specific What will this look like when it s done How will it change what I do Showed the business users the end to end solution Rather than doing everything by committee, sat down with individuals and understood their process Reviewed the process diagram for accuracy Leaders from those business units are now leading Everything s not rosy by any means but issues are dealt with as they come up, facilitated by people that are trusted Page 7
A (not so) Brief History Out of 2013, a few key elements were in place: Data Governance Structure in place and individuals assigned roles** Familiar with Noah Methodology (one iteration) List of data types they wanted to pursue Commitment to continue from both Business and IT Management support of the DG Council to do the right thing Still accountable to the DG Board but largely empowered to make it happen ** Structure includes: DG Board executive sponsorship, approve roadmap/priorities, determine budget DG Council Core group of representatives, develop/review/revise roadmap, approve standards DG Office Day to Day DG Operations, support DG Council, facilitate working groups Data Stewards SMEs accountable for a specific data type Working Groups team of SMEs and Project Team members chartered to develop standards for a data type DG Project Team full-time personnel dedicated to delivering standards Page 8
Methodology Prioritization Page 9
Prioritization Page 10
Prioritization Business Readiness Business Impact Data Readiness Calgary General Seismic Survey Data Seismic Navigation 3.0 2.8 2.9 2.90 Seismic Interpretations 4.6 4.0 2.6 3.73 Velocity Models Well Header 4.4 4.0 3.4 3.93 Non operated Well logs Well Core Samples Directional Survey 3.6 4.5 3.0 3.70 Borehole Geophysical Analysis Core Analysis 3.8 4.3 3.3 3.80 Pressure Data ngis Metadata Editor Catalogue PSDM Site Surveys Average all aspects Page 11
Prioritization Couple of observations: Difficult to establish a common understanding of the Data Type Remember the Data Types Definition? Clearly defining what the data type is and isn t (i.e. scoping it) is key There was also a knee jerk reaction to data problems Is an Oracle Database running out of space a data problem or an operations problem? Many times we heard that data is a mess We re finding through a Data Quality Engine Pilot that actually, the data is in exceptional shape For meaningful results from prioritization, you need to have the right people making the assessments Need to focus on those few people with a broad knowledge across a data area Page 12
Prioritization Next steps: Prepare Roadmap Approve Roadmap ** Low Readiness Projects Preparing data types for Data governance which scored low on the data readiness scale May be necessary to create a business case and present it to the impacted business units for approval and implementation Page 13
Data Readiness Business Impact Business Readiness Data Readiness Calgary General Seismic Survey Data Seismic Navigation 3.0 2.8 2.9 2.90 Seismic Interpretations 4.6 4.0 2.6 3.73 Velocity Models Well Header 4.4 4.0 3.4 3.93 Non operated Well logs Well Core Samples Directional Survey 3.6 4.5 3.0 3.70 Borehole Geophysical Analysis Core Analysis 3.8 4.3 3.3 3.80 Pressure Data ngis Metadata Editor Catalogue PSDM Site Surveys Average all aspects Page 14
Standards Development Page 15
Standards Development Assess Data Readiness, Assemble a Working Group (made up of DG Council or other SMEs) Discover Harvest any work completed before Could be related, ongoing projects in other areas Document high level process flow Investigate what any standards bodies or other parts of the business may have done: Can we leverage PPDM Business/Data Rules? Did another office take on this data type? **Result is clearer scope, more accurate readiness assessment Page 16
Standards Development Page 17
Standards Development Stage Gate DG Council evaluates whether to proceed Continue refining the process flow documentation People, processes, technology Generally more complex than anyone believes Several examples of this at the client Individuals just know their piece of the process Seeing the full picture is an eye-opener Determine logical checkpoints in the process Checkpoint - is a logical point in a process flow where the business has determined the Data Governance effectiveness (i.e. Data Quality metrics) must be checked before proceeding. Page 18
Attribute versus Process Orientation Attribute Based Data Types Some data types involve checking the quality of the data in the system or record (or across systems of record) Ex. Well Header, Seismic Header Generally can be implemented with automated rules Process Based Data Types Other data types follow a workflow or process to enforce rigor at each step along the way Ex. Directional Survey, Seismic Navigation Will require a mix of manual and automated rules Page 19
Standards Development Business Rules At each checkpoint, what does the business want to know Use business terminology Declarative wording ( must not should ) Application and repository agnostic Borrow shamelessly Data Rules How do you validate compliance to Business Rules or identify exceptions? Defines specific tests, validations or constraints Always resolves to true/false Use workshop to refine / approve these Don t start with a blank slate (you ll get blank stares) Use sparingly Page 20
Standards Development Page 21
Configure and Implement Stage Gate DG Council approves business and data rules Usually agrees with the recommendation of the Working Group Project team is authorized to proceed to Implement Implement Develop the implementation plan Schedule the configuration Build the SQL for automated tests (or configure the data quality engine) Gather the reporting requirements for the dashboard / metrics / reports. (standard implementation steps) Sustain and Monitor Transition to DG Office Page 22
Why was the client successful? Now more of a partnership with IT Proceed at a pace the business can support Recognize that participants have many work commitments in addition to the Data Governance process Structure of DG Council has experience of knowing real examples of what is needed to get done See it / own it culture Have the actual end users involved Accountability resides within the business units Know the flow of information and the timing differences of when the system of record gets updated Can t just check the end result at the start of the process Page 23
Why was the client successful? Iterative development Need to deliver something regularly Incrementally add a little bit of value Cooperation Each of the business units have their own priorities but there s an atmosphere of give and take Level of trust that your BU s priorities will be actioned The right people Stay positive Work through the conflicts Not everyone is going to see the value immediately See the big picture This is the right thing to do, let s get it done Page 24
Why was the client successful? Acknowledge differences In a bank, DG enforcement isn t negotiable In an oil company, business units may have different processes and tools Can t get away with thou shall across the board There are areas where that s possible In other areas, you just have to accept the differences Generally, the business rules apply across business units, they just may not be implemented as Data Rules everywhere Western Canada Scenario - teams focus on an operating area and continually refine the model of the field International Scenario generally get one pass through the available data for the area, may not have original data to reprocess Page 25
What s Next? Focus on implementation Data Governance Repository Data Quality Engine Workflow Engine Use of Collaborative Tools Page 26