Integrating Data Governance into Your Operational Processes
|
|
|
- Allyson Chambers
- 10 years ago
- Views:
Transcription
1 TDWI rese a rch TDWI Checklist Report Integrating Data Governance into Your Operational Processes By David Loshin Sponsored by tdwi.org
2 August 2011 TDWI Checklist Report Integrating Data Governance into Your Operational Processes By David Loshin TABLE OF CONTENTS 2 FOREWORD 2 NUMBER ONE Define and approve a data governance charter. 3 NUMBER TWO Clarify accountability for the operational responsibilities of data governance roles. 3 NUMBER THREE Connect observance of business policies to information requirements. 4 NUMBER FOUR Assess the cross-functional uses of data. 4 NUMBER FIVE Formalize information requirements by defining data rules. 5 NUMBER SIX Standardize services for inspecting and monitoring conformance to data rules. 5 NUMBER SEVEN Design and implement data remediation workflows. 6 NUMBER eight Institute notifications, incident reporting, and resolution tracking. 6 NUMBER NINE Drive technology acquisition based on operational governance needs. 7 ABOUT OUR SPONSOR 7 ABOUT THE AUTHOR 1201 Monster Road SW, Suite 250 Renton, WA T F E [email protected] tdwi.org 7 ABOUT TDWI RESEARCH 2011 by TDWI (The Data Warehousing Institute TM ), a division of 1105 Media, Inc. All rights reserved. Reproductions in whole or part are prohibited except by written permission. requests or feedback to [email protected]. Product and company names mentioned herein may be trademarks and/or registered trademarks of their respective companies.
3 FOREWORD number ONE Define and approve a data governance charter. Although much effort has been spent on describing organizational structures for data governance committees, their effectiveness is limited by the absence of well-defined policies and operational procedures that guide the integration of data governance and control within day-to-day processes. Individuals tasked with data stewardship responsibilities may become hesitant to embark on a list of specific actions if they begin to interfere with existing responsibilities. Periodic meetings are planned, but agendas are unstructured in the absence of clearly defined performance objectives for operationalization. Data governance meetings eventually peter out, leaving data stewards to operate in a silo, making assumptions and guessing on data policies and processes. Attaining a sustainable data governance capability relies on defining cross-organizational processes for defining data oversight policies, processes for monitoring data policies, and enforcing compliance. This TDWI Checklist Report details recommendations that connect high-level policies to specific actions, and shows the ways that data governance tasks can be integrated into existing operational processes. Following the items on this Checklist will energize your data governance program and ensure its integration within operational workflows. To be effective, a data governance program should be a business initiative. A data governance charter provides the program s business justification. It anchors the program within the organization by formalizing the value proposition and specifying the program objectives, desired results, and groundwork for collaboration and oversight. The charter describes the logical organization for oversight, such as a data governance council, as well as the council s authority to define, review, implement, and enforce information policies. The charter should clearly state: Business problems to be addressed Program goals, metrics, and success criteria General approaches for institutionalization Definition of specific roles and responsibilities Only after the data governance charter is approved can the organizational structure be fleshed out and the presiding data governance roles filled. At this point, the structure can be designed, key data policies can be mapped to processes, and roles such as data governors, data custodians, data stewards, and subject matter experts can be assigned. 2 TDWI rese arch tdwi.org
4 number Two Clarify accountability for the operational responsibilities of data governance roles. number three Connect observance of business policies to information requirements. The first meetings to initiate data governance are marked by great enthusiasm, especially when there is a perception that concrete steps are being taken to address chronic data quality problems. With agenda items focusing on data issues, data standards, and clarifying business terms to create a business glossary, these meetings have no shortage of topics. The challenge is not in setting the agenda for what happens during the meeting, but in creating an operational framework for next steps and action items. Such a framework provides the ability to manage and take effective action between meetings. Participating in a weekly discussion about data standards and data models is one thing, but being assigned accountability for ensuring high-quality data can lead to inaction or ineffectiveness of newly designated data stewards if they don t know the operational processes for governing data. This is particularly evident in executing the day-to-day operational tasks of monitoring data rule conformance, identifying data errors or process issues, and remediating data failures. Accountability is more clearly defined when a data governance program incorporates: A framework for defining, proposing, and agreeing to data policies Defined processes for inspecting data and monitoring adherence to defined policies A set of operational workflows for investigating the root causes of a problem Processes for identifying, prioritizing, and recommending remediation alternatives Ensuring a comprehensive understanding of the value of data sharing and repurposing means defining the processes and procedures required for operationalization. One of the most important aspects is assessing data use. Data governance team members will need to follow defined processes to consider potential use cases. This will help define the workflow processes so that when data stewards are assigned their roles, they have the proper training and methods to evaluate and remediate emerging data problems. Providing designated stewards with a data governance operational run book detailing roles and responsibilities will delineate accountability, guide specific activities, and alleviate the uncertainty regarding the operational aspects of data governance. Every organization is both directly and indirectly driven by business policies, which are meant to direct behavior and practices within the business context. Sometimes policies are imposed from outside the organization, such as privacy regulations that direct the protection of personally identifiable information. Other times policies are defined internally, such as meeting projections for quarterly earnings targets or observing directives regarding preferred customer discounts. Either way, operational processes evolve and applications are developed to support the requirements dictated by these business policies. Data that supports business information is often overlooked when business policies are enforced. Operationalizing data governance requires making the connection between business policies and data rules. To make this connection, business policies should be defined using natural language that indicates specific implications for the use of data. Here are some examples: Protection of private information implies the ability to tag data elements as private and apply the proper access controls Statutory reporting implies managing data lineage along with the ability to create reports as directed by the law to ensure auditability Compliance with industry standards implies the use of agreed-to data specifications for data exchange Corporate dashboards rely on the ability to collect data for the reporting of key performance indicators (KPIs) with a predictable level of trust The proper application of discounting rules implies capturing and managing preferred customer status Ensuring observance of the business policy implies ensuring compliance with the data rules derivable from the associated information policies. Conversely, sets of rules regarding the use of accurate, consistent, complete, and timely data are then mapped to the corresponding information and business policies. Operational data governance methods for validating compliance with data rules are then integrated into the business processes and workflows. 3 TDWI rese arch tdwi.org
5 number four Assess the cross-functional uses of data. number five Formalize information requirements by defining data rules. Many, if not all, organizations are structured around functional areas of the business, such as marketing, sales, finance, fulfillment, manufacturing, or human resources. Accordingly, an organic approach to system development has allowed applications to be designed and developed that meet specific functional needs, such as logging sales transactions or managing customer service centers. When the processes are self-contained within a single business function, their success can be completely monitored within the context of the function. Yet, the tasks associated with many corporate business processes span more than one functional area. Examples include managing compliance with privacy regulations, ERP processes such as orderto-cash or procure-to-pay, business intelligence, and predictive analytics. The success of these cross-functional processes is measured in relation to corporate KPIs. Communication across the areas of the business is done through data sharing the results of each phase are stored in preparation for consumption by subsequent process phases. The quality of each data set might be sufficient to meet the needs of the originating business application. However, cross-functional process success depends on shared information; each phase may repurpose data sourced from different areas of the business, and may impose new data quality requirements. The absence of oversight over aspects such as collection, enrichment, and modification of that information (often accumulated from applications supporting different areas of the business) can lead to process errors, causing delays that reduce effectiveness, predictability, and ultimately contribute a cumulative negative effect on achieving key performance objectives. Effective management of operational processes that span the enterprise relies on the availability of high-quality data. Ensuring the success of cross-functional processes requires practical methods for building operational data governance into the application infrastructure. This begins with an understanding of the information flow for cross-functional processes: Identify the critical data elements consumed downstream Solicit data quality expectations Map the information flow Ensure that compliance with the collected data requirements is inspected and monitored along the way While policies expressed in natural language convey general business expectations, it is necessary to establish a connection between a high-level business policy, the information used to support compliance with the policy, and the specific data rules implied by those information dependencies. When ungoverned approaches to specifying information requirements are introduced, a number of challenges to ensuring business policy observance arise, including: Subjectivity. Since policies are interpreted in different ways, each approach to validating policy conformance is inherently subjective. Scalability. Without a standard way to validate that information requirements are observed, the typical methods of hard-coded tests and occasional manual checks will not scale. The sets of requirements will expand as data sets are repurposed multiple times, and in each instance the requirements must be mapped to different data models. Consistency. With no agreed-to method for specifying information requirements, there cannot be consistency in their validation. Each information requirement involves reviewing metadata characteristics: identifying business terms that map to known data concepts and specifying facts that relate those data concepts. In turn, defined data validity rules can be monitored to demonstrate that information requirements are met. A framework of data rule categories corresponding to recognized dimensions of data quality (such as completeness, consistency, or uniqueness) helps reduce the severity of these issues. Data rules directly link observance of quality assertions to information compliance, and these rules can be uniformly operationalized using automated inspection and monitoring services. As an example, consider a typical business policy for supplier payments: each vendor invoice must correspond to an issued purchase order (PO). A corresponding information requirement states that invoices will not be processed if any invoiced line items are not linked to a purchase order identifier issued by the PO system. Derived data rules may include: 1. Invoices are incomplete without PO identifier for each line item 2. PO identifiers must conform to a PO identifier format 3. PO identifiers must match with PO identifiers in the purchase order data system 4 TDWI rese arch tdwi.org
6 number six Standardize services for inspecting and monitoring conformance to data rules. number seven Design and implement data remediation workflows. In the past, it was not uncommon for application programmers to insert validations and controls directly into their programs. However, variations in interpretation existed within application development silos, and management of the data quality rules fell completely into the realm of the technologist; there was little interaction with business consumers. With few standards for data governance, even as rules were integrated into the program, no process existed for reviewing the rules, verifying their correctness, and most critically, ensuring rule consistency. In today s environment, much of the effort for data integration and repurposing has been automated, which not only addresses the need for scalability (both for data volume and data set repurposing), but also allows compliance monitoring to be streamlined through the use of tools. Instead of pushing the responsibility for data quality compliance to technical programmers, standardize processes for documenting and sharing data rules and standardize services for automated inspection, monitoring, and notification when data exceptions occur. A simple example is a defined rule for validating data completeness. If someone in shipping needs the sales agents to capture components of the delivery address at the point of sale, a rule verifying that the data values are not missing can invoke a data quality service at transaction processing time. A triggered alert pushes incomplete records back to the sales application for proper data capture. Some organizations rely on an agreed-to set of query criteria applied to different data sets for data validation. This is satisfactory within a limited context and when contained to specific data domain and use cases in a stable source system. However, as business policies affect cross-functional uses and processes, continuous monitoring of adherence to commonly recognized assertions can be streamlined by using a services approach. Specific tools such as data profilers can be coupled with reporting systems to use a common set of data rules for inspection and monitoring. Layering data profiling inside a service-oriented architecture allows applications within different business contexts to validate different data instances using a common set of data rules, and generate notifications to data stewards when noncompliant data items require remediation. As suggested by the directive of accountability within the data governance charter, practical suggestions for operationalizing data governance focus on refining discrete data rules from business policies and standardizing scalable methods for monitoring data rule validation across business process flows. With proper data management policies in place, defined processes ensure that when data rules are violated, there is a clear line of accountability directing the right set of individuals to proactively investigate the root causes of the issue as well as evaluating alternatives for addressing those root causes. These specific procedures implement agreed-to data governance policies which, when incorporated into a service-level agreement (SLA), establish lines of responsibility and escalation strategies when issues are not resolved in a timely manner. Specific aspects of the data remediation workflows include triage, assessment, and remediation: Triage is performed to understand where and how the data issue manifests itself, the business impact, the size of the problem, and the number of individuals or systems affected. It incorporates the evaluation of issue criticality, issue frequency, the feasibility of correction, and what steps can immediately be taken to prevent the issue from recurring. Assessment involves the synthesis of what was discovered during triage. The data steward prioritizes each identified issue and considers alternatives for eliminating the root cause, additional inspections, and whether specific corrective measures are to be taken. Remediation can comprise different tactics depending upon the location and root cause of the data issue. Cascaded data errors may require backing out changes, rolling back processing stages, and restarting the process. Failed processes can be adjusted to prevent the introduction of data errors, and human errors can be reduced through user training. Most critically, the key data governance team members must consider potential use cases to define the workflow processes before data stewards are put in a position to evaluate emerging data problems. 5 TDWI rese arch tdwi.org
7 number eight Institute notifications, incident reporting, and resolution tracking. number nine Drive technology acquisition based on operational governance needs. Standardized inspection and monitoring employ tools to streamline the data validation process. Remediation workflows operationalize accountability for data by describing the methods by which data issues are reviewed and resolved, how individuals investigate root causes, and how they evaluate alternatives for addressing those root causes. A means for triggering the remediation workflow is critical when an exception is identified and when monitoring the progress of issue resolution. Key components of an incident management framework include: Notifying data stewards of emerging issues Logging all discovered data issues for the responsible parties Providing a collaborative framework for tracking the progress of the remediation and resolution activities Identifying point of failure in the process for root cause analysis With this framework in place, data stewards can create an audit trail for aligning data quality and governance performance with business policy compliance. They also have the foundation to better report and manage remediation by understanding mean time to resolve issues, frequency of occurrence of issues, types of issues, sources of issues, and common approaches for correcting or eliminating problems. This links operational data governance to business process success. These Checklist items suggest that practical approaches to data governance can be integrated directly into operational processes, from the refinement of data rules from business policies, to the solicitation of requirements from the pool of downstream consumers, to standardized services for inspection, monitoring, notification, and resolution of data exceptions. This ultimately suggests two fundamental changes in the organization: 1. Requirements analysis, data quality inspection and monitoring, notification, tracking, and reporting must all be incorporated into the organization s system development lifecycle, or SDLC. 2. Much of the complexity for operational data governance can be simplified by employing the right kinds of automated tools and accompanying methodologies. Data governance is operationalized through the definition of policies and corresponding processes for assuring policy compliance. Although the processes for data governance are the most important components for operationalization, the procedures can be implemented in a number of ways, including the incremental introduction of tools and technology to supplement the defined processes. For example, data management tools such as data profiling, data cleansing, metadata management, data standards management, and incident management can be adapted to streamline operational activities. The results of inspection and measurements can be presented using existing end-user reporting techniques. As the policies, roles, and responsibilities are formulated and the processes are put in place, institute a technology adoption plan that corresponds to the staged rollout of operational data governance procedures. Use of data profiling, modeling, and metadata tools help to capture business data quality rules. Data profiling is helpful to develop data assessment processes, as well as inspection and monitoring services and embedded controls. Any data corrections, standardization, or enhancements performed within a resolution workflow should rely on standardized technologies to ensure consistency. Review the existing uses of reporting frameworks for developing performance dashboards and data quality metrics reporting. By aligning technology acquisition with the rollout of operational governance processes, the team can more effectively articulate business needs, evaluate training requirements, and essentially speed time to value. The combined set of business needs and requirements will also scope the vendor selection process, and perhaps even point to specific vendors whose capabilities map directly to organization expectations. 6 TDWI rese arch tdwi.org
8 about our sponsor about the author Be certain that all your information is reliable and available to manage your business. Trillium Software solutions and services bring certainty to your enterprise data for customer, product, financial, and supplier information. The Trillium Software System enables the business teams who understand the data and the IT teams who manage it to discover and quantify data anomalies, put data controls in place, and manage data conditions to optimize information performance. The Trillium Software System is recognized as a global leader in enterprise data quality and critical to the success of data integration, data migration, data stewardship, and data governance initiatives. We deliver solutions for data profiling, cleansing, enhancement, linking, geocoding, and governance for global enterprise applications, business intelligence, and data management platforms. Ensure that your information becomes the trusted asset that your employees rely on to make business-critical decisions every day. Don t just be certain about your data. Be Trillium Certain! For more information, visit About the tdwi Checklist Report Series TDWI Checklist Reports provide an overview of success factors for a specific project in business intelligence, data warehousing, or a related data management discipline. Companies may use this overview to get organized before beginning a project or to identify goals and areas of improvement for current projects. David Loshin, president of Knowledge Integrity, Inc. ( is a recognized thought leader, TDWI instructor, and expert consultant in the areas of data management and business intelligence. David is a prolific author on business intelligence best practices, including numerous books and papers on data management such as The Practitioner s Guide to Data Quality Improvement, with additional content provided at David is a frequent speaker at conferences, Web seminars, and sponsored Web sites and channels such as His best-selling book, Master Data Management, has been endorsed by data management industry leaders, and his MDM insights can be reviewed at He can be reached at [email protected]. about tdwi research TDWI Research provides research and advice for business intelligence and data warehousing professionals worldwide. TDWI Research focuses exclusively on BI/DW issues and teams up with industry thought leaders and practitioners to deliver both broad and deep understanding of the business and technical challenges surrounding the deployment and use of business intelligence and data warehousing solutions. TDWI Research offers in-depth research reports, commentary, inquiry services, and topical conferences as well as strategic planning services to user and vendor organizations. 7 TDWI rese arch tdwi.org
Operationalizing Data Governance through Data Policy Management
Operationalizing Data Governance through Data Policy Management Prepared for alido by: David Loshin nowledge Integrity, Inc. June, 2010 2010 nowledge Integrity, Inc. Page 1 Introduction The increasing
Building a Data Quality Scorecard for Operational Data Governance
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
Five Fundamental Data Quality Practices
Five Fundamental Data Quality Practices W H I T E PA P E R : DATA QUALITY & DATA INTEGRATION David Loshin WHITE PAPER: DATA QUALITY & DATA INTEGRATION Five Fundamental Data Quality Practices 2 INTRODUCTION
Three Fundamental Techniques To Maximize the Value of Your Enterprise Data
Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Prepared for Talend by: David Loshin Knowledge Integrity, Inc. October, 2010 2010 Knowledge Integrity, Inc. 1 Introduction Organizations
5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 www.winshuttle.com Introduction
Supporting Your Data Management Strategy with a Phased Approach to Master Data Management WHITE PAPER
Supporting Your Data Strategy with a Phased Approach to Master Data WHITE PAPER SAS White Paper Table of Contents Changing the Way We Think About Master Data.... 1 Master Data Consumers, the Information
DATA QUALITY MATURITY
3 DATA QUALITY MATURITY CHAPTER OUTLINE 3.1 The Data Quality Strategy 35 3.2 A Data Quality Framework 38 3.3 A Data Quality Capability/Maturity Model 42 3.4 Mapping Framework Components to the Maturity
ten mistakes to avoid
second quarter 2010 ten mistakes to avoid In Predictive Analytics By Thomas A. Rathburn ten mistakes to avoid In Predictive Analytics By Thomas A. Rathburn Foreword Predictive analytics is the goal-driven
Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER
Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Useful vs. So-What Metrics... 2 The So-What Metric.... 2 Defining Relevant Metrics...
Understanding the Financial Value of Data Quality Improvement
Understanding the Financial Value of Data Quality Improvement Prepared by: David Loshin Knowledge Integrity, Inc. January, 2011 Sponsored by: 2011 Knowledge Integrity, Inc. 1 Introduction Despite the many
Effecting Data Quality Improvement through Data Virtualization
Effecting Data Quality Improvement through Data Virtualization Prepared for Composite Software by: David Loshin Knowledge Integrity, Inc. June, 2010 2010 Knowledge Integrity, Inc. Page 1 Introduction The
5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC Executive Summary Successful deployment of ERP solutions can revolutionize
Data Quality Assessment. Approach
Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source
Busting 7 Myths about Master Data Management
Knowledge Integrity Incorporated Busting 7 Myths about Master Data Management Prepared by: David Loshin Knowledge Integrity, Inc. August, 2011 Sponsored by: 2011 Knowledge Integrity, Inc. 1 (301) 754-6350
Data Governance for Master Data Management and Beyond
Data Governance for Master Data Management and Beyond A White Paper by David Loshin WHITE PAPER Table of Contents Aligning Information Objectives with the Business Strategy.... 1 Clarifying the Information
Data Governance, Data Architecture, and Metadata Essentials
WHITE PAPER Data Governance, Data Architecture, and Metadata Essentials www.sybase.com TABLE OF CONTENTS 1 The Absence of Data Governance Threatens Business Success 1 Data Repurposing and Data Integration
DATA VISUALIZATION AND DISCOVERY FOR BETTER BUSINESS DECISIONS
TDWI research TDWI BEST PRACTICES REPORT THIRD QUARTER 2013 EXECUTIVE SUMMARY DATA VISUALIZATION AND DISCOVERY FOR BETTER BUSINESS DECISIONS By David Stodder tdwi.org EXECUTIVE SUMMARY Data Visualization
Implementing a Data Governance Initiative
Implementing a Data Governance Initiative Presented by: Linda A. Montemayor, Technical Director AT&T Agenda AT&T Business Alliance Data Governance Framework Data Governance Solutions: o Metadata Management
Enabling Data Quality
Enabling Data Quality Establishing Master Data Management (MDM) using Business Architecture supported by Information Architecture & Application Architecture (SOA) to enable Data Quality. 1 Background &
Business Performance & Data Quality Metrics. David Loshin Knowledge Integrity, Inc. [email protected] (301) 754-6350
Business Performance & Data Quality Metrics David Loshin Knowledge Integrity, Inc. [email protected] (301) 754-6350 1 Does Data Integrity Imply Business Value? Assumption: improved data quality,
Enterprise Data Governance
DATA GOVERNANCE Enterprise Data Governance Strategies and Approaches for Implementing a Multi-Domain Data Governance Model Mark Allen Sr. Consultant, Enterprise Data Governance WellPoint, Inc. 1 Introduction:
Enterprise Data Governance
Enterprise Aligning Quality With Your Program Presented by: Mark Allen Sr. Consultant, Enterprise WellPoint, Inc. ([email protected]) 1 Introduction: Mark Allen is a senior consultant and enterprise
Data Governance. David Loshin Knowledge Integrity, inc. www.knowledge-integrity.com (301) 754-6350
Data Governance David Loshin Knowledge Integrity, inc. www.knowledge-integrity.com (301) 754-6350 Risk and Governance Objectives of Governance: Identify explicit and hidden risks associated with data expectations
Practical Fundamentals for Master Data Management
Practical Fundamentals for Master Data Management How to build an effective master data capability as the cornerstone of an enterprise information management program WHITE PAPER SAS White Paper Table of
Seven Tips for Unified Master Data Management
TDWI RESEARCH TDWI CHECKLIST REPORT Seven Tips for Unified Master Data Management Integrated with Data Quality and Data Governance By Philip Russom Sponsored by: tdwi.org MAY 2014 TDWI CHECKLIST REPORT
The Role of Metadata in a Data Governance Strategy
The Role of Metadata in a Data Governance Strategy Prepared by: David Loshin President, Knowledge Integrity, Inc. (301) 754-6350 loshin@knowledge- integrity.com Sponsored by: Knowledge Integrity, Inc.
Explore the Possibilities
Explore the Possibilities 2013 HR Service Delivery Forum Best Practices in Data Management: Creating a Sustainable and Robust Repository for Reporting and Insights 2013 Towers Watson. All rights reserved.
The following is intended to outline our general product direction. It is intended for informational purposes only, and may not be incorporated into
The following is intended to outline our general product direction. It is intended for informational purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any
Trends In Data Quality And Business Process Alignment
A Custom Technology Adoption Profile Commissioned by Trillium Software November, 2011 Introduction Enterprise organizations indicate that they place significant importance on data quality and make a strong
Implement a unified approach to service quality management.
Service quality management solutions To support your business objectives Implement a unified approach to service quality management. Highlights Deliver high-quality software applications that meet functional
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff The Challenge IT Executives are challenged with issues around data, compliancy, regulation and making confident decisions on their business
An RCG White Paper The Data Governance Maturity Model
The Dataa Governance Maturity Model This document is the copyrighted and intellectual property of RCG Global Services (RCG). All rights of use and reproduction are reserved by RCG and any use in full requires
Enterprise Data Management
TDWI research TDWI Checklist report Enterprise Data Management By Philip Russom Sponsored by www.tdwi.org OCTOBER 2009 TDWI Checklist report Enterprise Data Management By Philip Russom TABLE OF CONTENTS
Harness the value of information throughout the enterprise. IBM InfoSphere Master Data Management Server. Overview
IBM InfoSphere Master Data Management Server Overview Master data management (MDM) allows organizations to generate business value from their most important information. Managing master data, or key business
DataFlux Data Management Studio
DataFlux Data Management Studio DataFlux Data Management Studio provides the key for true business and IT collaboration a single interface for data management tasks. A Single Point of Control for Enterprise
Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise
Data Governance Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise 2 Table of Contents 4 Why Business Success Requires Data Governance Data Repurposing
Master data deployment and management in a global ERP implementation
Master data deployment and management in a global ERP implementation Contents Master data management overview Master data maturity and ERP Master data governance Information management (IM) Business processes
Information Management & Data Governance
Data governance is a means to define the policies, standards, and data management services to be employed by the organization. Information Management & Data Governance OVERVIEW A thorough Data Governance
Principal MDM Components and Capabilities
Principal MDM Components and Capabilities David Loshin Knowledge Integrity, Inc. 1 Agenda Introduction to master data management The MDM Component Layer Model MDM Maturity MDM Functional Services Summary
White Paper. An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management
White Paper An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management Managing Data as an Enterprise Asset By setting up a structure of
Using SAP Master Data Technologies to Enable Key Business Capabilities in Johnson & Johnson Consumer
Using SAP Master Data Technologies to Enable Key Business Capabilities in Johnson & Johnson Consumer Terry Bouziotis: Director, IT Enterprise Master Data Management JJHCS Bob Delp: Sr. MDM Program Manager
Ten Mistakes to Avoid
EXCLUSIVELY FOR TDWI PREMIUM MEMBERS TDWI RESEARCH SECOND QUARTER 2014 Ten Mistakes to Avoid In Big Data Analytics Projects By Fern Halper tdwi.org Ten Mistakes to Avoid In Big Data Analytics Projects
Transforming IT Processes and Culture to Assure Service Quality and Improve IT Operational Efficiency
EXECUTIVE BRIEF Service Operations Management November 2011 Transforming IT Processes and Culture to Assure Service Quality and Improve IT Operational Efficiency agility made possible David Hayward Sr.
A Tipping Point for Automation in the Data Warehouse. www.stonebranch.com
A Tipping Point for Automation in the Data Warehouse www.stonebranch.com Resolving the ETL Automation Problem The pressure on ETL Architects and Developers to utilize automation in the design and management
Data Integration Alternatives Managing Value and Quality
Solutions for Enabling Lifetime Customer Relationships Data Integration Alternatives Managing Value and Quality Using a Governed Approach to Incorporating Data Quality Services Within the Data Integration
Data Integration Alternatives Managing Value and Quality
Solutions for Customer Intelligence, Communications and Care. Data Integration Alternatives Managing Value and Quality Using a Governed Approach to Incorporating Data Quality Services Within the Data Integration
Operational Excellence for Data Quality
Operational Excellence for Data Quality Building a platform for operational excellence to support data quality. 1 Background & Premise The concept for an operational platform to ensure Data Quality is
IMPROVEMENT THE PRACTITIONER'S GUIDE TO DATA QUALITY DAVID LOSHIN
i I I I THE PRACTITIONER'S GUIDE TO DATA QUALITY IMPROVEMENT DAVID LOSHIN ELSEVIER AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS SAN DIEGO SAN FRANCISCO SINGAPORE SYDNEY TOKYO Morgan Kaufmann
Solutions Master Data Governance Model and Mechanism
www.pwc.com Solutions Master Data Governance Model and Mechanism Executive summary Organizations worldwide are rapidly adopting various Master Data Management (MDM) solutions to address and overcome business
Data Governance: A Business Value-Driven Approach
Global Excellence Governance: A Business Value-Driven Approach A White Paper by Dr Walid el Abed CEO Trusted Intelligence Contents Executive Summary......................................................3
Data Governance. Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise
Data Governance Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise 2 Table of Contents 4 Why Business Success Requires Data Governance Data Repurposing
Master Data Management
Master Data Management Managing Data as an Asset By Bandish Gupta Consultant CIBER Global Enterprise Integration Practice Abstract: Organizations used to depend on business practices to differentiate them
JOURNAL OF OBJECT TECHNOLOGY
JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7, No. 8, November-December 2008 What s Your Information Agenda? Mahesh H. Dodani,
VISA COMMERCIAL SOLUTIONS BEST PRACTICES SUMMARIES. Profit from the experience of best-in-class companies.
VISA COMMERCIAL SOLUTIONS BEST PRACTICES SUMMARIES Profit from the experience of best-in-class companies. Introduction To stay competitive, you know how important it is to find new ways to streamline and
Coverity White Paper. Effective Management of Static Analysis Vulnerabilities and Defects
Effective Management of Static Analysis Vulnerabilities and Defects Introduction According to a recent industry study, companies are increasingly expanding their development testing efforts to lower their
Considerations: Mastering Data Modeling for Master Data Domains
Considerations: Mastering Data Modeling for Master Data Domains David Loshin President of Knowledge Integrity, Inc. June 2010 Americas Headquarters EMEA Headquarters Asia-Pacific Headquarters 100 California
Data Governance: A Business Value-Driven Approach
Data Governance: A Business Value-Driven Approach A White Paper by Dr. Walid el Abed CEO January 2011 Copyright Global Data Excellence 2011 Contents Executive Summary......................................................3
IRMAC SAS INFORMATION MANAGEMENT, TRANSFORMING AN ANALYTICS CULTURE. Copyright 2012, SAS Institute Inc. All rights reserved.
IRMAC SAS INFORMATION MANAGEMENT, TRANSFORMING AN ANALYTICS CULTURE ABOUT THE PRESENTER Marc has been with SAS for 10 years and leads the information management practice for canada. Marc s area of specialty
Better Data is Everyone s Job! Using Data Governance to Accelerate the Data Driven Organization
Better Data is Everyone s Job! Using Data Governance to Accelerate the Data Driven Organization Intros - Name - Interest / Challenge - Role Data Governance is a Business Function Data governance should
MDM and Data Warehousing Complement Each Other
Master Management MDM and Warehousing Complement Each Other Greater business value from both 2011 IBM Corporation Executive Summary Master Management (MDM) and Warehousing (DW) complement each other There
Using and Choosing a Cloud Solution for Data Warehousing
TDWI RESEARCH TDWI CHECKLIST REPORT Using and Choosing a Cloud Solution for Data Warehousing By Colin White Sponsored by: tdwi.org JULY 2015 TDWI CHECKLIST REPORT Using and Choosing a Cloud Solution for
BIG DATA KICK START. Troy Christensen December 2013
BIG DATA KICK START Troy Christensen December 2013 Big Data Roadmap 1 Define the Target Operating Model 2 Develop Implementation Scope and Approach 3 Progress Key Data Management Capabilities 4 Transition
Analytics Strategy Information Architecture Data Management Analytics Value and Governance Realization
1/22 As a part of Qlik Consulting, works with Customers to assist in shaping strategic elements related to analytics to ensure adoption and success throughout their analytics journey. Qlik Advisory 2/22
CA Service Desk Manager
PRODUCT BRIEF: CA SERVICE DESK MANAGER CA Service Desk Manager CA SERVICE DESK MANAGER IS A VERSATILE, COMPREHENSIVE IT SUPPORT SOLUTION THAT HELPS YOU BUILD SUPERIOR INCIDENT AND PROBLEM MANAGEMENT PROCESSES
Development, Acquisition, Implementation, and Maintenance of Application Systems
Development, Acquisition, Implementation, and Maintenance of Application Systems Part of a series of notes to help Centers review their own Center internal management processes from the point of view of
Existing Technologies and Data Governance
Existing Technologies and Data Governance Adriaan Veldhuisen Product Manager Privacy & Security Teradata, a Division of NCR 10 June, 2004 San Francisco, CA 6/10/04 1 My Assumptions for Data Governance
Realizing business flexibility through integrated SOA policy management.
SOA policy management White paper April 2009 Realizing business flexibility through integrated How integrated management supports business flexibility, consistency and accountability John Falkl, distinguished
Enterprise Data Quality
Enterprise Data Quality An Approach to Improve the Trust Factor of Operational Data Sivaprakasam S.R. Given the poor quality of data, Communication Service Providers (CSPs) face challenges of order fallout,
TDWI strives to provide course books that are content-rich and that serve as useful reference documents after a class has ended.
Previews of TDWI course books offer an opportunity to see the quality of our material and help you to select the courses that best fit your needs. The previews cannot be printed. TDWI strives to provide
Evaluating the Business Impacts of Poor Data Quality
Evaluating the Business Impacts of Poor Data Quality Submitted by: David Loshin President, Knowledge Integrity, Inc. (301) 754-6350 [email protected] Knowledge Integrity, Inc. Page 1 www.knowledge-integrity.com
Governance Is an Essential Building Block for Enterprise Information Management
Research Publication Date: 18 May 2006 ID Number: G00139707 Governance Is an Essential Building Block for Enterprise Information Management David Newman, Debra Logan Organizations are seeking new ways
Data Quality Governance: Proactive Data Quality Management Starting at Source
Data Quality Governance: Proactive Data Quality Management Starting at Source By Paul Woodlock, Clavis Technologies About the Author: Paul Woodlock is a business process and management expert with nearly
ISSA Guidelines on Master Data Management in Social Security
ISSA GUIDELINES ON INFORMATION AND COMMUNICATION TECHNOLOGY ISSA Guidelines on Master Data Management in Social Security Dr af t ve rsi on v1 Draft version v1 The ISSA Guidelines for Social Security Administration
Solutions. Master Data Governance Model and the Mechanism
Solutions Master Data Governance Model and the Mechanism Executive summary Organizations worldwide are rapidly adopting various Master Data Management (MDM) solutions to address and overcome business issues
Extend the value of your core business systems.
Legacy systems renovation to SOA September 2006 Extend the value of your core business systems. Transforming legacy applications into an SOA framework Page 2 Contents 2 Unshackling your core business systems
Data Governance: The Lynchpin of Effective Information Management
by John Walton Senior Delivery Manager, 972-679-2336 [email protected] Data Governance: The Lynchpin of Effective Information Management Data governance refers to the organization bodies, rules, decision
Data Governance. Unlocking Value and Controlling Risk. Data Governance. www.mindyourprivacy.com
Data Governance Unlocking Value and Controlling Risk 1 White Paper Data Governance Table of contents Introduction... 3 Data Governance Program Goals in light of Privacy... 4 Data Governance Program Pillars...
Change Management Best Practices
General Change Management Best Practices Practice Area Best Practice Criteria Organization Change management policy, procedures, and standards are integrated with and communicated to IT and business management
Data Governance for Financial Institutions
Financial Services the way we see it Data Governance for Financial Institutions Drivers and metrics to help banks, insurance companies and investment firms build and sustain data governance Table of Contents
Executive Dashboards: Putting a Face on Business Service Management
Executive Dashboards: Putting a Face on Business Service best practices WHITE PAPER Table of Contents Executive Summary...1 The Right Information to the Right Manager...2 Begin with Dashboards for IT Managers...2
Master Data Management
Master Data Management David Loshin AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS SAN DIEGO Ик^И V^ SAN FRANCISCO SINGAPORE SYDNEY TOKYO W*m k^ MORGAN KAUFMANN PUBLISHERS IS AN IMPRINT OF ELSEVIER
Logical Modeling for an Enterprise MDM Initiative
Logical Modeling for an Enterprise MDM Initiative Session Code TP01 Presented by: Ian Ahern CEO, Profisee Group Copyright Speaker Bio Started career in the City of London: Management accountant Finance,
Applying ITIL v3 Best Practices
white paper Applying ITIL v3 Best Practices to improve IT processes Rocket bluezone.rocketsoftware.com Applying ITIL v. 3 Best Practices to Improve IT Processes A White Paper by Rocket Software Version
Proven Testing Techniques in Large Data Warehousing Projects
A P P L I C A T I O N S A WHITE PAPER SERIES A PAPER ON INDUSTRY-BEST TESTING PRACTICES TO DELIVER ZERO DEFECTS AND ENSURE REQUIREMENT- OUTPUT ALIGNMENT Proven Testing Techniques in Large Data Warehousing
RSA ARCHER OPERATIONAL RISK MANAGEMENT
RSA ARCHER OPERATIONAL RISK MANAGEMENT 87% of organizations surveyed have seen the volume and complexity of risks increase over the past five years. Another 20% of these organizations have seen the volume
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation Market Offering: Package(s): Oracle Authors: Rick Olson, Luke Tay Date: January 13, 2012 Contents Executive summary
Challenges in the Effective Use of Master Data Management Techniques WHITE PAPER
Challenges in the Effective Use of Master Management Techniques WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Consolidation: The Typical Approach to Master Management. 2 Why Consolidation
OPTIMUS SBR. Optimizing Results with Business Intelligence Governance CHOICE TOOLS. PRECISION AIM. BOLD ATTITUDE.
OPTIMUS SBR CHOICE TOOLS. PRECISION AIM. BOLD ATTITUDE. Optimizing Results with Business Intelligence Governance This paper investigates the importance of establishing a robust Business Intelligence (BI)
Kalido Data Governance Maturity Model
White Paper Kalido Data Governance Maturity Model September 2010 Winston Chen Vice President, Strategy and Business Development Kalido Introduction Data management has gone through significant changes
How to Improve Service Quality through Service Desk Consolidation
BEST PRACTICES WHITE PAPER How to Improve Quality through Desk Consolidation By Gerry Roy, Director of Solutions Management for Support, BMC Software, and Frederieke Winkler Prins, Senior IT Management
Data Governance 8 Steps to Success
Data Governance 8 Steps to Success Anne Marie Smith, Ph.D. Principal Consultant Asmith @ alabamayankeesystems.com http://www.alabamayankeesystems.com 1 Instructor Background Internationally recognized
Request for Proposal for Application Development and Maintenance Services for XML Store platforms
Request for Proposal for Application Development and Maintenance s for ML Store platforms Annex 4: Application Development & Maintenance Requirements Description TABLE OF CONTENTS Page 1 1.0 s Overview...
DATA REPLICATION FOR REAL-TIME DATA WAREHOUSING AND ANALYTICS
TDWI RESE A RCH TDWI CHECKLIST REPORT DATA REPLICATION FOR REAL-TIME DATA WAREHOUSING AND ANALYTICS By Philip Russom Sponsored by tdwi.org APRIL 2012 TDWI CHECKLIST REPORT DATA REPLICATION FOR REAL-TIME
Manageability with BPM
Manageability with BPM Presenter Names: Thierry Hendrickx / Chris den Hoedt 2008 IBM Corporation Agenda What is BPM Managing Internal controls with BPM IBM s Vision to BPM 1 Government leaders know what
Fortune 500 Medical Devices Company Addresses Unique Device Identification
Fortune 500 Medical Devices Company Addresses Unique Device Identification New FDA regulation was driver for new data governance and technology strategies that could be leveraged for enterprise-wide benefit
