5 Best Practices for SAP Master Data Governance



Similar documents
5 Best Practices for SAP Master Data Governance

Data Governance, Data Architecture, and Metadata Essentials

Effecting Data Quality Improvement through Data Virtualization

Operationalizing Data Governance through Data Policy Management

Three Fundamental Techniques To Maximize the Value of Your Enterprise Data

Building a Data Quality Scorecard for Operational Data Governance

Integrating Data Governance into Your Operational Processes

Creating Bill of Materials with Winshuttle Transaction

Busting 7 Myths about Master Data Management

Considerations: Mastering Data Modeling for Master Data Domains

Five Fundamental Data Quality Practices

DATA QUALITY MATURITY

Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff

Data Integration Alternatives Managing Value and Quality

Supporting Your Data Management Strategy with a Phased Approach to Master Data Management WHITE PAPER

Understanding the Financial Value of Data Quality Improvement

Data Governance for Master Data Management and Beyond

Data Integration Alternatives Managing Value and Quality

Practical Fundamentals for Master Data Management

Data Governance. David Loshin Knowledge Integrity, inc. (301)

Challenges in the Effective Use of Master Data Management Techniques WHITE PAPER

Enhance visibility into and control over software projects IBM Rational change and release management software

Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise

Implementing a Data Governance Initiative

Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER

Data Governance. Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise

Customer Master Data: Common Challenges and Solutions

Harness the value of information throughout the enterprise. IBM InfoSphere Master Data Management Server. Overview

The following is intended to outline our general product direction. It is intended for informational purposes only, and may not be incorporated into

Measure Your Data and Achieve Information Governance Excellence

Fortune 500 Medical Devices Company Addresses Unique Device Identification

Choosing the Right Master Data Management Solution for Your Organization

Best Practices in Enterprise Data Governance

Principal MDM Components and Capabilities

Bringing agility to Business Intelligence Metadata as key to Agile Data Warehousing. 1 P a g e.

Making Automated Accounts Payable a Reality

Making Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management

Explore the Possibilities

JOURNAL OF OBJECT TECHNOLOGY

IBM Software A Journey to Adaptive MDM

Data Quality and Cost Reduction

Using SAP Master Data Technologies to Enable Key Business Capabilities in Johnson & Johnson Consumer

Enable Business Agility and Speed Empower your business with proven multidomain master data management (MDM)

Enterprise Data Quality

Enterprise Data Governance

DataFlux Data Management Studio

A Simple Guide to Material Master Data Governance

NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation

Winshuttle, LLC Technical Support and Maintenance Program

Enterprise Data Governance

A Tipping Point for Automation in the Data Warehouse.

04 Executive Summary. 08 What is a BI Strategy. 10 BI Strategy Overview. 24 Getting Started. 28 How SAP Can Help. 33 More Information

Corralling Data for Business Insights. The difference data relationship management can make. Part of the Rolta Managed Services Series

Creating a Business Intelligence Competency Center to Accelerate Healthcare Performance Improvement

Logical Modeling for an Enterprise MDM Initiative

Master Data Management

Data Quality Assessment. Approach

Enterprise Information Management Capability Maturity Survey for Higher Education Institutions

Data Governance Maturity Model Guiding Questions for each Component-Dimension

Data Governance Primer. A PPDM Workshop. March 2015

Business Data Authority: A data organization for strategic advantage

EAI vs. ETL: Drawing Boundaries for Data Integration

VISA COMMERCIAL SOLUTIONS BEST PRACTICES SUMMARIES. Profit from the experience of best-in-class companies.

MDM and Data Warehousing Complement Each Other

Automating Master Data Management Workflows for Oracle

Product Lifecycle Management in the Food and Beverage Industry. An Oracle White Paper Updated February 2008

Information Management & Data Governance

The Role of Metadata in a Data Governance Strategy

Management Update: The Cornerstones of Business Intelligence Excellence

Knowledgent White Paper Series. Developing an MDM Strategy WHITE PAPER. Key Components for Success

Michael Briles Senior Solution Manager Enterprise Information Management SAP Labs LLC

Data Governance in the B2B2C World

Cordys Master Data Management

Monitoring Data Quality Performance Using Data Quality Metrics

Governance Is an Essential Building Block for Enterprise Information Management

Why is Master Data Management getting both Business and IT Attention in Today s Challenging Economic Environment?

How to Implement MDM in 12 Weeks

4th Annual ISACA Kettle Moraine Spring Symposium

Informatica Master Data Management

The Requirements for Universal Master Data Management (MDM) White Paper

Business Performance & Data Quality Metrics. David Loshin Knowledge Integrity, Inc. loshin@knowledge-integrity.com (301)

Making Automated Accounts Payable a Reality

Table of contents. Best practices in open source governance. Managing the selection and proliferation of open source software across your enterprise

Point of View: FINANCIAL SERVICES DELIVERING BUSINESS VALUE THROUGH ENTERPRISE DATA MANAGEMENT

Enabling Data Quality

Master data deployment and management in a global ERP implementation

!!!!! White Paper. Understanding The Role of Data Governance To Support A Self-Service Environment. Sponsored by

Financial Planning, Budgeting, and Forecasting

ORACLE HYPERION DATA RELATIONSHIP MANAGEMENT

Best Practices for Data Loading into SAP ERP Systems

HP SOA Systinet software

Integrating MDM and Business Intelligence

A Road Map to Successful Customer Centricity in Financial Services. White Paper

How To Manage It Asset Management On Peoplesoft.Com

A discussion of information integration solutions November Deploying a Center of Excellence for data integration.

Data Management Roadmap

Riversand Technologies, Inc. Powering Accurate Product Information PIM VS MDM VS PLM. A Riversand Technologies Whitepaper

Enterprise Information Management Services Managing Your Company Data Along Its Lifecycle

SAP BusinessObjects Information Steward

Transcription:

5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 www.winshuttle.com

Introduction In most organizations, business application development is aligned by corporate function, such as sales or finance, or by line of business. However, many key business activities, such as order-to-cash or procure-topay, cut across functional and organizational boundaries. To be effective, applications that automate these cross functional activities must be able to share and exchange data among multiple domains. However the mismatch between siloed application development and cross-functional business process often causes same or similar concepts to be modeled differently, creating differences in structure and meaning that can lead to difficulties in data integration and sharing across functions. Fortunately, disciplines referred to as master data management (MDM) have been developed for creating shared views of uniquely identifiable data concepts. MDM is particularly useful for sharing data for crossfunctional processes such as those implemented within ERP applications. But there are potentially many touch points for shared master objects (customer master and material master are both good examples), and coordinating among the different participants across the different steps of workflow processes requires some oversight. That oversight is delivered in the form of a Master Data Governance (MDG) program that is centralized within the organization. In this paper we look at some best practices that effectively facilitate: Collecting business requirements for master data oversight Capturing and managing master data business rules Managing the processes for creating and updating master data entities Motivating Factors Master data management disciplines are intended to support sharing high quality master data repositories. However, solving the infrastructure problems without addressing policies and processes can often lead to delays or even project failures. Introducing Master Data Governance can address challenges such as: Collective data expectations: In a siloed environment, each data system is engineered to meet the data quality and usability needs of the business function it supports. However, when a master data system is deployed, the shared records must meet the needs of all the consumers of the data asset. That means that there must be well-defined processes in place for identifying who the data consumers are, how they plan to use the data, and what their data quality requirements are. In turn, those business rules must be centrally managed to ensure that everyone s needs are met. Consolidation vs. Shared Use: There is a big difference between utilizing a master repository as a channel for consolidating data dumped from multiple sources vs. managing the master repository as a channel for effective data sharing to support end-to-end processes. The former approach is prone to inconsistency and errors, whereas focusing on the usage characteristics of a shared data asset will improve productivity, reduce errors, and speed time to value. By necessity, though, the shared-use perspective requires oversight and governance. Employing a Collaborative Operational Model: In usage scenarios that have many data touch points performed by different individuals along the way, there is a need to oversee the sharing of the data associated with workflow processes that cross functional lines. 2

Five Best Practices for Master Data Governance Master Data Governance establishes policies and processes for the creation and maintenance of master data. Observing these policies will establish a level of trust in the quality and usability of master data. And in order to institute policies governing the creation and maintenance of master data, we recommend the following best practices; each is intended to address the challenges in centrally managing shared master data. 1: Establish a Master Data Governance Team and Operating Model Recognize that most business processes cross function boundaries. Multiple individuals are involved in the collaborative workflow processes for creating master data as well as executing the business processes. But because organizations are typically organized around function, a question arises as to who owns cross-functional master data processes? To ensure that the processes are not impeded by data errors or process failures, the first best practice that we recommend is that the executive management mandate the creation of a central data governance team. This data governance team should be composed of representatives from each area of the business and be vested with the authority to: Define and approve policies governing the lifecycle for master data to ensure data quality and usability. Oversee the process workflows that touch master data. Define and manage business rules for master data. Inspect and monitor compliance with defined master data policies. Notify individuals when data errors or process faults are putting the quality and usability of the data at risk. 2: Identify and Map the Production and Consumption of Master Data By virtue of the process of sharing data managed within a master data repository, we are able to benefit from reuse of the data even though it may only be stored and managed from a single central location. The quality and usability characteristics of the master data must be suitable for all of the multiple purposes. This means that you will need to ensure that cross-functional requirements are identified from all master data consumers and that those requirements can be observed throughout the data lifecycles associated with the cross-functional processes. In order to properly assess their data requirements and expectations for availability, usability, and quality, you must first have a horizontal view of the workflow processes and understand who the data producers and data consumers are; this is the basis of our second recommended best practice of documenting the business process workflows and determining the production and consumption touch points for master data. Whether you are looking at the workflow processes for creating shared master data or reviewing those activities that consume shared master data, one can link the assurance of consistency, accuracy, and usability to specific master data touch points. Documenting process flow maps is a practical process that enables one to consider all the touch points for master data (for creation, use, and updates) along each workflow activity. Any point of data creation or updating provides an opportunity for inserting inspection and monitoring of observance of data policies and data quality rules. Reviewing the usage scenarios provides traceability for the lineage of shared master values, and exposes the data dependencies that must be carefully monitored to ensure consistency in definition, aggregation, and reporting. The result is that this information production map identifies the opportunities for instituting the governance directives for ensuring compliance with enterprise data policies. 3

3: Govern Shared Metadata: Concepts, Data Elements, Reference Data, and Rules One of the main drivers for Master Data Governance is discrepancies and inconsistencies associated with shared data use, particularly in terms of the characteristics and attributes of master data concepts such as customer or product. And incredible as it may seem, inconsistencies in reference data sets (such as product codes or customer category codes) seem to be rampant. The duplication and overlap of user-defined code sets causes particular problems within an ERP environment when there is no centralized authority to define and manage standards for reference data. Incorrect use of overlapping codes can lead to incorrect behavior within hard-coded applications; these bugs are not only insidious, they can be extremely hard to track down. To reduce the potential negative impacts of inconsistencies, our third best practice for Master Data Governance is centralizing the oversight and management of shared metadata, focusing on entity concepts (such as customer or supplier), reference data and corresponding code sets, along with the data quality standards and rules used for validation at the different touch points in the data lifecycle. Because each business function may have its own understanding of what is implied by the terms used to refer to master data concepts, use a collaborative process to compare definitions and resolve differences to provide a standard set of master concept definitions. Governance and management of shared metadata involves the definition and observation of data policies for resolving variance across similarly-named data elements along with the procedures for normalizing reference data code sets, values, and associated mappings. Normalizing shared master reference data can alleviate a significant amount of pain in failed processes and repeated reconciliations that must be done when the reporting does not match the operational systems. Lastly, if (as our second best practice suggests) we are creating a collected set of data quality requirements, the data quality rules that are used for validation can be centralized and governed to ensure consistency in the application of those rules for inspection, monitoring, and notification of errors or invalid data. 4: Institute Policies for Ensuring Quality at Stages of the Data Lifecycle The fundamental value of instituting master data management is the reduction in data variance, inconsistency, and incompleteness. Cleansing or correcting data downstream is a reactive measure, and as long as any corrections are shared among all the data creators and consumers, this may be an acceptable last resort. But realize that these types of data issues are just symptomatic of the absence of data quality control processes intended to prevent errors from occurring in the first place. Instead, this fourth best practice seeks to ensure that the data meets the needs as specified by the collected data requirements. This suggests a more directed approach of defining data policies and instituting data governance processes for data discovery and quality assessment. By analyzing potential anomalies within the master data set, this practice helps to identify structural inconsistencies, code set issues, and semantic differences inherent in the data. Once the potential issues are identified, many can be resolved with stop-gap controls architected within those common master data models used for information sharing. At the same time, defining policies for instituting controls within the process workflows and application programs that can be inserted into the workflows based on the analysis of the information production maps documented using our second best practice. This will help reduce the introduction of errors into the environment in the first place. 5: Implement Discrete Approval and Separation of Duties Workflow By virtue of the fact that cross-functional processes (such as order-to-cash) are not owned by any particular line of business or functional area, the question of workflow process ownership becomes key in governing their successful execution. Drilling down on this question exposes different facets of the challenge. 4

One facet involves navigating the relationship between the business team members who define the workflow process and the IT teams who develop and implement the applications of the workflow processes. Another facet is that there are different types of processes: some that require complex IT solutions, and others that can be easily deployed by business people without IT involvement. A third facet involves the operational aspects of oversight of the decisions that are necessary within process workflows, such as reviewing newlyentered items, approving the new records, or signing off on the completion of a workflow, to name a few. Because process workflow ownership is effectively assigned across the areas of the business, there is a need for a set of policies and methods to centrally oversee the successful execution of all stages of the workflow. This leads to our fifth best practice of integrating discrete approval of tasks into the operational aspects of Master Data Governance. Separation of duties can be discretely integrated into the application environment through the proper delineation of oversight as part of an approval process that focuses on the business use of the data and does not require IT intervention. Implementing this practice on a small scale might be managed manually. Yet as the number of processes grows, the number of touch points increases, and the need for discrete separation of duties expand, a manual solution will become insufficient to provide the enterprise-wide effectiveness that is required, while relying on IT as the conduit for application development introduces a bottleneck in getting processes into production. To alleviate both these pressures, as part of this best practice we suggest exploring how different tools can be used to enable some degree of self-service for business people to develop and deploy workflow processes without the need for IT intervention. The best types of technologies would automate the management of these processes along with their approval procedures, and this master governance could then be directly integrated into the execution of the processes. Summary The successful deployment of ERP solutions can revolutionize the way a company works in terms of increased opportunities, decreased operational costs, and improved level of trust in generated reports. Yet the transition from a vertical, function-based mode of operations to one that is integrated horizontally across the different areas of the business introduces some challenges that are often ignored. Introducing Master Data Governance in alignment with the use of ERP frameworks for cross-function workflow processes can help you anticipate potential failures and lead to rapid time-to-value. But don t let governance strangle flexibility and adaptability. Governance is an evolving and ongoing process. Policies and practices must be reviewed and updated to ensure that business performance metrics are properly measured and business objective are met. Introducing practical master data governance practices such as those described in this paper can help your organization deploy the proper oversight over crossfunctional activities. Automating these Master Data Governance practices relies on a variety of core competencies, so when evaluating solutions, look for these types of capabilities: Broad access to shared master data (such as those integrated within ERP systems) Centralized management of data element definitions, structures, reference data sets, and other relevant metadata Shared data quality business rules Flexibility in developing and implementing workflow processes that reduce the IT bottleneck The incorporation of the inspection and monitoring of compliance with data quality and validation rules The incorporation of the inspection and monitoring of compliance with discrete approval and decisioning rules. 5

About the Author David Loshin, president of Knowledge Integrity, Inc., (www.knowledge-integrity.com), is a recognized thought leader and expert consultant in the areas of data quality, master data management, and business intelligence. David is a prolific author regarding BI best practices, via the expert channel at www.b-eye-network.com and numerous books and papers on BI and data quality. His book, Business Intelligence: The Savvy Manager s Guide (June 2003) has been hailed as a resource allowing readers to gain an understanding of business intelligence, business management disciplines, data warehousing, and how all of the pieces work together. He is the author of Master Data Management, which has been endorsed by data management industry leaders, and the recently-released The Practitioner s Guide to Data Quality Improvement, focusing on practical processes for improving information utility. Visit http://dataqualitybook.com for more insights on data quality, and http://mdmbook.com for thoughts on master data management. David can be reached at loshin@knowledge-integrity.com. About Winshuttle The best practices outlined in this paper offer broad guidelines for conceptualizing and planning a Master Data Governance program that safeguards the quality and integrity of data assets. The preceding pages have made clear that MDG is not solely or even primarily a technical process: it is a system of roles, rules and rights that determine how people interact with data. Nonetheless, most organizations look to technology to implement and manage the policies and processes necessary for successful MDG. For users of SAP, Winshuttle provides tools for creating and implementing a best-practice based MDG program. The Winshuttle MDG solution focusses on solving the practical master data problems that most directly hinder organizational performance, including: Poor data quality: Winshuttle helps companies gain control over the extraction, creation, update, and maintenance of master data both at the user level, with desktop tools that reduce errors and standardize execution of SAP data operations; and at the process level, with tools and services for creating workflows that ensure compliance with MDG policies. Tension between governance and flexibility: MDG is necessarily restrictive. But an MDG regime that can t adapt to the demands of the changing business environment can end up harming rather than aiding performance. Winshuttle s approach to MDG promotes rapid innovation and problem solving within a controlled framework. The Winshuttle Platform is a codeless development environment that allows analysts and non-programmers to safely author and modify solutions, ensuring that technology never becomes a barrier to MDG effectiveness. Incomplete process tracking: Winshuttle s logging and auditing capabilities ensure that data stewards and shared service groups can document performance against SLA s and create benchmarks for continuous improvement. For more information on Winshuttle s approach to MDG in an SAP environment, visit www.winshuttle.com or email info@winshuttle.com. 6

Winshuttle provides software products that improve how business users work with SAP. For customers who struggle with rigid, expensive and inefficient processes that limit their ability to adapt to changing business conditions, Winshuttle has the solution. The Winshuttle Platform enables customers to build and adapt Excel and SharePoint-based interactive forms and workflows for SAP without programming. Thousands of Winshuttle customers have radically accelerated SAP processes, saving and redirecting millions of dollars every day. Winshuttle supports customers worldwide from offices in the United States, United Kingdom, France, Germany, and India. Corporate Headquarters Bothell, WA Tel + 1 (800) 711-9798 Fax + 1 (425) 527-6666 www.winshuttle.com France Maisons-Alfort, France Tel +33 (0) 148 937 171 Fax +33 (0) 143 683 768 www.winshuttle.fr United Kingdom London, U.K. Tel +44 (0) 208 704 4170 Fax +44 (0) 208 711 2665 www.winshuttle.co.uk India Research & Development Chandigarh, India Tel +91 (0) 172 663 9800 Germany Bremerhaven, Germany Tel +49 (0) 471 140840 Fax +49 (0)471 140849 www.winshuttle-software.de 7