Data Quality and Cost Reduction
|
|
- Marjory Wilcox
- 8 years ago
- Views:
Transcription
1 Data Quality and Cost Reduction A White Paper by David Loshin WHITE PAPER
2 SAS White Paper Table of Contents Introduction Data Quality as a Cost-Reduction Technique... 1 Understanding Expenses Establishing the Connection: Data Flaws and Increased Costs Bids and Proposals Product Research and Development Human Resources Customer Relationship Management Overhead Considerations: Common Data Failures Data Quality Services to Reduce Costs... 5 Empirical Data Quality Assessment Using Data Profiling... 6 Entity Name Harmonization Using Parsing and Standardization Entity Record Consolidation Using Identity Resolution, Matching, and Linkage Address Standardization and Correction Establishing a Unified View Using Master Data Management Summary... 8 About the Author.... 9
3 Data Quality and Cost Reduction Introduction Data Quality as a Cost-Reduction Technique There is no arguing that information technology (IT) is typically a cost center in which many different types of operating costs are incurred, accumulated and offset against organizational revenues and profits. As a core component of IT, the staff, hardware, software and support dedicated to data management are often directly accountable for many of those costs. Although data management contributes to corporate expenses, there are many opportunities to apply best practices in data quality management to reduce expenses. There are different aspects to the meaning of cost reduction. To some, it focuses solely on eliminating or reducing operational expenses. But reduced costs are also often linked to increased efficiencies in day-to-day operations, as well as improved performance for revenue-generating activities. In other words, within the scope of a performance-oriented organization, data quality management can be used to seek operational efficiencies for cost reduction, leading to increased margins and, consequently, increased profits. This paper will review aspects of cost reduction by examining some typical financial accounting expense categories. The paper then selects some specific examples and assesses their reliance on high-quality data. In turn, the paper looks at how data quality services can be applied in those examples to reduce expenses. Lastly, we reiterate the potential for managing data quality as a way to control and reduce organizational expenses. Understanding Expenses The objective of a program to identify operational efficiency may be to take a slash-andburn approach to reducing expenses. However, eliminating staff or services necessary to keep the business running will increase the workload for the remaining staff members, while reducing their effectiveness and, ultimately, detracting from the employee experience. This cycle often leads to increased turnover and an overall reduction in organizational knowledge. Reducing expenses is more about being smart in understanding where costs have exceeded reasonable levels and determining ways to identify and eliminate excessive costs. And this is where good data quality comes in if the source of waste can be attributed to successful business operations that are nonetheless negatively affected by poor data quality, then (logic would suggest) improving data quality will help identify ways to streamline processes and reduce costs. More importantly, the effects of applying these techniques during weaker economic times will train people to work smarter in the future, helping the organization improve competitiveness and agility during economic recovery and expansion. The challenge for the data professional lies not in the knowledge of good data management techniques, but in understanding the financial language that describes how money is spent to run a business, usually encapsulated within the finance department s chart of accounts. This chart lists the channels through which money flows into and out of the organization. 1
4 SAS White Paper In most cases, a business spends money as a prerequisite to making money. That spending is broken into standard categories, such as: Direct costs labor, materials or subcontractor costs associated with fulfilling contractual obligations. General overhead and administrative (GOA) costs rent, maintenance, asset purchase, asset utility, licensing, utilities, administrative staff and general procurement. Staff overhead, including the staff necessary to run the business clerical, sales management, field supervision, bids and proposals (B&P), recruiting and training. Business overhead bank fees, service charges, commissions, legal fees, accounting, penalties and fines, bad debt, merger and acquisition costs. Cost of goods sold (COGS), which refers to the costs associated with creating and selling a product design of the products, raw materials, production, cost of inventory, inventory planning, marketing, sales, customer management, advertising, lead generation, promotional events, samples, customer acquisition, customer retention, order replacement, order fulfillment and shipping, among others. With a better understanding of accounting concepts associated with how the organization spends money, it s easier to reframe cost reduction away from simply slashing the budget to actually creating opportunities. For example, cutting the costs associated with selling the product may result in selling fewer items, so that might not be the wisest choice. But examining the supply chain to reduce held inventory may free up cash so that money becomes available for other activities. This is a good example for two reasons. First, it provides a reasonable context for cost reduction as a way of improving a business process. Second, it is a situation that can be affected by data flaws (e.g., actual held inventory may be much less than what the inventory systems tell you). Establishing good data management and data governance practices, such as identifying and eliminating data errors, will ultimately help prevent inefficiencies from creeping into the system in the first place. Establishing the Connection: Data Flaws and Increased Costs This last statement establishes the role data management plays in many cost-saving initiatives. Despite the perceived tenuous connection between data quality management and the bottom line, business processes may fail to operate at peak efficiency due to poor data quality. Assessing the quality of data is a critical first step. If opportunities for improvement depend on access to high-quality data, then improving data quality management becomes a necessity for analyzing and optimizing operational activities. To illustrate, let s consider some common scenarios in which data issues negatively affect business expenses. 2
5 Data Quality and Cost Reduction Bids and Proposals Competitive sourcing requires that actual costs of delivering a manufactured product, service or a combination of the two be determined. Once those costs are evaluated, the proposal cost is determined by adding an acceptable margin on top of the cost. The bid and proposals (B&P) team then considers the context whether the proposal is within a range that would be acceptable to the customer and any potential aggravating circumstances that could put the project at risk. The B&P process needs reliable data to understand the operational costs for delivery, such as the cost of raw materials, manufacturing and just-in-time inventory requirements. In turn, analyzing the actual costs attributable to the customer from estimates of direct costs and materials, as well as the logistics and delivery of materials to the right places at the right times depends on a level of precision and detail that can be derailed if the right data is missing. Better data means better predictions for proposed projects, which means lower direct costs. Reducing costs and more efficient project planning increase the likelihood of delivering the results early and under budget, thereby increasing the margin. Product Research and Development Businesses often depend on a cyclical understanding of the relationships between their customers and the products those customers buy (or don t buy) in order to plan, design, and market the next iteration of products and services. Knowing which customers buy which products greatly influences the next design cycle. High-quality transaction data about customers (and customer profiles) and their product purchases must be available, because invalid or incorrect data will skew the perception of customer-product affinity. Designers must also consider which raw materials and components are necessary for any proposed product design, since any new ones require additional sourcing and procurement. However, large organizations with different design teams may not be aware of each other s plans, which could lead to inefficiencies in the design process, such as: Different teams may attempt to purchase the same components from different suppliers and end up paying different prices. Different teams may attempt to purchase the same components from the same suppliers and end up paying different prices (i.e., out-of-contract or maverick spending). Some teams may end up building their own component while others acquire them from suppliers. High-quality data will narrow the focus of new product design and, consequently, reduce the costs of developing, marketing and managing product lines that are not successful. In addition, sharing master data about available materials and components will reduce the costs and time associated with new product design. The result is a more focused line of goods with lower operational costs for design. 3
6 SAS White Paper Human Resources Larger companies often grow by acquisition, but each acquired company may have its own organizational resource and staffing systems for managing employees. Inconsistency in capturing an enterprisewide view of employees exposes certain risks and issues. One significant issue is managing talent and skills so that employees gain the additional skills they need during their careers. But incomplete or incorrect information about employees may lead to increased hiring costs and turnover. For example, the need for an employee with a specific skill set may arise. If the staffing personnel don t know that there are people within the organization that possess that skill set, they may recruit a new employee for that role. As a result, existing staff members may not be used to their full potential while recruitment and hiring costs may increase. Additionally, employees passed over for a plum spot may become disenfranchised, leading to increased turnover, which again increases recruiting and hiring costs. These increased costs are attributable to the lack of high-quality information about employees, but the costs can be reduced if employee information is managed better. Customer Relationship Management Poor customer data quality can wreak havoc with any kind of customer relationship management program. A common issue, such as inadvertent duplication of customer records, can obscure the full view of any one customer s interactions with the organization. Once a customer loses trust in the company s ability to serve its customer base, then the costs for customer retention, the requirements for sales and marketing, and customer attrition rates all go up. If the costs for new customer acquisition outweigh those of customer retention, then failing to stem customer churn means that the average cost of doing business with each customer increases, with a proportional increase in spending. For instance, a simple exercise of identifying and eliminating duplicate customer data can help maintain a complete view of the customer and potentially increase retention. In this way, improved data quality leads to reduced COGS. Overhead All organizations must incur overhead: the indirect expenses that are necessary to run the business but don t contribute to its profitability. While the costs of employees working on client projects are direct costs, other costs that are necessary so those employees can do their jobs office rent, telephone services, electricity and liability insurance are overhead. Other examples of overhead include non-reimbursable travel expenses, postage, office supplies, professional services (such as accounting), office equipment leases and taxes. 4
7 Data Quality and Cost Reduction As the organization grows, so does the need to manage overhead, especially when there are few controls over spending. Often people in one part of the company are unaware of what others are doing when it comes to engaging vendors, negotiating prices and managing the relationships with different suppliers. Multiple, unsynchronized supplier systems compound the problem. When the same suppliers appear multiple times across different systems, they benefit from establishing many relationships, contracts, prices and so forth. In addition, the same types of products and services may be purchased at different prices from different vendors. Spending analysis is a process that helps evaluate these potential sources of overhead leakage. Yet the inability to determine that two different records refer to the same vendor makes it very difficult to get a comprehensive view of all supplier relationships. Once supplier records have been consolidated into a unified view, the company can better analyze the ways it interacts with those suppliers. Considerations: Common Data Failures While the reasons for increased costs in each of these examples differ, there are some commonalities, some generic data issues, that occur with a degree of regularity that affect the organization s expenses. Essentially, each issue involves common data failures associated with defined dimensions of data quality: Completeness: Information necessary to complete transactions, to perform operations, or to make the proper decisions within the business context is missing or defaulted to an unusable value. Accuracy: Information that is inaccurate or is not at the proper level of precision leads to increased reconciliation and manual intervention, thereby reducing throughput and increasing the completion time. Uniqueness: Variance in party or entity data particularly customer, supplier, employee, product and materials data affects inventory, staffing and effective customer management, leading to increased effort and leakage. These types of data failures magnify gaps in business processes, but assessing those data failures and eliminating their root causes will remove the barriers to analyzing corporate spending and lead to greater efficiencies. Data Quality Services to Reduce Costs From an operational standpoint, costs can be reduced using the following data quality management processes and techniques: Data quality assessment using data profiling. Entity name harmonization and standardization. Entity record consolidation using identity resolution, matching and linkage. Address standardization and correction. Establishing a unified view using master data management. 5
8 SAS White Paper Empirical Data Quality Assessment Using Data Profiling If poor data quality can create barriers to comprehensive spending analysis, and then the first steps to breaking through those barriers are identifying, evaluating and prioritizing potential data anomalies. Empirical analysis of the data provides the first insights that can motivate cost reduction, and that empirical analysis is typically performed using data profiling. Assessing data quality is a process of analysis and discovery requiring an objective review of the actual data values. Those values are assessed by using statistical analysis of data value frequencies, followed by analyst review. This review helps pinpoint instances of flawed data and identifies potential anomalies that are impeding optimal business processes and increasing expenses. Data profiling uses a set of algorithms for statistical analysis and assessment of the quality of data values within a data set. Data profiling helps explore relationships that exist within and across data sets. For each column in a table, a data profiling tool will provide a frequency distribution of the different values, shedding insight into the data type and use in each column. Cross-column analysis can expose embedded value dependencies, while inter-table analysis explores overlapping value sets that may represent foreign key relationships between entities. In this way, profiling can analyze and assess anomalies and build the groundwork for data improvement efforts. Entity Name Harmonization Using Parsing and Standardization When analysts are able to describe the different component and format patterns used to represent a data object (person name, supplier name, raw material, employee name, product description, etc.), data quality tools can parse data values that conform to any of those patterns and even transform them into a single, normalized format. Parsing uses defined patterns managed within a rules engine to distinguish valid and invalid data values. When patterns are recognized, other rules and actions can be triggered, either to standardize the representation (presuming a valid representation) or to correct the values (if known errors are identified). Automated pattern-based parsing will recognize variant string tokens and subsequently reorder those tokens into a standardized format. Entity Record Consolidation Using Identity Resolution, Matching, and Linkage The primary challenge of entity identification and resolution is twofold. Sometimes multiple data instances actually refer to the same real-world entity, while other times it may appear that a record does not exist for an entity when, in fact, it does. Both of these conditions are related to the same fundamental issues. In the first situation, similar, yet slightly variant, representations in data values may have been inadvertently introduced into the system. In the second situation, a slight variation in representation prevents the identification of an exact match of the existing record in the data set. 6
9 Data Quality and Cost Reduction Both of these issues are addressed through identity resolution, which is another name for approximate matching and record linkage. In this process, the degree of similarity between any two records is scored using a weighted approximate matching between a corresponding set of mapped attribute values. When the match score exceeds a specific threshold, the two records are assumed to be a match; if the score is below an established threshold, the two records are assumed not to match. Scores that fall between the two thresholds are sent to subject-matter experts for review. Identity resolution is used to recognize when only slight variations suggest that different records are connected and where values may be cleansed. There are different approaches to matching a deterministic approach relies on a broad knowledge base for matching, while a probabilistic approach uses statistical techniques that contribute to the weighting and similarity scoring process. Identifying similar records within the same data set probably means that the records are duplicates and may be cleansed or removed. Identifying similar records in different sets may help resolve duplicated data across applications, thus eliminating variant representations of products, customers, employees or any other set of entities. Address Standardization and Correction Address data quality is critical for businesses that depend on the delivery of products to specific locations or for traditional marketing outreach to customers and prospects via mailings. Many countries have postal standards, and increasingly these standards are being used for typical mailings and for cleansing and identity resolution. The United States Postal Service s standards are well-defined and provide an easy template for address correction. An address standardization process must incorporate: A method to validate whether an address is already in standard form. Parsing and standardization tools to identify each address token. A method to disambiguate data elements that might cause a conflict (e.g., directional words such as West or NW as part of a street name, or the difference between ST as an abbreviation for street or for saint ). A method to map strings to standard abbreviations. Data normalization to reformulate the data address tokens into a standard form. This process builds on earlier capabilities and integrates corrective actions to resolve inconsistent or extraneous address tokens. The result: a clean address and a lower likelihood of incurring repeat delivery costs after attempts to deliver to the wrong location. 7
10 SAS White Paper Establishing a Unified View Using Master Data Management Not only are identity resolution failures among the most common data issues and pervasive across all industries, they also lead to the greatest business impact, especially when attempting to use customer data, both operationally and from an analytic standpoint. By applying these data quality techniques together, you can configure an environment with a unified view of the customer data (or product data, employee data, or any master data object) that is currently managed by different applications using different data stores across the enterprise. Master data objects are those core business objects that are used in different applications across the organization, along with their associated metadata, attributes, definitions, roles, connections and taxonomies. Master data objects are those things upon which successful business operations rely the things that are logged in our transaction systems, drive operations, are measured and reported on in our reporting systems, and analyzed in our analytical systems. Master data management (MDM) is essentially a data quality management program that incorporates the business applications, methods and tools to implement policies, procedures and infrastructure that support the capture, integration, and subsequent shared use of accurate, timely, consistent and complete master data. While the scope of the infrastructure to implement MDM may be relatively broad, it cannot be executed without data quality techniques. For example, to create a unified view of the customer, it is necessary to find all records associated with a unique entity that resides in different systems. Variance in representations, spelling and formats means that the unification process cannot be limited to exact duplicate matching. Instead, it relies on parsing, standardization and normalization of name strings, as well as approximate matching and linkage to collate identifying attributes and to connect similar records. The MDM architecture then allows similar records to be mapped into a single virtual representation. Extending the process allows you to map customers to products, or employees to skills, which is the first step in building the ever-elusive 360-degree view. This unified view can help reduce costs for sales and marketing, improve customer selfservice, and reduce recruitment and hiring costs. Summary Organizations that rely on information to successfully run their businesses should be aware of the ways that flawed data can increase costs. When times are tough and the boss is looking to reduce expenses, or when times are good and management is seeking greater margins and increased profits, having a process to establish reliable data will reveal opportunities where low investments can lead to high returns. 8
11 Data Quality and Cost Reduction Whether you seek to reduce COGS, overhead, general and administrative or even direct costs, be prepared see how data errors wreak havoc with the desired results. Data errors are not uncommon, and there are well-defined approaches to analyzing their root cause and eliminating their sources. Applying data quality best practices to address commonly occurring data issues will ultimately alleviate the pain those errors cause. About the Author David Loshin, President of Knowledge Integrity Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. Loshin is a prolific author regarding data management best practices and has written numerous books, white papers and Web seminars on a variety of data management best practices. His book Business Intelligence: The Savvy Manager s Guide has been hailed as a resource allowing readers to gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together. His book Master Data Management has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com. Loshin is also the author of the recent book The Practitioner s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com. 9
12 About SAS SAS is the leader in business analytics software and services, and the largest independent vendor in the business intelligence market. Through innovative solutions, SAS helps customers at more than 65,000 sites improve performance and deliver value by making better decisions faster. Since 1976, SAS has been giving customers around the world THE POWER TO KNOW. SAS Institute Inc. World Headquarters To contact your local SAS office, please visit: sas.com/offices SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. indicates USA registration. Other brand and product names are trademarks of their respective companies. Copyright 2013, SAS Institute Inc. All rights reserved _S118298_1213
Evaluating the Business Impacts of Poor Data Quality
Evaluating the Business Impacts of Poor Data Quality Submitted by: David Loshin President, Knowledge Integrity, Inc. (301) 754-6350 loshin@knowledge-integrity.com Knowledge Integrity, Inc. Page 1 www.knowledge-integrity.com
More informationUnderstanding the Financial Value of Data Quality Improvement
Understanding the Financial Value of Data Quality Improvement Prepared by: David Loshin Knowledge Integrity, Inc. January, 2011 Sponsored by: 2011 Knowledge Integrity, Inc. 1 Introduction Despite the many
More informationBuilding a Data Quality Scorecard for Operational Data Governance
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
More informationFive Fundamental Data Quality Practices
Five Fundamental Data Quality Practices W H I T E PA P E R : DATA QUALITY & DATA INTEGRATION David Loshin WHITE PAPER: DATA QUALITY & DATA INTEGRATION Five Fundamental Data Quality Practices 2 INTRODUCTION
More informationPopulating a Data Quality Scorecard with Relevant Metrics WHITE PAPER
Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Useful vs. So-What Metrics... 2 The So-What Metric.... 2 Defining Relevant Metrics...
More informationEffecting Data Quality Improvement through Data Virtualization
Effecting Data Quality Improvement through Data Virtualization Prepared for Composite Software by: David Loshin Knowledge Integrity, Inc. June, 2010 2010 Knowledge Integrity, Inc. Page 1 Introduction The
More information5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 www.winshuttle.com Introduction
More informationData Quality Assessment. Approach
Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source
More informationThree Fundamental Techniques To Maximize the Value of Your Enterprise Data
Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Prepared for Talend by: David Loshin Knowledge Integrity, Inc. October, 2010 2010 Knowledge Integrity, Inc. 1 Introduction Organizations
More informationData Governance, Data Architecture, and Metadata Essentials
WHITE PAPER Data Governance, Data Architecture, and Metadata Essentials www.sybase.com TABLE OF CONTENTS 1 The Absence of Data Governance Threatens Business Success 1 Data Repurposing and Data Integration
More informationMonitoring Data Quality Performance Using Data Quality Metrics
WHITE PAPER Monitoring Data Quality Performance Using Data Quality Metrics with David Loshin This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of
More informationSupporting Your Data Management Strategy with a Phased Approach to Master Data Management WHITE PAPER
Supporting Your Data Strategy with a Phased Approach to Master Data WHITE PAPER SAS White Paper Table of Contents Changing the Way We Think About Master Data.... 1 Master Data Consumers, the Information
More informationOperationalizing Data Governance through Data Policy Management
Operationalizing Data Governance through Data Policy Management Prepared for alido by: David Loshin nowledge Integrity, Inc. June, 2010 2010 nowledge Integrity, Inc. Page 1 Introduction The increasing
More informationBusting 7 Myths about Master Data Management
Knowledge Integrity Incorporated Busting 7 Myths about Master Data Management Prepared by: David Loshin Knowledge Integrity, Inc. August, 2011 Sponsored by: 2011 Knowledge Integrity, Inc. 1 (301) 754-6350
More informationData Integration Alternatives Managing Value and Quality
Solutions for Customer Intelligence, Communications and Care. Data Integration Alternatives Managing Value and Quality Using a Governed Approach to Incorporating Data Quality Services Within the Data Integration
More informationData Governance for Master Data Management and Beyond
Data Governance for Master Data Management and Beyond A White Paper by David Loshin WHITE PAPER Table of Contents Aligning Information Objectives with the Business Strategy.... 1 Clarifying the Information
More informationPractical Fundamentals for Master Data Management
Practical Fundamentals for Master Data Management How to build an effective master data capability as the cornerstone of an enterprise information management program WHITE PAPER SAS White Paper Table of
More informationW H I T E PA P E R : DATA QUALITY
Core Data Services: Basic Components for Establishing Business Value W H I T E PA P E R : DATA QUALITY WHITE PAPER: DATA QUALITY Core Data Services: Basic Components for Establishing Business Value 2 INTRODUCTION
More informationData Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise
Data Governance Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise 2 Table of Contents 4 Why Business Success Requires Data Governance Data Repurposing
More informationBusiness Performance & Data Quality Metrics. David Loshin Knowledge Integrity, Inc. loshin@knowledge-integrity.com (301) 754-6350
Business Performance & Data Quality Metrics David Loshin Knowledge Integrity, Inc. loshin@knowledge-integrity.com (301) 754-6350 1 Does Data Integrity Imply Business Value? Assumption: improved data quality,
More information5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC Executive Summary Successful deployment of ERP solutions can revolutionize
More informationData Integration Alternatives Managing Value and Quality
Solutions for Enabling Lifetime Customer Relationships Data Integration Alternatives Managing Value and Quality Using a Governed Approach to Incorporating Data Quality Services Within the Data Integration
More informationNCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation Market Offering: Package(s): Oracle Authors: Rick Olson, Luke Tay Date: January 13, 2012 Contents Executive summary
More informationConsiderations: Mastering Data Modeling for Master Data Domains
Considerations: Mastering Data Modeling for Master Data Domains David Loshin President of Knowledge Integrity, Inc. June 2010 Americas Headquarters EMEA Headquarters Asia-Pacific Headquarters 100 California
More informationIBM Software A Journey to Adaptive MDM
IBM Software A Journey to Adaptive MDM What is Master Data? Why is it Important? A Journey to Adaptive MDM Contents 2 MDM Business Drivers and Business Value 4 MDM is a Journey 7 IBM MDM Portfolio An Adaptive
More informationagility made possible
SOLUTION BRIEF CA IT Asset Manager how can I manage my asset lifecycle, maximize the value of my IT investments, and get a portfolio view of all my assets? agility made possible helps reduce costs, automate
More informationMaster Data Management Drivers: Fantasy, Reality and Quality
Solutions for Customer Intelligence, Communications and Care. Master Data Management Drivers: Fantasy, Reality and Quality A Review and Classification of Potential Benefits of Implementing Master Data
More informationSOLUTION BRIEF: CA IT ASSET MANAGER. How can I reduce IT asset costs to address my organization s budget pressures?
SOLUTION BRIEF: CA IT ASSET MANAGER How can I reduce IT asset costs to address my organization s budget pressures? CA IT Asset Manager helps you optimize your IT investments and avoid overspending by enabling
More informationPrincipal MDM Components and Capabilities
Principal MDM Components and Capabilities David Loshin Knowledge Integrity, Inc. 1 Agenda Introduction to master data management The MDM Component Layer Model MDM Maturity MDM Functional Services Summary
More informationIMPROVEMENT THE PRACTITIONER'S GUIDE TO DATA QUALITY DAVID LOSHIN
i I I I THE PRACTITIONER'S GUIDE TO DATA QUALITY IMPROVEMENT DAVID LOSHIN ELSEVIER AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS SAN DIEGO SAN FRANCISCO SINGAPORE SYDNEY TOKYO Morgan Kaufmann
More informationThe Data Quality Business Case: Projecting Return on Investment
WHITE PAPER The Data Quality Business Case: Projecting Return on Investment with David Loshin This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information )
More informationChallenges in the Effective Use of Master Data Management Techniques WHITE PAPER
Challenges in the Effective Use of Master Management Techniques WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Consolidation: The Typical Approach to Master Management. 2 Why Consolidation
More informationData Governance. Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise
Data Governance Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise 2 Table of Contents 4 Why Business Success Requires Data Governance Data Repurposing
More informationEMC PERSPECTIVE Enterprise Data Management
EMC PERSPECTIVE Enterprise Data Management Breaking the bad-data bottleneck on profits and efficiency Executive overview Why are data integrity and integration issues so bad for your business? Many companies
More informationSupercharge Salesforce.com initiatives with a 360-degree view of the customer
IBM Software Thought Leadership White Paper Information Management Supercharge Salesforce.com initiatives with a 360-degree view of the customer 2 Supercharge Salesforce.com initiatives with a 360-degree
More information10426: Large Scale Project Accounting Data Migration in E-Business Suite
10426: Large Scale Project Accounting Data Migration in E-Business Suite Objective of this Paper Large engineering, procurement and construction firms leveraging Oracle Project Accounting cannot withstand
More informationNext-Generation IT Asset Management: Transform IT with Data-Driven ITAM
Sponsored by Next-Generation IT Asset Management: In This Paper IT Asset Management, one of the key pillars of IT, is currently highly siloed from related and dependent functions Next-generation ITAM provides
More informationMake your CRM work harder so you don t have to
September 2012 White paper Make your CRM work harder so you don t have to 1 With your CRM working harder to deliver a unified, current view of your clients and prospects, you can concentrate on retaining
More informationHow To Improve Product Data Quality
Three Critical Steps to Improving Product Data Quality A DataFlux White Paper Prepared by Jim Harris Introduction Convincing your organization to view data as a strategic corporate asset and, by extension,
More informationAddressing the Challenges of Data Governance
Debbie Schmidt FIS Consulting Services www.fisglobal.com Executive Summary Addressing the Challenges of Sound bank management ceases to exist without reliable, accurate information. This paper will explore
More informationThe Business Case for Information Management An Oracle Thought Leadership White Paper December 2008
The Business Case for Information Management An Oracle Thought Leadership White Paper December 2008 NOTE: The following is intended to outline our general product direction. It is intended for information
More informationCHAPTER SIX DATA. Business Intelligence. 2011 The McGraw-Hill Companies, All Rights Reserved
CHAPTER SIX DATA Business Intelligence 2011 The McGraw-Hill Companies, All Rights Reserved 2 CHAPTER OVERVIEW SECTION 6.1 Data, Information, Databases The Business Benefits of High-Quality Information
More informationContent is essential to commerce
Content is essential to commerce IBM ECM helps organizations improve the efficiency of buy, market, sell and service processes Highlights: Analyze customer and operational data and build business processes
More informationInformatica Master Data Management
Informatica Master Data Management Improve Operations and Decision Making with Consolidated and Reliable Business-Critical Data brochure The Costs of Inconsistency Today, businesses are handling more data,
More informationGlobal Headquarters: 5 Speen Street Framingham, MA 01701 USA P.508.872.8200 F.508.935.4015 www.idc.com
WHITE PAPER Channel Sales Management: Beyond CRM Sponsored by: BlueRoads Mary Wardley October 2006 INTRODUCTION Global Headquarters: 5 Speen Street Framingham, MA 01701 USA P.508.872.8200 F.508.935.4015
More informationORACLE SERVICES PROCUREMENT
ORACLE SERVICES PROCUREMENT KEY FEATURES Create contracts for services with complex payment terms Incorporate progress payment schedule into services contracts Track and report progress based on schedule
More informationThree proven methods to achieve a higher ROI from data mining
IBM SPSS Modeler Three proven methods to achieve a higher ROI from data mining Take your business results to the next level Highlights: Incorporate additional types of data in your predictive models By
More informationSupply Chain Management 100 Success Secrets
Supply Chain Management 100 Success Secrets Supply Chain Management 100 Success Secrets - 100 Most Asked Questions: The Missing SCM Software, Logistics, Solution, System and Process Guide Lance Batten
More informationA Simple Guide to Material Master Data Governance. By Keith Boardman, Strategy Principal
A Simple Guide to Material Master Data Governance By Keith Boardman, Strategy Principal DATUM is an Information Management solutions company focused on driving greater business value through data. We provide
More informationThe Evolving Role of Process Automation and the Customer Service Experience
The Evolving Role of Process Automation and the Customer Service Experience Kyle Lyons Managing Director Ponvia Technology Gina Clarkin Product Manager Interactive Intelligence Table of Contents Executive
More informationB2B E-Commerce Solutions Empower Wholesale Distributors
SAP Thought Leadership Paper Wholesale Distribution B2B E-Commerce Solutions Empower Wholesale Distributors Achieve Interaction Excellence with Outstanding Online Experiences and High-Quality Digital Content
More informationMaking Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management
Making Business Intelligence Easy Whitepaper Measuring data quality for successful Master Data Management Contents Overview... 3 What is Master Data Management?... 3 Master Data Modeling Approaches...
More informationVehicle Sales Management
Solution in Detail Automotive Executive Summary Contact Us Vehicle Sales Optimizing Your Wholesale Business Efficient Sales Collaborative Operation Faced with declining margins, automotive sales organizations
More informationSALES AND OPERATIONS PLANNING BLUEPRINT BUSINESS VALUE GUIDE
Business Value Guide SALES AND OPERATIONS PLANNING BLUEPRINT BUSINESS VALUE GUIDE INTRODUCTION What if it were possible to tightly link sales, marketing, supply chain, manufacturing and finance, so that
More informationData Governance. David Loshin Knowledge Integrity, inc. www.knowledge-integrity.com (301) 754-6350
Data Governance David Loshin Knowledge Integrity, inc. www.knowledge-integrity.com (301) 754-6350 Risk and Governance Objectives of Governance: Identify explicit and hidden risks associated with data expectations
More informationData Governance: A Business Value-Driven Approach
Data Governance: A Business Value-Driven Approach A White Paper by Dr. Walid el Abed CEO January 2011 Copyright Global Data Excellence 2011 Contents Executive Summary......................................................3
More informationEnterprise Data Quality
Enterprise Data Quality An Approach to Improve the Trust Factor of Operational Data Sivaprakasam S.R. Given the poor quality of data, Communication Service Providers (CSPs) face challenges of order fallout,
More informationRole of Analytics in Infrastructure Management
Role of Analytics in Infrastructure Management Contents Overview...3 Consolidation versus Rationalization...5 Charting a Course for Gaining an Understanding...6 Visibility into Your Storage Infrastructure...7
More informationAccenture Federal Services. Federal Solutions for Asset Lifecycle Management
Accenture Federal Services Federal Solutions for Asset Lifecycle Management Assessing Internal Controls 32 Material Weaknesses: identified in FY12 with deficiencies noted in the management of nearly 75%
More informationMANAGING THE REVENUE CYCLE WITH BUSINESS INTELLIGENCE: June 30, 2006 BUSINESS INTELLIGENCE FOR HEALTHCARE
MANAGING THE REVENUE CYCLE WITH BUSINESS INTELLIGENCE: June 30, 2006 BUSINESS INTELLIGENCE FOR HEALTHCARE Hospital manager and leadership positions face many challenges in today s healthcare environment
More informationOracle Business Intelligence Applications Overview. An Oracle White Paper March 2007
Oracle Business Intelligence Applications Overview An Oracle White Paper March 2007 Note: The following is intended to outline our general product direction. It is intended for information purposes only,
More informationDATA QUALITY MATURITY
3 DATA QUALITY MATURITY CHAPTER OUTLINE 3.1 The Data Quality Strategy 35 3.2 A Data Quality Framework 38 3.3 A Data Quality Capability/Maturity Model 42 3.4 Mapping Framework Components to the Maturity
More informationCustomer Master Data: Common Challenges and Solutions
Customer Master Data: Common Challenges and Solutions By Will Crump President, DATUM LLC Executive Summary Master data within an enterprise is typically segmented by domain, or a category of related data
More informationGovernment Insights: Possible IT Budget Cuts
Tactical Guidelines, J. Kost Research Note 6 January 2003 Government Insights: Possible IT Budget Cuts In tough economic times, government leaders pressure CIOs to reduce IT budgets. Although IT investments
More informationTapping the benefits of business analytics and optimization
IBM Sales and Distribution Chemicals and Petroleum White Paper Tapping the benefits of business analytics and optimization A rich source of intelligence for the chemicals and petroleum industries 2 Tapping
More informationConnecting data initiatives with business drivers
Connecting data initiatives with business drivers TABLE OF CONTENTS: Introduction...1 Understanding business drivers...2 Information requirements and data dependencies...3 Costs, benefits, and low-hanging
More informationData Quality Assurance
CHAPTER 4 Data Quality Assurance The previous chapters define accurate data. They talk about the importance of data and in particular the importance of accurate data. They describe how complex the topic
More informationIncrease Business Intelligence Infrastructure Responsiveness and Reliability Using IT Automation
White Paper Increase Business Intelligence Infrastructure Responsiveness and Reliability Using IT Automation What You Will Learn That business intelligence (BI) is at a critical crossroads and attentive
More informationThe Butterfly Effect on Data Quality How small data quality issues can lead to big consequences
How small data quality issues can lead to big consequences White Paper Table of Contents How a Small Data Error Becomes a Big Problem... 3 The Pervasiveness of Data... 4 Customer Relationship Management
More information10 Fundamental Strategies and Best Practices of Supply Chain Organizations
10 Fundamental Strategies and Best Practices of Supply Chain Organizations Robert J. Engel, C.P.M. National Director of Client Service Resources Global Professionals - SCM Practice 713-403-1979: Bob.Engel@Resources-us.com
More informationThe SAS Transformation Project Deploying SAS Customer Intelligence for a Single View of the Customer
Paper 3353-2015 The SAS Transformation Project Deploying SAS Customer Intelligence for a Single View of the Customer ABSTRACT Pallavi Tyagi, Jack Miller and Navneet Tuteja, Slalom Consulting. Building
More informationIntegrating Data Governance into Your Operational Processes
TDWI rese a rch TDWI Checklist Report Integrating Data Governance into Your Operational Processes By David Loshin Sponsored by tdwi.org August 2011 TDWI Checklist Report Integrating Data Governance into
More informationHow To Audit A Company
INTERNATIONAL STANDARD ON AUDITING 315 IDENTIFYING AND ASSESSING THE RISKS OF MATERIAL MISSTATEMENT THROUGH UNDERSTANDING THE ENTITY AND ITS ENVIRONMENT (Effective for audits of financial statements for
More informationInfoGlobalData specialise in B2B Email Lists and Email Appending Services.
InfoGlobalData specialise in B2B Email Lists and Email Appending Services. We provide high quality mailing lists for your email marketing needs. Our data intelligence service can provide valuable insight
More informationMETA DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING
META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 asistithod@gmail.com
More informationENTERPRISE MANAGEMENT AND SUPPORT IN THE TELECOMMUNICATIONS INDUSTRY
ENTERPRISE MANAGEMENT AND SUPPORT IN THE TELECOMMUNICATIONS INDUSTRY The Telecommunications Industry Companies in the telecommunications industry face a number of challenges as market saturation, slow
More informationMDM Components and the Maturity Model
A DataFlux White Paper Prepared by: David Loshin MDM Components and the Maturity Model Leader in Data Quality and Data Integration www.dataflux.com 877 846 FLUX International +44 (0) 1753 272 020 One common
More informationThe Total Economic Impact Of SAS Customer Intelligence Solutions Intelligent Advertising For Publishers
A Forrester Total Economic Impact Study Commissioned By SAS Project Director: Dean Davison February 2014 The Total Economic Impact Of SAS Customer Intelligence Solutions Intelligent Advertising For Publishers
More informationAgile Master Data Management A Better Approach than Trial and Error
Agile Master Data Management A Better Approach than Trial and Error A whitepaper by First San Francisco Partners First San Francisco Partners Whitepaper Executive Summary Market leading corporations are
More informationData Governance: A Business Value-Driven Approach
Global Excellence Governance: A Business Value-Driven Approach A White Paper by Dr Walid el Abed CEO Trusted Intelligence Contents Executive Summary......................................................3
More informationValuable Metrics for Successful Supplier Management
Valuable Metrics for Successful Supplier Management Stringent global trade and product safety regulations have been increasingly enforced, driving organizations to seek absolute visibility across the supply
More informationBusiness Drivers for Data Quality in the Utilities Industry
Solutions for Enabling Lifetime Customer Relationships. Business Drivers for Data Quality in the Utilities Industry Xxxxx W HITE PAPER: UTILITIES WHITE PAPER: UTILITIES Business Drivers for Data Quality
More informationThis paper looks at current order-to-pay challenges. ECM for Order-to-Pay. Maximize Operational Excellence
J A N U A R Y 2 0 1 4 ECM for Order-to-Pay Maximize Operational Excellence This paper looks at current order-to-pay challenges and trends; what organizations should consider in an order-to-pay solution;
More informationThe Cost of Duplicate Data in Enterprise Content Management
The Cost of Duplicate Data in Enterprise Content Management Sheryl Arnold Partner/CTO Introduction Duplicate records in databases and product information are a serious issue for data and content management
More informationIBM Tivoli Netcool network management solutions for enterprise
IBM Netcool network management solutions for enterprise The big picture view that focuses on optimizing complex enterprise environments Highlights Enhance network functions in support of business goals
More informationDataFlux Data Management Studio
DataFlux Data Management Studio DataFlux Data Management Studio provides the key for true business and IT collaboration a single interface for data management tasks. A Single Point of Control for Enterprise
More informationHow CRM Software Benefits Insurance Companies
How CRM Software Benefits Insurance Companies Salesboom.com Currently, the Insurance Industry is in a state of change where today's insurance field is becoming extremely complex and more competitive. As
More informationhow can you stop sprawl in your IT infrastructure?
SOLUTION BRIEF Software Rationalization Services August 2010 how can you stop sprawl in your IT infrastructure? we can You can optimize your software portfolio with Software Rationalization Services from
More informationInformation Systems in the Enterprise
Chapter 2 Information Systems in the Enterprise 2.1 2006 by Prentice Hall OBJECTIVES Evaluate the role played by the major types of systems in a business and their relationship to each other Describe the
More informationWhat to Look for When Selecting a Master Data Management Solution
What to Look for When Selecting a Master Data Management Solution What to Look for When Selecting a Master Data Management Solution Table of Contents Business Drivers of MDM... 3 Next-Generation MDM...
More informationMeasure Your Data and Achieve Information Governance Excellence
SAP Brief SAP s for Enterprise Information Management SAP Information Steward Objectives Measure Your Data and Achieve Information Governance Excellence A single solution for managing enterprise data quality
More informationAn Enterprise Resource Planning Solution for Mill Products Companies
SAP Thought Leadership Paper Mill Products An Enterprise Resource Planning Solution for Mill Products Companies Driving Operational Excellence and Profitable Growth Table of Contents 4 What It Takes to
More informationEnterprise Content Management for Procurement
Enterprise Content Management for Procurement Extending SAP capabilities is a key aspect of advanced Enterprise Content Management Today s procurement departments need extended content management solutions,
More informationGetting a head start in Software Asset Management
Getting a head start in Software Asset Management Managing software for improved cost control, better security and reduced risk A guide from Centennial Software September 2007 Abstract Software Asset Management
More informationThe Total Economic Impact Of SAS Customer Intelligence Solutions Real-Time Decision Manager
A Forrester Total Economic Impact Study Commissioned By SAS Project Director: Dean Davison May 2014 The Total Economic Impact Of SAS Customer Intelligence Solutions Real-Time Decision Manager Table Of
More informationThought Leadership White Paper. Consolidate Job Schedulers to Save Money
Thought Leadership White Paper Consolidate Job Schedulers to Save Money Table of Contents 1 EXECUTIVE SUMMARY 2 INCREASED BUSINESS FOCUS 2 LOWER TCO 3 GREATER AGILITY 3 IMPROVED COMPLIANCE 3 REAL-WORLD
More informationHIGH PRECISION MATCHING AT THE HEART OF MASTER DATA MANAGEMENT
HIGH PRECISION MATCHING AT THE HEART OF MASTER DATA MANAGEMENT Author: Holger Wandt Management Summary This whitepaper explains why the need for High Precision Matching should be at the heart of every
More informationIn this chapter, we build on the basic knowledge of how businesses
03-Seidman.qxd 5/15/04 11:52 AM Page 41 3 An Introduction to Business Financial Statements In this chapter, we build on the basic knowledge of how businesses are financed by looking at how firms organize
More information