Data Quality Assessment. Approach
|
|
- Edith Beverly Whitehead
- 8 years ago
- Views:
Transcription
1 Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15
2 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source systems is accurate and reliable more effort will be spent by BI users on manual activities, rework than on business related activities. In order to improve the quality of data many companies initiate data quality assessment programs and form data stewardship groups. Yet in absence of a comprehensive methodology, measuring data quality remains an elusive concept. It proves to be easier to produce hundreds or thousands of data error reports than to make any sense of them. The purpose of this document is to provide a methodology for managing data quality issues. The process is described in the order the events should occur, from the initial capture of data quality issues to presenting the findings to the data owners for further action. Data quality can be defined as the state of completeness, consistency, timeliness and accuracy which makes the data appropriate for a specific use. The dimensions of data quality are: Accuracy loading of facts/dimensions correctly Completeness - having all relevant data stored Consistency - format uniformity for storing the data Timeliness - data storage in the required time frame Thus data quality should ensure that the data loaded into the target destination is timely, accurate, complete and consistent. Data quality issues result from:- Mainly due to incorrect manual entry of data in the source system Lack of common data standards across various business divisions when integrating data from multiple data sources while loading to a data warehouse Lack of a proper business process. In some cases root cause analysis of a data quality issue may point to a business process that is required to be re-designed to mitigate the issue. Data Quality Assessment Approach-Review.doc Page 2 of 15
3 The main source of data quality issues as provided in a TDWI survey is categorized and ranked below as Figure 1: Sources of Data Quality Problems Benefits with Improved Data Quality The following benefits are a resultant of improved information quality: Data quality related benefits can be related as either hard benefits, or soft benefits Soft benefits are those that are evident, clearly have an effect on productivity, yet are difficult to measure. These include: Build user confidence and trust in the data disseminated by the Data Warehouse solution - Good data quality promotes use of the Data Warehouse. Improve throughput for volume processing By reducing the delays associated with detecting and correcting data errors, and the rework associated with that correction, more transactions can be processed, resulting in greater volume processing and lower cost per transaction. Improve customer profiling Having more compliant customer information allows the business intelligence process to provide more accurate customer profiling, which in turn can lead to increased sales, better customer service, and increased valued customer retention. Decrease resource requirements Redundant data, correction, and rework put an unnecessary strain on an organization s resource pool. Eliminating redundant data and reducing the amount of rework reduces that strain and provides better resource allocation and utilization. Hard benefits are those which can be estimated and/or measured. These include: Customer attrition, which occurs when a customer s reaction to poor data quality results in the customer s complete cessation of business Data Quality Assessment Approach-Review.doc Page 3 of 15
4 Costs of error detection and correction Detection costs, which are incurred when a system error or processing failure occurs and a process is invoked to track down the problem. Extra time it takes to correct data problems Correction costs, which are associated with the actual correction of a problem as well as the restarting of any failed processes or activities. The amount of time associated with the activity that failed, along with extraneous employee activity, are all rolled up into correction costs. Costs of data maintenance i.e. maintaining spreadsheets to meet information requirements Extra resources needed to correct data problems Time and effort required to re-run jobs that a bend Time wasted arguing over inconsistent reports Lost business opportunities due to unavailable data Fines paid for noncompliance with government regulations Shipping products to the wrong customers Bad public relations with customers leads to alienated and lost customer The Data Warehousing Institute (TDWI) estimates that poor quality customer data costs U.S. businesses a staggering $611 billion a year in postage, printing, and staff overhead. Organizations can frustrate and alienate loyal customers by incorrectly addressing letters or failing to recognize them when they call or visit a store or Web site. Once a company loses its loyal customers, it loses its base of sales and referrals, and future revenue potential. The above benefits can be provided in the solution by following the Data Quality Assessment Methodology as mentioned in the following section Data Quality Assessment Methodology The Data Quality Assessment methodology consists of a five stage process for assessing and improving data quality of the solution being addressed 1. Extract - Identify and analyze the source data being cleansed 2. Discover - Identify and understand the current data issues 3. Cleanse Implement checks to rectify data errors 4. Monitoring - Processes for regularly validating the data 5. Prevention Fixing the processes by which these data errors are introduced. Data Quality Assessment Approach-Review.doc Page 4 of 15
5 Step 1 Extract Step 2 Discover Step 3 Cleanse Step 4 Monitor Step 5 Prevention Deploy automated profiling tools for accelerated data analysis Understand current data issues Investigate data errors and identify root cause for failure Quantify gap between Current and Desired quality levels Prioritize data quality levels to meet objectives Implement validation checks and business rules to detect data errors and ensure logical consistency of data Rectify erroneous data using ETL rules Manage data exceptions manually Monitor and Identify data gaps and plan for maintaining and enhancing quality of data Communicate errors and impact to data providers Figure 2: Data Quality Methodology 1. Extract Data profiling is the first step towards ensuring data quality. Data Profiling is the assessment of data to understand its content, structure, quality and dependencies. Data profiling looks at standard column analysis such as frequency, NULL checks, cardinality, etc. This process will usually expose some of the more offending data quality problems. Some of the common methods in data profiling are listed below: Structure discovery Check whether the different patterns of data are valid. e.g.: Format of zip-code, phone-number, address etc can be checked. Data discovery Is the data value correct, error free and valid? e.g.: Check if mandatory attributes are having incorrect data. Relationship discovery Verify if all the key relationships are maintained and we are able to get an end to end linkage. e.g.: Link between a child and parent table. Data redundancy Check if same data is represented in multiple times. The results of the profiling should be discussed with the client to determine which of the issues are related to business problems. Data Quality Assessment Approach-Review.doc Page 5 of 15
6 Some examples of data profiling analysis that can be done are for Profiling Analysis Completeness What data is missing or unusable? Null checks, Uniqueness checks Conformity What data is stored in a non-standard format? Standard code sets, Rules Consistency What data values give conflicting information? Relationship Analysis Accuracy What data is incorrect or out of date? Domain validation, Range validation Duplicates What data records or attributes are repeated? Redundancy Evaluation Integrity What data is missing or not referenced? Referential Integrity, Cardinality Analysis Figure 3: Data Profiling Analysis 2. Discover After identifying the problems, data quality involves correcting the various errors identified. Data standardization Same data is represented in different formats. Identify them and use common standard values. Pattern standardization Particular attribute may have data following different patterns. Generate a common pattern for the data. e.g.: Standardize phone no s: or 999( ), (999) Data verification Verify the correctness of data and reduce ambiguities. e.g.: Customer address data can be checked to see if it is correct. 3. Cleanse Through cleansing data quality defects are corrected using one of the appropriate data cleansing action. The common data cleansing actions that can be taken during data cleansing: Filter data to remove rule violations Correct data to repair rule violations Data Quality Assessment Approach-Review.doc Page 6 of 15
7 Filter data: The purpose of filtering is to remove problematic data; this action is typically applied when data is considered defective to a degree that makes it unusable. Correct data: The purpose of correction is to fix defective data. Correction alters the values of individual fields. Determining the replacement value to be used may use the following techniques 1) Identifying errors while integrating data. Inserting a default value that indicates absence of reliable data Removal of redundant information e.g.: Same data is available in two different source systems. Same data is represented in two different formats in two source systems e.g. Address information can be represented in different formats. 2) Searching alternative sources to find a replacement value e.g. incorporating additional external data to add value to existing records, which Increase understanding effectively e.g. customer data is appended with more business details, a better understanding of the customer can be obtained. Data quality rules are implemented in the following order: Assess data quality objectives Identify & Define data cleansing rules Implement data validation routines in source feed file Implement data cleansing rules Missing data values or data elements, etc. Solution Identify cleansing rules for various subject areas (customer, products etc). Define Functional / program specifications for cleansing rules Implement validation and cleansing rules Figure 4: Data Quality Rules Implementation Data Quality Assessment Approach-Review.doc Page 7 of 15
8 Data Warehouse solutions have a role to play in ensuring the quality of data. It ensures that only data that is fit for use is allowed to be loaded and be made available to consumers. These solutions can also use dedicated data cleansing tools to conduct activities such as address cleansing, standardization and de-duping. Data Warehouse solutions have various ways of validating that the source data is of good quality and conditioning the data where appropriate, prior to loading it into the final target tables. These solutions can also provide for automated proactive alerting. 4. Monitor The measurement step involved measuring the data quality and tracking errors. It ensures that all reported anomalies are corrected and monitored. Data quality monitoring is a process which focuses on improving the quality of data. It ensures that the data is valid. A proper standardization is attained for the data. Also the redundant data is identified and eliminated. Once data is corrected, regular monitoring of the data is necessary to avoid errors and ambiguities. It can be done by Creating reports on a regular basis. Creating rules to validate the data Generating events to correct the data. Data monitoring will include creating a list of critical data quality problems as well as the business problems to which they relate. Data monitoring should not only measure information compliance to defined business rules, but also measure the actual costs associated with noncompliance. The data quality monitoring can be implemented either as an Interim Solution or Long term solution. During the Interim solution, data from various sources can be profiled and analyzed for anomalies and once the data quality rules are determined from the profiling analysis, they can be deployed to overcome these anomalies. For a Long term solution, data quality metrics and audit reports can be created which will measure the overall data quality improvement. Audit report structure details are provided in the Appendix. Quality metrics provide the means to quantify data quality. Measures are needed to measure current state of data quality and to evaluate the progress made towards improving data quality. Various attributes (format, range, domain, etc.) of the data elements can be measured. The measurements can be rolled up or aggregated into metrics e.g. the number of defective addresses, invalid phone numbers, incorrectly formatted Data Quality Assessment Approach-Review.doc Page 8 of 15
9 addresses, and can all be measured and rolled up into one metric that represents quality of just the contact data. Figure 5: Data Quality Monitoring 5. Prevention The purpose of prevention is to remove causes of defective data to fix the processes by which defects are introduced. Prevention determines the root causes of defective data and takes steps to eliminate them. By providing error reports, audit and reconciliation reports to the source system providers for correction of data. Data quality issues can be reduced and hence prevented over a period of time. These reports give the data owners the visibility into the errors and their causes and the corrective action that needs to be taken. Reconciliation is a process through which data from both source and target systems are compared and analyzed. Validation scripts are run on both source and target data for comparison. Data quality issues should ideally be addressed and resolved in the source systems themselves. This helps in ensuring that data in the data Warehouse is always in sync with data in the source systems. Data Quality Assessment Approach-Review.doc Page 9 of 15
10 Conclusion: Strong frameworks and processes are required for controlling data quality and for managing data. Additional validation procedures such as data level reconciliation ensures high success in providing high data quality solutions Data Quality Assessment Approach-Review.doc Page 10 of 15
11 Case Study: For one of the Life Insurance clients, the objective of the data quality assessment was to provide an approach on how the data warehouse should increase its data accuracy and thus build business confidence in using the data warehouse. The Client had multiple source systems loading customer and product data to the data warehouse so the requirement was to provide an approach by which the data quality within the warehouse could improve and address the following business issues, that was having an impact on business and overall cost. Figure 6: Data Quality Relationship between Impact and Cost The data quality assessment was conducted using the Data Quality methodology and a solution was provided for improving data quality for data access, the solution provided for the following benefit Provided for a Data warehouse environment that enabled o 360 degree view of customer o Integrated product information from various source systems o Able to satisfy reporting needs of users Improved operational efficiencies by making data available to users when they need it, via a single standard framework, so that they can effectively make informed decisions The solution provided an opportunity for the analysts to spend more quality time on analysis and less time on data quality issue resolution The solution improvements provided for recommendations in the following domains Data Quality Assessment Approach-Review.doc Page 11 of 15
12 Data Architecture Created source to target mappings Enabled single version of entities and metrics Structured Standards (Ex. Naming Standards) ETL Processes It helped define robust data validation, rejection and reconciliation mechanisms built into the ETL processes Processes that need to be defined in the ETL: Data Integration Rules Data Standardization Data Rejection Data Reconciliation Recommended Data Steward Participation during functional testing The Data Steward along with the Test team needs to be involved in the following data quality aspects of functional testing of ETL and Reporting applications Ensured that sample data is used for testing, data that represents all kinds of irregularities and peculiarities of source data Ensured that ETL was able to identify, handle and notify all types of defined data issues Ensured that Reports and queries used for testing cover the required data samples Activities that are carried out by a Data Steward is provided in the Appendix Data Quality Monitoring The data quality monitoring program recommended was to check on the data purity levels, this program would involve A Data quality scorecard to measure purity levels of the data warehouse, identify issues pro-actively and plan projects to address these issues Pro-actively analyze quality of source system data to identify new corruption issues and modify ETL to handle them Periodically assess effectiveness of ETL error notification process to the source system and how effective it is to resolve these issues. Data Quality Assessment Approach-Review.doc Page 12 of 15
13 Appendix Data Quality Audit The purpose of auditing is to understand the degree to which data quality problems exist i.e. the extent and severity of data defects. Audit procedures examine content, structure, completeness and other factors to detect and report integrity and correctness rules violations. Data auditing is a process for identifying errors and check the health of the overall system with regard to quality of data. As the amount of data and number of process escalates over a period of time, the amount of inaccurate data also increases and decreased data quality. The audit reports can be created to measure progress in achieving data quality goals and complying with service level agreements. It is very important to understand the error prone areas in the BI solution. A data Quality scorecard can show the overall quality status. A template for Audit Summary is shown below. Audit summary shows the number of occurrence of different types of errors, as per error stack, at various stages in the system. On one axis, 'types of error' is shown and on the other, the stage where error occurred is shown. So it will give the overall picture of the error occurrence. For example, 500 data entry errors occurred in the source system. This will give a fair idea about the problematic areas where data quality is poor / strong. Data Quality Assessment Approach-Review.doc Page 13 of 15
14 Data Steward The data steward is responsible for driving organizational agreement on data definitions, business rules and domain values for the data warehouse data and publishing and reinforcing these definitions and rules. The data steward is responsible for the following Primary Guardian of data while it is being created or maintained Responsible to create standards and procedures to ensure that policies and business rules are known and followed Should enforce adherence to policies and business rules that govern the data while the data is in their custody Should periodically monitor (audit) the quality of the data in their custody Creating a data governance body, processes and tools for managing data quality will help to establish a robust framework for development. Figure 7: Data Governance Framework Data Quality Assessment Approach-Review.doc Page 14 of 15
15 Data Quality and Data Profiling tools Some of the Leading Data Quality vendors/products are provided below Sanjay Seth Sanjay Seth, a Senior Architect with the Business Intelligence Practice of a leading IT consulting firm, has 8 years of extensive experience in the data warehousing/business intelligence space Data Quality Assessment Approach-Review.doc Page 15 of 15
Five Fundamental Data Quality Practices
Five Fundamental Data Quality Practices W H I T E PA P E R : DATA QUALITY & DATA INTEGRATION David Loshin WHITE PAPER: DATA QUALITY & DATA INTEGRATION Five Fundamental Data Quality Practices 2 INTRODUCTION
More informationEnterprise Data Quality
Enterprise Data Quality An Approach to Improve the Trust Factor of Operational Data Sivaprakasam S.R. Given the poor quality of data, Communication Service Providers (CSPs) face challenges of order fallout,
More informationBuilding a Data Quality Scorecard for Operational Data Governance
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
More informationMETA DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING
META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 asistithod@gmail.com
More informationBig Data-Challenges and Opportunities
Big Data-Challenges and Opportunities White paper - August 2014 User Acceptance Tests Test Case Execution Quality Definition Test Design Test Plan Test Case Development Table of Contents Introduction 1
More informationEvaluating the Business Impacts of Poor Data Quality
Evaluating the Business Impacts of Poor Data Quality Submitted by: David Loshin President, Knowledge Integrity, Inc. (301) 754-6350 loshin@knowledge-integrity.com Knowledge Integrity, Inc. Page 1 www.knowledge-integrity.com
More informationData Warehouse and Business Intelligence Testing: Challenges, Best Practices & the Solution
Warehouse and Business Intelligence : Challenges, Best Practices & the Solution Prepared by datagaps http://www.datagaps.com http://www.youtube.com/datagaps http://www.twitter.com/datagaps Contact contact@datagaps.com
More informationBusiness Performance & Data Quality Metrics. David Loshin Knowledge Integrity, Inc. loshin@knowledge-integrity.com (301) 754-6350
Business Performance & Data Quality Metrics David Loshin Knowledge Integrity, Inc. loshin@knowledge-integrity.com (301) 754-6350 1 Does Data Integrity Imply Business Value? Assumption: improved data quality,
More informationEnterprise Data Governance
Enterprise Aligning Quality With Your Program Presented by: Mark Allen Sr. Consultant, Enterprise WellPoint, Inc. (mark.allen@wellpoint.com) 1 Introduction: Mark Allen is a senior consultant and enterprise
More informationOperationalizing Data Governance through Data Policy Management
Operationalizing Data Governance through Data Policy Management Prepared for alido by: David Loshin nowledge Integrity, Inc. June, 2010 2010 nowledge Integrity, Inc. Page 1 Introduction The increasing
More informationMonitoring Data Quality Performance Using Data Quality Metrics
WHITE PAPER Monitoring Data Quality Performance Using Data Quality Metrics with David Loshin This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of
More informationB2B Operational Intelligence
M A R C H 2 0 1 4 B2B Operational Intelligence This white paper describes how B2B Operational Intelligence safeguards your supply chain against non-compliant B2B data in the order lifecycle, ensuring that
More informationBI and ETL Process Management Pain Points
BI and ETL Process Management Pain Points Understanding frequently encountered data workflow processing pain points and new strategies for addressing them What You Will Learn Business Intelligence (BI)
More informationEnabling Data Quality
Enabling Data Quality Establishing Master Data Management (MDM) using Business Architecture supported by Information Architecture & Application Architecture (SOA) to enable Data Quality. 1 Background &
More informationData Quality for BASEL II
Data Quality for BASEL II Meeting the demand for transparent, correct and repeatable data process controls Harte-Hanks Trillium Software www.trilliumsoftware.com Corporate Headquarters + 1 (978) 436-8900
More informationdbspeak DBs peak when we speak
Data Profiling: A Practitioner s approach using Dataflux [Data profiling] employs analytic methods for looking at data for the purpose of developing a thorough understanding of the content, structure,
More informationGovernment Business Intelligence (BI): Solving Your Top 5 Reporting Challenges
Government Business Intelligence (BI): Solving Your Top 5 Reporting Challenges Creating One Version of the Truth Enabling Information Self-Service Creating Meaningful Data Rollups for Users Effortlessly
More informationWhite Paper. An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management
White Paper An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management Managing Data as an Enterprise Asset By setting up a structure of
More informationDATA GOVERNANCE AND INSTITUTIONAL BUSINESS INTELLIGENCE WORKSHOP
NERCOM, Wesleyan University DATA GOVERNANCE AND INSTITUTIONAL BUSINESS INTELLIGENCE WORKSHOP ORA FISH, EXECUTIVE DIRECTOR PROGRAM SERVICES OFFICE NEW YORK UNIVERSITY Data Governance Personal Journey Two
More informationDataFlux Data Management Studio
DataFlux Data Management Studio DataFlux Data Management Studio provides the key for true business and IT collaboration a single interface for data management tasks. A Single Point of Control for Enterprise
More informationIntegrating Data Governance into Your Operational Processes
TDWI rese a rch TDWI Checklist Report Integrating Data Governance into Your Operational Processes By David Loshin Sponsored by tdwi.org August 2011 TDWI Checklist Report Integrating Data Governance into
More informationData Quality Management and Financial Services
Data Quality Management and Financial Services Loretta O Connor Data Quality Sales Manager Data Quality Divion May 2007 1 PG 961 Content Introduction Defining the Data Quality Problem Solutions for Data
More informationIncrease Business Intelligence Infrastructure Responsiveness and Reliability Using IT Automation
White Paper Increase Business Intelligence Infrastructure Responsiveness and Reliability Using IT Automation What You Will Learn That business intelligence (BI) is at a critical crossroads and attentive
More informationExisting Technologies and Data Governance
Existing Technologies and Data Governance Adriaan Veldhuisen Product Manager Privacy & Security Teradata, a Division of NCR 10 June, 2004 San Francisco, CA 6/10/04 1 My Assumptions for Data Governance
More informationChapter 6 Basics of Data Integration. Fundamentals of Business Analytics RN Prasad and Seema Acharya
Chapter 6 Basics of Data Integration Fundamentals of Business Analytics Learning Objectives and Learning Outcomes Learning Objectives 1. Concepts of data integration 2. Needs and advantages of using data
More informationOPTIMUS SBR. Optimizing Results with Business Intelligence Governance CHOICE TOOLS. PRECISION AIM. BOLD ATTITUDE.
OPTIMUS SBR CHOICE TOOLS. PRECISION AIM. BOLD ATTITUDE. Optimizing Results with Business Intelligence Governance This paper investigates the importance of establishing a robust Business Intelligence (BI)
More informationMeasuring and Monitoring the Quality of Master Data By Thomas Ravn and Martin Høedholt, November 2008
Measuring and Monitoring the Quality of Master Data By Thomas Ravn and Martin Høedholt, November 2008 Introduction We ve all heard about the importance of data quality in our IT-systems and how the data
More informationIT Outsourcing s 15% Problem:
IT Outsourcing s 15% Problem: The Need for Outsourcing Governance ABSTRACT: IT outsourcing involves complex IT infrastructures that make it extremely difficult to get an accurate inventory of the IT assets
More informationData Governance: The Lynchpin of Effective Information Management
by John Walton Senior Delivery Manager, 972-679-2336 john.walton@ctg.com Data Governance: The Lynchpin of Effective Information Management Data governance refers to the organization bodies, rules, decision
More informationInformation Governance Workshop. David Zanotta, Ph.D. Vice President, Global Data Management & Governance - PMO
Information Governance Workshop David Zanotta, Ph.D. Vice President, Global Data Management & Governance - PMO Recognition of Information Governance in Industry Research firms have begun to recognize the
More informationWhitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff The Challenge IT Executives are challenged with issues around data, compliancy, regulation and making confident decisions on their business
More informationDrive business process improvement and performance with high quality data
Drive business process improvement and performance with high quality data Adam Bracey Solutions Architect abracey@informatica.com (317) 218-7661 1 1 Impact of Poor Data Quality Lack of Trust or Confidence
More informationagility made possible
SOLUTION BRIEF CA IT Asset Manager how can I manage my asset lifecycle, maximize the value of my IT investments, and get a portfolio view of all my assets? agility made possible helps reduce costs, automate
More informationCA Service Desk Manager
PRODUCT BRIEF: CA SERVICE DESK MANAGER CA Service Desk Manager CA SERVICE DESK MANAGER IS A VERSATILE, COMPREHENSIVE IT SUPPORT SOLUTION THAT HELPS YOU BUILD SUPERIOR INCIDENT AND PROBLEM MANAGEMENT PROCESSES
More informationData Quality in Data warehouse: problems and solution
IOSR Journal of Computer Engineering (IOSR-JCE) e-issn: 2278-0661, p- ISSN: 2278-8727Volume 16, Issue 1, Ver. IV (Jan. 2014), PP 18-24 Data Quality in Data warehouse: problems and solution *Rahul Kumar
More informationThe Firewall Audit Checklist Six Best Practices for Simplifying Firewall Compliance and Risk Mitigation
The Firewall Audit Checklist Six Best Practices for Simplifying Firewall Compliance and Risk Mitigation Copyright, AlgoSec Inc. All rights reserved The Need to Ensure Continuous Compliance Regulations
More informationITSM Maturity Model. 1- Ad Hoc 2 - Repeatable 3 - Defined 4 - Managed 5 - Optimizing No standardized incident management process exists
Incident ITSM Maturity Model 1- Ad Hoc 2 - Repeatable 3 - Defined 4 - Managed 5 - Optimizing No standardized incident process exists Incident policies governing incident Incident urgency, impact and priority
More informationDATA QUALITY MATURITY
3 DATA QUALITY MATURITY CHAPTER OUTLINE 3.1 The Data Quality Strategy 35 3.2 A Data Quality Framework 38 3.3 A Data Quality Capability/Maturity Model 42 3.4 Mapping Framework Components to the Maturity
More informationThe Key Components of a Data Governance Program. John R. Talburt, PhD, IQCP University of Arkansas at Little Rock Black Oak Analytics, Inc
The Key Components of a Data Governance Program John R. Talburt, PhD, IQCP University of Arkansas at Little Rock Black Oak Analytics, Inc My Background Currently University of Arkansas at Little Rock Acxiom
More informationPopulating a Data Quality Scorecard with Relevant Metrics WHITE PAPER
Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Useful vs. So-What Metrics... 2 The So-What Metric.... 2 Defining Relevant Metrics...
More informationBringing agility to Business Intelligence Metadata as key to Agile Data Warehousing. 1 P a g e. www.analytixds.com
Bringing agility to Business Intelligence Metadata as key to Agile Data Warehousing 1 P a g e Table of Contents What is the key to agility in Data Warehousing?... 3 The need to address requirements completely....
More informationJOURNAL OF OBJECT TECHNOLOGY
JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7, No. 8, November-December 2008 What s Your Information Agenda? Mahesh H. Dodani,
More informationTDWI Data Integration Techniques: ETL & Alternatives for Data Consolidation
TDWI Data Integration Techniques: ETL & Alternatives for Data Consolidation Format : C3 Education Course Course Length : 9am to 5pm, 2 consecutive days Date : Sydney 22-23 Nov 2011, Melbourne 28-29 Nov
More informationWhat You Don t Know Does Hurt You: Five Critical Risk Factors in Data Warehouse Quality. An Infogix White Paper
What You Don t Know Does Hurt You: Five Critical Risk Factors in Data Warehouse Quality Executive Summary Data warehouses are becoming increasingly large, increasingly complex and increasingly important
More informationA Comprehensive Approach to Master Data Management Testing
A Comprehensive Approach to Master Data Management Testing Abstract Testing plays an important role in the SDLC of any Software Product. Testing is vital in Data Warehousing Projects because of the criticality
More informationMaking Business Intelligence Easy. White Paper Spreadsheet reporting within a BI framework
Making Business Intelligence Easy White Paper Spreadsheet reporting within a BI framework Contents Overview...4 What is spreadsheet reporting and why does it exist?...5 Risks and issues with spreadsheets
More informationSOLUTION BRIEF: CA IT ASSET MANAGER. How can I reduce IT asset costs to address my organization s budget pressures?
SOLUTION BRIEF: CA IT ASSET MANAGER How can I reduce IT asset costs to address my organization s budget pressures? CA IT Asset Manager helps you optimize your IT investments and avoid overspending by enabling
More informationHow Can I Better Manage My Software Assets And Mitigate The Risk Of Compliance Audits?
SOLUTION BRIEF CA SERVICE MANAGEMENT - SOFTWARE ASSET MANAGEMENT How Can I Better Manage My Software Assets And Mitigate The Risk Of Compliance Audits? SOLUTION BRIEF CA DATABASE MANAGEMENT FOR DB2 FOR
More informationDATA GOVERNANCE AND DATA QUALITY
DATA GOVERNANCE AND DATA QUALITY Kevin Lewis Partner Enterprise Management COE Barb Swartz Account Manager Teradata Government Systems Objectives of the Presentation Show that Governance and Quality are
More informationCoverity White Paper. Effective Management of Static Analysis Vulnerabilities and Defects
Effective Management of Static Analysis Vulnerabilities and Defects Introduction According to a recent industry study, companies are increasingly expanding their development testing efforts to lower their
More informationFor more information about UC4 products please visit www.uc4.com. Automation Within, Around, and Beyond Oracle E-Business Suite
For more information about UC4 products please visit www.uc4.com Automation Within, Around, and Beyond Oracle E-Business Suite Content Executive Summary...3 Opportunities for Enhancement: Automation Within,
More informationThe Requirements Compliance Matrix columns are defined as follows:
1 DETAILED REQUIREMENTS AND REQUIREMENTS COMPLIANCE The following s Compliance Matrices present the detailed requirements for the P&I System. Completion of all matrices is required; proposals submitted
More informationWhite Paper. Thirsting for Insight? Quench It With 5 Data Management for Analytics Best Practices.
White Paper Thirsting for Insight? Quench It With 5 Data Management for Analytics Best Practices. Contents Data Management: Why It s So Essential... 1 The Basics of Data Preparation... 1 1: Simplify Access
More informationEffecting Data Quality Improvement through Data Virtualization
Effecting Data Quality Improvement through Data Virtualization Prepared for Composite Software by: David Loshin Knowledge Integrity, Inc. June, 2010 2010 Knowledge Integrity, Inc. Page 1 Introduction The
More informationBusiness Usage Monitoring for Teradata
Managing Big Analytic Data Business Usage Monitoring for Teradata Increasing Operational Efficiency and Reducing Data Management Costs How to Increase Operational Efficiency and Reduce Data Management
More informationCSPP 53017: Data Warehousing Winter 2013" Lecture 6" Svetlozar Nestorov" " Class News
CSPP 53017: Data Warehousing Winter 2013 Lecture 6 Svetlozar Nestorov Class News Homework 4 is online Due by Tuesday, Feb 26. Second 15 minute in-class quiz today at 6:30pm Open book/notes Last 15 minute
More informationThe ROI of Data Governance: Seven Ways Your Data Governance Program Can Help You Save Money
A DataFlux White Paper Prepared by: Gwen Thomas The ROI of Data Governance: Seven Ways Your Data Governance Program Can Help You Save Money Leader in Data Quality and Data Integration www.dataflux.com
More informationExplore the Possibilities
Explore the Possibilities 2013 HR Service Delivery Forum Best Practices in Data Management: Creating a Sustainable and Robust Repository for Reporting and Insights 2013 Towers Watson. All rights reserved.
More information5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 www.winshuttle.com Introduction
More informationMaking Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management
Making Business Intelligence Easy Whitepaper Measuring data quality for successful Master Data Management Contents Overview... 3 What is Master Data Management?... 3 Master Data Modeling Approaches...
More informationCourse Outline. Module 1: Introduction to Data Warehousing
Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing solution and the highlevel considerations you must take into account
More information7 Directorate Performance Managers. 7 Performance Reporting and Data Quality Officer. 8 Responsible Officers
Contents Page 1 Introduction 2 2 Objectives of the Strategy 2 3 Data Quality Standards 3 4 The National Indicator Set 3 5 Structure of this Strategy 3 5.1 Awareness 4 5.2 Definitions 4 5.3 Recording 4
More informationData Quality Where did it all go wrong? Ed Wrazen, Trillium Software
Data Quality Where did it all go wrong? Ed Wrazen, Trillium Software Agenda Examples of data quality problems Why do data quality problems occur? The impact of poor data Why data quality is an enterprise
More informationDATA QUALITY IN BUSINESS INTELLIGENCE APPLICATIONS
DATA QUALITY IN BUSINESS INTELLIGENCE APPLICATIONS Gorgan Vasile Academy of Economic Studies Bucharest, Faculty of Accounting and Management Information Systems, Academia de Studii Economice, Catedra de
More informationData Governance Best Practices
Data Governance Best Practices Rebecca Bolnick Chief Data Officer Maya Vidhyadharan Data Governance Manager Arizona Department of Education Key Issues 1. What is Data Governance and why is it important?
More informationRequirements-Based Testing: Encourage Collaboration Through Traceability
White Paper Requirements-Based Testing: Encourage Collaboration Through Traceability Executive Summary It is a well-documented fact that incomplete, poorly written or poorly communicated requirements are
More informationData Integrity and Integration: How it can compliment your WebFOCUS project. Vincent Deeney Solutions Architect
Data Integrity and Integration: How it can compliment your WebFOCUS project Vincent Deeney Solutions Architect 1 After Lunch Brain Teaser This is a Data Quality Problem! 2 Problem defining a Member How
More informationData Governance for Financial Institutions
Financial Services the way we see it Data Governance for Financial Institutions Drivers and metrics to help banks, insurance companies and investment firms build and sustain data governance Table of Contents
More informationAgile Master Data Management A Better Approach than Trial and Error
Agile Master Data Management A Better Approach than Trial and Error A whitepaper by First San Francisco Partners First San Francisco Partners Whitepaper Executive Summary Market leading corporations are
More informationSeven Ways To Help ERP IN 2014 AND BEYOND
Seven Ways To Help Data Migration During Implementation SPECial REPORT SERIES ERP IN 2014 AND BEYOND CONTENTS INTRODUCTION 3 Develop a Data MigraTION Plan 4 PerfORM a ThOROUgh Gap Analysis 6 COMMIT ResOURCes
More informationDon t simply manage work in your Professional Services business. Manage dollars and profits.
INCREASE PROFITABILITY THROUGH END-TO-END INTEGRATION OF CRM, PROJECT MANAGEMENT, AND BILLING SYSTEMS Don t simply manage work in your Professional Services business. Manage dollars and profits. Through
More informationThree Fundamental Techniques To Maximize the Value of Your Enterprise Data
Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Prepared for Talend by: David Loshin Knowledge Integrity, Inc. October, 2010 2010 Knowledge Integrity, Inc. 1 Introduction Organizations
More informationData Governance: A Business Value-Driven Approach
Data Governance: A Business Value-Driven Approach A White Paper by Dr. Walid el Abed CEO January 2011 Copyright Global Data Excellence 2011 Contents Executive Summary......................................................3
More informationETL-EXTRACT, TRANSFORM & LOAD TESTING
ETL-EXTRACT, TRANSFORM & LOAD TESTING Rajesh Popli Manager (Quality), Nagarro Software Pvt. Ltd., Gurgaon, INDIA rajesh.popli@nagarro.com ABSTRACT Data is most important part in any organization. Data
More informationPart A OVERVIEW...1. 1. Introduction...1. 2. Applicability...2. 3. Legal Provision...2. Part B SOUND DATA MANAGEMENT AND MIS PRACTICES...
Part A OVERVIEW...1 1. Introduction...1 2. Applicability...2 3. Legal Provision...2 Part B SOUND DATA MANAGEMENT AND MIS PRACTICES...3 4. Guiding Principles...3 Part C IMPLEMENTATION...13 5. Implementation
More informationData Governance: A Business Value-Driven Approach
Global Excellence Governance: A Business Value-Driven Approach A White Paper by Dr Walid el Abed CEO Trusted Intelligence Contents Executive Summary......................................................3
More informationBuild an effective data integration strategy to drive innovation
IBM Software Thought Leadership White Paper September 2010 Build an effective data integration strategy to drive innovation Five questions business leaders must ask 2 Build an effective data integration
More informationReduce and manage operating costs and improve efficiency. Support better business decisions based on availability of real-time information
Data Management Solutions Horizon Software Solution s Data Management Solutions provide organisations with confidence in control of their data as they change systems and implement new solutions. Data is
More informationAccelerate BI Initiatives With Self-Service Data Discovery And Integration
A Custom Technology Adoption Profile Commissioned By Attivio June 2015 Accelerate BI Initiatives With Self-Service Data Discovery And Integration Introduction The rapid advancement of technology has ushered
More informationSQL Server 2012 Business Intelligence Boot Camp
SQL Server 2012 Business Intelligence Boot Camp Length: 5 Days Technology: Microsoft SQL Server 2012 Delivery Method: Instructor-led (classroom) About this Course Data warehousing is a solution organizations
More informationAligning Quality Management Processes to Compliance Goals
Aligning Quality Management Processes to Compliance Goals MetricStream.com Smart Consulting Group Joint Webinar February 23 rd 2012 Nigel J. Smart, Ph.D. Smart Consulting Group 20 E. Market Street West
More informationImproving Operations Through Agent Portal Data Quality
Improving Operations Through Agent Portal Data Quality An Experian QAS white paper Executive Summary With a record number of insurance carriers using agent portals, the strategic focus at most organizations
More informationThe Butterfly Effect on Data Quality How small data quality issues can lead to big consequences
How small data quality issues can lead to big consequences White Paper Table of Contents How a Small Data Error Becomes a Big Problem... 3 The Pervasiveness of Data... 4 Customer Relationship Management
More informationDATA CONSISTENCY, COMPLETENESS AND CLEANING. By B.K. Tyagi and P.Philip Samuel CRME, Madurai
DATA CONSISTENCY, COMPLETENESS AND CLEANING By B.K. Tyagi and P.Philip Samuel CRME, Madurai DATA QUALITY (DATA CONSISTENCY, COMPLETENESS ) High-quality data needs to pass a set of quality criteria. Those
More informationImplementing a Data Warehouse with Microsoft SQL Server 2012 MOC 10777
Implementing a Data Warehouse with Microsoft SQL Server 2012 MOC 10777 Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing
More informationResponding to Regulatory Activity: 6 Vital Areas to Gauge the Effectiveness of your Regulatory Change Management Process
Whitepaper Responding to Regulatory Activity: 6 Vital Areas to Gauge the Effectiveness of your Regulatory Change Management Process By Amy Downey, U.S. Banking and Regulatory Expert, Risk & Compliance,
More informationData Governance 8 Steps to Success
Data Governance 8 Steps to Success Anne Marie Smith, Ph.D. Principal Consultant Asmith @ alabamayankeesystems.com http://www.alabamayankeesystems.com 1 Instructor Background Internationally recognized
More informationFeature. A Framework for Estimating ROI of Automated Internal Controls. Do you have something to say about this article?
Feature A Framework for Estimating ROI of Automated Internal Controls Angsuman Dutta is the unit leader of the marketing and customer acquisition support teams at Infogix. Since 2001, he has assisted numerous
More informationTransparent Government Demands Robust Data Quality
Transparent Government Demands Robust Data Quality Federal initiatives to strengthen transparency and accountability require agencies to improve data quality practices W H I T E P A P E R Table of Contents
More informationTHE QUALITY OF DATA AND METADATA IN A DATAWAREHOUSE
THE QUALITY OF DATA AND METADATA IN A DATAWAREHOUSE Carmen Răduţ 1 Summary: Data quality is an important concept for the economic applications used in the process of analysis. Databases were revolutionized
More informationUS Department of Education Federal Student Aid Integration Leadership Support Contractor January 25, 2007
US Department of Education Federal Student Aid Integration Leadership Support Contractor January 25, 2007 Task 18 - Enterprise Data Management 18.002 Enterprise Data Management Concept of Operations i
More informationEXPLORING THE CAVERN OF DATA GOVERNANCE
EXPLORING THE CAVERN OF DATA GOVERNANCE AUGUST 2013 Darren Dadley Business Intelligence, Program Director Planning and Information Office SIBI Overview SIBI Program Methodology 2 Definitions: & Governance
More informationWhy Data Governance - 1 -
Data Governance Why Data Governance - 1 - Industry: Lack of Data Governance is a Key Issue Faced During Projects As projects address process improvements, they encounter unidentified data processes that
More informationA TECHNICAL WHITE PAPER ATTUNITY VISIBILITY
A TECHNICAL WHITE PAPER ATTUNITY VISIBILITY Analytics for Enterprise Data Warehouse Management and Optimization Executive Summary Successful enterprise data management is an important initiative for growing
More informationIntegration Maturity Model Capability #5: Infrastructure and Operations
Integration Maturity Model Capability #5: Infrastructure and Operations How improving integration supplies greater agility, cost savings, and revenue opportunity TAKE THE INTEGRATION MATURITY SELFASSESSMENT
More informationData Governance. David Loshin Knowledge Integrity, inc. www.knowledge-integrity.com (301) 754-6350
Data Governance David Loshin Knowledge Integrity, inc. www.knowledge-integrity.com (301) 754-6350 Risk and Governance Objectives of Governance: Identify explicit and hidden risks associated with data expectations
More informationAutomated IT Asset Management Maximize organizational value using BMC Track-It! WHITE PAPER
Automated IT Asset Management Maximize organizational value using BMC Track-It! WHITE PAPER CONTENTS ADAPTING TO THE CONSTANTLY CHANGING ENVIRONMENT....................... 1 THE FOUR KEY BENEFITS OF AUTOMATION..................................
More informationControlling Costs with Managed Mobility Services
WHITE PAPER Controlling Costs with Managed Mobility Services Learn about how Managed Mobility Services can produce cost savings and help improve the productivity of your mobile workforce. There are many
More informationData Quality Management The Most Critical Initiative You Can Implement
Data Quality Management The Most Critical Initiative You Can Implement SUGI 29 Montreal May 2004 Claudia Imhoff President Intelligent Solutions, Inc. CImhoff@Intelsols.com www.intelsols.com Jonathan G.
More informationSession M6. Andrea Matulick, Acting Manager, Business Intelligence, Robert Davies, Technical Team Leader, Enterprise Data
Strategic Data Management Conforming the Data Warehouse Session M6 September 24, 2007 Andrea Matulick, Acting Manager, Business Intelligence, Planning and Assurance Services, UniSA Robert Davies, Technical
More information