The Data Quality Business Case: Projecting Return on Investment
|
|
|
- Nancy Matthews
- 10 years ago
- Views:
Transcription
1 WHITE PAPER The Data Quality Business Case: Projecting Return on Investment with David Loshin
2 This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of Informatica Corporation and may not be copied, distributed, duplicated, or otherwise reproduced in any manner without the prior written consent of Informatica. While every attempt has been made to ensure that the information in this document is accurate and complete, some typographical errors or technical inaccuracies may exist. Informatica does not accept responsibility for any kind of loss resulting from the use of information contained in this document. The information contained in this document is subject to change without notice. The incorporation of the product attributes discussed in these materials into any release or upgrade of any Informatica software product as well as the timing of any such release or upgrade is at the sole discretion of Informatica. Protected by one or more of the following U.S. Patents: 6,032,158; 5,794,246; 6,014,670; 6,339,775; 6,044,374; 6,208,990; 6,208,990; 6,850,947; 6,895,471; or by the following pending U.S. Patents: 09/644,280; 10/966,046; 10/727,700. This edition published June 2006
3 Table of Contents White Paper Executive Summary Establishing the Value of Information Quality Business Expectations and Data Quality Key Data Quality Dimensions Using Data Quality Technology to Improve Your Data Data Quality Techniques Anomaly Analysis and Assessment Data Standardization Similarity Analysis Data Quality Auditing and Reporting Data Quality Tools Data Profiling Parsing and Standardization Similarity and Linkage Auditing and Monitoring Identifying Impacts of Poor Information Quality Increased Costs Decreased Revenues Decreased Confidence Increased Risk Developing the Business Case Identifying Impacts Research Financial Impacts The Data Quality Impact Matrix Correlating Impacts to Root Causes Costs to Remediate Projecting Return on Investment Case Studies Pharmaceutical Company Health Insurance Company Government Department of Defense Telecommunications Company Summary: How to Get Started The Data Quality Business Case: Projecting ROI 1
4 ABOUT THE AUTHOR David Loshin is the president of Knowledge Integrity, Inc., a consulting and development company focusing on customized information management solutions including information quality solutions consulting, information quality training and business rules solutions. Loshin is the author of Enterprise Knowledge Management - The Data Quality Approach (Morgan Kaufmann, 2001) and Business Intelligence - The Savvy Manager's Guide and is a frequent speaker on maximizing the value of information. [email protected] Executive Summary The purpose of this White Paper is to outline the importance of data quality in today s business environment. It describes how an organization should tackle a data quality improvement process and where Informatica data quality software solutions fit into that process. There is little doubt that the need for high quality data permeates most information-centric programs, whether they depend on transactional, operational, or analytical applications. In addition, new events take place and evolving techniques are introduced that affect the way our systems and business operate, all of which are dependent on the best use of quality data, such as: Regulatory compliance, in which organizations are required to account for the quality of the data they use and information they disseminate, Reengineering, Migration, and Modernization projects, in which older applications are updated and the legacy system data is migrated into new applications, Mergers and Acquisitions, in which multiple source data systems are merged into a single operational framework, Data Integration programs, such as Customer Data Integration, Product Data Integration, or any other Master Data Management program Even though everyone fundamentally understands the need for high quality data, technologists are often left to their own devices when it comes to ensuring the high levels of data quality. However, at some point an investment must be made in the infrastructure necessary to provide measurably acceptable levels of data quality. In order to justify that investment, we must be able to articulate the business value of data quality in a way that will show a return on the investment made. Often, developing a business case for the prevention of impacts may appear to be a challenge. Yet, even in the absence of critical events that necessitate senior management action, there is usually more than enough evidence available within an organization to develop a business case justifying the costs related to data quality improvement. This white paper will present a process for: Identifying key business dimensions impacted by poor data quality Reviewing approaches that can be used to improve data quality Assessing the actual historical costs related to data flaws Determining the costs to improve data quality Assembling this material into a business case justifying an investment in data quality tools and methodologies We will look at case studies where these approaches are put into practice, and then conclude by summarizing the approach to developing a successful business case for investing in data quality improvement. 2
5 White Paper Establishing the Value of Information Quality Business Expectations and Data Quality There is a common notion that objective data quality improvement necessarily implies business value, and this notion often drives golden copy, single source of truth, or master data projects. This approach, though, does not take into account the fact that data quality is subjective, and relies on how data flaws are related to negative business impacts. Objective data quality metrics may not necessarily be tied to your business s performance, and raises some interesting questions: How do you distinguish high impact from low impact data integrity issues? How do you isolate the source of the introduction of data flaws to fix the process instead of correcting the data? How do you correlate business value with source data integrity? What is the best way to employ data integration best practices to address these questions? This challenge can be characterized by a fundamental distinction between data quality expectations and business expectations. Data quality expectations are expressed as rules measuring aspects of the validity of data values: What data is missing or unusable? Which data values are in conflict? Which records are duplicated? What linkages are missing? Alternatively, business expectations are expressed as rules measuring performance, productivity, efficiency of processes, asking questions like: How has throughput decreased due to errors? What percentage of time is spent in scrap and rework? What is the loss in value of transactions that failed due to missing data? How quickly can we respond to business opportunities? To determine the true value added by data quality programs, conformance to business expectations (and the corresponding business value) should be measured in relation to its component data quality rules. We do this by identifying how the business impacts of poor data quality can be measured as well as how they relate to their root causes, then assess the costs to eliminate the root causes. Characterizing both our business impacts as well as our data quality problems provides a framework for developing our business case. The Data Quality Business Case: Projecting ROI 3
6 Key Data Quality Dimensions To be able to correlate data quality issues to business impacts, we must be able to both classify our data quality expectations as well as our business impact criteria. In order for the analyst to determine the scope of the underlying root causes and to plan the ways that tools can be used to address data quality issues, it is valuable to understand these common data quality dimensions: Completeness: Is all the requisite information available? Are data values missing, or in an unusable state? In some cases, missing data is irrelevant, but when the information that is missing is critical to a specific business process, completeness becomes an issue. Conformity: Are there expectations that data values conform to specified formats? If so, do all the values conform to those formats? Maintaining conformance to specific formats is important in data representation, presentation, aggregate reporting, search, and establishing key relationships. Consistency: Do distinct data instances provide conflicting information about the same underlying data object? Are values consistent across data sets? Do interdependent attributes always appropriately reflect their expected consistency? Inconsistency between data values plagues organizations attempting to reconcile between different systems and applications. Accuracy: Do data objects accurately represent the real-world values they are expected to model? Incorrect spellings of product or person names, addresses, and even untimely or not current data can impact operational and analytical applications. Duplication: Are there multiple, unnecessary representations of the same data objects within your data set? The inability to maintain a single representation for each entity across your systems poses numerous vulnerabilities and risks. Integrity: What data is missing important relationship linkages? The inability to link related records together may actually introduce duplication across your systems. Not only that, as more value is derived from analyzing connectivity and relationships, the inability to link related data instance together impedes this valuable analysis. 4
7 White Paper Using Data Quality Technology to Improve Your Data Understanding the key data quality dimensions is the first step to data quality improvement. Being able to segregate data flaws by dimension or classification allows analysts and developers to apply improvement techniques using data quality tools to improve both your information, and the processes that create and manipulate that information. Let s briefly examine some data quality techniques, then review the kinds of tools employed for these techniques. Data Quality Techniques There are many policies and procedures that can be employed for ongoing, proactive data quality improvement. However, most successful programs make use of some simple techniques that enable the discovery, assessment, remediation, and reporting of baseline measurements and ongoing improvement. Anomaly Analysis and Assessment Before any improvements can be made to information, one must first be able to distinguish between good and bad data. The attempt to qualify data quality is a process of analysis and discovery. The analysis involves an objective review of the data values populating data sets through quantitative measures and analyst review. While a data analyst may not necessarily be able to pinpoint all instances of flawed data, the ability to document situations where data values look like they don t belong provides a means to communicate these instances with subject matter experts whose business knowledge can confirm the existence of data problems. Data Standardization Many data issues are attributable to situations where slight variance in representation of data values introduces confusion or ambiguity. For example, consider the different ways telephone numbers are formatted in Figure 1. While some have digits, some have alphabetic characters, and all use different special characters for separation, we all recognize each one as being a telephone number (301) BIZRULE Figure 1: Variant formats for representing telephone numbers But in order to determine whether these numbers are accurate (perhaps by comparing them to a master customer directory), or to investigate whether duplicate numbers exist when there should be only one for each supplier, the values must be parsed into their component segments (area code, exchange, and line) and then transformed into a standard format. The Data Quality Business Case: Projecting ROI 5
8 Similarity Analysis (matching) A common data quality problem involves two sides of the same coin: when there are multiple data instances that actually refer to the same real-world entity, or the perception by a knowledge worker or application that a record does not exist for a real-world entity when in fact it really does. These problems both are a result of approximate duplication. In the first situation, similar, yet slightly variant representations in data values may have been inadvertently introduced into the system, while in the second situation, a slight variation in representation prevents the identification of an exact match of the existing record in the data set. Both of these issues are addressed through a process called similarity analysis or matching, in which the degree of similarity between any two records is scored, most often based on weighted approximate matching between a set of attribute values between the two records. If the score is above a specific threshold, the two records are deemed to be a match, and are presented to the end client as most likely to represent the same entity. It is through similarity analysis that slight variations are recognized and data values are connected, and subsequently cleansed. Data Quality Auditing and Reporting It is difficult to improve a process without having a means to measure that process. In addition, it is difficult to gauge continuous improvement without being able to track performance on a regular basis. To this end, it is necessary to define relevant data quality metrics that can be measured through an auditing process, with the results captured and reported to the relevant stakeholders. Data Quality Tools To address these remediation needs, we can employ the following data quality tools/technologies to achieve our quality objectives: Data Profiling Data profiling is a set of algorithms for statistical analysis and assessment of the quality of data values within a data set, as well as exploring relationships that exists between value collections within and across data sets. For each column in a table, a data profiling tool will provide a frequency distribution of the different values, providing insight into the type and use of each column. Cross-column analysis can expose embedded value dependencies, while inter-table analysis explores overlapping values sets that may represent foreign key relationships between entities, and it is in this way that profiling can be used for anomaly analysis and assessment. Data profiling can also be used to proactively test against a set of defined (or discovered) business rules. In this way, we can distinguish those records that conform to our defined data quality expectations and those that don t, which in turn can contribute to baseline measurements and ongoing auditing for data quality reporting. Parsing and Standardization Our innate ability to recognize familiar patterns contributes to our ability to characterize variant data values belonging to the same abstract class of values. Continuing our example in Figure 1, people recognize these all as telephone numbers because these are all frequently used patterns. Luckily, if we can describe the format patterns that can be used to represent all other data 6
9 White Paper objects (e.g. Person Name, Product Description, etc.), we can use a data quality tool to parse data values that conform to any of those patterns and even transform them into a single, standardized form that will simplify the assessment, similarity analysis, and cleansing processes. Pattern-based parsing can automate the recognition and subsequent standardization of meaningful value components. Similarity and Linkage Attempting to compare each record against all the others to provide a similarity score is not only ambitious, but also time-consuming and computationally intensive. Most data quality tool suites use advanced algorithms for blocking records that are most likely to contain matches into smaller sets, whereupon different approaches are taken to measure similarity. Identifying similar records within the same data set probably means that the records are duplicated, and may be subjected to cleansing and/or elimination. Identifying similar records in different sets may indicate a link across the data sets, which helps facilitate cleansing, knowledge discovery, reverse engineering, and master data aggregation. Auditing and Monitoring The value of using these tools to quantify the existence of data flaws is increased when there is a well-defined framework for collecting and reporting those data quality statistics. Some tools interface well with standard query/reporting tools to populate a data quality dashboard that can be drilled-through by data quality analysts for root cause analysis and remediation. The Data Quality Business Case: Projecting ROI 7
10 Identifying Impacts of Poor Information Quality Fundamentally, the return on your DQ investment is based on the real pains incurred by data flaws in running your business. If the goal of your business is to optimize productivity and profitability while minimizing costs and risks, then we can characterize business impacts across these four dimensions: Increased Costs Decreased Revenues Decreased confidence Increased Risk Our goal is to maximize the value of the information based on impacts associated with each dimension, and our task in developing the business case is to determine when and where poor information quality affects one or more of these variables. These impacts are summarized in Figure 2. This characterization allows one to classify impacts and subsequently determine a formula for assessing actual costs. Regulatory or legislative risk System Development risk Information Integration risk Investment risk Health risk Privacy risk Competitive risk Fraud Detection Delayed/lost collections Customer attrition Lost opportunities Increased cost/volumne Decreased Revenue Increased Risk Low Confidence Increased Costs Detection and correction Prevention Spin control Scrap and rework Penalties Overpayments Increased resource costs System delays Increased workloads Increased process times Organizational trust issues Impaired decision-making Lowered predictability Impaired forecasting Inconsistent management reporting Figure 2: Impacts of poor data quality 8
11 White Paper Increased Costs Costs may be incurred when addressing information quality issues or by ignoring them. For example, detection and correction costs are incurred when a problem has been identified, and these may be relatively large but infrequent. Alternatively, prevention costs may be incremental costs that are ongoing, and may diminish in expense as time goes on. Spin control costs are associated with ensuring that data quality impacts exposed outside of the organization are mitigated, such as the discovery (by an external organization like a newspaper) that decisions about which medical procedures are approved by a health insurer are based on faulty data, indicating that the needs of the member are not always being met properly. The cost of spin control includes the costs of publicity to address the discovery, plus any acute costs incurred to immediately modify procedures in place to close the perceived gap exposed by the discovery. Scrap and rework refers to costs associated with rolling back computations, undoing what had been done, and starting again. Information quality problems can impact levels of application service; if there are well-defined service-level agreements that are not being met, penalties for missing objective targets may be incurred. The inability to properly track all representational and contact information related to financial agreements with business partners can result in accounting failures, leading to potential overpayments or duplicated invoice payments. Ultimately, many of these issues may roll up into increased workloads on system, as well as human resources, leading to system delays and increased process times. Decreased Revenues The inability to resolve uniquely identifiable parties within a financial system can result in accounting entries that cannot be found without searching for the exact variation, potentially leading to delayed or lost collections. At the same time, the inability to resolve uniquely identifiable customer/client records ultimately reduces the effectiveness of any party relationship management system, which in turn may lead to customer attrition. Bad data may result in a delay in exploiting information at the proper time, leading to lost opportunities, such as not being able to execute transactions in a timely manner, inaccurately deactivating resources, or missing up-sell or cross-sell opportunities. Lastly, data quality problems interrupt streamlined workflow, reducing throughput and volume, resulting in increased cost per volume. In environments that rely on high volume, predictable volumes relate to determining the average costs per transaction, which contribute to predictability across resource planning, volume pricing, and other ways to increase profit margin. Decreased Confidence Organizational management is a world that is highly influenced by information its use, its guardianship, and its ownership model. Flawed data introduces organizational trust issues leading to suspicion and anxiety. As more data is propagated into business intelligence platforms for decision support services, invalid or incorrect data leads to impaired decision-making and forecasting. The Data Quality Business Case: Projecting ROI 9
12 Impaired forecasting can reverberate across the organization improper staffing, reduced investments in capital expenditures for hardware and software, incorrectly setting service rates and product prices, etc. However, the impact of poor decision-making is related to the actions taken based on that bad decision. When assessing the impact, it is valuable to : Identify specific business decisions that are impaired (suggestion: come up with a word better-suited to your organization) Determine if those decisions are directly attributable to bad data (and which data!) Only once you can directly attribute bad decisions to bad data, then have your business partner explain the actual cost impact of the decision Increased Risk Regulatory risks (e.g. HIPAA, Sarbanes-Oxley) introduce constraints associated with information quality with well-defined penalties for noncompliance. System development risk refers to the investment in building systems that cannot be rolled out until the end-clients are satisfied with the levels of information quality. When attempting to assemble a master data management program, it is important to be able to integrate data from across many different systems; flawed data can cause an integration impact when they are left unaddressed. In the health care world, there can be serious health risks associated with incorrect data. An obvious severe example is the case of heart transplant patient Jesica Santillan, where inaccurate information regarding blood-typing resulted in a botched heart-lung transplant, which not only led to the girl s death, but also prevented other critical patients from receiving needed donated organs. As an example of privacy risk, HIPAA s regulatory aspect underscores health insurers ethical responsibility to ensure patient privacy. Properly capturing and managing data related to with whom an individual s health information may be shared impacts many aspects of that person s life, ranging from protection against physical abuse to employment discrimination, among others. Fraud risks may be masqueraded throughout your applications when fraudulent behavior is performed to exploit information failures within the system. 10
13 White Paper Developing the Business Case A data quality improvement program is a serious commitment on behalf of an organization. Its importance deserves to be effectively communicated to the business managers who will sponsor both the technology and the organizational infrastructure in order to ensure a successful program. And as we have already identified key impact dimensions and corresponding impact categories associated with poor data quality, some additional research can document: The quantification of the identified financial impacts, Actual root causes in the information processing that are correlated to those impacts, The costs to remediate those process failures, and A way to prioritize and plan the solutions of those problems. We will accumulate this information into an Impact Template, which documents the problems, the issues, the business impacts, the quantifiers, all of which will enable the determination of a yearly incurred impact. Identifying Impacts Most likely, there will already be some awareness of some of the existence of impacts of poor data quality. Using our impact taxonomy, we can begin to determine how the results of different data quality events can be grouped together, which simplifies the research necessary to determine financial impact. Researching Financial Impacts The next step in the process is to get a high-level view of the actual financial impacts associated with the problem. This step combines subject matter expertise with some old-fashioned detective work. Because we are trying to get a high-level impact assessment, we have some flexibility in exactness, and in fact much of the information that is relevant can be collected in a relatively short time. Anecdotes are good starting places, since they are indicative of high-impact, acute issues with high management visibility. Historical data associated with work/process flows during critical data events can provide cost/impact information. To understand the actual cost impact, delve deeper into the core of the story; ask these kinds of questions: What was it about the data that cause the problem? How big is the problem? Has this happened before? How many times? When this happened in the past, what was the remediation process? What was done to prevent it from happening again? At the same time, consult issues tracking system event logs, management reports on staff allocation for problem resolution, and review external impacts (e.g., stock price, customer satisfaction, management spin) to identify key quantifiers for business impact. The answers to the questions combined with the research will provide insight into quantifiable costs, which will flow into the Impact Template (see Figure 3). The Data Quality Business Case: Projecting ROI 11
14 Problem Issue Business Impact Quantifier Yearly Incurred Impact Missing product id, inaccurate product description at data entry point Inability to clearly identify known products leads to inaccurate forecasts Slower turnover of stock Increased cost $30, Stock write downs Increased cost $20, Out of stocks at customers Inability to deliver orders Inefficiencies in sales promotions Distribution errors and rework Lost revenue Lost revenue $250, Speed to market (and $20, lost revenue) Staff time $24, Shipping costs Increased shipping $78, costs Unnecessary deliveries Staff time $23, Figure 3: An example impact template The Data Quality Impact Matrix The template in Figure 3 reflects an example of how invalid data entry at one point in the supply chain management process results in three impacts incurred at each of three different client applications, Inventory Management, Fulfillment, and Logistics. For each of these business areas, the corresponding impact quantifiers are identified, and then their associated costs are projected and expressed as yearly incurred impacts. In our impact matrix, the intention is to document the critical data quality problems, review the specific issues that occur within the enterprise, and then enumerate all the business impacts incurred by each of those issues. Once the impacts are specified, we simplify the process of assessing the actual costs, which we also incorporate in the matrix. The resulting matrix reveals the summed costs that can be attributed to poor data quality. Correlating Impacts to Root Causes The next step in developing the business case involves tracking the data flaws backward through the information processing flow to determine at which point in the process the data flaw was introduced. Since many data quality issues are very likely to be process failure, eliminating the source of the introduction of bad data upstream will provide a much greater return on investment than just correcting bad data downstream. In our supply chain example, the interesting thing to note is that each of the client application users would assume that their issues were separate ones, yet they all stem from the same root cause. The value in assessing the introduction of the flaw into the process is that when we can show that one core problem has multiple impacts, the return on our investment is remediating the problem will be much greater. 12
15 White Paper Costs to Remediate An ROI calculation doesn t just take into account the benefits it also must factor in the costs associated with the improvements. Therefore, we need to look at the specific problems that are the root causes and what it would cost to fix those problems. In this step, we evaluate the specific issues and develop a set of high-level improvement plans, including analyst and developer staff time along with the costs of acquiring data quality tools. Problem Issue Solution Software Costs Staffing Missing product id. inaccurate product description at data entry entry Inability to clearly identify known product leads to inaccurate forecasts Parsing and standardization. Record, monitoring linkage tools for cleansing $150, for license 15% annual maintenance.75 FTE for 1 year.15 FTE for annual maintenance Figure 4: Example solution investment assessment Figure 4 shows an example solution investment assessment, documenting the cost of each solution, which also allows us to allocate the improvement to the documented problem (and its associated impacts). Because multiple problems across the enterprise may require the same solution, this opens up the possibility for economies of scale. It also allows us to amortize both the staff and technology investment across multiple problem areas, thereby further diluting the actual investment attributable to each area of business impact. Projecting Return on Investment We now have two artifacts that can be used to project the return on investment: the impact matrix and the solution assessment. Deploying a proposed solution will eliminate some number of yearly incurred impacts, and therefore, the return on investment can be calculated as the difference between the sum of those yearly incurred impacts and the yearly resource and staffing requirements. In turn, we can use these results to prioritize our investment and program growth. By reviewing the criteria for providing value (e.g., biggest bang for the buck, fastest results, lowest up front costs), management can select and plan the project investments that grow the data quality program in the most strategic way. The Data Quality Business Case: Projecting ROI 13
16 Case Studies In each of these case studies, individuals were aware of the existence of data quality problems, but were challenged by understanding how those problems impacted the achievement of their business objectives. The process of identifying data flaws, correlating them to business impacts, and establishing a priority for remediation contributed to the organization s allocating budget for data quality tools, and more importantly, data quality management. Pharmaceutical Company A subsidiary of a pharmaceutical company was developing a new application to be rolled out to the sales staff to assist in the sales and marketing process. However, the sales representatives hesitated in accepting a new sales application, and while they contended that the quality of the data was insufficient to meet their needs, no one was able to effectively communicate the basis of their hesitance to senior management. A high level assessment of the data revealed a number of potential anomalies in the data, including: Duplicated entries for customers, suppliers, and research grantees Missing telephone and address contact information Significant use of represented null values (e.g., N/A, none,???? ) The assessment revealed a number of impacts related to these deficiencies, including: Investment Risk As the suspicions of the sales staff regarding the quality of the data was confirmed, in that duplicate data and missing contact information would not improve the sales and marketing process. This made it clear that their reluctance to use the new application indicated that the significant investment in development of the new application was put at risk. Regulatory Compliance The assessment revealed that a number of customers were also research grantees, which exposed a potential inadvertent risk of violating the federal Anti- Kickback statute. Decreased Service Capability Duplicated and/or missing data contributed to a decrease in the ability to satisfy negotiated client-side service level agreements. As a result of the data quality assessment and subsequent impact analysis, senior management recognized that the financial impacts and compliance risks warranted an investment in data quality improvement. The organization has defined a staff position whose role is to oversee a data quality program, and has purchased data quality tools to be used in data quality analysis, improvement, and monitoring. Health Insurance Company Ongoing data flaws that had propagated into this health insurance company s data warehouse had resulted in flawed decision making related to premiums, provider rates, financial forecasting, as well as member and group management issues such as underwriting, sales and marketing, and member attrition. In addition, inaccurate and untimely data impacted regulatory reporting, exposing the organization to compliance risk. As part of the data quality program, a straightforward system was developed to proactively test a number of data quality rules and assertions before the data was loaded into the data warehouse. The approach described in this paper was used to assess the cost impacts of the data quality issues that had occurred to provide a baseline for determining return on investment. 14
17 White Paper Over a one-year time period, this proactive validation captured a relatively large number of data flaws, whose impacts were prevented due to early intervention. Subsequent analysis revealed that the return on the proactive validation was 6-10 times the investment in the development and maintenance of the validation application. Government Department of Defense As described in the Department of Defense Guidelines on Data Quality Management, an assessment of the cost impacts of data quality was described, with expectations set by specifying that resultant costs should be quantified wherever possible. As described in the report, the inability to match payroll records to the official employment record can cost millions in payroll overpayments to deserters, prisoners, and "ghost" soldiers. In addition, the inability to correlate purchase orders to invoices is a major problem in unmatched disbursements. In the DoD, resultant costs, such as payroll overpayments and unmatched disbursements, may be significant enough to warrant extensive changes in processes, systems, policy and procedure, and AIS data designs. In one specific instance, the assessment approach identified specific issues with Bill of Material (BoM) data, identified approaches for solutions, with the recommendation of purchasing particular data quality tools. The benefits of the solution included the ability to quickly produce meaningful results that were easily understood by functional users and management, as well as projected cost savings as high as 40% in both time dedicated to reacting to a diagnosing data quality problems and re-entering incorrect data. In one specific case, the projected net savings exceeded $3.7 million. Importantly as well, the automated data quality analysis tool and the structured levels of analysis enabled the project team to quantify the benefits of the project into a format for high-level discussions. Telecommunications Company Inaccurate and invalid data can plague service-oriented companies, such as in the telecommunications industry. For one telecommunications company, an assessment of the quality of their component inventory, billing records, and customer records revealed a number of financial impacts: Numerous high-bandwidth components had been misconfigured, resulting in decreased service bandwidth Discrepancies existed between services provide and services actually billed Inconsistent and invalid information about the system created excess capacity in some areas while simultaneously capacity was strained in others As a result of this analysis, near-term funding for data quality efforts was inspired by the more traditional approach of revenue assurance through the detection of underbilling. The telecommunications company was able to derive the following benefits as a result of data quality improvement: Revenue assurance/underbilling analysis indicated revenue leakage of just over 3 percent of revenue attributable to poor data quality 49 misconfigured (but assumed to be unusable) high-bandwidth circuits were returned to productive use, thereby increasing bandwidth Cleansing of customer data revealed more than 200,000 unknown potential customers The Data Quality Business Case: Projecting ROI 15
18 Summary: How to Get Started As we can see from our case studies, successful business cases can be developed for investing in a data quality. By investing a small amount of analyst time, and with senior management sponsorship, here is an outline to develop your Data Quality Business Case: 1) Identify 5 business objectives impacted by the quality of data 2) For each of those business objectives: a. Determine cost/impacts areas for each flaw b. Identify key quantifiers for those impacts c. At a high level, assess the actual costs associated with that problem 3) For each data quality problem: a. Review solution options for that problem b. Determine costs to implement 4) Seek economies of scale to exploit the same solution multiple times At the conclusion of this exercise, you should have the right information to assemble a business case that not only justifies the investment in the staff and data quality technology used in developing an information quality program, but provides baseline measurements and businessdirected metrics that can be used to plan and measure ongoing program performance. 16
19 Notes The Data Quality Business Case: Projecting ROI 17
20 Worldwide Headquarters, 100 Cardinal Way, Redwood City, CA 94063, USA phone: fax: toll-free in the US: Informatica Offices Around The Globe: Australia Belgium Canada China France Germany Japan Korea the Netherlands Singapore Switzerland United Kingdom USA 2006 Informatica Corporation. All rights reserved. Printed in the U.S.A. Informatica, the Informatica logo, and, PowerCenter are trademarks or registered trademarks of Informatica Corporation in the United States and in jurisdictions throughout the world. All other company and product names may be tradenames or trademarks of their respective owners. J (06/20/2006)
Monitoring Data Quality Performance Using Data Quality Metrics
WHITE PAPER Monitoring Data Quality Performance Using Data Quality Metrics with David Loshin This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of
Business Performance & Data Quality Metrics. David Loshin Knowledge Integrity, Inc. [email protected] (301) 754-6350
Business Performance & Data Quality Metrics David Loshin Knowledge Integrity, Inc. [email protected] (301) 754-6350 1 Does Data Integrity Imply Business Value? Assumption: improved data quality,
Evaluating the Business Impacts of Poor Data Quality
Evaluating the Business Impacts of Poor Data Quality Submitted by: David Loshin President, Knowledge Integrity, Inc. (301) 754-6350 [email protected] Knowledge Integrity, Inc. Page 1 www.knowledge-integrity.com
Five Fundamental Data Quality Practices
Five Fundamental Data Quality Practices W H I T E PA P E R : DATA QUALITY & DATA INTEGRATION David Loshin WHITE PAPER: DATA QUALITY & DATA INTEGRATION Five Fundamental Data Quality Practices 2 INTRODUCTION
Understanding the Financial Value of Data Quality Improvement
Understanding the Financial Value of Data Quality Improvement Prepared by: David Loshin Knowledge Integrity, Inc. January, 2011 Sponsored by: 2011 Knowledge Integrity, Inc. 1 Introduction Despite the many
Data Governance. David Loshin Knowledge Integrity, inc. www.knowledge-integrity.com (301) 754-6350
Data Governance David Loshin Knowledge Integrity, inc. www.knowledge-integrity.com (301) 754-6350 Risk and Governance Objectives of Governance: Identify explicit and hidden risks associated with data expectations
Building a Data Quality Scorecard for Operational Data Governance
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
Data Quality and Cost Reduction
Data Quality and Cost Reduction A White Paper by David Loshin WHITE PAPER SAS White Paper Table of Contents Introduction Data Quality as a Cost-Reduction Technique... 1 Understanding Expenses.... 1 Establishing
The Informatica Solution for Improper Payments
The Informatica Solution for Improper Payments Reducing Improper Payments and Improving Fiscal Accountability for Government Agencies WHITE PAPER This document contains Confidential, Proprietary and Trade
Operationalizing Data Governance through Data Policy Management
Operationalizing Data Governance through Data Policy Management Prepared for alido by: David Loshin nowledge Integrity, Inc. June, 2010 2010 nowledge Integrity, Inc. Page 1 Introduction The increasing
What Your CEO Should Know About Master Data Management
White Paper What Your CEO Should Know About Master Data Management A Business Use Case on How You Can Use MDM to Drive Revenue and Improve Sales and Channel Performance This document contains Confidential,
A Road Map to Successful Customer Centricity in Financial Services. White Paper
A Road Map to Successful Customer Centricity in Financial Services White Paper This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of Informatica
Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER
Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Useful vs. So-What Metrics... 2 The So-What Metric.... 2 Defining Relevant Metrics...
Choosing the Right Master Data Management Solution for Your Organization
Choosing the Right Master Data Management Solution for Your Organization Buyer s Guide for IT Professionals BUYER S GUIDE This document contains Confidential, Proprietary and Trade Secret Information (
Three Fundamental Techniques To Maximize the Value of Your Enterprise Data
Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Prepared for Talend by: David Loshin Knowledge Integrity, Inc. October, 2010 2010 Knowledge Integrity, Inc. 1 Introduction Organizations
Data Quality Assessment. Approach
Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source
Effecting Data Quality Improvement through Data Virtualization
Effecting Data Quality Improvement through Data Virtualization Prepared for Composite Software by: David Loshin Knowledge Integrity, Inc. June, 2010 2010 Knowledge Integrity, Inc. Page 1 Introduction The
Healthcare Data Management
Healthcare Data Management Expanding Insight, Increasing Efficiency, Improving Care WHITE PAPER This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information
White Paper. An Introduction to Informatica s Approach to Enterprise Architecture and the Business Transformation Toolkit
White Paper An Introduction to Informatica s Approach to Enterprise Architecture and the Business Transformation Toolkit This document contains Confidential, Proprietary and Trade Secret Information (
IBM Software A Journey to Adaptive MDM
IBM Software A Journey to Adaptive MDM What is Master Data? Why is it Important? A Journey to Adaptive MDM Contents 2 MDM Business Drivers and Business Value 4 MDM is a Journey 7 IBM MDM Portfolio An Adaptive
Master Data Management Drivers: Fantasy, Reality and Quality
Solutions for Customer Intelligence, Communications and Care. Master Data Management Drivers: Fantasy, Reality and Quality A Review and Classification of Potential Benefits of Implementing Master Data
White Paper. Data Quality: Improving the Value of Your Data
White Paper Data Quality: Improving the Value of Your Data This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of Informatica Corporation and may
agility made possible
SOLUTION BRIEF CA IT Asset Manager how can I manage my asset lifecycle, maximize the value of my IT investments, and get a portfolio view of all my assets? agility made possible helps reduce costs, automate
Consolidating Multiple Salesforce Orgs: A Best Practice Guide. White Paper
Consolidating Multiple Salesforce Orgs: A Best Practice Guide White Paper This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of Informatica Corporation
How To Create A Healthcare Data Management For Providers Solution From An Informatica Data Management Solution
White Paper Healthcare Data Management for Providers Expanding Insight, Increasing Efficiency, Improving Care This document contains Confidential, Proprietary and Trade Secret Information ( Confidential
Making Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management
Making Business Intelligence Easy Whitepaper Measuring data quality for successful Master Data Management Contents Overview... 3 What is Master Data Management?... 3 Master Data Modeling Approaches...
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation Market Offering: Package(s): Oracle Authors: Rick Olson, Luke Tay Date: January 13, 2012 Contents Executive summary
DATA QUALITY MATURITY
3 DATA QUALITY MATURITY CHAPTER OUTLINE 3.1 The Data Quality Strategy 35 3.2 A Data Quality Framework 38 3.3 A Data Quality Capability/Maturity Model 42 3.4 Mapping Framework Components to the Maturity
Data Governance Baseline Deployment
Service Offering Data Governance Baseline Deployment Overview Benefits Increase the value of data by enabling top business imperatives. Reduce IT costs of maintaining data. Transform Informatica Platform
Data Governance, Data Architecture, and Metadata Essentials
WHITE PAPER Data Governance, Data Architecture, and Metadata Essentials www.sybase.com TABLE OF CONTENTS 1 The Absence of Data Governance Threatens Business Success 1 Data Repurposing and Data Integration
SOLUTION BRIEF: CA IT ASSET MANAGER. How can I reduce IT asset costs to address my organization s budget pressures?
SOLUTION BRIEF: CA IT ASSET MANAGER How can I reduce IT asset costs to address my organization s budget pressures? CA IT Asset Manager helps you optimize your IT investments and avoid overspending by enabling
Supporting Your Data Management Strategy with a Phased Approach to Master Data Management WHITE PAPER
Supporting Your Data Strategy with a Phased Approach to Master Data WHITE PAPER SAS White Paper Table of Contents Changing the Way We Think About Master Data.... 1 Master Data Consumers, the Information
ACL WHITEPAPER. Automating Fraud Detection: The Essential Guide. John Verver, CA, CISA, CMC, Vice President, Product Strategy & Alliances
ACL WHITEPAPER Automating Fraud Detection: The Essential Guide John Verver, CA, CISA, CMC, Vice President, Product Strategy & Alliances Contents EXECUTIVE SUMMARY..................................................................3
Enable Business Agility and Speed Empower your business with proven multidomain master data management (MDM)
Enable Business Agility and Speed Empower your business with proven multidomain master data management (MDM) Customer Viewpoint By leveraging a well-thoughtout MDM strategy, we have been able to strengthen
5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 www.winshuttle.com Introduction
Informatica Master Data Management
Brochure Informatica Master Data Management Improve Operations and Decision Making with Consolidated and Reliable Business-Critical Data This document contains Confidential, Proprietary and Trade Secret
Master Your Data. Master Your Business. Empower your business with access to consolidated and reliable business-critical data
Master Your. Master Your Business. Empower your business with access to consolidated and reliable business-critical data Award-winning Informatica MDM provides reliable views of business-critical data
Data Integration Alternatives Managing Value and Quality
Solutions for Customer Intelligence, Communications and Care. Data Integration Alternatives Managing Value and Quality Using a Governed Approach to Incorporating Data Quality Services Within the Data Integration
Test Data Management for Security and Compliance
White Paper Test Data Management for Security and Compliance Reducing Risk in the Era of Big Data WHITE PAPER This document contains Confidential, Proprietary and Trade Secret Information ( Confidential
Work Smarter, Not Harder: Leveraging IT Analytics to Simplify Operations and Improve the Customer Experience
Work Smarter, Not Harder: Leveraging IT Analytics to Simplify Operations and Improve the Customer Experience Data Drives IT Intelligence We live in a world driven by software and applications. And, the
Data Governance for Master Data Management and Beyond
Data Governance for Master Data Management and Beyond A White Paper by David Loshin WHITE PAPER Table of Contents Aligning Information Objectives with the Business Strategy.... 1 Clarifying the Information
Enterprise Data Quality
Enterprise Data Quality An Approach to Improve the Trust Factor of Operational Data Sivaprakasam S.R. Given the poor quality of data, Communication Service Providers (CSPs) face challenges of order fallout,
Informatica Master Data Management
Informatica Master Data Management Improve Operations and Decision Making with Consolidated and Reliable Business-Critical Data brochure The Costs of Inconsistency Today, businesses are handling more data,
Informatica PowerCenter The Foundation of Enterprise Data Integration
Informatica PowerCenter The Foundation of Enterprise Data Integration The Right Information, at the Right Time Powerful market forces globalization, new regulations, mergers and acquisitions, and business
Agile Master Data Management A Better Approach than Trial and Error
Agile Master Data Management A Better Approach than Trial and Error A whitepaper by First San Francisco Partners First San Francisco Partners Whitepaper Executive Summary Market leading corporations are
Considerations: Mastering Data Modeling for Master Data Domains
Considerations: Mastering Data Modeling for Master Data Domains David Loshin President of Knowledge Integrity, Inc. June 2010 Americas Headquarters EMEA Headquarters Asia-Pacific Headquarters 100 California
Data Integration Alternatives Managing Value and Quality
Solutions for Enabling Lifetime Customer Relationships Data Integration Alternatives Managing Value and Quality Using a Governed Approach to Incorporating Data Quality Services Within the Data Integration
14 TRUTHS: How To Prepare For, Select, Implement And Optimize Your ERP Solution
2015 ERP GUIDE 14 TRUTHS: How To Prepare For, Select, Implement And Optimize Your ERP Solution Some ERP implementations can be described as transformational, company-changing events. Others are big disappointments
Busting 7 Myths about Master Data Management
Knowledge Integrity Incorporated Busting 7 Myths about Master Data Management Prepared by: David Loshin Knowledge Integrity, Inc. August, 2011 Sponsored by: 2011 Knowledge Integrity, Inc. 1 (301) 754-6350
A Practical Guide to Legacy Application Retirement
White Paper A Practical Guide to Legacy Application Retirement Archiving Data with the Informatica Solution for Application Retirement This document contains Confidential, Proprietary and Trade Secret
Integrating Data Governance into Your Operational Processes
TDWI rese a rch TDWI Checklist Report Integrating Data Governance into Your Operational Processes By David Loshin Sponsored by tdwi.org August 2011 TDWI Checklist Report Integrating Data Governance into
Ten Steps to Quality Data and Trusted Information
Ten Steps to Quality Data and Trusted Information ABSTRACT Do these situations sound familiar? Your company is involved in a data integration project such as building a data warehouse or migrating several
Accenture Federal Services. Federal Solutions for Asset Lifecycle Management
Accenture Federal Services Federal Solutions for Asset Lifecycle Management Assessing Internal Controls 32 Material Weaknesses: identified in FY12 with deficiencies noted in the management of nearly 75%
Optimizing government and insurance claims management with IBM Case Manager
Enterprise Content Management Optimizing government and insurance claims management with IBM Case Manager Apply advanced case management capabilities from IBM to help ensure successful outcomes Highlights
Enabling Data Quality
Enabling Data Quality Establishing Master Data Management (MDM) using Business Architecture supported by Information Architecture & Application Architecture (SOA) to enable Data Quality. 1 Background &
The Requirements for Universal Master Data Management (MDM) White Paper
The Requirements for Universal Master Data Management (MDM) White Paper This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of Informatica Corporation
ROUTES TO VALUE. Business Service Management: How fast can you get there?
ROUTES TO VALUE Business Service : How fast can you get there? BMC Software helps you achieve business value quickly Each Route to Value offers a straightforward entry point to BSM; a way to quickly synchronize
IBM Tivoli Netcool network management solutions for enterprise
IBM Netcool network management solutions for enterprise The big picture view that focuses on optimizing complex enterprise environments Highlights Enhance network functions in support of business goals
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff The Challenge IT Executives are challenged with issues around data, compliancy, regulation and making confident decisions on their business
Data Quality Assurance
CHAPTER 4 Data Quality Assurance The previous chapters define accurate data. They talk about the importance of data and in particular the importance of accurate data. They describe how complex the topic
Product Lifecycle Management in the Food and Beverage Industry. An Oracle White Paper Updated February 2008
Product Lifecycle Management in the Food and Beverage Industry An Oracle White Paper Updated February 2008 Product Lifecycle Management in the Food and Beverage Industry EXECUTIVE OVERVIEW Companies in
A Simple Guide to Material Master Data Governance. By Keith Boardman, Strategy Principal
A Simple Guide to Material Master Data Governance By Keith Boardman, Strategy Principal DATUM is an Information Management solutions company focused on driving greater business value through data. We provide
ENTERPRISE ASSET MANAGEMENT (EAM) The Devil is in the Details CASE STUDY
ENTERPRISE ASSET MANAGEMENT (EAM) The Devil is in the Details CASE STUDY 1 EXECUTIVE SUMMARY Enterprise Asset Management (EAM) is a strategy to provide an optimal approach for the management of the physical
INFORMATION CONNECTED
INFORMATION CONNECTED Business Solutions for the Utilities Industry Primavera Project Portfolio Management Solutions Achieve Operational Excellence with Robust Project Portfolio Management Solutions The
Controlling Risk Through Software Code Governance
Controlling Risk Through Software Code Governance July 2011 Catastrophic Consequences Today s headlines are filled with stories about catastrophic software failures and security breaches; medical devices
Practical Fundamentals for Master Data Management
Practical Fundamentals for Master Data Management How to build an effective master data capability as the cornerstone of an enterprise information management program WHITE PAPER SAS White Paper Table of
Challenges in the Effective Use of Master Data Management Techniques WHITE PAPER
Challenges in the Effective Use of Master Management Techniques WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Consolidation: The Typical Approach to Master Management. 2 Why Consolidation
Problem Management: A CA Service Management Process Map
TECHNOLOGY BRIEF: PROBLEM MANAGEMENT Problem : A CA Service Process Map MARCH 2009 Randal Locke DIRECTOR, TECHNICAL SALES ITIL SERVICE MANAGER Table of Contents Executive Summary 1 SECTION 1: CHALLENGE
Performance Management for Enterprise Applications
performance MANAGEMENT a white paper Performance Management for Enterprise Applications Improving Performance, Compliance and Cost Savings Teleran Technologies, Inc. 333A Route 46 West Fairfield, NJ 07004
Product Lifecycle Management in the Medical Device Industry. An Oracle White Paper Updated January 2008
Product Lifecycle Management in the Medical Device Industry An Oracle White Paper Updated January 2008 Product Lifecycle Management in the Medical Device Industry PLM technology ensures FDA compliance
Customer Master Data: Common Challenges and Solutions
Customer Master Data: Common Challenges and Solutions By Will Crump President, DATUM LLC Executive Summary Master data within an enterprise is typically segmented by domain, or a category of related data
Agency HRIT Migrations to Shared Service Centers: Consolidated Lessons Learned Report
United States Office of Personnel Management Human Resources Line of Business Agency HRIT Migrations to Shared Service Centers: Consolidated Lessons Learned Report March 2015 Table of Contents 1 Table
Top Five Reasons Not to Master Your Data in SAP ERP. White Paper
Top Five Reasons Not to Master Your Data in SAP ERP White Paper This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of Informatica Corporation and
Enterprise Location Intelligence
Solutions for Customer Intelligence, Communications and Care. Enterprise Location Intelligence Bringing Location-related Business Insight to Support Better Decision Making and More Profitable Operations
Leveraging Network and Vulnerability metrics Using RedSeal
SOLUTION BRIEF Transforming IT Security Management Via Outcome-Oriented Metrics Leveraging Network and Vulnerability metrics Using RedSeal november 2011 WHITE PAPER RedSeal Networks, Inc. 3965 Freedom
with Managing RSA the Lifecycle of Key Manager RSA Streamlining Security Operations Data Loss Prevention Solutions RSA Solution Brief
RSA Solution Brief Streamlining Security Operations with Managing RSA the Lifecycle of Data Loss Prevention and Encryption RSA envision Keys with Solutions RSA Key Manager RSA Solution Brief 1 Who is asking
can you improve service quality and availability while optimizing operations on VCE Vblock Systems?
SOLUTION BRIEF Service Assurance Solutions from CA Technologies for VCE Vblock Systems can you improve service quality and availability while optimizing operations on VCE Vblock Systems? agility made possible
SAP ERP FINANCIALS ENABLING FINANCIAL EXCELLENCE. SAP Solution Overview SAP Business Suite
SAP Solution Overview SAP Business Suite SAP ERP FINANCIALS ENABLING FINANCIAL EXCELLENCE ESSENTIAL ENTERPRISE BUSINESS STRATEGY PROVIDING A SOLID FOUNDATION FOR ENTERPRISE FINANCIAL MANAGEMENT 2 Even
8 Key Requirements of an IT Governance, Risk and Compliance Solution
8 Key Requirements of an IT Governance, Risk and Compliance Solution White Paper: IT Compliance 8 Key Requirements of an IT Governance, Risk and Compliance Solution Contents Introduction............................................................................................
Copyright 2000-2007, Pricedex Software Inc. All Rights Reserved
The Four Pillars of PIM: A white paper on Product Information Management (PIM) for the Automotive Aftermarket, and the 4 critical categories of process management which comprise a complete and comprehensive
Data Governance Implementation
Service Offering Implementation Leveraging Data to Transform the Enterprise Benefits Use existing data to enable new business initiatives Reduce costs of maintaining data by increasing compliance, quality
White Paper. Successful Legacy Systems Modernization for the Insurance Industry
White Paper Successful Legacy Systems Modernization for the Insurance Industry This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information ) of Informatica
Business Drivers for Data Quality in the Utilities Industry
Solutions for Enabling Lifetime Customer Relationships. Business Drivers for Data Quality in the Utilities Industry Xxxxx W HITE PAPER: UTILITIES WHITE PAPER: UTILITIES Business Drivers for Data Quality
Continuous Monitoring and Auditing: What is the difference? By John Verver, ACL Services Ltd.
Continuous Monitoring and Auditing: What is the difference? By John Verver, ACL Services Ltd. Call them the twin peaks of continuity continuous auditing and continuous monitoring. There are certainly similarities
5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC Executive Summary Successful deployment of ERP solutions can revolutionize
The Informatica Platform for Data Driven Healthcare
Solutions Brochure The Informatica Platform for Data Driven Healthcare Solutions for Providers This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information )
White Paper March 2009. Government performance management Set goals, drive accountability and improve outcomes
White Paper March 2009 Government performance management Set goals, drive accountability and improve outcomes 2 Contents 3 Business problems Why performance management? 4 Business drivers 6 The solution
Transforming life sciences contract management operations into sustainable profit centers
Point of View Generating life sciences Impact Transforming life sciences contract management operations into sustainable profit centers Globally, life sciences companies spend anywhere from $700 million
building a business case for governance, risk and compliance
building a business case for governance, risk and compliance contents introduction...3 assurance: THe last major business function To be integrated...3 current state of grc: THe challenges... 4 building
Finding, Fixing and Preventing Data Quality Issues in Financial Institutions Today
Finding, Fixing and Preventing Data Quality Issues in Financial Institutions Today FIS Consulting Services 800.822.6758 Introduction Without consistent and reliable data, accurate reporting and sound decision-making
DataFlux Data Management Studio
DataFlux Data Management Studio DataFlux Data Management Studio provides the key for true business and IT collaboration a single interface for data management tasks. A Single Point of Control for Enterprise
