Item Master Data Rationalization Laying the Foundation for Continuous Business Process Improvement
|
|
|
- George Gregory
- 10 years ago
- Views:
Transcription
1 m e t a g r o u p.c o m META [6382] October 2004 Item Master Data Rationalization Laying the Foundation for Continuous Business Process Improvement Bad master data that is, master data that is inaccurate, duplicated, incomplete, or out-of-date hampers the accuracy of analysis, causes expensive exceptions that must be resolved, and prevents refinement of processes. Moreover, when bad data or flawed analysis is shared with partners, not only are the associated processes affected, but also the level of trust is undermined. Under these conditions, frustrated employees tend to continue their manual processes and future efforts in collaboration, integration, and automation become more difficult, due to employee resistance. In short, bad master data will destroy the best-designed business processes. A META Group White Paper Sponsored by Zycus
2 Contents Executive Summary Master Data Has a Material Impact on the Financial and Operational Health of an Organization Item Master Data Requires Specialized Attention Clean Item Master Data Enables a Wide Range of Business Initiatives Master Data Rationalization Is the Foundation for Leveraging Existing ERP Investment Master Data Rationalization Protects the SAP Master Data Management Investment Successful Sourcing and Procurement Initiatives Depend on Clean, Reliable Master Data Optimum Master Data Maturity Enables Real-Time Analysis and Control of Business Processes Introduction ERP Systems Are Indispensable to the Business Operations of Large Organizations Business Process Configuration in ERP Is Important, But Master Data Quality Affects the Accuracy, Efficiency, and Reliability of the Process Keeping Enterprise Applications in Shape Requires Constant Master Data Maintenance Successful Business Initiatives Depend on Clean, Organized, and Reliable Master Data CEOs and CFOs Who Are Accountable Under Sarbanes-Oxley Need Good Data The Role of Master Data in the Enterprise Master Data Quality Issues Ripple Across the Enterprise The Difference Between Primary and Derived Master Data Records A Disorganized Approach Toward Maintaining Master Data Is Common Item Master Records Present Particular Challenges Item Master Record Quality Problems Have Numerous Root Causes The Effect of Bad Item Master Data on Business Initiatives Is Profound Master Data Rationalization Is a Prerequisite for Successful Business Initiatives Understanding the Process of Master Data Rationalization Step 1: Extraction and Aggregation Step 2: Cleansin Step 3: Classification Step 4: Attribute Extraction and Enrichment Step 5: Final Duplicate Record Identification Automation Is Not an Option Integrating Master Data Rationalization Into ERP Consolidation or Upgrade Planning Moving Your Organization Through the Data Quality Maturity Model Level 1: Aware Level 2: Reactive Level 3: Proactive Level 4: Managed Level 5: Optimized Bottom Line Clean, Reliable Master Data Enables Successful Enterprise Initiatives Master Data Rationalization Is Required to Ensure Master Data Quality Building Master Data Rationalization Into ERP Consolidation Planning Master Data Rationalization Is a Key Component in Achieving Data Quality Maturity
3 Executive Summary Master Data Has a Material Impact on the Financial and Operational Health of an Organization Business executives depend on reliable reporting of operational and financial activities to guide their decisions. The US government even mandates reliable and accurate reporting under the Sarbanes-Oxley Act (SOX). The underlying enabler to meet the demands of business executives and the government is the master data found in enterprise software systems. Master data represents the items a company buys, the products it sells, suppliers it manages and the customers it has. When the master data is inaccurate, out-of-date, or duplicated, business processes magnify and propagate these errors, and the company's financial and operational results are affected. The results are profound. Shareholders lose their confidence and market capitalization falls. Executives begin to manage by instinct rather than from facts and results suffer. Suppliers lose faith in the collaborative processes and build in safety stock. All these scenarios are likely and have a direct effect on the financial and operational health of the enterprise. Item Master Data Requires Specialized Attention Customer relationship management (CRM) projects have long focused on the quality of customer master records managed by CRM systems. Item master records, on the other hand, often have no clear owner to champion the cause of clean, reliable item master data, because the data often resides in various systems and is used by different departments. However, these records require special attention, because they contain the most pervasive master data in the enterprise and form the basis for many other dependent master records and business objects such as purchase orders and pricing records. Moreover, item master records often have hundreds of attributes that are used by various systems and business processes. It is critical that item master records be properly classified and have complete and accurate attributes, because they form the foundation for accuracy and efficiency in enterprise software systems. Clean Item Master Data Enables a Wide Range of Business Initiatives There are numerous business initiatives underway in an organization at any given time that are focused on cost reductions, operational efficiencies, or strategic synergies. A company's supply organization may engage in strategic sourcing or enterprise spend management, while the product management group may focus on part reuse. The merger-and-acquisition team may be evaluating potential targets based partially on synergies to be won in the consolidation of operations, supply 2
4 chains, or product lines. The successful ongoing operation of such initiatives rests on reliable reporting: What do we spend? What do we buy and from whom? What parts do products have in common? What can be substituted? When item master data is not clean, managers do not have reliable data for the reporting needed to drive these initiatives forward. Master Data Rationalization Is the Foundation for Leveraging Existing ERP Investment Most IT organizations are challenged in driving continuing positive return on investment from their ERP systems. Many are consolidating their various ERP and other enterprise software systems to meet that challenge. In particular, many SAP customers facing the need to upgrade as SAP ends support of R/3 4.6c in 2006 in favor of R/3 Enterprise or mysap ERP are using this opportunity to consolidate and upgrade. This is the ideal time to launch a master data rationalization initiative. Indeed, an item master record format and classification scheme in SAP system #1 is typically not the same as in SAP system #2. Before the systems can be consolidated, the master data must be rationalized according to agreed-upon format, classification scheme, and attribute definitions. Otherwise, companies risk contaminating their upgraded and consolidated ERP systems with even more bad data. Master Data Rationalization Protects the SAP Master Data Management Investment We also note that a large number of SAP customers are preparing to implement SAP's Master Data Management (MDM) functionality found in the NetWeaver platform. Implementing SAP MDM does not eliminate the need for master data rationalization. To the contrary, it emphasizes the need for master data rationalization because its function is the syndication and management of the various master data objects in enterprise software systems. SAP customers should protect their investment and undertake master data rationalization before implementing MDM, to ensure that only clean master data is managed by SAP MDM. Successful Sourcing and Procurement Initiatives Depend on Clean, Reliable Master Data Companies implementing enterprise spend management learn very quickly that the quality of their master data holds the key to unlocking the promised value. Master data such as vendor and item master records forms the basis for all other associated spend data and business objects such as purchase orders and goods receipts. The ugly reality is that this master data exists in many systems and is often incomplete, duplicated, and wrongly classified or unclassified. 3
5 Extracting, organizing, enriching, and analyzing this data potpourri is a major challenge for any organization, but it must be done. Without clean, reliable master data, a spend management initiative will fail. Master data rationalization that is, the process of extracting, normalizing, classifying, enriching, and staging data for analysis is fundamental to the spend management process. Organizations should invest in processes and tools that automate to the greatest extent possible the master data rationalization process. The goal is to establish a repeatable, reliable process that enables confident spend data analysis on an ongoing basis. Optimum Master Data Maturity Enables Real-Time Analysis and Control of Business Processes Our research shows that the maturity of organizational master data quality practices varies greatly, from the most basic but not uncommon state of master data chaos, to the rare case of pervasive, real-time, high-quality master data. Organizations should understand where they are in the master data maturity model and chart a path to achieving an optimized level of master data quality maturity a level where they will be able to exploit spend data on a real-time basis to drive continual improvements in supply-side processes. Key to this evolution is the implementation of automated processes for the cleansing, enrichment, and maintenance of master data. Introduction ERP Systems Are Indispensable to the Business Operations of Large Organizations Enterprise software applications have become so indispensable that they have a material effect on company valuations. Over the years, we have seen companies incur charges totaling hundreds of millions of dollars because of ERP problems, companies miss the market with their products because of ERP problems, and mergers fail to deliver intended results because of ERP problems. The health and continuing welfare of a company's ERP system is clearly an issue for the CEO. ERP systems, once a transformational investment where companies invested enormous sums without a clear understanding of the outcome, have dropped down the stack to become a true backbone of the organization. Accordingly, the focus surrounding their maintenance and economic performance has shifted, from a mindset of, I'll pay whatever it takes to get it in and beat my competition, to one of, I want Six Sigma quality, and I want to minimize my operational costs, as described by META Group's IT Application Portfolio Management theory. Chief information officers not only are tasked with the responsibility for improving the performance of their ERP systems, but they also face the challenge of continuing to mine return from their ERP investment. 4
6 Business Process Configuration in ERP Is Important, But Master Data Quality Affects the Accuracy, Efficiency, and Reliability of the Process Organizations dedicate much attention and many resources to improving their business processes. The focus of many ERP efforts revolves around process optimization and process extension to other enterprise systems such as CRM or supplier relationship management (SRM). As the process broadens to involve other organizational units or enterprise applications, many organizations discover that the process efficiency and reliability suffers. Accurate reporting is no longer possible, and confidence in the systems drops. Investigation into these problems reveals that bad master data is often the root cause of these process degradations. Entropy: The Cause of Diminishing Returns Entropy (noun): a process of degradation or running down, or a trend to disorder. (Source: Merriam Webster) Entropy affects spend data as well as all other elements in the universe. Cleaning and organizing spend data once is not sufficient to win continued savings and efficiencies. Organizations must implement an automated, repeatable, scalable process to ensure the completeness, accuracy, and integrity of spend data. Bad master data that is, master data that is inaccurate, duplicated, incomplete, or out-of-date hampers the accuracy of analysis, causes expensive exceptions that must be resolved, and prevents refinement of processes. Moreover, when bad data or flawed analysis is shared with partners, not only are the associated processes affected, but also the level of trust is undermined. Under these conditions, frustrated employees tend to continue their manual processes and future efforts in collaboration, integration, and automation become more difficult, due to employee resistance. In short, bad master data will destroy the best-designed business processes. Keeping Enterprise Applications in Shape Requires Constant Master Data Maintenance Master data in enterprise applications such as ERP, SRM, or CRM is subjected to data entropy from the first moment after go-live. Entropy, the universal trend toward disorder, takes many forms. In the application itself, incomplete validation routines, poor master data maintenance policies, or subsequent master data loads can contaminate the system. Across a business process that spans more than one application, master data record formats and contents can vary, leading to inaccurate transactions and analysis. In the fight against master data disorder, organizations must institute 5
7 master data quality tools, policies, and procedures. Master data requires continuous maintenance, from the time it is created or loaded to the time it is archived, or business results will suffer. Essential to master data quality is the process of master data rationalization. A typical enterprise IT architecture comprises several enterprise applications and many sources of master data. Integrated business processes that tap these sources as they wind their way through the various systems suffer when there is no agreement among systems on something as fundamental as an item master record. Master data rationalization is the process that ensures that master data is properly classified, with complete and normalized attributes, and that it is fully suitable for use throughout the enterprise IT landscape. Successful Business Initiatives Depend on Clean, Organized, and Reliable Master Data Business initiatives such as ERP system consolidation, enterprise spend management, total inventory visibility, or component reuse promise high returns, whether from reduced IT expenditures, as in the case of an ERP consolidation, or from more cost-effective designs and faster time to market, as in the case of component reuse in the product design cycle. All of these business initiatives have one thing in common, though, and that is a dependency on clean, organized, and reliable master data. Master data that is correctly classified with a common taxonomy and that has normalized and enriched attributes yields a granular level of visibility that is critical to search and reporting functions. Before undertaking any of these efforts and similar business initiatives, organizations must ensure that they have instituted the policies, procedures, and tools to ensure master data quality. CEOs and CFOs Who Are Accountable Under Sarbanes-Oxley Need Good Data The Sarbanes-Oxley Act, passed in 2002, underscores the importance of master data quality for the CEO and CFO. This broad act addresses financial reporting and business processes that have an effect on financial reporting. Under Sarbanes-Oxley, company officers must certify compliance of their financial reports with the act. As companies work toward compliance, many discover that the quality of their master data has a direct and material impact on their financial reporting, making the state of master data a Sarbanes-Oxley issue (see Figure 1). Accordingly, CEOs and CFOs are using the Sarbanes-Oxley Act as the impetus for consolidating ERP systems, for driving visibility in corporate spending, and for visibility in inventories. Surveys within our client base confirm an increase in all these activities. 6
8 Figure 1 SOX Sections Impacted by Master Data Organizations must also assess readiness, requirements, and controls across individual sections of the Sarbanes-Oxley Act: Section 404: Internal Controls - Capability to comprehensively aggregate financial data - Accessibility of financial reporting details to executives - Support for frequent flash reporting - Availability of management of tools for drill-down analysis of accounting reports - Capability to routinely highlight key analysis areas based on tolerances and financial metrics - Capability to segment reporting into material or significant elements - Adequacy of visibility into any outsourced processes that impact SOX compliance Sections 302 and 906: CEO/CFO Sign-Off - Degree and efficiency of financial/erp consolidation and integration - Availability and quality of financial data marts/data warehouses - Quality of financial reporting/olap capabilities - Consistency of defined financial and related metadata - Availability to management of compliance dashboards and related tools - Support for frequent flash reporting - Quality of ERP, best-of-breed, and legacy system controls Source: META Group The Role of Master Data in the Enterprise Master Data Quality Issues Ripple Across the Enterprise Master data represents the fundamental building blocks of operational enterprise software systems and the key components of the company, including: - The items it makes - The items it buys - The employees who work there - The customers to whom it sells - The suppliers it buys from 7
9 When any of these records becomes inaccurate, other dependent master records lso become corrupt. The ripple effect is pronounced as these records feed transactions and business processes. Reporting becomes inaccurate and suspect, managers lose visibility of actual operational results, and the company and its shareholders suffer. The Difference Between Primary and Derived Master Data Records Master data records can be classified into two main categories: Primary master data records: These records are like prime numbers. They cannot be reduced further. Employee, customer, vendor, and item master records are all examples of primary master data records. Derived master data records: Derived master data records are created by linking primary master data records together. Linking a customer record with an item record creates the basis for creating a specific pricing record that is used in sales and trade management applications. The number of derived master data records is an order of magnitude greater than primary master data records and managing them is a challenge in itself. However, if the primary master data records are bad, the challenge becomes insurmountable. A Disorganized Approach Toward Maintaining Master Data Is Common Organizations rarely have a unified approach toward managing primary master data. Customer records typically fall under the purview of the CRM team, and customer data is maintained as part of that initiative. Vendor master records normally belong to procurement or accounts payable, and their maintenance is administered by these departments. Item master data records, on the other hand, often have no clear owner. Item Master Records Present Particular Challenges Item master records have numerous sources. Engineers and designers can create parts, procurement can source new parts, and suppliers can load their part masters into the organization's systems. Compounding the complexity surrounding the item master record is the number of systems in which they reside. In the simple example of a product as it moves from design to manufacturing: The design engineer creates a product using prototype parts that are supplied by a prototype supplier. These parts have unique part numbers and often are procured by the engineer. This normally takes place in the engineer's own product life-cycle management software application. After winning approval, the design is released to manufacturing, where the manufacturing bill 8
10 of materials calls out for series production parts that must be sourced by procurement. This takes place in another application, typically an ERP system. The service parts management group creates item master records for its service parts and works through an after-market sales organization in yet another system. Item master records abound, yet rarely will they have complete and accurate information, since they are remade in independent applications as new parts records. The organization has no single version of the truth and has lost its ability to effectively manage its resources. Item Master Record Quality Problems Have Numerous Root Causes This simple scenario highlights a few of the root causes of item master data quality problems, which include: Various master record formats: As a rule, no two software systems share the same master record format. Therefore, a one-to-one correspondence, between fields is not possible, and any data migration between systems will result in incomplete and inaccurate records. Various systems of record: The vast majority of organizations use more than one software application. Product organizations may have many applications, including computer-aided design (CAD), sourcing, manufacturing execution, ERP, warehousing and logistics, and CRM applications. Integration of all of these applications is not a guarantee of data integrity. Incongruent naming standards or no naming standards: Item codes and descriptions are too often a window to the creativity of those who created the master record. Consequently, abbreviations proliferate. Deciphering units of measure or part names becomes an IQ test, and search engines fail to find the right part. Duplicate records are created when existing parts cannot be found. Lack of a standardized classification convention: As an aid to finding items and for reporting purposes, item master records are classified, but all too often, organizations use proprietary or incomplete classification systems, which leads to items being wrongly classified or not classified at all. Consequently, reports are incomplete and inaccurate, which has an impact on decision making. Incomplete fields: This is a simple yet effective way that master record quality is reduced. Inadequate validation routines often are the cause of incomplete fields being passed on. Imprecise validation routines also affect a related issue: incorrect entries. 9
11 The Effect of Bad Item Master Data on Business Initiatives Is Profound Bad master data is not an IT problem, though the ITO is often called upon to solve it. The success and measurable impact of business initiatives depend on consistent, high-quality master data. Merger and Acquisition Activities The synergies driving M&A activity often are dependent on consolidating operations and inventory as well as sharing and integrating designs and leveraging use of common parts. Realizing these synergies depends on the ability to merge item master data files and to accurately report on the status of these initiatives. Failure to gain a common view of the item master data of both companies not only diminishes the synergies and drags out the integration process, but also threatens the success of the merger or acquisition itself a business event typically far more expensive than the cost of the required data maintenance. ERP System Consolidation Increasingly more organizations are consolidating their ERP instances, targeting savings and efficiencies. Business drivers for these consolidations include SOX compliance pressures, the end of SAP R/3 version support, system harmonization across business units or geographies, and architectural upgrades that allow companies to leverage service-oriented architectures. However, attempting consolidation before the master data is rationalized will lead to a contaminated single instance. Cleansing the data once it lands in the new system is enormously expensive and time consuming. Enterprise Spend Management The initial business benefits from enterprise spend management are substantial. Organizations routinely report cost savings of 5%-30% after aggregating spending and reducing the number of suppliers for a given commodity. However, many companies find that they hit the wall after a first round of spend management and that incremental gains afterward are small to non-existent. These organizations are discovering that the familiar 80/20 rule has been turned on its head. The first 20% of savings and efficiency gains is the easy part. The remaining 80% presents a formidable challenge that most organizations and software solutions are not currently equipped to tackle. Bad master data is a major culprit. Sourcing Sourcing projects and the make-versus-buy decision process in general require a view of what exists already in the approved parts lists and approved vendor lists. Bad item master data can result in supplier proliferation, part proliferation, and a failure to leverage existing contracts. 10
12 Inventory Visibility Warehouse management systems, ERP systems, and third-party logistics service providers manage aspects of parts and finished goods inventories. This fragmented system landscape clouds inventory visibility and leads to over purchasing, stock-outs, inventory write-offs, and disruptions of manufacturing operations. This impact can be measured in lost customers, missed deadlines, and financial losses. Part Reuse in Design An engineer's design decisions can have lasting financial impacts on product margin as well as on the organization. Part reuse is dependent on the engineer's ability to find the right part based on attributes. When existing parts are incompletely or wrongly classified and attributes are missing, frustrated engineers find it easier to create a new part than to perform an extended manual search. This undermines sourcing strategies and merger-and-acquisition synergies, and further bloats inventories. Master Data Rationalization Is a Prerequisite for Successful Business Initiatives The pervasive nature of item master data affects the success, efficiency, and material impact of many business processes and initiatives, as we have described above. Organizations must establish a strategy for item master data that addresses data quality across the master data life cycle, from inception or introduction to archiving. The first step in this process is master data rationalization. Understanding the Process of Master Data Rationalization The case for clean, reliable master data is clear, and we have seen that it is essential for master data to be clean from its inception or its introduction into a business application. Master data rationalization is the first step that organizations should undertake in their drive for master data quality. Master data rationalization is a multistep, iterative process that involves the extraction, aggregation, cleansing, classification, and attribute enrichment of itemmaster data. Key to this process is the proper classification and attribute enrichment of the item master record. Most systems use some sort of taxonomy to classify items. However, for use throughout the enterprise and with external partners, organizations should select a taxonomy that delivers depth and breadth, such as UNSPSC (the United Nations Standard Products and Services Code), 11
13 and that allows granular visibility of the item. Rarely do organizations themselves have the resources in-house to evaluate and select the proper taxonomies. Accordingly, organizations should ensure that their consulting partners demonstrate their experience with taxonomy selection and deployment. Item record attributes play a similar important role. Attributes define the item and are important for successful parametric searches. Incomplete or incorrect attributes prevent items from being found in the systems, resulting in proliferation of parts and bloated inventories. Before the development of sophisticated automated tools to perform these functions, this process was an expensive and cumbersome process, and rarely a successful undertaking. Step 1: Extraction and Aggregation The master data rationalization process begins with extraction of the master data from the various systems of record, whether they are internal systems such as ERP, SRM, or legacy, or external systems such as purchasing card suppliers.these records are aggregated in a database that serves as the source for the follow-on processing. Initial validation can take place at this point to send bad records back for repair (see Figure 2). Step 2: Cleansing Once aggregated, the data is subjected to an initial screening to identify duplicate records (see Figure 3). Part numbers, descriptions, and attributes (e.g., supplier names) are parsed using predefined rules. Exact matches and probable matches are identified and published. Weeding out duplicate records is an iterative process that requires subject-matter experts to identify those records that cannot be culled in the first round. In this process, rule-based processing is inadequate to manage the volume of data. Statistical processing and artificial intelligence is needed to ensure the maximum level of automation and accuracy. Figure 2 Extraction and Aggregation prior to duplicate identification Data Sources Templates Initial Validation Master Data ERP Rationalization Environment AP T&E PO Corrupt records are returned to the source for repair Data Warehouse/ Consolidated Database Source: META Group 12
14 Figure 3 Initial duplicate identification based on part number & supplier name Part Number 75A01 75AO1 75A-01}75AO1 Supplier Name General Electric GE Electric Inc. Gen. Elec.}General Source: META Group Step 3: Classification Classification is a critical step. The master records must be classified correctly, completely, and to a level of detail that makes the record easy to identify for search and reporting functions. Organizations often have multiple classification schemas. Although it is not necessary to choose one particular taxonomy, since taxonomies can coexist, it is necessary to have a taxonomy that supports the enterprise's business initiatives. Our research confirms that the use of widely adopted taxonomies such as UNSPSC, NATO, or eclass improves the performance of enterprise spend management strategies significantly over legacy taxonomies. This step is best executed with the help of a partner that has deep experience in taxonomy deployment (see Figure 4). Figure 4 An Example of Hierarchical Taxonomy UNSPSC...Description Power generation distribution machinery and accessories Power motors Motors Induction motors Alternating current (A/C) motors Synchronous motors Single-phase motors Multi-phase motors Source: META Group Step 4: Attribute Extraction and Enrichment Although classification helps determine what an item is and how it relates to other items, attributes define the characteristics of the item and can run into the hundreds per item. Unfortunately, attributes in the item record may be left blank, be cryptic, or be inaccurate. In particular, ERP 13
15 master records are full of cryptic attributes, due to poor validations and limited text-field lengths. In this step, attributes are extracted, normalized, and completed as part of record enrichment (see Figure 5). This establishes the difference between the discovery of a metal nut and the discovery of a ¼-20 hex nut made of 316 stainless steel. Because of the sheer volume of attributes to be extracted and enriched, an automated approach is the only practical way to execute this step. Figure 5 Attribute Extraction and Enrichment UNSPSC Classification Item Record After Initial Normalization and Classification UNSPSC Description Part Number Item Description Supplier Printer or copier paper Printer paper 81/2 x 11, 24lb., 500ct. Office Depot Web Cross- Referencing Attribute Extraction & Enrichment Engine Item Record After Attribute Extraction and Enrichment UNSPSC Classification UNSPSC Description Part Number Item Description Size Weight Brightness Unit of Sale Quantity Supplier Printer or copier paper Inkjet printer paper US letter 24lb. 104 Ream 500 Office Depot Source: META Group 14
16 Step 5: Final Duplicate Record Identification Once the records have been classified and their attributes enriched, the records undergo a second round of duplicate identification (see Figure 6). With much more record information normalized, enriched, and complete, most of the duplicates are automatically identified during this step. Although this may vary by category, there are usually a small number of records that still must be evaluated by subject-matter experts to determine their status. Figure 6 Final Duplicate Record Identification Item Record #1 After Attribute Enrichment UNSPSC Classification UNSPSC Description Part Number Item Description Size Weight Brightness Unit of Sale Quantity Supplier Printer or copier paper Inkjet printer paper US letter 24lb. 104 Ream 500 Office Depot Item Record #2 After Attribute Enrichment UNSPSC Classification UNSPSC Description Part Number Item Description Size Weight Brightness Unit of Sale Quantity Supplier Printer or copier paper Inkjet printer paper US letter 24lb. 104 Ream 500 Office Depot These records are routed to the Subject-matter expert for duplicate identification. Source: META Group 15
17 Automation Is Not an Option Applying this master data rationalization methodology requires deployment of an automated solution. Without automation, it will be impossible to process the volume of records required to make an impact on the overall performance of the enterprise initiatives that depend on item master data. In particular, automating the classification and attribute enrichment steps in the master data rationalization process is crucial to the overall process. Organizations should examine available solutions based on a number of criteria, including: How repeatable is the process? - Investing in a process that is not repeatable is a waste of money and resources. How strong are the algorithms used for the automated classification? - Organizations should note the percentage of records that make it through screening with an 80% confidence level that the classification is correct. Can the system learn? - The strength of artificial intelligence is that self-learning systems require less support over time, saving users money and resources. What is the throughput of the system? - Data loads must be accomplished in short order: Business will not wait. High throughput with high accuracy is a sign of a strong system. How effective is the human support? - Service providers offer expertise in setting up taxonomies and classification of materials. Users should look for experience with their particular industry as well as with the toolset they have chosen to use. Systems integrators should have experience with both the master data rationalization tools as well as the ERP systems. Can the process be integrated into daily operations? - Users should look for tools that support the classification of master data at the source. An automated classification tool that is integrated into the business application ensures that any new part is automatically classified with the correct codes before that part record is used. Data quality is thereby maintained. Currently, there are several avenues that organizations can take to attain master data quality (see Figure 7). One approach is to limit the scope of the data to those records used by the asset management system. Typically, there is a large amount of manual intervention because asset management solution vendors believe that the low volume of data does not require significant automation. Needless to say, this approach fails because of its narrow focus and lack of scalability. Catalog content management providers also offer aspects of master data rationalization, though their focus still remains primarily on the commerce side, rather than on the procurement and supply sides of the organization. Finally, there are service providers that offer onetime cleansings 16
18 using manual content factories to screen the data. Again, this approach is neither scalable nor repeatable. Figure 7 Incomplete approaches to Item Master Rationalization ETL Solutions (Extract, Transform, Load) These solutions are too generic in functionality to deal with the complexities of item master records. ETL solutions do not perform classification and attribute enrichment. Moreover, there is considerable effort and expense in setting up these solutions for repeated use. Asset Management Solutions Asset management solutions typically target only a subset of item master data, namely MRO (maintenance, repair, and operations) items. This is not sufficient for ERP consolidation or for comprehensive spend analysis. In addition, there is significant manual effort involved. Commerce Catalog Solutions Commerce catalog solutions tend to focus only on the items sold, rather than those procured. These solutions are less experienced in tapping the various internal and external sources of item data and fail in the subject-matter expert department. Furthermore, they do not automate the attribute enrichment, automating instead only the workflow. Manual Content Factories Manual content factories, or manual approaches in general, were common before the advent of artificial intelligence tools for master data rationalization. The manual approach cannot scale nor can it meet the throughput demands of large projects. Source: META Group Organizations should instead evaluate their prospective solution providers on their ability to deliver an approach toward master data rationalization that automates as much of the classification, cleansing, attribute extraction, and attribute enrichment as possible on a repeatable basis. In addition, the solution provider should bring to the table experience in taxonomies and specific industry verticals along with the automated solution. 17
19 Integrating Master Data Rationalization Into ERP Consolidation or Upgrade Planning An organization should not consider consolidating its enterprise business systems without building master data rationalization into the project. To do otherwise is to destroy the opportunity to leverage a single instance of clean data for business improvement. Users should ensure that their systems integrators understand the value and power of master data rationalization and that they have experience in laying the foundation for a successful ERP consolidation. Master data rationalization is a significant step on the path toward achieving data quality maturity. Without this first step, further activities are like trying to plug holes in the dike with one's fingers. Moving Your Organization Through the Data Quality Maturity Model We have seen the extent to which bad data limits the success of enterprise initiatives, and we have examined the strong business case in support of a systematic approach to master data quality. The process of master data rationalization is straightforward. The next logical question involves where to start. Determining where to start a master data management project begins with identifying where the organization is in the data quality maturity model. With spend data proving to be a true corporate asset, enterprises must adopt a method for gauging their information maturity that is, how well they manage and leverage information to achieve corporate goals. Only by measuring information maturity can organizations hope to put in place appropriate programs, policies, architecture, and infrastructure to manage and apply information better. Figure 8 The Data Quality maturity pyramid Optimized Managed Proactive Reactive Aware Level 5 Operate real-time data monitoring and enrichment to enable real-time business reporting Level 4 Measure data quality continually and analyze for impact on business operations Level 3 Institute upstream data quality processes such as auto classification at the point of data entry Level 2 Conduct a targeted data and process audit, avoiding onetime fixes, and begin master data rationalization Level 1 Create awareness, linking data quality to business initiatives, and get the CEO/CIO involved Source: META Group 18
20 Our data quality maturity model comprises five levels of maturity, from awareness to optimization. Advancing from one level to the next delivers real value to the organization and its partners. This model should serve as a guide to aid organizations in understanding the necessary changes and associated impact on the organization, its business processes, its information technology infrastructure, and its applications (see Figure 8). Level 1: Aware These organizations live in master data chaos. They generally have some awareness that data quality problems are affecting business execution and decision making, but they have no formal initiatives to cleanse data. Individuals typically initiate data quality processes on an ad hoc basis as needs arise. A common example is that of suppliers needing to be identified for a particular commodity and efforts being focused on weeding out duplicate entries. We find that approximately 30% of Global 2000 enterprises currently fit this profile. To move to the next level, these organizations should strive to improve internal awareness and communication about the impact of data quality and should link data quality to specific business initiatives and performance indicators. Chief financial officers and chief procurement officers are key players in driving the organization to understand that it is suffering because of bad data. This should set the stage for action. Level 2: Reactive Suspicion and mistrust abound at this level. Decisions and transactions are often questioned, due to suspicion or knowledge of data quality problems, and managers revert to instinct-driven decision making, rather than relying on reports. Some manual or homegrown batch cleansing is performed at a departmental or application level within the application database. At this level, data quality issues tend to most affect field or service personnel, who rely on access to correct operational data to perform their roles effectively. About 45% of enterprises fit this profile. To avoid the organizational paralysis that accompanies thoughts of a sweeping overhaul of the company's master data, targeted data audits and process assessments should be the first order of business for these organizations. Spend data should be audited by experts that can identify remediation strategies, and business processes such as item master record maintenance should be assessed for impact on data quality. Limited-scope initiatives leveraging hosted data management solutions often deliver a quick return on investment and prove the business case for wider deployment. To exit this level permanently requires some investment and a commitment from lineof-business managers to improve data quality. 19
21 Level 3: Proactive Moderate master data maturity can be ascribed to organizations that perceive master data as a genuine fuel for improved business performance. These organizations have incorporated data quality in the IT charter, and data cleansing is typically performed downstream by departmentlevel IT shops or in a data warehouse by commercial data quality software. Processes include: Record-based batch cleansing (e.g., name/address) Identification Matching Weeding out duplicates Standardization These processes mend data sufficiently for strategic and tactical decision making. Our research indicates that 15% to 20% of enterprises fit this profile. To reach the next data quality echelon, these organizations should implement forms of data management policy enforcement to stem data quality problems at a business process level. In addition, they should concentrate on moving beyond the onetime repair of glaring data quality problems and simple edits to continuous monitoring and remediation of data closer to the source of input. For example, leading spend management organizations deploy automated solutions that automatically classify spend data as it is put into the system. Level 4: Managed Organizations in this penultimate data quality maturity level view data as a critical component of the IT portfolio. They consider data quality to be a principal IT function and one of their major responsibilities. Accordingly, data quality is regularly measured and monitored for accuracy, completeness, and integrity at an enterprise level, across systems. Data quality is concretely linked to business issues and process performance. Most cleansing and standardization functions are performed at the source (i.e., where data is generated, captured, or received), and item master record data quality monitoring is performed on an international level. These organizations now have rigorous, yet flexible, data quality processes that make incorporating new data sources and snaring and repairing unforeseen errors straightforward, if not seamless. Data quality functions are built into major business applications, enabling confident operational decision making. Only 5% of enterprises have achieved this level of data quality-related information maturity. Evolving to the pinnacle of data quality excellence demands continued institutionalization of data quality practices. 20
22 Level 3: Proactive Moderate master data maturity can be ascribed to organizations that perceive master data as a genuine fuel for improved business performance. These organizations have incorporated data quality in the IT charter, and data cleansing is typically performed downstream by departmentlevel IT shops or in a data warehouse by commercial data quality software. Processes include: Record-based batch cleansing (e.g., name/address) Identification Matching Weeding out duplicates Standardization Figure 9 Key Data Quality Characteristics Accuracy: A measure of information correctness Consistency: A measure of semantic standards being applied Completenes: A measure of gaps within a record Entirety: A measure of the quantity of entities or events captured versus those universally available Breadth: A measure of the amount of information captured about an entity or event Depth: A measure of the amount of entity or event history/versioning Precision: A measure of exactness Latency: A measure of how current a record is Scarcity: A measure of how rare an item of information is Redundancy: A measure of unnecessary information repetition Source: META Group These processes mend data sufficiently for strategic and tactical decision making. Our research indicates that 15% to 20% of enterprises fit this profile. To reach the next data quality echelon, these organizations should implement forms of data management policy enforcement to stem data quality problems at a business process level. In addition, they should concentrate on moving beyond the onetime repair of glaring data quality 21
23 Bottom Line Clean, Reliable Master Data Enables Successful Enterprise Initiatives As organizations consolidate ERP systems, engage in strategic sourcing or launch enterprise spend management initiatives, they find that the efficiency and accuracy of their business processes and reporting are dependent on the item master data. More than just good housekeeping, a methodical and automated approach to cleansing, classifying, and enriching item master data lays the foundation for the continuing success of many enterprise initiatives. Master Data Rationalization Is Required to Ensure Master Data Quality There are many approaches to attaining master data quality. Some systems rely on field-level validations and some use workflow for review and approval, while others combine techniques in an ad hoc fashion. However, without the consistent, systematic approach of master data rationalization, our research shows that these techniques fail to deliver the level of consistency and quality needed for ongoing operations. Building Master Data Rationalization Into ERP Consolidation Planning Few organizations and systems integrators dedicate enough attention and resources to master data rationalization in their ERP consolidation planning. Successful organizations will plan far ahead of the small window in the schedule allotted to the master data load and will plan for master data rationalization with an experienced service provider. Once the data is loaded and go-live is reached, it is too late to rethink the impact of poor master data quality. Master Data Rationalization Is a Key Component in Achieving Data Quality Maturity Our research shows that maturity of organizational master data quality practices varies greatly, from the most basic but not uncommon state of master data chaos, to the rare case of pervasive, real-time, high-quality master data. Organizations should understand where they are in the master data maturity model and chart a path to achieving an optimized level of master data quality maturity a level where they will be able to exploit spend data on a real-time basis to drive continual improvements in supply side processes. Key to this evolution is the implementation of automated processes for the cleansing, enrichment, and maintenance of master data. Bruce Hudson is a program director, Barry Wilderman is a senior vice president, and Carl Lehmann is a vice president with Enterprise Application Strategies, a META Group advisory service. For additional information on this topic or other META Group offerings, contact [email protected]. 22
24 About META Group Return On Intelligence SM META Group is a leading provider of information technology research, advisory services, and strategic consulting. Delivering objective and actionable guidance, META Group s experienced analysts and consultants are trusted advisors to IT and business executives around the world. Our unique collaborative models and dedicated customer service help clients be more efficient, effective, and timely in their use of IT to achieve their business goals. Visit metagroup.com for more details on our high-value approach. 208 Harbor Drive Stamford, CT (203) Fax (203) metagroup.com Copyright 2004 META Group, Inc. All rights reserved.
Material Master Data Management
WHITEPAPER The Business Benefits of Material Master Data Management INTRODUCTION Master data is the core reference data that describes the fundamental dimensions of business customer, material, vendor,
HANDLING MASTER DATA OBJECTS DURING AN ERP DATA MIGRATION
HANDLING MASTER DATA OBJECTS DURING AN ERP DATA MIGRATION Table Of Contents 1. ERP initiatives, the importance of data migration & the emergence of Master Data Management (MDM)...3 2. 3. 4. 5. During Data
HANDLING MASTER DATA OBJECTS DURING AN ERP DATA MIGRATION
HANDLING MASTER DATA OBJECTS DURING AN ERP DATA MIGRATION Table Of Contents 1. ERP initiatives, the importance of data migration & the emergence of Master Data Management (MDM)...3 2. During Data Migration,
SAP's MDM Shows Potential, but Is Rated 'Caution'
Products, A. White, D. Hope-Ross Research Note 16 September 2003 SAP's MDM Shows Potential, but Is Rated 'Caution' SAP's introduction of its Master Data Management solution shows it recognizes that maintaining
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation Market Offering: Package(s): Oracle Authors: Rick Olson, Luke Tay Date: January 13, 2012 Contents Executive summary
EMC PERSPECTIVE Enterprise Data Management
EMC PERSPECTIVE Enterprise Data Management Breaking the bad-data bottleneck on profits and efficiency Executive overview Why are data integrity and integration issues so bad for your business? Many companies
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff The Challenge IT Executives are challenged with issues around data, compliancy, regulation and making confident decisions on their business
Featuring a White Paper from Priya Kamani, M.D. of Neoforma, Inc. June 2004. Sponsored by. Thought Leadership From MRI. Montgomery Research
Featuring a White Paper from Priya Kamani, M.D. of Neoforma, Inc. June 2004 Thought Leadership From MRI Montgomery Research Sponsored by 2 ASCET Vision Hospital Supply Chain Savings Priya Kamani Hospital
Enabling Data Quality
Enabling Data Quality Establishing Master Data Management (MDM) using Business Architecture supported by Information Architecture & Application Architecture (SOA) to enable Data Quality. 1 Background &
Master Data Management Framework: Begin With an End in Mind
S e p t e m b e r 2 0 0 5 A M R R e s e a r c h R e p o r t Master Data Management Framework: Begin With an End in Mind by Bill Swanton and Dineli Samaraweera Most companies know they have a problem with
Data Governance for ERP Projects
Data Governance for ERP Projects Adopting the Best Practices for Ongoing Data Management A whitepaper by Verdantis Data Governance has emerged as the point of convergence for people, technology and process
Choosing the Right Master Data Management Solution for Your Organization
Choosing the Right Master Data Management Solution for Your Organization Buyer s Guide for IT Professionals BUYER S GUIDE This document contains Confidential, Proprietary and Trade Secret Information (
Spend Enrichment: Making better decisions starts with accurate data
IBM Software Industry Solutions Industry/Product Identifier Spend Enrichment: Making better decisions starts with accurate data Spend Enrichment: Making better decisions starts with accurate data Contents
Knowledge Base Data Warehouse Methodology
Knowledge Base Data Warehouse Methodology Knowledge Base's data warehousing services can help the client with all phases of understanding, designing, implementing, and maintaining a data warehouse. This
An Enterprise Resource Planning Solution for Mill Products Companies
SAP Thought Leadership Paper Mill Products An Enterprise Resource Planning Solution for Mill Products Companies Driving Operational Excellence and Profitable Growth Table of Contents 4 What It Takes to
Copyright 2000-2007, Pricedex Software Inc. All Rights Reserved
The Four Pillars of PIM: A white paper on Product Information Management (PIM) for the Automotive Aftermarket, and the 4 critical categories of process management which comprise a complete and comprehensive
Informatica Master Data Management
Informatica Master Data Management Improve Operations and Decision Making with Consolidated and Reliable Business-Critical Data brochure The Costs of Inconsistency Today, businesses are handling more data,
An Oracle White Paper April, 2012. Spend Management Best Practices: A Call for Data Management Accelerators
An Oracle White Paper April, 2012 Spend Management Best Practices: A Call for Data Management Accelerators Table of Contents Overview... 1 Analytics Best Practices... 2 The Importance of Spend Management
Knowledgent White Paper Series. Developing an MDM Strategy WHITE PAPER. Key Components for Success
Developing an MDM Strategy Key Components for Success WHITE PAPER Table of Contents Introduction... 2 Process Considerations... 3 Architecture Considerations... 5 Conclusion... 9 About Knowledgent... 10
Using Master Data in Business Intelligence
helping build the smart business Using Master Data in Business Intelligence Colin White BI Research March 2007 Sponsored by SAP TABLE OF CONTENTS THE IMPORTANCE OF MASTER DATA MANAGEMENT 1 What is Master
Framework for Data warehouse architectural components
Framework for Data warehouse architectural components Author: Jim Wendt Organization: Evaltech, Inc. Evaltech Research Group, Data Warehousing Practice. Date: 04/08/11 Email: [email protected] Abstract:
Agile Master Data Management A Better Approach than Trial and Error
Agile Master Data Management A Better Approach than Trial and Error A whitepaper by First San Francisco Partners First San Francisco Partners Whitepaper Executive Summary Market leading corporations are
Leveraging the power of UNSPSC for Business Intelligence
Paper No. Satyam/DW&BI/00 6 A Satyam White Paper Leveraging the power of UNSPSC for Business Intelligence Author: Anantha Ramakrishnan [email protected] Introduction The Universal Standard Products
Effective Master Data Management with SAP. NetWeaver MDM
Andy Walker, Jagadeesh Ganapathy Effective Master Data Management with SAP NetWeaver MDM Bonn Boston Contents at a Glance PART I MDM Business Background and Skills 1 Introducing MDM Concepts and Definitions...
Enterprise Information Management Services Managing Your Company Data Along Its Lifecycle
SAP Solution in Detail SAP Services Enterprise Information Management Enterprise Information Management Services Managing Your Company Data Along Its Lifecycle Table of Contents 3 Quick Facts 4 Key Services
Issue in Focus: Consolidating Design Software. Extending Value Beyond 3D CAD Consolidation
Issue in Focus: Consolidating Design Software Extending Value Beyond 3D CAD Consolidation Tech-Clarity, Inc. 2012 Table of Contents Introducing the Issue... 3 Consolidate Upstream from Detailed Design...
Why is Master Data Management getting both Business and IT Attention in Today s Challenging Economic Environment?
Why is Master Data Management getting both Business and IT Attention in Today s Challenging Economic Environment? How Can You Gear-up For Your MDM initiative? Tamer Chavusholu, Enterprise Solutions Practice
Steel supply chain transformation challenges Key learnings
IBM Global Business Services White Paper Industrial Products Steel supply chain transformation challenges Key learnings 2 Steel supply chain transformation challenges Key learnings Introduction With rising
Data Management Roadmap
Data Management Roadmap A progressive approach towards building an Information Architecture strategy 1 Business and IT Drivers q Support for business agility and innovation q Faster time to market Improve
META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING
META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 [email protected]
Build an effective data integration strategy to drive innovation
IBM Software Thought Leadership White Paper September 2010 Build an effective data integration strategy to drive innovation Five questions business leaders must ask 2 Build an effective data integration
OPTIMUS SBR. Optimizing Results with Business Intelligence Governance CHOICE TOOLS. PRECISION AIM. BOLD ATTITUDE.
OPTIMUS SBR CHOICE TOOLS. PRECISION AIM. BOLD ATTITUDE. Optimizing Results with Business Intelligence Governance This paper investigates the importance of establishing a robust Business Intelligence (BI)
Business Intelligence
Transforming Information into Business Intelligence Solutions Business Intelligence Client Challenges The ability to make fast, reliable decisions based on accurate and usable information is essential
ENTERPRISE MANAGEMENT AND SUPPORT IN THE INDUSTRIAL MACHINERY AND COMPONENTS INDUSTRY
ENTERPRISE MANAGEMENT AND SUPPORT IN THE INDUSTRIAL MACHINERY AND COMPONENTS INDUSTRY The Industrial Machinery and Components Industry Manufacturers in the industrial machinery and components (IM&C) industry
Executive Summary... 2. Sales Reps And Operations Professionals Need Rich Customer Data To Meet Revenue Targets... 3
Executive Summary... 2 Sales Reps And Operations Professionals Need Rich Customer Data To Meet Revenue Targets... 3 Lack Of Accurate, Timely, And Actionable Customer Data Makes Goal Attainment Difficult...
Verdantis Material Master Data Management Delivering standardized, de-duplicated & enriched Material Master
Verdantis Material Master Data Management Delivering standardized, de-duplicated & enriched Material Master Clean, consolidated, enriched and maintained material master data is the foundation for optimized
Supply Chain Optimization
White Paper Supply Chain Optimization Maintenance Repair and Operations (MRO) supply chain, when given serious consideration at all, is frequently seen as a challenge a high-cost, but necessary evil to
What to Look for When Selecting a Master Data Management Solution
What to Look for When Selecting a Master Data Management Solution What to Look for When Selecting a Master Data Management Solution Table of Contents Business Drivers of MDM... 3 Next-Generation MDM...
SAP Thought Leadership Business Intelligence IMPLEMENTING BUSINESS INTELLIGENCE STANDARDS SAVE MONEY AND IMPROVE BUSINESS INSIGHT
SAP Thought Leadership Business Intelligence IMPLEMENTING BUSINESS INTELLIGENCE STANDARDS SAVE MONEY AND IMPROVE BUSINESS INSIGHT Your business intelligence strategy should take into account all sources
Strategic Data Governance
Authoring Success into Your Data Strategic Data Governance Strategic Data Advisory Data Cleansing, Enrichment, Migration Data Governance Advisory About Strategic Data Governance LLP SDG is a global data
BRIDGE. the gaps between IT, cloud service providers, and the business. IT service management for the cloud. Business white paper
BRIDGE the gaps between IT, cloud service providers, and the business. IT service management for the cloud Business white paper Executive summary Today, with more and more cloud services materializing,
Top 10 Root Causes of Data Quality Problems. White Paper
Top 10 Root Causes of Data Quality Problems White Paper Table of Contents #1 - Typographical Errors and Non-Conforming Data... 3 #2 - Information Obfuscation... 4 #3 - Renegade IT and Spreadmarts... 5
Enterprise Data Management for SAP. Gaining competitive advantage with holistic enterprise data management across the data lifecycle
Enterprise Data Management for SAP Gaining competitive advantage with holistic enterprise data management across the data lifecycle By having industry data management best practices, from strategy through
MDM and Data Warehousing Complement Each Other
Master Management MDM and Warehousing Complement Each Other Greater business value from both 2011 IBM Corporation Executive Summary Master Management (MDM) and Warehousing (DW) complement each other There
A discussion of information integration solutions November 2005. Deploying a Center of Excellence for data integration.
A discussion of information integration solutions November 2005 Deploying a Center of Excellence for data integration. Page 1 Contents Summary This paper describes: 1 Summary 1 Introduction 2 Mastering
Masterminding Data Governance
Why Data Governance Matters The Five Critical Steps for Data Governance Data Governance and BackOffice Associates Masterminding Data Governance 1 of 11 A 5-step strategic roadmap to sustainable data quality
E N T E R P R I S E D A T A M A N A G E M E N T & LEVERAGING SAP S EIM SOLUTION
E N T E R P R I S E D A T A M A N A G E M E N T & LEVERAGING SAP S EIM SOLUTION Preparing for ERP, Customer Insight, and Merger & Acquisition Activity AN EXECUTIVE SUMMARY WHITE PAPER Authored by: John
WHITEPAPER. Creating and Deploying Predictive Strategies that Drive Customer Value in Marketing, Sales and Risk
WHITEPAPER Creating and Deploying Predictive Strategies that Drive Customer Value in Marketing, Sales and Risk Overview Angoss is helping its clients achieve significant revenue growth and measurable return
Realizing the True Power of Insurance Data: An Integrated Approach to Legacy Replacement and Business Intelligence
Realizing the True Power of Insurance Data: An Integrated Approach to Legacy Replacement and Business Intelligence Featuring as an example: Guidewire DataHub TM and Guidewire InfoCenter TM An Author: Mark
Introduction to Business Intelligence
IBM Software Group Introduction to Business Intelligence Vince Leat ASEAN SW Group 2007 IBM Corporation Discussion IBM Software Group What is Business Intelligence BI Vision Evolution Business Intelligence
IBM Software A Journey to Adaptive MDM
IBM Software A Journey to Adaptive MDM What is Master Data? Why is it Important? A Journey to Adaptive MDM Contents 2 MDM Business Drivers and Business Value 4 MDM is a Journey 7 IBM MDM Portfolio An Adaptive
Data Virtualization A Potential Antidote for Big Data Growing Pains
perspective Data Virtualization A Potential Antidote for Big Data Growing Pains Atul Shrivastava Abstract Enterprises are already facing challenges around data consolidation, heterogeneity, quality, and
Master data deployment and management in a global ERP implementation
Master data deployment and management in a global ERP implementation Contents Master data management overview Master data maturity and ERP Master data governance Information management (IM) Business processes
ebook 4 Steps to Leveraging Supply Chain Data Integration for Actionable Business Intelligence
ebook 4 Steps to Leveraging Supply Chain Data Integration for Actionable Business Intelligence Content Introduction 3 Leverage a Metadata Layer to Serve as a Standard Template for Integrating Data from
PLM and ERP Integration: Business Efficiency and Value A CIMdata Report
PLM and ERP Integration: Business Efficiency and Value A CIMdata Report Mechatronics A CI PLM and ERP Integration: Business Efficiency and Value 1. Introduction The integration of Product Lifecycle Management
Supply Chains: From Inside-Out to Outside-In
Supply Chains: From Inside-Out to Outside-In Table of Contents Big Data and the Supply Chains of the Process Industries The Inter-Enterprise System of Record Inside-Out vs. Outside-In Supply Chain How
Fortune 500 Medical Devices Company Addresses Unique Device Identification
Fortune 500 Medical Devices Company Addresses Unique Device Identification New FDA regulation was driver for new data governance and technology strategies that could be leveraged for enterprise-wide benefit
Management Update: The Cornerstones of Business Intelligence Excellence
G00120819 T. Friedman, B. Hostmann Article 5 May 2004 Management Update: The Cornerstones of Business Intelligence Excellence Business value is the measure of success of a business intelligence (BI) initiative.
WHITE PAPER. The 7 Deadly Sins of. Dashboard Design
WHITE PAPER The 7 Deadly Sins of Dashboard Design Overview In the new world of business intelligence (BI), the front end of an executive management platform, or dashboard, is one of several critical elements
Tapping the benefits of business analytics and optimization
IBM Sales and Distribution Chemicals and Petroleum White Paper Tapping the benefits of business analytics and optimization A rich source of intelligence for the chemicals and petroleum industries 2 Tapping
CIC Audit Review: Experian Data Quality Enterprise Integrations. Guidance for maximising your investment in enterprise applications
CIC Audit Review: Experian Data Quality Enterprise Integrations Guidance for maximising your investment in enterprise applications February 2014 Table of contents 1. Challenge Overview 03 1.1 Experian
An Enterprise Resource Planning Solution (ERP) for Mining Companies Driving Operational Excellence and Sustainable Growth
SAP for Mining Solutions An Enterprise Resource Planning Solution (ERP) for Mining Companies Driving Operational Excellence and Sustainable Growth 2013 SAP AG or an SAP affi iate company. All rights reserved.
The Kroger Company: Transforming the Product Data Management Landscape
CASE STUDY The Kroger Company: Transforming the Product Data Management Landscape Executive Summary Challenge Evolving consumer expectations, e-commerce and regulatory requirements are driving the demand
DATA QUALITY MATURITY
3 DATA QUALITY MATURITY CHAPTER OUTLINE 3.1 The Data Quality Strategy 35 3.2 A Data Quality Framework 38 3.3 A Data Quality Capability/Maturity Model 42 3.4 Mapping Framework Components to the Maturity
DESIGNED FOR YOUR INDUSTRY. SCALED TO YOUR BUSINESS. READY FOR YOUR FUTURE. SAP INDUSTRY BRIEFING FOR HEATING, VENTILATION, AIR CONDITIONING, AND
DESIGNED FOR YOUR INDUSTRY. SCALED TO YOUR BUSINESS. READY FOR YOUR FUTURE. SAP INDUSTRY BRIEFING FOR HEATING, VENTILATION, AIR CONDITIONING, AND PLUMBING EQUIPMENT MANUFACTURERS BEST-RUN HVAC AND PLUMBING
Supply Chain Management Build Connections
Build Connections Enabling a business in manufacturing Building High-Value Connections with Partners and Suppliers Build Connections Is your supply chain responsive, adaptive, agile, and efficient? How
Introduction to Strategic Supply Chain Network Design Perspectives and Methodologies to Tackle the Most Challenging Supply Chain Network Dilemmas
Introduction to Strategic Supply Chain Network Design Perspectives and Methodologies to Tackle the Most Challenging Supply Chain Network Dilemmas D E L I V E R I N G S U P P L Y C H A I N E X C E L L E
PROCUREMENT: A Strategic Lever for Bottom Line Improvement
PROCUREMENT: A Strategic Lever for Bottom Line Improvement Headquarters 610 Old York Road Jenkintown, PA 19046 tel 877.935.ICGC fax 877.ICGC.339 www.icgcommerce.com white paper A Time for Impact With continued
Product Lifecycle Management in the Food and Beverage Industry. An Oracle White Paper Updated February 2008
Product Lifecycle Management in the Food and Beverage Industry An Oracle White Paper Updated February 2008 Product Lifecycle Management in the Food and Beverage Industry EXECUTIVE OVERVIEW Companies in
Cost-effective supply chains: Optimizing product development through integrated design and sourcing
Cost-effective supply chains: Optimizing product development through integrated design and sourcing White Paper Robert McCarthy, Jr., associate partner, Supply Chain Strategy Page 2 Page 3 Contents 3 Business
White Paper February 2009. IBM Cognos Supply Chain Analytics
White Paper February 2009 IBM Cognos Supply Chain Analytics 2 Contents 5 Business problems Perform cross-functional analysis of key supply chain processes 5 Business drivers Supplier Relationship Management
Successful Outsourcing of Data Warehouse Support
Experience the commitment viewpoint Successful Outsourcing of Data Warehouse Support Focus IT management on the big picture, improve business value and reduce the cost of data Data warehouses can help
How To Design An Invoice Processing And Document Management System
WIPRO CONSULTING SERVICES Business Methods Series Source to Pay: Transforming Processing and Document Management Paulo Jose Freixa Calhau Preto Senior Manager, Finance & Accounting Transformation Practice,
ROUTES TO VALUE. Business Service Management: How fast can you get there?
ROUTES TO VALUE Business Service : How fast can you get there? BMC Software helps you achieve business value quickly Each Route to Value offers a straightforward entry point to BSM; a way to quickly synchronize
Accenture Federal Services. Federal Solutions for Asset Lifecycle Management
Accenture Federal Services Federal Solutions for Asset Lifecycle Management Assessing Internal Controls 32 Material Weaknesses: identified in FY12 with deficiencies noted in the management of nearly 75%
Streamlining the Order-to-Cash process
Streamlining the Order-to-Cash process Realizing the potential of the Demand Driven Supply Chain through Order-to-Cash Optimization Introduction Consumer products companies face increasing challenges around
BEST PRACTICES IN DEMAND AND INVENTORY PLANNING
WHITEPAPER BEST PRACTICES IN DEMAND AND INVENTORY PLANNING for Food & Beverage Companies WHITEPAPER BEST PRACTICES IN DEMAND AND INVENTORY PLANNING 2 ABOUT In support of its present and future customers,
GxP Process Management Software. White Paper: Software Automation Trends in the Medical Device Industry
GxP Process Management Software : Software Automation Trends in the Medical Device Industry Introduction The development and manufacturing of a medical device is an increasingly difficult endeavor as competition
Mergers and Acquisitions: The Data Dimension
Global Excellence Mergers and Acquisitions: The Dimension A White Paper by Dr Walid el Abed CEO Trusted Intelligence Contents Preamble...............................................................3 The
Generating analytics impact for a leading aircraft component manufacturer
Case Study Generating ANALYTICS Impact Generating analytics impact for a leading aircraft component manufacturer Client Genpact solution Business impact A global aviation OEM and services major with a
Enterprise Resource Planning Analysis of Business Intelligence & Emergence of Mining Objects
Enterprise Resource Planning Analysis of Business Intelligence & Emergence of Mining Objects Abstract: Build a model to investigate system and discovering relations that connect variables in a database
Best Practices in Contract Migration
ebook Best Practices in Contract Migration Why You Should & How to Do It Introducing Contract Migration Organizations have as many as 10,000-200,000 contracts, perhaps more, yet very few organizations
Data Quality Assessment. Approach
Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source
The Ten How Factors That Can Affect ERP TCO
The Ten How Factors That Can Affect ERP TCO Gartner RAS Core Research Note G00172356, Denise Ganly, 1 February 2010, V1RA9 04082011 Organizations tend to focus on the what that is, the vendor or the product
Why enterprise data archiving is critical in a changing landscape
Why enterprise data archiving is critical in a changing landscape Ovum white paper for Informatica SUMMARY Catalyst Ovum view The most successful enterprises manage data as strategic asset. They have complete
The Advantages of a Golden Record in Customer Master Data Management. January 2015
The Advantages of a Golden Record in Customer Master Data Management January 2015 Anchor Software White Paper The Advantages of a Golden Record in Customer Master Data Management The term master data describes
Global Sourcing. Conquer the Supply Chain with PLM and Global Sourcing Solutions. Visibility Planning Collaboration Control
ENOVIA Global Sourcing Conquer the Supply Chain with PLM and Global Sourcing Solutions Visibility Planning Collaboration Control Direct materials sourcing (DMS) is seen as not just an important technology
Lean in Product Development
Lean in Product Development Key Strategies to Successfully Implement Lean Development and the Synergies with Advanced Product Quality Planning by Marc Lind 2008 Aras Corporation. All Rights Reserved. Contents
The Butterfly Effect on Data Quality How small data quality issues can lead to big consequences
How small data quality issues can lead to big consequences White Paper Table of Contents How a Small Data Error Becomes a Big Problem... 3 The Pervasiveness of Data... 4 Customer Relationship Management
Point of View: FINANCIAL SERVICES DELIVERING BUSINESS VALUE THROUGH ENTERPRISE DATA MANAGEMENT
Point of View: FINANCIAL SERVICES DELIVERING BUSINESS VALUE THROUGH ENTERPRISE DATA MANAGEMENT THROUGH ENTERPRISE DATA MANAGEMENT IN THIS POINT OF VIEW: PAGE INTRODUCTION: A NEW PATH TO DATA ACCURACY AND
Operational Excellence. Integrity management. Cost management. Profitability management. Knowledge management. Plan & Strategy EAM LIFE CYCLE
Industry specific EAM problem Asset intensive Companies, whether in the Downstream or Upstream business, are under ever increasing pressure to optimize the Life Cycle Performance of their asset base in
BUSINESS INTELLIGENCE. Keywords: business intelligence, architecture, concepts, dashboards, ETL, data mining
BUSINESS INTELLIGENCE Bogdan Mohor Dumitrita 1 Abstract A Business Intelligence (BI)-driven approach can be very effective in implementing business transformation programs within an enterprise framework.
