DATA MIGRATION MANAGEMENT. A Methodology: Sustaining data integrity after the go live and beyond WHITE PAPER
|
|
|
- Loreen Gaines
- 10 years ago
- Views:
Transcription
1 DATA MIGRATION MANAGEMENT A Methodology: Sustaining data integrity after the go live and beyond WHITE PAPER 1
2 ENTERPRISE DATA LIFECYCLE MANAGEMENT Data migration is a core practice within Utopia s Enterprise Data Lifecycle Management (EDLM) framework. EDLM is a comprehensive data architecture strategy that combines business processes, data processes, governance practices, and applied technologies. The result of the strategy is a framework of principles and practices to manage and extend the lifecycle of data across an enterprise from creation through archiving. By improving the quality, accessibility, and usability of data the breadth of its lifecycle is widened, increasing the financial returns of the data. The Data Migration Management methodology is an EDLM best practice that extends the lifecycle by moving and transforming data from aging legacy repositories to new leading edge systems. The same migration methodology is used to archive the data when it truly has reached its end of life, completing the cycle. The stages in the EDLM lifecycle are reflected in Figure 1: FIGURE 1: EDLM Stages Data migration projects can occur anywhere in the lifecycle, but the Creation and Integration stages are where the data is first captured and stored, and then integrated into the organization s information environment. Integrating the data entails moving it from temporary capture storage to a system repository, or migrating it from a source system to a new target. There are four dimensions to EDLM and they form the framework upon which organizational demands are placed and solution practices are devised and employed. The four dimensions are: LIFECYCLE: time and management of data as it ages PEOPLE: data governance, data stewards, business metrics and value, and usage PROCESSES: data flow processes, business workflow processes, standard operating procedures TECHNOLOGY: systems architecture, applications, repositories, integration layers, IT governance It is from those four dimensions that the Data Migration Management methodology was first conceived, then tested, and then tempered from one customer migration after another. When an EDLM framework is established within an organization the objective of the program is to transcend geographies, businesses, and functions to support enterprise-common data processes (i.e. migration) wherever they are needed. 2
3 THE CHALLENGES OF DATA MIGRATION PROJECTS Historically, for many companies data migration projects are often found to be plagued with risk, and had delayed go live events resulting in costly budget overruns and acrimonious recriminations. A frequently overlooked aspect of any new ERP system deployment, for example, is the integrity of the data, which the new business application executes against. Too often system owners forget the root reason for deploying the system: to house and manage the data. The existence of the system is, after all, justified by the data. Data Migration Should Therefore Be a High Priority in New System Deployments Traditionally, many systems integrators implementing a new system will prioritize their focus in this order of importance Application design Implementation planning Integration Change management Data migration How would you prioritize your company s last project and its associated success? For many projects, data migration is one of the last priorities, and systems integrators will often defer to the client the completeness and integrity of the data file for loading into the system. Indeed data migration is often viewed as simply transferring data between systems. Yet the business impact can be significant and detrimental to business continuity when proper data management is not applied during the migration. Without a thorough review and understanding of the business processes and considerations of how the data will be consumed the suboptimal performance of the old legacy system will be repeated in the new. Data Migration Scope Should be Inclusive The data migration process should encompass all business functions that consume data to insure it is properly populated in the new system. In essence, data must be orchestrated with business processes; table and field mappings must be validated, and data quality must be checked against business rules. The combination of risk ignorance and lack of skilled and experienced IT resources is why a data migration defaults to a file loading exercise for many teams. Suffice to say, the effort associated with data migration is often underestimated. Data Quality Should Be an Inherent Part of Data Migration When implementing a new ERP system legacy data must be migrated to ensure business continuity in ongoing business operations. During a data migration effort should be devoted to a data quality assessment and subsequent remediation or else the quality problems in the legacy systems will be propagated to the new. These processes include profiling, data standardization, de-duplication, and enrichment of the data to provide the most accurate content for continued business operations. Planning for Long-Term Data Quality is Critical to Realizing Your ROI The long-term view looks beyond the data migration event and seeks to govern and manage data long past the go live date. Sustainability of data quality is critical for ongoing process optimization as well as reporting integrity. It is important to not only get the data clean, but to keep it clean. Therefore the data migration event can be viewed as a convenient genesis for an EDLM strategy. Maintaining the quality of data input into the system after your go live is even more critical than at the point of initial data loading. 3
4 TOP ISSUES IDENTIFIED FOR DATA MIGRATION PROJECTS From our experience the top issues faced during a data migration project typically include: Poor quality data: sometimes the defects are known in the source systems, but always new and unknown deficiencies are uncovered after extraction. Missing data: it can be surprising how many mandatory fields in the source systems actually have high percentages of blanks, NULLs, or defaults. Mismatched data: field overuse is a classic problem. Sometimes two, three, or four different domains of data can be found in one field that was repurposed after its original use became obsolete. Data not available in time for go live: operational commitments and cycle times are misaligned with the system implementation project, and delay the entire deployment. Data requirements are not considered or communicated: business rules, data quality rules, and transformation rules are not sufficiently researched or documented to either the breadth or depth necessary for moving and consolidating multiple systems into one target. Lack of legacy system knowledge and understanding: this can also apply to the new target system. The client has insufficient staff to provide system guidance. These issues and others form the drivers for the use of the structured approach outlined in this methodology. By following the five phases of the Data Migration Management methodology the systems implementer will gain these benefits: Reduced migration cycle time. Following a proven, repeatable process avoids dead ends, project rework, and inefficient resource utilization. Reduced risk of system deployment. By knowing the steps to follow and adhering to the sound data quality procedures, trial data conversions, and iterative validations, the system implementer can accurately predict development periods and build a project plan with a high degree of certainty. The unknowns are drawn to the light and worked out of the system. Improved data quality. Not only is the data migrated in predictable manner, but it is loaded into the target system with a substantially higher degree of quality than when it was extracted from the sources. The methodology aims for zero percent defects by the completion of the Final Preparation phase. Enhanced replication of source data in the target system. Through the use of multiple stages of interrogative workshops with system stakeholders, hidden details of what additional data, obsolete data, or new reference data is mined from the organization and built into the migration plan. This ensures the original goal of the migration: loading of all applicable source data, is achieved. The data is found, extracted, and transformed. METHODOLOGY OVERVIEW Utopia s Data Migration Management methodology incorporates elements of the four dimensions of EDLM: lifecycle, people, processes, and technology to ensure the data is provisioned as required by the business. The methodology is aligned, in part, on SAPs ASAP implementation methodology. We do this for a number of reasons, one of which is to adopt standard nomenclature and phrases that are familiar to SAP ecosystem partners and clients. This methodology works equally well for non-sap environments as the fundamentals and processes for moving and managing data are universal. Data is data, whether you are moving it into an ERP system or a data warehouse. 4
5 The Data Migration Management methodology is based upon proven best practices and extensive experience in data analysis, harmonization, and migration. The methodology utilizes the SAP BusinessObjects Data Services platform and our own techniques to assess and improve the quality of source data before it is loaded into the target repository. Detecting and cleansing data upfront reduces the total cycle time of the data conversion, thereby reducing the total cost while enhancing the quality of the data being migrated into the target system. Methodology Phases There are five phases in the methodology, and they span an entire data migration project from the initial project scoping task, through trial data loads, to the final support period. Those phases are depicted in Figure 2. FIGURE 2: Five Phases of the Data Migration Management Methodology Figure 3 below depicts the major stages in each of the five methodology phases. Those phases and stages will be discussed in detail in the next sections in this white paper. FIGURE 3: The Major Stages within Each of the Five Methodology Stages Project Preparation Business Blueprint Name Phase Realization Final Preparation Go Live and Support Scoping Identify & Analyze Data Assess Source Data Data Quality Remediation Prepare Data Load Trial Data Loads Validate Results Load Tgt Sys Go-Live 5
6 GOVERNANCE AND ISSUES COMMUNICATION As the project moves from the Business Blueprint phase through the Go Live phase significant metadata and documentation is generated by the SAP BusinessObjects Data Services (DS) platform. These reports are used to provide a visualization capacity to the migration process. They aid in identifying potential data errors, and developing actionable outputs to improve data quality, monitor transformation trends, and enable data governance. As legacy source records are processed through the validation transforms to prepare the output files, data quality errors can be captured and reported. These field-level errors can be rolled up along a metrics hierarchy, which provides for analysis along quality vectors such as completeness, accuracy, formatting, etc. The extract and transformation jobs can also be analyzed by data objects (business information domains), such as customer, vendor, materials, etc. These metrics can be aggregated upwards to a top-level score, which provides a multilevel understanding of what data is ready to be loaded into the system. This beginning-to-end insight into the performance of the extract, transform, and data quality routines helps focus the migration process on problematic source feeds. By focusing process governance, and by extending the data governance effort on critical areas the duration and cost of the migration effort are reduced. Fewer governance and project resources will be inspecting high quality data feeds. Their energy and time will be spent on those that need attention instead. The Data Migration Management methodology, inherent process monitoring, and defect measuring operations lay the data governance foundation for when the system passes Go Live. Data governance, like data migration, is a critical EDLM practice. The EDLM framework that supports those and other practices identifies and builds capabilities that carry the entire data environment long after the initial migration project is complete. PROJECT PREPARATION PHASE The Project Preparation phase involves researching, scoping, and planning the early tasks that need to be completed in order for the design and implementation activities to commence. A number of documents are generated during this phase, notably are the project management plan, quality management plan, and resource management plan. Each phase in the Data Migration Management methodology is started and ended with a stage-gate. In order for the project to move into or out of a phase, specific criteria must be met. For the Project Preparation phase the starting stage-gate occurs when the organization managing the source systems determines a replacement is needed. The exiting stage-gate occurs when all of the applicable documentation and planning has been completed to the degree that allows the Business Blueprinting phase to begin. The Project Preparation phase is unique in that a pseudo stage-gate exists within its tasks series. Figure 4 below depicts the major activities in this phase. FIGURE 4: Project Preparation Phase Project Conception Project Preparation Phase Scoping Study Data Health Assessment Infrastructure & Target System Setup Migration Framework Setup High-Level Source to Tgt Mapping Formulate Data Conversion Approach Formulate Data Cleansing Strategy Formulate Cut-Over Strategy Ready for data extraction & ETL design Stage- Gate Blueprint Phase 6
7 The Infrastructure & Target System Setup task acts as a stage-gate. While usually considered its own major project, building the new target system, such as procuring the licenses, establishing the repository database, developing the data structure and model, etc., can run in parallel to the migration project. Building the target system places dependencies on the migration effort. One dependency is the target system must be at least partially built before data can be loaded into it. The Infrastructure & Target System Setup task effectively represents a link between the parallel projects and methodologies. One project will create and install the new system infrastructure and the other project will gather and load the data. In a best practices approach the two are inextricably connected, but each requires its own specialization. S C O P I N G S T U D Y Utopia s approach to data migration begins with performing a scoping study to assess the existing system environment and the level of effort to complete a successful project. The results of the study will provide a detailed data migration strategy to include: The data objects in scope. For the data warehouse, ERP, or MDM system. The requirements of how the data will be used in the target system. The criteria need to be evaluated to ensure the quality of the data is fit for its intended purpose? How the data will be migrated The timelines for the migration Estimate of the effort required Initiation of a Data Health Assessment to capture data quality statistics Determine what activities are necessary to prepare the data Data architecture and structures information: Where the data is currently used, how it is used, by who, for what purpose across the data lifecycle Collection of customer artifacts on source data structures Collection of target system artifacts, such as ERP or MDM system data structures Identification of target system downstream interfaces and use by whom DATA PROFILING THE DATA HEALTH ASSESSMENT Data quality has a significant impact on the performance and capabilities of the new system. Utopia assesses the data from the source systems as well the target application. Assessments are performed using Utopia s Data Health Assessment (DHA) process. The DHA measures data quality and identifies critical parameters including those shown in Figure 5. Figure 5: The Critical Parameters of a DHA 7
8 Using the results from the data discovery workshops and the DHA, migration developers will prepare a data migration strategy. A migration strategy takes into account not only the existing legacy data structures and the volumes of data, it also factors in the data requirements of the system being implemented. The discovery workshops and the DHA examine the structure and content of the legacy data systems to provide input to the Business Blueprint phase where high-level data requirements are compared to the discovery findings (metadata). BUSINESS BLUEPRINT PHASE The Business Blueprint phase focuses on charting the business processes surrounding the target system, interviewing the system and data stakeholders, and evaluating both the legacy system and target system for the items reflected in Figure 6: The data migration consultants interact with business process owners, system integrators, data owners, and the legacy technical team to achieve a detailed understanding of the data subject areas (objects) addressed by the migration project. Two types of interrogative discovery workshops are held with data stakeholders: legacy data investigation, and target system transformations and upload logic. The high-level outputs of the Business Blueprint phase are Data quality rules for legacy & target data Detailed data mappings Transformation rules Traceability requirements System cut-over plan Staged legacy data FIGURE 6: Business Blueprint Items of Focus These deliverables form the backbone of the data migration documentation. The Business Blueprint phase essentially collects and documents the metadata for the project and data objects. From these artifacts and discovery workshops a set of strategy and approach documents are created to guide the actual migration implementation performed in the Realization, Final Preparation, and Go Live phases. A second major output resulting from this phase is the staging of extracted data from the legacy source systems. Figure 7 depicts the key activities conducted during the Business Blueprint phase. Stage-Gate to Blueprint Phase FIGURE 7: Business Blueprint Phase Business Blueprint Phase Strategy and Requirements Track Object 1 Object X Object 1 through X For each object Legacy Data Workshop Create Conversion List Create Data Definitions Document Data Extraction Track Object 1 Object X Object 1 through X For each object Create Source to Tgt Mappings Extract Legacy Data Profile Legacy Data in Landing Area Transformation Rules and Upload Logic Workshop Filter Obsolete Data All Objects mapped, extracted, and profiled Define Reconciliation Guide Define Output File or ETL Load specs. Transfer Data to Staging Area Stage- Gate to Realization Phase Key Tasks that physically interact with the data Planning and definition tasks 8
9 The Business Blueprint phase loops through each data object or domain destined to be migrated to the target system. The first iterative process focuses on strategy and collecting legacy system information and metadata through workshops with client stakeholders. A list of attributes to be converted to the new system along with a corresponding data definitions document is created for each object. Once a strategy/investigation workshop for an object is completed the object can move to the Data Extraction track where the source to target mappings are created and the data is extracted from the legacy system. Once extracted to a landing area, typically in a flat file format, the critical task of data profiling is performed to verify the data is as expected. Data defects are cataloged, patterns identified, and the true field formats discovered. The data profile is a follow-on to the original DHA conducted during the early Project Preparation phase. The results of the data profile are key to constructing accurate transformation rules and filter logic to eliminate obsolete data. As a foundation for those transforms two separate data quality reports are generated: PROFILING REPORT A report for each object is created using the results of analysis and output from SAP BusinessObjects Data Services. A sample of the report metrics includes the percentage and number of records regarding uniqueness, completeness, cardinality, minimum and maximum lengths, domain values, etc. on the extracted legacy data. This report produces an initial assessment of the legacy data per the identified target structure before starting the transformation work. CLEANSING CERTIFICATE- reflects all the records that are reported as failing to meet quality criteria as established in profiling report. It is considered a detailed data analysis for each of the target objects. Cleansing Certificates are generated in multiple iterations. They are submitted to the project stakeholders for review. After reviewing the reports it is the responsibility of the legacy data owners to remediate the defective records in the legacy systems or provide business rules to rectify the problems to ensure they meet the target system requirements. Once the defective records are fixed the data will be extracted again and profiled for a second iteration of the Cleansing Certificate. This exercise will continue until zero defective records are attained for loading in the target system. An area of potential deviation in a migration project is in the Defining Output File or ETL Load Specs task. Some target systems such as SAP ECC have their own specialized load programs while others like an Oracle data warehouse are loaded directly by a commercial extract, transform, and load (ETL) solution. For targets such as SAP ECC the defined output file means the creation of legacy system migration workbenches (LSMW) or intermediate document (IDoc) files that are recognized and consumed by the ECC loader. For a generic data warehouse, this same task will specify the ETL load parameters for directly loading the data via a commercial ETL solution. The final task in the data extract track is the definition of the reconciliation guide. The data reconciliation plan/ process is an important check and balance step when migrating data. It verifies that the data is moved correctly. This is where quantities, totals, and other numeric sums are confirmed to match from the source system to the target. A sophisticated process is planned where a suite of test records is built to test every business rule and confirm a check exists to cause failure. If defect records can be created to cause the failure of all business rules then the test suite is certified to be 100% complete. This level of testing is particularly important for highly regulated industries such as banking and finance that need stringent documentation on their data test sets. REALIZATION PHASE The Realization phase is where data migration developers build ETL jobs as per the requirements and specifications created in the Business Blueprint phase. Business rules, data quality rules, and field validations are built into the ETL jobs. Configuration data, reference tables, and system data is pulled from the target or other master data systems for incorporation into the transformations and validations. ETL unit and integration testing is carried out by the migration team to ensure that the specified integration routines perform and interact as designed. The entire Realization phase is depicted in Figure 8 and shows an ETL job, potentially multiple jobs, created for each of the target system objects. Within each job data quality and transformation rules are collected and coded. Unit tests are defined and the entire job or suite of jobs is run. The sole purpose of some of these jobs will be to cleanse and transform the extracted data, while other jobs will run as load routines in the trial conversion stages of the Final 9
10 Preparation phase. The purpose of running the data quality and transformation jobs is to achieve a zero defect rate on the data by the time it leaves the Realization phase. This means that the ETL job loop may be executed six, seven, or eight times in order to drive the error rate down to zero. The class of defects corrected in this loop is at the field level. Additional defects will be uncovered during the trial data conversion stages that relate to cross-object testing. Stage-Gate to Realization Phase FIGURE 8: Realization Phase Realization Phase Object 1 Object X Object 1 Through X ETL Job for Object X Collect Data Quality Criteria and Rules Develop DQ transforms Records Requiring Manual Intervention Delivered to SMEs Design ETL Routines Code ETL Routines Define Unit Tests Test ET and DQ Routines Adjustment Per Testing Analyze Test Results, Run Cleansing and Transformation again? ETL Routine Ready for Trial Load Data into Staging Area Stage- Gate to Final Prep Phase The stage-gate to leaving the Realization phase is having all transformation and data quality defects implemented that can be with programmatic fixes. Records and fields that require manual intervention will be delivered to system subject matter experts (SMEs) for remediation. A goal is to deliver the records requiring the manual fixes to the SMEs as early in the process as possible. Once the ETL jobs for all objects have been created the jobs that are destined to build the output file or load set are tested on the development version of the target system. The testing is conducted through a series of pre-planned trial data conversions (TDCs) in the Final Preparation phase. A typical ETL job developed in the Realization Phase will have some if not all of the following features. 1. A configuration file, which is used to declare global variables and set rules is applied throughout the job. It is used to control job flow and improve developer productivity by reducing duplication of codes in a job Movement of extract data to a staging server. Mappings of the source file to target structure. Data validation transforms to convert legacy data to the target data format. These validation routines will include: i) Mandatory Checks checks to confirm critical data is not blank, and that mandatory fields are 10
11 populated per specific business rules governing that field. ii) Data Length Check checks the source data length as per the target structure iii) Data Format Check checks the format of data, which includes numeric check, date format check, pattern check, etc. iv) Reference Table Check input record fields will be compared to specified reference tables, and defective records logged as deficient. v) Duplicate Check identifies duplicates at record level, as well as column level. There are two levels of duplicates: complete duplicate and potential duplicate. Complete duplicates are those records which are exactly identical. Potential duplicates are records which are similar based on crossing a matching threshold as specified per the business requirements. These are implemented using Match Transform in SAP BusinessObjects. Removal of records from the source, which fails the validation rules. These failed records are provided to the data and system stakeholders for review and remediation before the next load iteration. Application of all business and enrichment rules on the source data. Generation of the output files as per the source to target mapping specification. Exportation of the output files from staging server to the target system through the use of DS or a specialized application load facility. LSMW or IDocs as in the case of SAP ECC. DATA CLEANSING, ENSURING BUSINESS READY DATA Achieving a sufficient level of quality in your business ready data is the most crucial part of a successful migration project and is a key aspect of the Realization phase. Bypassing data cleansing and directly loading the master data objects, no matter the quality, perpetuates legacy system issues and shifts the quality problems from pre-production to post-production implementation. This raises the cost to correct the data and introduces new data issues into downstream applications. In order to pass from the Realization phase to the Final Preparation phase the master data must adhere to the following: Each master data record is itself error free Master data values are in the expected range of allowed entries (e.g. units of measure can be liter and ml, but not cubic meter ) Inconsistencies in the data are cleansed (e.g. data duplicates are removed) Master data attributes to execute a business process are accurate within the context of the process Efficient and effective data cleansing can only be implemented when close cooperation exists between the business units responsible for the maintenance of the master data. As part of the EDLM framework, data standards are applied so that consistent transformation can occur. This iterative process is where IT and business become partners in solving systemic legacy data quality, normalization, and duplication issues. In the Business Blueprint phase workshops are conducted to collect and validate existing business rules and to establish additional business rules to ensure long-term data quality in the target system. FINAL PREPARATION PHASE The goal of the Final Preparation phase is to achieve business ready data for production loading into the target system. The data will be in the defined output file format that is ready for direct loading via DS, or as LSMW or IDoc files. The methodology accomplishes data readiness by executing the ETL jobs (created in the Realization phase) in a series of three TDCs. They are performed on all three types of data: master, transactional, and historical, and are loaded into dev/test or pre-production versions of the target system. Upon successful testing completion of the TDCs the data migration jobs are approved and the project enters the Go Live phase. 11
12 FIGURE 9: Final Preparation Phase Stage-Gate to Final Preparation Phase Final Preparation Phase TDC1 Preparation Execution Run (Practice with Master Data) Load Validation Adjust ETL Routines TDC2 Preparation Execution Run (with Tested Master Data) Execution Run (Practice Transactional Data) Load Validation Adjust ETL Routines GLS (Go-Live Simulation, TD3) Preparation Execution Run (with Tested Master Data) Execution Run (with tested Transactional Data) Execution Run (Practice with Historical Data) Load Validation Adjust ETL Routines Stage- Gate to Go Live Phase As Figure 9 above portrays, the focus of the first TDC (TDC1) is to test-execute the ETL jobs (Master Data Uploads) used for loading just master data. The second TDC (TDC2) builds upon the results of the first TDC by executing the tested Master Data Upload routines, and then running the Transactional Data Upload routines. Testing and load validation is performed between the Master and Transactional Uploads to verify the transactional data did not otherwise disrupt or corrupt the master data. When the two upload routines are successfully load tested together, the Final Preparation phase moves into the Go Live Simulation where tested Master and Transactional Uploads are run in addition to Historical Data Uploads. In addition to running and testing all three uploads together against a pre-production copy of the target system, the Go Live (cut-over) strategy is tested. The goal of this testing is to prove the cut-over can be performed in the time frame allotted and that all steps are documented and agreed upon. As part of each Load Validation, ETL jobs developed for upload reconciliation are executed. These jobs consist of reconciliation rules for a particular target data object and their logical relationships to other objects. Apart from the intra object reconciliation, basic reconciliation tests are performed such as: Verifying the number of actual records in the output is equal to the expected number of records Calculations generated from the legacy system match those generated from the target system, such as the total in three financial accounts might have a value of $52,000 in the source files the same value should be derived from the output files staged and ready for loading. A successful Final Preparation phase reduces the risk of a delayed target system startup and seeks zero interruptions in business operations. GO LIVE & SUPPORT During the Go Live phase the TDC-tested ETL jobs are executed on the production target and the cut-over process is commenced. First the master data is loaded and validated, followed by the transaction data. As can be seen in Figure 10 historical data is then loaded. These three loads were tested and effectively duplicated in the Final Preparation phase through TDC1, TDC2, and TDC3 with the goal of making the Go Live phase a non-event with no surprises. Following the production loads the reconciliation tests first developed in the Business Blueprint phase are run and their results analyzed. Reconciliation tests check for amongst other things that totals, calculations, and 12
13 other business operations conducted in the legacy systems are duplicated with the same results when run in the new production system. The target system stakeholders then sign-off on the results once any post-test adjustments are made. Depending on the transactional volumes in the legacy systems and any latency in the cut-over period there may be a requirement for delta uploads. These uploads will be accomplished as a second Transactional Data Upload. Stage-Gate to Go-Live Phase FIGURE 10: Go Live (Cut-Over) Phase Go-Live (Cut-Over) Phase Cut-Over Preparation Master Data Upload Transactional Data Upload Historical Data Upload Run Reconciliation Tests Adjustment and Closure Post Go-Live Support Migration Complete POST GO LIVE SUPPORT In the Post Go Live support period training, editing of operational guides, and ETL-code consulting is provided. Planning towards the next migration project is also discussed. This is important because as a risk mitigation tactic for large system deployments as a subset of the total number of data objects may have been migrated. Indeed, the future of further data domain roll-outs may depend on the success of the first migration/roll-out. In these subsequent migration projects substantial benefit can be gained by leveraging the processes established by the Data Migration Management methodology during the first Blueprinting, Realization, and other phases. While much of the source extract code and source mappings may not be reusable, knowledge of the target system, business rules, data validation rules, environment metadata, etc, will be invaluable. The existence of unit test procedures, data profiling processes, SAP BusinessObjects Data Services infrastructure, and how to conduct Final Preparation TDCs will substantially accelerate migration of subsequent objects and reduce implementation risk. What is said for migrating subsequent objects holds true for future mergers and acquisitions (M&A) projects, but on a larger scale. M&A are a leading driver for many large system migrations. In one M&A project this author worked on the acquired firm had 115 different systems and applications to be replaced, hosted, or migrated to the buyer s systems. Moreover, an enterprise that has chosen to grow via acquisitions is likely to conduct multiple M&A operations in a year. This level of a data environment places extreme stress on the host organization s data integration capabilities. That stress often drives the creation of data integration/migration competency center. Those centers use the Data Migration Management methodology as the foundation for their operations. DATA HEALTH ASSESSMENT POST GO LIVE Sustaining data integrity throughout the life cycle of both the data and enterprise system is a key factor in maintaining and improving a firm s cost structure. If the data degrades, suboptimal decisions are made, such as erroneous inventory restocking causing undue expenses. The data profiling tests and infrastructure established in the Business Blueprint phase can be repurposed to monitor the health of the target system. This is of particular importance as data inflows to the ERP, CRM, or other targets, are constant and pose a continual challenge of defect leakage. Even a relatively low percentage (five percent or less) of defective records in a non-monitored data stream can cause significant operational headaches for the managers of the target system. Identifying, isolating, and remediating those defects once they are in the system is two to three times more expensive than if they were intercepted as part of the input. The processes employed by the DHA can be implemented on the input data streams, leveraging the business rules, validation checks, and metadata captured during the Blueprint phase. This way the investment in the data migration framework can pay dividends well beyond the completion of the original project. 13
14 SUMMARY Data migration is a key event for your enterprise. By recognizing and preparing for the critical impact that data plays during a large system roll out significant risk and costly overruns can be mitigated and eliminated. Additionally, the sustainable management of data via an EDLM strategy will not only support your data migration event, but also yield significant process improvement and financial benefits over the long term. Embracing the five phases of the Data Migration Management methodology is simple. Deploying the methodology with rigor demands an organizational commitment of operational excellence, and with that commitment the methodology delivers a new system with quality data. UTOPIA CORPORATE PROFILE Utopia, Inc. is a global services and strategy consulting firm focused exclusively on Enterprise Data Lifecycle Management (EDLM). EDLM is an umbrella strategy that combines business processes, people, and applied technologies to manage and improve the lifecycle of data across an enterprise (from creation through archiving). Net outcomes of EDLM are business process optimization, hard dollar savings and reporting integrity Utopia s offerings range from enterprise data strategy and systems integration to data migration, data quality, and data governance services. We help our clients reduce overall costs and improve efficiencies by enabling them to manage and sustain (structured and unstructured) data as a key asset. We serve customers in a variety of industries including: oil & gas, utilities, process and discrete manufacturing, consumer packaged goods, transportation, engineering and construction, distribution, telecom, healthcare, and financial services where large volumes of data often exist in disparate systems, multiple locations and inconsistent formats. We are an approved SAP, SAP Business Objects, SAP Consulting, and Open Text partner with satisfied customers worldwide. The company is headquartered in the Chicagoland area with offices in Dubai, Singapore, and Bangalore. White paper authored by: Frank Dravis, Solutions Consultant Utopia, Inc. Headquarters: 405 Washington Boulevard Suite 203 Mundelein, Illinois USA Phone Web [email protected] 2009 Utopia, Inc. All Rights Reserved. 07/
ENTERPRISE ASSET MANAGEMENT (EAM) The Devil is in the Details CASE STUDY
ENTERPRISE ASSET MANAGEMENT (EAM) The Devil is in the Details CASE STUDY 1 EXECUTIVE SUMMARY Enterprise Asset Management (EAM) is a strategy to provide an optimal approach for the management of the physical
IBM InfoSphere Information Server Ready to Launch for SAP Applications
IBM Information Server Ready to Launch for SAP Applications Drive greater business value and help reduce risk for SAP consolidations Highlights Provides a complete solution that couples data migration
E N T E R P R I S E D A T A M A N A G E M E N T & LEVERAGING SAP S EIM SOLUTION
E N T E R P R I S E D A T A M A N A G E M E N T & LEVERAGING SAP S EIM SOLUTION Preparing for ERP, Customer Insight, and Merger & Acquisition Activity AN EXECUTIVE SUMMARY WHITE PAPER Authored by: John
POLAR IT SERVICES. Business Intelligence Project Methodology
POLAR IT SERVICES Business Intelligence Project Methodology Table of Contents 1. Overview... 2 2. Visualize... 3 3. Planning and Architecture... 4 3.1 Define Requirements... 4 3.1.1 Define Attributes...
Mergers and Acquisitions: The Data Dimension
Global Excellence Mergers and Acquisitions: The Dimension A White Paper by Dr Walid el Abed CEO Trusted Intelligence Contents Preamble...............................................................3 The
... Foreword... 17. ... Preface... 19
... Foreword... 17... Preface... 19 PART I... SAP's Enterprise Information Management Strategy and Portfolio... 25 1... Introducing Enterprise Information Management... 27 1.1... Defining Enterprise Information
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation Market Offering: Package(s): Oracle Authors: Rick Olson, Luke Tay Date: January 13, 2012 Contents Executive summary
Enterprise Data Quality
Enterprise Data Quality An Approach to Improve the Trust Factor of Operational Data Sivaprakasam S.R. Given the poor quality of data, Communication Service Providers (CSPs) face challenges of order fallout,
Enterprise Information Management Services Managing Your Company Data Along Its Lifecycle
SAP Solution in Detail SAP Services Enterprise Information Management Enterprise Information Management Services Managing Your Company Data Along Its Lifecycle Table of Contents 3 Quick Facts 4 Key Services
Data Management Roadmap
Data Management Roadmap A progressive approach towards building an Information Architecture strategy 1 Business and IT Drivers q Support for business agility and innovation q Faster time to market Improve
Proven Testing Techniques in Large Data Warehousing Projects
A P P L I C A T I O N S A WHITE PAPER SERIES A PAPER ON INDUSTRY-BEST TESTING PRACTICES TO DELIVER ZERO DEFECTS AND ENSURE REQUIREMENT- OUTPUT ALIGNMENT Proven Testing Techniques in Large Data Warehousing
Data Migration Management. White Paper METHODOLOGY: SUSTAINING DATA QUALITY AFTER THE GO-LIVE AND BEYOND
White Paper Data Migration Management METHODOLOGY: SUSTAINING DATA QUALITY AFTER THE GO-LIVE AND BEYOND Introduction Data Migration is a core practice under the NebuLogic Data Management Services (NDMS)
Data Quality Assessment. Approach
Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source
ASAP Roadmap. Solution Use. Bill Wood [email protected] (704) 905 5175 http://www.r3now.com
ASAP Roadmap Solution Use Bill Wood [email protected] (704) 905 5175 http://www.r3now.com 704. 905. 5175 [email protected] www.r3now.com http://www.linkedin.com/in/billwood Page 1 of 22 Contents THE
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff The Challenge IT Executives are challenged with issues around data, compliancy, regulation and making confident decisions on their business
Building a Data Quality Scorecard for Operational Data Governance
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
SAP Thought Leadership Data Migration. Approaching the Unique Issues of Data Migration
SAP Thought Leadership Data Migration A Road Map to Data Migration Success Approaching the Unique Issues of Data Migration Data migration plans and schedules typically are driven by larger projects for
SAP Master Data Governance for Enterprise Asset Management. Dean Fitt Solution Manager, Asset Management Solutions, SAP SE Stavanger, 21 October 2015
SAP Master Data Governance for Enterprise Asset Management Dean Fitt Solution Manager, Asset Management Solutions, SAP SE Stavanger, 21 October 2015 What I ll Cover SAP solutions for Asset Information
Knowledge Base Data Warehouse Methodology
Knowledge Base Data Warehouse Methodology Knowledge Base's data warehousing services can help the client with all phases of understanding, designing, implementing, and maintaining a data warehouse. This
Data Quality Assurance
CHAPTER 4 Data Quality Assurance The previous chapters define accurate data. They talk about the importance of data and in particular the importance of accurate data. They describe how complex the topic
Enabling Data Quality
Enabling Data Quality Establishing Master Data Management (MDM) using Business Architecture supported by Information Architecture & Application Architecture (SOA) to enable Data Quality. 1 Background &
Corralling Data for Business Insights. The difference data relationship management can make. Part of the Rolta Managed Services Series
Corralling Data for Business Insights The difference data relationship management can make Part of the Rolta Managed Services Series Data Relationship Management Data inconsistencies plague many organizations.
Logical Modeling for an Enterprise MDM Initiative
Logical Modeling for an Enterprise MDM Initiative Session Code TP01 Presented by: Ian Ahern CEO, Profisee Group Copyright Speaker Bio Started career in the City of London: Management accountant Finance,
ENSURING A SUCCESSFUL SAP DATA MIGRATION
ENSURING A SUCCESSFUL SAP DATA MIGRATION Presented By EXPEDIEN & KENNAMETAL Align Data Strategy With Your Business Goals Speakers Eric Stridinger, Global Data Management/EIM Lead/Manager, Kennametal Jeff
How to Implement MDM in 12 Weeks
White Paper Master Data Management How to Implement MDM in 12 Weeks Tuesday, June 30, 2015 How to Implement Provider MDM in 12 Weeks The Health Insurance industry is faced with regulatory, economic, social
Data Quality Governance: Proactive Data Quality Management Starting at Source
Data Quality Governance: Proactive Data Quality Management Starting at Source By Paul Woodlock, Clavis Technologies About the Author: Paul Woodlock is a business process and management expert with nearly
NASCIO EA Development Tool-Kit Solution Architecture. Version 3.0
NASCIO EA Development Tool-Kit Solution Architecture Version 3.0 October 2004 TABLE OF CONTENTS SOLUTION ARCHITECTURE...1 Introduction...1 Benefits...3 Link to Implementation Planning...4 Definitions...5
Three Fundamental Techniques To Maximize the Value of Your Enterprise Data
Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Prepared for Talend by: David Loshin Knowledge Integrity, Inc. October, 2010 2010 Knowledge Integrity, Inc. 1 Introduction Organizations
MDM and Data Warehousing Complement Each Other
Master Management MDM and Warehousing Complement Each Other Greater business value from both 2011 IBM Corporation Executive Summary Master Management (MDM) and Warehousing (DW) complement each other There
Knowledgent White Paper Series. Developing an MDM Strategy WHITE PAPER. Key Components for Success
Developing an MDM Strategy Key Components for Success WHITE PAPER Table of Contents Introduction... 2 Process Considerations... 3 Architecture Considerations... 5 Conclusion... 9 About Knowledgent... 10
Agile Master Data Management A Better Approach than Trial and Error
Agile Master Data Management A Better Approach than Trial and Error A whitepaper by First San Francisco Partners First San Francisco Partners Whitepaper Executive Summary Market leading corporations are
CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS
CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS In today's scenario data warehouse plays a crucial role in order to perform important operations. Different indexing techniques has been used and analyzed using
Submitted to: Service Definition Document for BI / MI Data Services
Submitted to: Service Definition Document for BI / MI Data Services Table of Contents 1. Introduction... 3 2. Data Quality Management... 4 3. Master Data Management... 4 3.1 MDM Implementation Methodology...
with an Effective, Migration Strategy
Minimize Project Risk with an Effective, Actionable, and Repeatable Data Migration Strategy Yosh Eisbart NIMBL 2012 Wellesley Information Services. All rights reserved. In This Session Understand the key
How To Implement An Enterprise Resource Planning Program
ERP Implementation Program Key phases of ERP implementation: Analysis of the company existing or designing new business process descriptions Inventory of the company s existing formal workflows or designing
Enterprise Data Management for SAP. Gaining competitive advantage with holistic enterprise data management across the data lifecycle
Enterprise Data Management for SAP Gaining competitive advantage with holistic enterprise data management across the data lifecycle By having industry data management best practices, from strategy through
Data Migration in SAP environments
Framework for Data Migration in SAP environments Does this scenario seem familiar? Want to save 50% in migration costs? Data migration is about far more than just moving data into a new application or
Using SAP Master Data Technologies to Enable Key Business Capabilities in Johnson & Johnson Consumer
Using SAP Master Data Technologies to Enable Key Business Capabilities in Johnson & Johnson Consumer Terry Bouziotis: Director, IT Enterprise Master Data Management JJHCS Bob Delp: Sr. MDM Program Manager
Evolutionary Multi-Domain MDM and Governance in an Oracle Ecosystem
Evolutionary Multi-Domain MDM and Governance in an Oracle Ecosystem FX Nicolas Semarchy Keywords: Master Data Management, MDM, Data Governance, Data Integration Introduction Enterprise ecosystems have
Presented By: Leah R. Smith, PMP. Ju ly, 2 011
Presented By: Leah R. Smith, PMP Ju ly, 2 011 Business Intelligence is commonly defined as "the process of analyzing large amounts of corporate data, usually stored in large scale databases (such as a
Data Quality for BASEL II
Data Quality for BASEL II Meeting the demand for transparent, correct and repeatable data process controls Harte-Hanks Trillium Software www.trilliumsoftware.com Corporate Headquarters + 1 (978) 436-8900
An RCG White Paper The Data Governance Maturity Model
The Dataa Governance Maturity Model This document is the copyrighted and intellectual property of RCG Global Services (RCG). All rights of use and reproduction are reserved by RCG and any use in full requires
A Design Technique: Data Integration Modeling
C H A P T E R 3 A Design Technique: Integration ing This chapter focuses on a new design technique for the analysis and design of data integration processes. This technique uses a graphical process modeling
Salesforce Certified Data Architecture and Management Designer. Study Guide. Summer 16 TRAINING & CERTIFICATION
Salesforce Certified Data Architecture and Management Designer Study Guide Summer 16 Contents SECTION 1. PURPOSE OF THIS STUDY GUIDE... 2 SECTION 2. ABOUT THE SALESFORCE CERTIFIED DATA ARCHITECTURE AND
Data warehouse and Business Intelligence Collateral
Data warehouse and Business Intelligence Collateral Page 1 of 12 DATA WAREHOUSE AND BUSINESS INTELLIGENCE COLLATERAL Brains for the corporate brawn: In the current scenario of the business world, the competition
ECM Migration Without Disrupting Your Business: Seven Steps to Effectively Move Your Documents
ECM Migration Without Disrupting Your Business: Seven Steps to Effectively Move Your Documents A White Paper by Zia Consulting, Inc. Planning your ECM migration is just as important as selecting and implementing
Effecting Data Quality Improvement through Data Virtualization
Effecting Data Quality Improvement through Data Virtualization Prepared for Composite Software by: David Loshin Knowledge Integrity, Inc. June, 2010 2010 Knowledge Integrity, Inc. Page 1 Introduction The
5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC Executive Summary Successful deployment of ERP solutions can revolutionize
Agile Master Data Management TM : Data Governance in Action. A whitepaper by First San Francisco Partners
Agile Master Data Management TM : Data Governance in Action A whitepaper by First San Francisco Partners First San Francisco Partners Whitepaper Executive Summary What do data management, master data management,
Data Migration/Conversion to SAP from Legacy systems - Our Strategy
By, WWW.101ERPTEAM.COM Data Migration/Conversion to SAP from Legacy systems - Our Strategy Project Implementation Team ---------------------------------------- SAP Consulting Services - Implementation,
What's New in SAS Data Management
Paper SAS034-2014 What's New in SAS Data Management Nancy Rausch, SAS Institute Inc., Cary, NC; Mike Frost, SAS Institute Inc., Cary, NC, Mike Ames, SAS Institute Inc., Cary ABSTRACT The latest releases
Chapter 6 Basics of Data Integration. Fundamentals of Business Analytics RN Prasad and Seema Acharya
Chapter 6 Basics of Data Integration Fundamentals of Business Analytics Learning Objectives and Learning Outcomes Learning Objectives 1. Concepts of data integration 2. Needs and advantages of using data
Five Steps to Integrate SalesForce.com with 3 rd -Party Systems and Avoid Most Common Mistakes
Five Steps to Integrate SalesForce.com with 3 rd -Party Systems and Avoid Most Common Mistakes This white paper will help you learn how to integrate your SalesForce.com data with 3 rd -party on-demand,
Whitepaper. Data Warehouse/BI Testing Offering YOUR SUCCESS IS OUR FOCUS. Published on: January 2009 Author: BIBA PRACTICE
YOUR SUCCESS IS OUR FOCUS Whitepaper Published on: January 2009 Author: BIBA PRACTICE 2009 Hexaware Technologies. All rights reserved. Table of Contents 1. 2. Data Warehouse - Typical pain points 3. Hexaware
Business Intelligence
Transforming Information into Business Intelligence Solutions Business Intelligence Client Challenges The ability to make fast, reliable decisions based on accurate and usable information is essential
5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 www.winshuttle.com Introduction
Measuring and Monitoring the Quality of Master Data By Thomas Ravn and Martin Høedholt, November 2008
Measuring and Monitoring the Quality of Master Data By Thomas Ravn and Martin Høedholt, November 2008 Introduction We ve all heard about the importance of data quality in our IT-systems and how the data
A discussion of information integration solutions November 2005. Deploying a Center of Excellence for data integration.
A discussion of information integration solutions November 2005 Deploying a Center of Excellence for data integration. Page 1 Contents Summary This paper describes: 1 Summary 1 Introduction 2 Mastering
A WHITE PAPER By Silwood Technology Limited
A WHITE PAPER By Silwood Technology Limited Using Safyr to facilitate metadata transparency and communication in major Enterprise Applications Executive Summary Enterprise systems packages such as SAP,
INFORMATION MANAGEMENT. Transform Your Information into a Strategic Asset
INFORMATION MANAGEMENT Transform Your Information into a Strategic Asset The information explosion in all organizations has created a challenge and opportunity for enterprises. When properly managed, information
14 TRUTHS: How To Prepare For, Select, Implement And Optimize Your ERP Solution
2015 ERP GUIDE 14 TRUTHS: How To Prepare For, Select, Implement And Optimize Your ERP Solution Some ERP implementations can be described as transformational, company-changing events. Others are big disappointments
Enterprise Data Management. Data Factory as a Service (DFaS)
Enterprise Data Management Data Factory as a Service (DFaS) Data Factory as a Service (DFaS) What is DFaS? DFaS offers a proven approach to deliver comprehensive services for end-to-end data conversion
SQL Server Master Data Services A Point of View
SQL Server Master Data Services A Point of View SUBRAHMANYA V SENIOR CONSULTANT [email protected] Abstract Is Microsoft s Master Data Services an answer for low cost MDM solution? Will
Data Governance Maturity Model Guiding Questions for each Component-Dimension
Data Governance Maturity Model Guiding Questions for each Component-Dimension Foundational Awareness What awareness do people have about the their role within the data governance program? What awareness
Integrating SAP and non-sap data for comprehensive Business Intelligence
WHITE PAPER Integrating SAP and non-sap data for comprehensive Business Intelligence www.barc.de/en Business Application Research Center 2 Integrating SAP and non-sap data Authors Timm Grosser Senior Analyst
AN APPLICATION-CENTRIC APPROACH TO DATA CENTER MIGRATION
AN APPLICATION-CENTRIC APPROACH TO DATA CENTER MIGRATION Five key success factors Whether the decision to relocate or consolidate a data center is driven by cost pressure or the need for greater IT efficiency,
JOURNAL OF OBJECT TECHNOLOGY
JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7, No. 8, November-December 2008 What s Your Information Agenda? Mahesh H. Dodani,
Operationalizing Data Governance through Data Policy Management
Operationalizing Data Governance through Data Policy Management Prepared for alido by: David Loshin nowledge Integrity, Inc. June, 2010 2010 nowledge Integrity, Inc. Page 1 Introduction The increasing
Gaining competitive advantage through Risk Data Governance
White Paper Gaining competitive advantage through Risk Data Governance - Nagharajan Vaidyam Raghavendran, Sudarsan Kumar, Partha Sarathi Padhi www.infosys.com Introduction As a response to the banking
Enable Business Agility and Speed Empower your business with proven multidomain master data management (MDM)
Enable Business Agility and Speed Empower your business with proven multidomain master data management (MDM) Customer Viewpoint By leveraging a well-thoughtout MDM strategy, we have been able to strengthen
Data Virtualization A Potential Antidote for Big Data Growing Pains
perspective Data Virtualization A Potential Antidote for Big Data Growing Pains Atul Shrivastava Abstract Enterprises are already facing challenges around data consolidation, heterogeneity, quality, and
Enterprise Data Governance
DATA GOVERNANCE Enterprise Data Governance Strategies and Approaches for Implementing a Multi-Domain Data Governance Model Mark Allen Sr. Consultant, Enterprise Data Governance WellPoint, Inc. 1 Introduction:
FI-IMS Fertilizer Industry Information Management System
FI-IMS Fertilizer Industry Information Management System By: Ashraf Mohammed December 2011 Fertilizer Industry Information Management System FI-IMS Fertilizer Industry Information Management System Fertilizer
Fixed Scope Offering for Implementation of Sales Cloud & Sales Cloud Integration With GTS Property Extensions
Fixed Scope Offering for Implementation of Sales Cloud & Sales Cloud Integration With GTS Property Extensions Today s Business Challenges Adopt leading CRM practices and stream line processes Take advantage
BUSINESSOBJECTS DATA INTEGRATOR
PRODUCTS BUSINESSOBJECTS DATA INTEGRATOR IT Benefits Correlate and integrate data from any source Efficiently design a bulletproof data integration process Accelerate time to market Move data in real time
<Insert Picture Here> Master Data Management
Master Data Management 김대준, 상무 Master Data Management Team MDM Issues Lack of Enterprise-level data code standard Region / Business Function Lack of data integrity/accuracy Difficulty
Creating a Business Intelligence Competency Center to Accelerate Healthcare Performance Improvement
Creating a Business Intelligence Competency Center to Accelerate Healthcare Performance Improvement Bruce Eckert, National Practice Director, Advisory Group Ramesh Sakiri, Executive Consultant, Healthcare
Data Conversion for SAP. Using Accenture s load right method to improve data quality from extraction through transformation to load
Data Conversion for SAP Using Accenture s load right method to improve data quality from extraction through transformation to load The importance of data conversion As organizations drive to reduce costs
Paper DM10 SAS & Clinical Data Repository Karthikeyan Chidambaram
Paper DM10 SAS & Clinical Data Repository Karthikeyan Chidambaram Cognizant Technology Solutions, Newbury Park, CA Clinical Data Repository (CDR) Drug development lifecycle consumes a lot of time, money
MANAGING USER DATA IN A DIGITAL WORLD
MANAGING USER DATA IN A DIGITAL WORLD AIRLINE INDUSTRY CHALLENGES AND SOLUTIONS WHITE PAPER OVERVIEW AND DRIVERS In today's digital economy, enterprises are exploring ways to differentiate themselves from
The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into
The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material,
Real World Strategies for Migrating and Decommissioning Legacy Applications
Real World Strategies for Migrating and Decommissioning Legacy Applications Final Draft 2014 Sponsored by: Copyright 2014 Contoural, Inc. Introduction Historically, companies have invested millions of
Data Virtualization and ETL. Denodo Technologies Architecture Brief
Data Virtualization and ETL Denodo Technologies Architecture Brief Contents Data Virtualization and ETL... 3 Summary... 3 Data Virtualization... 7 What is Data Virtualization good for?... 8 Applications
Capitalizing on Change
White paper Capitalizing on Change Capitalizing on Change One Network Enterprises www.onenetwork.com White paper Capitalizing on Change These big bang implementations take months and years to complete,
CIC Audit Review: Experian Data Quality Enterprise Integrations. Guidance for maximising your investment in enterprise applications
CIC Audit Review: Experian Data Quality Enterprise Integrations Guidance for maximising your investment in enterprise applications February 2014 Table of contents 1. Challenge Overview 03 1.1 Experian
Bid/Proposal No. P15/9888 Business Intelligence Management
Answers to Vendor Questions Questions are in black, Answers are in red 1. Who is the Executive Sponsor(s) for this project Information not available at this time 2. Will PCC provide the selected consultant
Data Conversion Best Practices
WHITE PAPER Data Conversion Best Practices Thomas Maher and Laura Bloemer Senior Healthcare Consultants Hayes Management WHITE PAPER: Data Conversion Best Practices Background Many healthcare organizations
Informatica Best Practice Guide for Salesforce Wave Integration: Building a Global View of Top Customers
Informatica Best Practice Guide for Salesforce Wave Integration: Building a Global View of Top Customers Company Background Many companies are investing in top customer programs to accelerate revenue and
Custom Development Methodology Appendix
1 Overview Custom Development Methodology Appendix Blackboard s custom software development methodology incorporates standard software development lifecycles in a way that allows for rapid development
Improving Service Asset and Configuration Management with CA Process Maps
TECHNOLOGY BRIEF: SERVICE ASSET AND CONFIGURATION MANAGEMENT MAPS Improving Service Asset and Configuration with CA Process Maps Peter Doherty CA TECHNICAL SALES Table of Contents Executive Summary SECTION
10426: Large Scale Project Accounting Data Migration in E-Business Suite
10426: Large Scale Project Accounting Data Migration in E-Business Suite Objective of this Paper Large engineering, procurement and construction firms leveraging Oracle Project Accounting cannot withstand
META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING
META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 [email protected]
HANDLING MASTER DATA OBJECTS DURING AN ERP DATA MIGRATION
HANDLING MASTER DATA OBJECTS DURING AN ERP DATA MIGRATION Table Of Contents 1. ERP initiatives, the importance of data migration & the emergence of Master Data Management (MDM)...3 2. 3. 4. 5. During Data
Trends In Data Quality And Business Process Alignment
A Custom Technology Adoption Profile Commissioned by Trillium Software November, 2011 Introduction Enterprise organizations indicate that they place significant importance on data quality and make a strong
How To Improve Your Business
IT Risk Management Life Cycle and enabling it with GRC Technology 21 March 2013 Overview IT Risk management lifecycle What does technology enablement mean? Industry perspective Business drivers Trends
