Data Migration Management. White Paper METHODOLOGY: SUSTAINING DATA QUALITY AFTER THE GO-LIVE AND BEYOND



Similar documents
As a System Integrator and a solution provider, NebuLogic s task is to deliver comprehensive Customer Experience (CX) solutions

Custom Development Methodology Appendix

Cisco Security Optimization Service

How To Implement An Enterprise Resource Planning Program

Release Management: Effective practices for IT delivery

NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation

A discussion of information integration solutions November Deploying a Center of Excellence for data integration.

AVEPOINT CLIENT SERVICES

Colorado Department of Health Care Policy and Financing

IBM InfoSphere Information Server Ready to Launch for SAP Applications

Project Implementation Process (PIP)

Private Cloud. One solution managed by Applied

Master data deployment and management in a global ERP implementation

Cisco Data Center Optimization Services

Datacenter Migration Think, Plan, Execute

DATA MIGRATION MANAGEMENT. A Methodology: Sustaining data integrity after the go live and beyond WHITE PAPER

PDM VS. PLM: IT ALL STARTS WITH PDM

Development, Acquisition, Implementation, and Maintenance of Application Systems

Project Risk and Pre/Post Implementation Reviews

Introduction. Architecture Re-engineering. Systems Consolidation. Data Acquisition. Data Integration. Database Technology

SEVEN WAYS TO AVOID ERP IMPLEMENTATION FAILURE SPECIAL REPORT SERIES ERP IN 2014 AND BEYOND

TECHNICAL WHITE PAPER. Accelerate UNIX-to-Linux Migration Programs with BMC Atrium Discovery and Dependency Mapping

Asset management guidelines

Windchill System Upgrade Methodology

Integrated Data Management: Discovering what you may not know

Enabling Data Quality

Cisco Video Surveillance Services

IDC Abordagem à Implementação de Soluções BPM

Mitigate Risk for Data Center Network Migration

Successful CRM. Delivered. Prepare for CRM Success. Our How to start right and stay right!

POLAR IT SERVICES. Business Intelligence Project Methodology

AN APPLICATION-CENTRIC APPROACH TO DATA CENTER MIGRATION

Standards for Developing and Implementing Administrative Systems at UC Davis

SAP Thought Leadership Data Migration. Approaching the Unique Issues of Data Migration

Information Technology Policy

Neverfail Solutions for VMware: Continuous Availability for Mission-Critical Applications throughout the Virtual Lifecycle

Reduce and manage operating costs and improve efficiency. Support better business decisions based on availability of real-time information

An Oracle White Paper November Upgrade Best Practices - Using the Oracle Upgrade Factory for Siebel Customer Relationship Management

Request for Proposal for Application Development and Maintenance Services for XML Store platforms

Whitepaper. Data Warehouse/BI Testing Offering YOUR SUCCESS IS OUR FOCUS. Published on: January 2009 Author: BIBA PRACTICE

Knowledge Base Data Warehouse Methodology

Service Definition: MooD Cloud Application Configuration & Evolution

Business Operations. Module Db. Capita s Combined Offer for Business & Enforcement Operations delivers many overarching benefits for TfL:

An Oracle White Paper Updated July Best Practices for Upgrading Oracle E-E-E- E-Business Suite

An Introduction to SharePoint Governance

SACM and CMDB Strategy and Roadmap. David Lowe ActionableITSM.com March 20, 2012

Optimos Enterprise Helpdesk Automation Solution Case Study

8 Critical Success Factors When Planning a CMS Data Migration

LDAP Authentication Configuration Appendix

50x Zettabytes*

Qlik UKI Consulting Services Catalogue

MICROSOFT DYNAMICS CRM

ORACLE SYSTEMS OPTIMIZATION SUPPORT

Audit Management. service definition document

Kern Health System CORE Software Professional Services RFP Responses to Questions/Request for Explanation

Accelerate Your Enterprise Private Cloud Initiative

Zero Downtime In Multi tenant Software as a Service Systems

Re: RFP # 08-X MOTOR VEHICLE AUTOMATED TRANSACTION SYSTEM (MATRX) FOR MVC ADDENDUM #10

Strategic Briefing Data Center Management & Automation

AN APPLICATION-CENTRIC APPROACH TO DATA CENTER MIGRATION

Validating Enterprise Systems: A Practical Guide

Template K Implementation Requirements Instructions for RFP Response RFP #

Microsoft Dynamics Sure Step 2012 Supplementary Training Material Version 5.0

CU ecomm program ACF Solutions Implementation Scope of Work

STATEMENT OF WORK (SOW) for Web Content Management System Professional Services

Cisco and VMware Virtualization Planning and Design Service

_experience the commitment TM. Seek service, not just servers

hi Information Technologies Change Management Standard

PHASE 5: DESIGN PHASE

PHASE 8: IMPLEMENTATION PHASE

Kiefer Consulting, Inc Job Opportunities

4.13 System Testing. Section 4 Bidder's Products, Methodology, and Approach to the Project System Training

Red Hat s Subscription Guide

5 Steps to Achieve Business Value from Your Next ERP Platform

Deploying the CMDB for Change & Configuration Management

Integrating Netezza into your existing IT landscape

Cloud Services Guide. (Managed Services Guide) JDA Cloud Services. Version 3.3. Date: 29th October 2012

Transcription:

White Paper Data Migration Management METHODOLOGY: SUSTAINING DATA QUALITY AFTER THE GO-LIVE AND BEYOND

Introduction Data Migration is a core practice under the NebuLogic Data Management Services (NDMS) Framework. NDMS is a holistic Data Architecture Strategy that aligns business and data processes using applied technologies. The NDMS Framework manages and extends the lifecycle of data across an enterprise from Source System Exploration to Production. The stages in NDMS are shown in Figure 1. In any CRM implementation, data migration is a critical task. If the data are not migrated correctly, business Figure 1: The NDMS stages processes will experience setbacks. Also, if data quality is poor, then users may resist adopting the new CRM system. Data migrations involve critical business processes, and the migration must be executed well and on time.

NDMS Stages Source System Exploration The first phase of the Data Migration is to identify the data file format. The most appropriate method for identification is to group data for example, the name field, address field and descriptions field to organize the data in a synchronized manner. Although the source system may contain numerous fields, some might be duplicates or not applicable to the target system. In this stage it is critical to identify the relevance of data. The relevance is based on required data, redundant data, and junk data (useless data for migration). Using multiple data sources allows us to add another element of data validation and a level of confidence in the data gathered. Output: Required data are identified and segregated into categories which enable us to work on manageable and possible parallel tasks. Data Assessment In this phase, we assess the quality of the source data. In case of data inconsistency incorrect or duplicate data there is limited value in migrating data to the target system. To assess the data, initial data profiling is highly recommended. Data profiling is the process of systematically scanning and analyzing the contents of all the fields. It is an integral part of the process of evaluating the conformity of the data, and safeguarding complaince to the requirements of the target system. The profiling functions include examining the actual record value and its metadata information. By including data profiling early in the migration process, the risks of project overruns, delays and failures are reduced. Data profiling can: Immediately identify whether the data will fit business purposes Accurately plan the integration strategy by identifying data inconsistencies early Successfully integrate the source data using automated data quality processes Output: A thorough understanding of the data quality in the source systems is developed, along with the identification of data issues and a list of defined rules that can be used to correct them.

Migration Design and Build Design: In this phase, the main tasks are defining the technical architecture and design of the migration processes. Defining the testing processes and how to transition them to the production system is critical. Migration patterns are also identified, as in whether there will be a parallel run, a zerodowntime migration, or whether you will be expecting to complete the migration and simply decommision the old system. Output: This phase predicts next steps involved in the process. Timelines for target system, technical requirements and other related concerns are documented. Build: This is the most crucial phase of the entire process. The migration is executed only once, and there will be limited reuse of the code. An effective and transparent migration starts with migrating a subset of data, and testing one category of data at a time. In case of larger projects, each category can be tested in parallel. Testing the migration solution is usually an iterative approach. For example, we start by checking the components individually in small subsets to ensure the mappings, transformations, and data cleansing routines are working. Then, we increase the data volumes, and eventually link all of the components together into a single migration job. Output: A fully tested data migration process that is scalable, reliable and can deliver the migration within the planned time. Execution After exhaustive testing, the execution phase begins. In most cases, the source systems are shut down while the migration executes. To mitigate disruptions, we prefer to conduct this phase over a weekend or a public holiday. In other cases, where the source applications are required to be up and running, a zero-downtime migration approach is used. Using zero-downtime migration technology, data from paused or running source systmes can be migrated to another parallel server with no interruption. This technology has the following advantages: The migration time is greatly reduced. In fact, the migration eliminates the service outage or interruption for end users. The process of migrating the data from source systems to another parallel server is transparent, that means no modifications of system charachteristics and operational procedures can be performed on the source or destination servers.

About NebuLogic Technologies NebuLogic Technologies, LLC is a leading System Implementation and Integration Information Technology company that specializes in delivering comprehensive Service and Sales Automation solutions. NebuLogic provides solutions and services using SaaS/Cloud based as well as enterprise class applications. Our definition to deployment services include but not limited to: 1) Conducting requirements discovery and analysis (RFI, RFP review and response); 2) Conducting master requirements workshops; 3) Developing Business Requirements Documents (BRD) and Technical Requirements Documents (TRD); 4) Building prototypes, mockups and pilot solutions; 5) Configuring core Customer Relationship Management (CRM) applications; 6) Enhancing and/or developing integration components; 7) Developing use cases based on client requirements; 8) Implementing data and application security requirements; 9) Conducting Quality Assurance and User Acceptance Testing (QA and UAT); 10) Developing user and administrator training guides and related documents; 11) Delivering hands-on user and administrator training; 12) Analyzing risk factors and implementing risk mitigation processes; 13) Conducting system stress testing and developing pre-production checklists; 14) Enabling successful Go Live/production rollouts; 15) Providing post-production support and maintenance services; 16) Providing hosted services and more. To find out more about NebuLogic s Services and Products please reach us at: www.nebulogic.com 1-(972)-335-0699 contact@nebulogic.com