Data Migration Service An Overview
|
|
|
- Ross Williamson
- 9 years ago
- Views:
Transcription
1 Metalogic Systems Pvt Ltd J 1/1, Block EP & GP, Sector V, Salt Lake Electronic Complex, Calcutta Phones: to 8994 Fax:
2 Metalogic Systems: Data Migration Services 1. Introduction Data Migration Service An Overview Data Migration is an extremely important but often neglected or wrongly estimated phase in any software implementation / migration activity. Apparently it sounds rather easy to move the data from the source platform to the target platform, but in real life this can easily become an extremely daunting task. The objective of this document is to give a general idea behind the activities related to migrating or transforming data and the technical overview of the processes within the data migration service provided by MLS. 2. Scope of Data Migration One may need to transform or migrate data from one application to another due to various reasons and some of them can be listed as below: Technical obsolescence of old operating environment (hardware, o/s, application s/w) and a compulsion to migrate the application along with the data to newer platforms Implementation of new custom-built applications or packages requiring enterprise data from the legacy systems Requirement to provide web-enabled services that must have access to enterprise data online or updated up to a pre-determined interval. To provide a reliable and efficient electronic mechanism to make archived data available. The actual scope of data migration needs to be determined after analyzing various requirements of the customer. We need to answer some basic questions before finalizing on the scope. Some of the questions are given below but the list is not exhaustive: a) Whether it is a 1:1 data migration (e.g., attributes of the entities remain mostly unchanged between the source & target platforms) or there is a need for data transformation (i.e., the source and target data models differ by design) For application migration, generally the functionalities offered by the original application remain same and it is only the source and target environments (i.e., hardware, o/s, database products) that change. But in cases where the migrated data is going to be used by a new custom application or any package, there is a substantial change in the entity/relationship between the source and target databases. Naturally, in the latter case, the effort is much more and one needs to be thorough about the business rules to transform the data. Metalogic Systems Confidential Page 2 of 10
3 b) Whether the original application has to continue or it is going to be retired once the target application goes into production. In all likelihood, in the latter case the migration effort is one time, whereas in the former case we may need to put in place a process by which both these applications should be able to exchange data to and fro at a regular interval of time. Often, due to modular replacement of legacy systems with new or migrated systems, exchanges of data between the two environments become a critical requirement. c) In some cases it may so happen that the original application has already been retired and the new application is operational with current data, but there is still a requirement to transfer the older/archive data. It is likely that the source environment itself may not be available any longer and in such cases, one will need to devise special handling for correct interpretation of this kind of data (e.g., computational / embedded sign fields, data residing in machines with 6/9 bit architecture etc.). Apart from the issues outlined above, we need to consider various operational aspects of the specific application and the environment as listed below. However, some of these may not apply depending on the response to the above questions. d) Which data is static and which is transactional in nature and what are the respective volumes? What are the frequencies of various source database entities those undergo changes? e) How much disk space is available in the source environment to download the data? If the download process needs to be split into several phases, how to manage the changes in the transactional data that happen during these phases? f) Can the source application be shut down during the data download & upload phases? What is the estimated process run-time and is it lower than the maximum time allowed for such a shutdown? Answers to the above questions naturally will lead to the decision on whether we will need to download/upload the data incrementally. The above analysis will also determine the periodicity of the data exchanges. It is apparent that an incremental download & upload process is going to be far more complicated compared to developing a set of simpler extraction and loading programs that will run as a single one-time batch job. Lastly, there is a great need to carefully evaluate the actual requirements of any data exchange process between the source and target applications if both have to run side by side and are dependent on each other. We need to determine the mode of data transfer in such cases. Should the data travel at a pre-determined interval by batch (e.g., nightly transfer by ftp or similar means) or do we need to have inter-process communication services between the two applications (through rpc or similar mechanisms)? Metalogic Systems Confidential Page 3 of 10
4 3. Overview of the Data Migration services provided by MLS MLS has acquired specialized expertise in carrying out data migration projects successfully across the globe. While executing data migration projects for our customers, we have implemented certain processes to streamline all the activities involved in this kind of activity. This process is based around a set of software tools developed in-house that generate some of the essential components like data download and upload programs. The toolbased approach automates the migration to a great extent, thereby ensuring a faster completion time and reducing the chances of errors. S o u r c e P la tf o r m ( H /W & S /W ) S o u r c e D a t a T r a n s f o r m a t io n T a r g e t D a t a T a r g e t P la tf o r m ( H /W & S /W ) The above picture is the uppermost level depiction of a most common type of data transformation activity. The source data are transformed and transported to the target platform after a series of operations are applied on them in various stages. The scope generally ends with uploading the data into target platform. Simple though it sounds, all aspects of the hardware, application software, database design (e.g., table structure, relationship, other objects like DB procedures etc.) and installation / deployment details of source and target environments must be considered in order to complete a successful data migration. While designing the service at MLS, we have tried our best to cover all these aspects. Metalogic Systems Confidential Page 4 of 10
5 4. Process Overview The process of Transformation involves: 4.1 Study and analysis of source and target data models and extraction of the mapping rules for transforming the business entities in the source application into the target platform. 4.2 Mapping existing data types available in the source platform to equivalent data types in the target database. 4.3 Translation/re-coding and manipulation of the source data as per requirements of the target application. 4.4 Scraping and Cleansing of data invalid in the new environment The following figure represents the major processes involved in a data migration activity and the sequence in which they are performed. The square boxes each represent a separate process with further elaboration in the subsequent sections of this document. Source Data Model Analysis Target Data Model Analysis Application Programs / Database Definition Scripts Attribute Mapping Mapper Creation Tool Processes Error Reports Repository Population Data Download and Upload Testing Data Validation Routine Generation Data Download and Upload Generation Tested Data Download and Upload Utilities Source Data Live Data Migration Target Database Metalogic Systems Confidential Page 5 of 10
6 Process Input and Output Process Input Output Source Data Model Source Data Model Source file layouts Analysis COBOL layout finalization for Flat and ISAM files Source database scripts (DDL/ DDS/SDDL/DBD/PSB/SQL etc.) & Application programs Source field/record Rules Source database entities Discrimination rules for each file/record/segment Creation of data mapping rules and Data Mapping and Validation Rules validations Repository Population Source file layouts 1. Populated Repository (2) Data Validation Programs Data Download & Upload Program Generation Data Download Testing Data Upload Testing Tested Data Download and Upload programs Populated Repository Data Mapping and Validation Rules Populated Repository Data Mapping and Validation Rules 1. Test Database in the Source Environment 2. Generated Data Download programs Sample source data in plain ASCII format Tested programs, compilation and installation scripts for both platforms 2. Generated Data Validation Programs Sample Error Reports on Invalid Data (3) Generated Data Download & Upload programs 1. Sample source data in plain ASCII format 2. Sample Error Reports on Invalid Data 1. Populated Test Database in the Target environment 2. Test results on the target platform Data Download and Upload programs installed on respective platforms Metalogic Systems Confidential Page 6 of 10
7 Process Input Output Data Migration Dry Sample data for all sources (Related Migrated data in target Run and complete) environment corresponding to the provided sample. Rough Estimate of actual time needed in final run. Data Migration Plan Test Data Migration Live Data Migration Full data for all sources for identified phase(s) Source Data Storage on Target platform Plan document Migrated data in staging environment Actual estimate of time required for final run Revised plan document Transformed data migrated to target platform Control report to ensure complete migration (1) Validation Rules may be defined for implementation during data transformation on source fields/records. Applying expressions or functions on source field(s) may generate target data elements. (2) The repository is a complex set of data structures that stores all information related to source data models. It can be used to produce a variety of reports about the source data models and generate the download programs. (3) This will be a repetitive task inspection of error reports coming out of this step will gradually refine the data mapping rules. Only after a couple of iterations it will be possible to extract all the prevalent rules existing in the source entities. Metalogic Systems Confidential Page 7 of 10
8 5. A brief look inside the Processes 5.1 Source Data Model Analysis i. Identify all types of storage (Network/Hierarchical/Relation databases, ISAM files, sequential files, etc.) and respective Data Definition scripts (Schema/Sub-schema/DBD/PSB/SQL scripts etc.). ii. Identify all data storage units (records/tables) requiring transformation. iii. iv. Identify all possible layouts for each data storage unit. Determine layouts of individual data storage units with break-up of data elements to the lowest possible levels. v. Identify rules to validate records and/or fields in each storage unit. For example, whether a field is a date field or not and if yes, what is the format of this date field. Or, if a field contains a set of valid codes, what are these codes (e.g., M for Male, F for Female) and so on. vi. Identify records with multiple layouts and the rules to distinguish the different layouts. For example, if there is multiple record types all put together in a single file, which field identifies the record type? 5.2 Target Data Model Analysis i. Determine target data model ii. Determine significance of each data element in target data model with respect to data migration requirement iii. Identify data elements to be populated by migrated data. 5.3 Attribute Mapping i. Identify rules to transform source data types to target data types ii. Correspond each target data element (columns) to source data iii. Identify rules (expressions, functions, etc.) to be applied on source data element(s) in order to populate target data elements iv. Identify rules to validate transformed data elements v. Identify rules to transform source records/files into target records/tables (viz., merge, split, etc.) vi. Identify discrepancies related to data types, sizes, and formats of data elements vii. Resolve discrepancies related to data types, sizes, and formats of data elements viii. Identify and resolve gaps between Source and Target data elements Metalogic Systems Confidential Page 8 of 10
9 5.4 Mapper Creation Data Migration Service An Overview Create Map Information files on the basis of source and target data models and the rules identified above to aid population of repository. Map Information files are files that store data mapping rules and validations in pre-defined formats. The transformation tool recognizes this format 5.5 Repository Population Outputs of all preceding processes are utilized to populate the repository. The repository is a complex data structure that stores all Source and Target data definitions and the rules for transformation. 5.6 Data Validation Routine Generation These programs will validate the supplied mapping rules. All the rules may not be readily available to the customer on day one and they have to evolve over a period of time. These programs will help validate those rules by sampling the actual data from the source databases and generating error reports. Appropriate inclusion / modifications are then applied on the supplied set of rules to bring the error report contents to an acceptable limit and determine the actual rules. 5.7 Data Download / Upload Program generation Data Download Data download programs are generated by the tool and run on the source platforms. There is one program for every file/record/table in the respective source databases, which dumps the contents of the respective data store in a flat file with all fields converted to plain ASCII text, removing all platform dependencies (i.e., embedded sign fields, computational fields etc.). The download programs may also generate a control file in order to preserve the existing set relationships of the current record with other records, ordering and other information as per requirement, so that no information is lost while pulling the data out of the existing environments Data Upload Data Upload Programs generated or developed during this stage will take as input the downloaded ASCII data extracted in the previous step. The Data Mapping document containing all mapping rules between the source and target databases will supply the specifications for this task.. Metalogic Systems Confidential Page 9 of 10
10 Data Download Testing The generated Download programs are tested on a test database in the source database environment. Testing is an iterative process. The test results will confirm correctness of the download process. Data Upload Testing Data Upload programs will be run on the development environment after setting up the target databases there. The test results at this stage will confirm correctness of the entire migration process. Tested Data Download and Upload programs The tested data Download and Upload programs are then delivered and installed on respective platforms with appropriate scripts to compile and execute them. Data Migration Dry Run A test run on all Source data units with sample data to ensure success for the live run. This stage will also provide a rough estimate of the time required for the final data migration. Data Migration Plan A plan for the live data migration is produced. The plan takes all logistics and contingencies into account. Test Data Migration Test run of data migration with full set of operational data for the identified phase(s). The test run is to be carried out on the staging environment. This will provide an estimate of actual time required for the final data migration. This stage will enable to determine the most suitable phases in the entire migration process and will produce a final data migration plan. Live Data Migration An approved migration plan is followed to undertake the Live Data Migration. The correctness of the transformation is confirmed by comparing control reports generated for both Source and Target data. Metalogic Systems Confidential Page 10 of 10
University Data Warehouse Design Issues: A Case Study
Session 2358 University Data Warehouse Design Issues: A Case Study Melissa C. Lin Chief Information Office, University of Florida Abstract A discussion of the design and modeling issues associated with
Improving database development. Recommendations for solving development problems using Red Gate tools
Improving database development Recommendations for solving development problems using Red Gate tools Introduction At Red Gate, we believe in creating simple, usable tools that address the problems of software
Best Practices for Data Loading into SAP ERP Systems
Best Practices for Data Loading into SAP ERP Systems Summary An organization s ability to act swiftly and make business decisions is based on having access to complete and accurate views of enterprise
EnterpriseLink Benefits
EnterpriseLink Benefits GGY AXIS 5001 Yonge Street Suite 1300 Toronto, ON M2N 6P6 Phone: 416-250-6777 Toll free: 1-877-GGY-AXIS Fax: 416-250-6776 Email: [email protected] Web: www.ggy.com Table of Contents
Metalogic Data Migration Practice
Metalogic Data Migration Practice Metalogic Systems Metalogic Systems Incorporated in July 1997 Head Office & Software Factory at The Software Technology Park, Salt Lake, Calcutta, India Automation of
Software Development Life Cycle at SSPL. An Summary of Methodologies We Offer
Software Development Life Cycle at SSPL An Summary of Methodologies We Offer 10/29/2009 Table of Contents The SSPL Advantage... 2 Commonly Used SDLC Models at SSPL... 2 Waterfall Model... 2 Agile Model...
8 Critical Success Factors When Planning a CMS Data Migration
8 Critical Success Factors When Planning a CMS Data Migration Executive Summary The first step to success. This paper is loaded with critical information that will promote the likelihood of your success
Eight key steps which help ensure a successful data migration project: A white paper for inspection management professionals
Eight key steps which help ensure a successful data migration project: A white paper for inspection management professionals Data migration defined Data migration is the selection, preparation, extraction,
Modernizing Your IT Systems While Preserving Your Investments & Managing Risk
Modernizing Your IT Systems While Preserving Your Investments & Managing Risk Overview The pace of modernization in Information Technology Systems is quickening. The business, technical and human capital
WHITEPAPER. Improving database development
WHITEPAPER Improving database development Introduction At Redgate, we believe in creating simple, usable tools that address the problems of software developers and technology businesses. In considering
Improving your Data Warehouse s IQ
Improving your Data Warehouse s IQ Derek Strauss Gavroshe USA, Inc. Outline Data quality for second generation data warehouses DQ tool functionality categories and the data quality process Data model types
N.K. Srivastava GM-R&M-Engg.Services NTPC- CC/Noida [email protected]
N.K. Srivastava GM-R&M-Engg.Services NTPC- CC/Noida [email protected] JULY 2012 ARC- Transforming Industry and Infrastructure through New Processes and Technologies Presentation Road Map Introduction
Cúram Business Intelligence Reporting Developer Guide
IBM Cúram Social Program Management Cúram Business Intelligence Reporting Developer Guide Version 6.0.5 IBM Cúram Social Program Management Cúram Business Intelligence Reporting Developer Guide Version
Best Practices for Deploying and Managing Linux with Red Hat Network
Best Practices for Deploying and Managing Linux with Red Hat Network Abstract This technical whitepaper provides a best practices overview for companies deploying and managing their open source environment
Performance in the Infragistics WebDataGrid for Microsoft ASP.NET AJAX. Contents. Performance and User Experience... 2
Performance in the Infragistics WebDataGrid for Microsoft ASP.NET AJAX An Infragistics Whitepaper Contents Performance and User Experience... 2 Exceptional Performance Best Practices... 2 Testing the WebDataGrid...
FTA Technology 2009 IT Modernization and Business Rules Extraction
FTA Technology 2009 IT Modernization and Business Rules Extraction August 5th, 2009 _experience the commitment TM Agenda IT Modernization Business Rules Extraction Automation Tools for BRE BRE Cost and
How To Retire A Legacy System From Healthcare With A Flatirons Eas Application Retirement Solution
EAS Application Retirement Case Study: Health Insurance Introduction A major health insurance organization contracted with Flatirons Solutions to assist them in retiring a number of aged applications that
CA Workload Automation Agents for Mainframe-Hosted Implementations
PRODUCT SHEET CA Workload Automation Agents CA Workload Automation Agents for Mainframe-Hosted Operating Systems, ERP, Database, Application Services and Web Services CA Workload Automation Agents are
EXHIBIT L. Application Development Processes
EXHIBIT L Application Development Processes Optum Development Methodology Development Overview Figure 1: Development process flow The Development phase consists of activities that include the building,
EAI vs. ETL: Drawing Boundaries for Data Integration
A P P L I C A T I O N S A W h i t e P a p e r S e r i e s EAI and ETL technology have strengths and weaknesses alike. There are clear boundaries around the types of application integration projects most
1-04-10 Configuration Management: An Object-Based Method Barbara Dumas
1-04-10 Configuration Management: An Object-Based Method Barbara Dumas Payoff Configuration management (CM) helps an organization maintain an inventory of its software assets. In traditional CM systems,
Recommendations for Performance Benchmarking
Recommendations for Performance Benchmarking Shikhar Puri Abstract Performance benchmarking of applications is increasingly becoming essential before deployment. This paper covers recommendations and best
SOA Enabled Workflow Modernization
Abstract Vitaly Khusidman Workflow Modernization is a case of Architecture Driven Modernization (ADM) and follows ADM Horseshoe Lifecycle. This paper explains how workflow modernization fits into the ADM
Elite: A New Component-Based Software Development Model
Elite: A New Component-Based Software Development Model Lata Nautiyal Umesh Kumar Tiwari Sushil Chandra Dimri Shivani Bahuguna Assistant Professor- Assistant Professor- Professor- Assistant Professor-
Key Requirements for a Job Scheduling and Workload Automation Solution
Key Requirements for a Job Scheduling and Workload Automation Solution Traditional batch job scheduling isn t enough. Short Guide Overcoming Today s Job Scheduling Challenges While traditional batch job
LECTURE 1. SYSTEMS DEVELOPMENT
LECTURE 1. SYSTEMS DEVELOPMENT 1.1 INFORMATION SYSTEMS System A system is an interrelated set of business procedures used within one business unit working together for a purpose A system has nine characteristics
Tips and Best Practices for Managing a Private Cloud
Deploying and Managing Private Clouds The Essentials Series Tips and Best Practices for Managing a Private Cloud sponsored by Tip s and Best Practices for Managing a Private Cloud... 1 Es tablishing Policies
Table of Contents. CHAPTER 1 Web-Based Systems 1. CHAPTER 2 Web Engineering 12. CHAPTER 3 A Web Engineering Process 24
Table of Contents CHAPTER 1 Web-Based Systems 1 The Web 1 Web Applications 2 Let s Introduce a Case Study 3 Are WebApps Really Computer Software? 4 Are the Attributes of WebApps Different from the Attributes
Create your own brick-level backup script for Exchange Server 5.5
Create your own brick-level backup script for Exchange Server 5.5 By Dominic Bosco Every Exchange Organization has its Very Important Mailboxes (VIMs). If you re like most Exchange Administrators, you
From Chaos to Clarity: Embedding Security into the SDLC
From Chaos to Clarity: Embedding Security into the SDLC Felicia Nicastro Security Testing Services Practice SQS USA Session Description This session will focus on the security testing requirements which
WHITE PAPER Achieving Continuous Data Protection with a Recycle Bin for File Servers. by Dan Sullivan. Think Faster. Visit us at Condusiv.
WHITE PAPER Achieving Continuous Data Protection with a Recycle Bin for File Servers by Dan Sullivan 01_20131025 Think Faster. Visit us at Condusiv.com WITH A RECYCLE BIN FOR FILE SERVERS 2 Article 1:
Life Cycle Management for Oracle Data Integrator 11 & 12. At lower cost Get a 30% return on investment guaranteed and save 15% on development costs
Life Cycle Management for Oracle Data Integrator 11 & 12 Increase productivity Stop wasting your time doing things maually by automating every step in your project s Life Cycle At lower cost Get a 30%
SAP Data Services 4.X. An Enterprise Information management Solution
SAP Data Services 4.X An Enterprise Information management Solution Table of Contents I. SAP Data Services 4.X... 3 Highlights Training Objectives Audience Pre Requisites Keys to Success Certification
Simplify Your Windows Server Migration
SOLUTION BRIEF: ENDPOINT MANAGEMENT........................................ Simplify Your Windows Server Migration Who should read this paper Windows Server 2003 customers looking to migrate to the latest
Realizing the Benefits of Data Modernization
February 2015 Perspective Realizing the Benefits of How to overcome legacy data challenges with innovative technologies and a seamless data modernization roadmap. Companies born into the digital world
Yiwo Tech Development Co., Ltd. EaseUS Todo Backup. Reliable Backup & Recovery Solution. EaseUS Todo Backup Solution Guide. All Rights Reserved Page 1
EaseUS Todo Backup Reliable Backup & Recovery Solution EaseUS Todo Backup Solution Guide. All Rights Reserved Page 1 Part 1 Overview EaseUS Todo Backup Solution Guide. All Rights Reserved Page 2 Introduction
DATA QUALITY MATURITY
3 DATA QUALITY MATURITY CHAPTER OUTLINE 3.1 The Data Quality Strategy 35 3.2 A Data Quality Framework 38 3.3 A Data Quality Capability/Maturity Model 42 3.4 Mapping Framework Components to the Maturity
An Oracle White Paper November 2011. Upgrade Best Practices - Using the Oracle Upgrade Factory for Siebel Customer Relationship Management
An Oracle White Paper November 2011 Upgrade Best Practices - Using the Oracle Upgrade Factory for Siebel Customer Relationship Management Executive Overview... 1 Introduction... 1 Standard Siebel CRM Upgrade
Efficient Automated Build and Deployment Framework with Parallel Process
Efficient Automated Build and Deployment Framework with Parallel Process Prachee Kamboj 1, Lincy Mathews 2 Information Science and engineering Department, M. S. Ramaiah Institute of Technology, Bangalore,
Municipality Moves SCADA System from Desktop Computers to Terminal Services
Municipality Moves SCADA System from Desktop Computers to Terminal Services KEYWORDS Bosko Bob Loncar 1 1 1151 Bronte Road Oakville, Ontario L6M 3L1 ([email protected]) SCADA, Terminal Services, Remote
Cisco Data Preparation
Data Sheet Cisco Data Preparation Unleash your business analysts to develop the insights that drive better business outcomes, sooner, from all your data. As self-service business intelligence (BI) and
Network Detective. PCI Compliance Module Using the PCI Module Without Inspector. 2015 RapidFire Tools, Inc. All rights reserved.
Network Detective PCI Compliance Module Using the PCI Module Without Inspector 2015 RapidFire Tools, Inc. All rights reserved. V20150819 Ver 5T Contents Purpose of this Guide... 4 About Network Detective
Managed File Transfer
Managed File Transfer How do most organizations move files today? FTP Typically File Transfer Protocol (FTP) is combined with writing and maintaining homegrown code to address its limitations Limited Reliability
Data Masking Secure Sensitive Data Improve Application Quality. Becky Albin Chief IT Architect [email protected]
Data Masking Secure Sensitive Data Improve Application Quality Becky Albin Chief IT Architect [email protected] Data Masking for Adabas The information provided in this PPT is entirely subject
How To Develop Software
Software Engineering Prof. N.L. Sarda Computer Science & Engineering Indian Institute of Technology, Bombay Lecture-4 Overview of Phases (Part - II) We studied the problem definition phase, with which
Embedded System Deployment and Management
Embedded System Deployment and Management Richard Wasell Applications Engineer, NI Norway Agenda Project Based Deployment Overview FPGA Deployment Image vs. Component Based Updates Push Model MAX & LabVIEW
BMC CONTROL-M Agentless Tips & Tricks TECHNICAL WHITE PAPER
BMC CONTROL-M Agentless Tips & Tricks TECHNICAL WHITE PAPER Table of Contents BMC CONTROL-M An IT workload automation platform... 1 Using standard agent-based scheduling... 1 Agentless scheduling... 1
AdminStudio 2013. Release Notes. 16 July 2013. Introduction... 3. New Features... 6
AdminStudio 2013 Release Notes 16 July 2013 Introduction... 3 New Features... 6 Microsoft App-V 5.0 Support... 6 Support for Conversion to App-V 5.0 Virtual Packages... 7 Automated Application Converter
California Enterprise Architecture Framework
Version 2.0 August 01, 2013 This Page is Intentionally Left Blank Version 2.0 ii August 01, 2013 TABLE OF CONTENTS 1 Executive Summary... 1 1.1 What is Enterprise Architecture?... 1 1.2 Why do we need
Testing Automation for Distributed Applications By Isabel Drost-Fromm, Software Engineer, Elastic
Testing Automation for Distributed Applications By Isabel Drost-Fromm, Software Engineer, Elastic The challenge When building distributed, large-scale applications, quality assurance (QA) gets increasingly
File S1: Supplementary Information of CloudDOE
File S1: Supplementary Information of CloudDOE Table of Contents 1. Prerequisites of CloudDOE... 2 2. An In-depth Discussion of Deploying a Hadoop Cloud... 2 Prerequisites of deployment... 2 Table S1.
Introduction. Setup of Exchange in a VM. VMware Infrastructure
Introduction VMware Infrastructure is deployed in data centers for deploying mission critical applications. Deployment of Microsoft Exchange is a very important task for the IT staff. Email system is an
BENEFITS OF AUTOMATING DATA WAREHOUSING
BENEFITS OF AUTOMATING DATA WAREHOUSING Introduction...2 The Process...2 The Problem...2 The Solution...2 Benefits...2 Background...3 Automating the Data Warehouse with UC4 Workload Automation Suite...3
White Paper Case Study: How Collaboration Platforms Support the ITIL Best Practices Standard
White Paper Case Study: How Collaboration Platforms Support the ITIL Best Practices Standard Abstract: This white paper outlines the ITIL industry best practices methodology and discusses the methods in
DATA MINING TOOL FOR INTEGRATED COMPLAINT MANAGEMENT SYSTEM WEKA 3.6.7
DATA MINING TOOL FOR INTEGRATED COMPLAINT MANAGEMENT SYSTEM WEKA 3.6.7 UNDER THE GUIDANCE Dr. N.P. DHAVALE, DGM, INFINET Department SUBMITTED TO INSTITUTE FOR DEVELOPMENT AND RESEARCH IN BANKING TECHNOLOGY
Workflow Templates Library
Workflow s Library Table of Contents Intro... 2 Active Directory... 3 Application... 5 Cisco... 7 Database... 8 Excel Automation... 9 Files and Folders... 10 FTP Tasks... 13 Incident Management... 14 Security
London Stock Exchange
M4 Systems Case Studies London Stock Exchange M4 Systems Ltd Tel: 0845 5000 777 International: +44 (0)1443 863910 www.m4systems.com www.dynamicsplus.net London Stock Exchange Increases Efficiency with
Guideline on Auditing and Log Management
CMSGu2012-05 Mauritian Computer Emergency Response Team CERT-MU SECURITY GUIDELINE 2011-02 Enhancing Cyber Security in Mauritius Guideline on Auditing and Log Management National Computer Board Mauritius
SECTION 2 PROGRAMMING & DEVELOPMENT
Page 1 SECTION 2 PROGRAMMING & DEVELOPMENT DEVELOPMENT METHODOLOGY THE WATERFALL APPROACH The Waterfall model of software development is a top-down, sequential approach to the design, development, testing
Symantec Endpoint Protection 11.0 Architecture, Sizing, and Performance Recommendations
Symantec Endpoint Protection 11.0 Architecture, Sizing, and Performance Recommendations Technical Product Management Team Endpoint Security Copyright 2007 All Rights Reserved Revision 6 Introduction This
15 Organisation/ICT/02/01/15 Back- up
15 Organisation/ICT/02/01/15 Back- up 15.1 Description Backup is a copy of a program or file that is stored separately from the original. These duplicated copies of data on different storage media or additional
Continuous integration End of the big bang integration era
Continuous integration End of the big bang integration era Patrick Laurent Partner Technology & Enterprise Applications Deloitte Mario Deserranno Manager Technology & Enterprise Applications Deloitte The
Distributed Computing and Big Data: Hadoop and MapReduce
Distributed Computing and Big Data: Hadoop and MapReduce Bill Keenan, Director Terry Heinze, Architect Thomson Reuters Research & Development Agenda R&D Overview Hadoop and MapReduce Overview Use Case:
Agency HRIT Migrations to Shared Service Centers: Consolidated Lessons Learned Report
United States Office of Personnel Management Human Resources Line of Business Agency HRIT Migrations to Shared Service Centers: Consolidated Lessons Learned Report March 2015 Table of Contents 1 Table
Visionet IT Modernization Empowering Change
Visionet IT Modernization A Visionet Systems White Paper September 2009 Visionet Systems Inc. 3 Cedar Brook Dr. Cranbury, NJ 08512 Tel: 609 360-0501 Table of Contents 1 Executive Summary... 4 2 Introduction...
For each requirement, the Bidder should indicate which level of support pertains to the requirement by entering 1, 2, or 3 in the appropriate box.
Annex Functional Requirements for: The integrated reconciliation system of Back-Office and Cash Accounts operations: Instructions: The Required or Desired column represents whether a feature is a business
How to address top problems in test data management
How to address top problems in test data management Data reuse, sub-setting and masking Business white paper Table of contents Why you need test data management... 3 The challenges of preparing and managing
An Oracle White Paper March 2014. Best Practices for Real-Time Data Warehousing
An Oracle White Paper March 2014 Best Practices for Real-Time Data Warehousing Executive Overview Today s integration project teams face the daunting challenge that, while data volumes are exponentially
The Role of Automation Systems in Management of Change
The Role of Automation Systems in Management of Change Similar to changing lanes in an automobile in a winter storm, with change enters risk. Everyone has most likely experienced that feeling of changing
Siebel Business Process Framework: Workflow Guide. Siebel Innovation Pack 2013 Version 8.1/8.2 September 2013
Siebel Business Process Framework: Workflow Guide Siebel Innovation Pack 2013 Version 8.1/8.2 September 2013 Copyright 2005, 2013 Oracle and/or its affiliates. All rights reserved. This software and related
Mapping the Technical Dependencies of Information Assets
Mapping the Technical Dependencies of Information Assets This guidance relates to: Stage 1: Plan for action Stage 2: Define your digital continuity requirements Stage 3: Assess and manage risks to digital
Green Migration from Oracle
Green Migration from Oracle Greenplum Migration Approach Strong Experiences on Oracle Migration Automate all tasks DDL Migration Data Migration PL-SQL and SQL Scripts Migration Data Quality Tests ETL and
Audit Trail Administration
Audit Trail Administration 0890431-030 August 2003 Copyright 2003 by Concurrent Computer Corporation. All rights reserved. This publication or any part thereof is intended for use with Concurrent Computer
2.2 INFORMATION SERVICES Documentation of computer services, computer system management, and computer network management.
3 Audit Trail Files Data generated during the creation of a master file or database, used to validate a master file or database during a processing cycle. GS 14020 Retain for 3 backup cycles Computer Run
Continuous integration for databases using Redgate tools
Continuous integration for databases using Redgate tools Wie Sie die Microsoft SQL Server Data Tools mit den Tools von Redgate ergänzen und kombinieren können An overview 1 Continuous integration for
Position Classification Standard for Management and Program Clerical and Assistance Series, GS-0344
Position Classification Standard for Management and Program Clerical and Assistance Series, GS-0344 Table of Contents SERIES DEFINITION... 2 EXCLUSIONS... 2 OCCUPATIONAL INFORMATION... 3 TITLES... 6 EVALUATING
POLAR IT SERVICES. Business Intelligence Project Methodology
POLAR IT SERVICES Business Intelligence Project Methodology Table of Contents 1. Overview... 2 2. Visualize... 3 3. Planning and Architecture... 4 3.1 Define Requirements... 4 3.1.1 Define Attributes...
Automation can dramatically increase product quality, leading to lower field service, product support and
QA Automation for Testing Medical Device Software Benefits, Myths and Requirements Automation can dramatically increase product quality, leading to lower field service, product support and liability cost.
Global Software Change Management for PVCS Version Manager
Global Software Change Management for PVCS Version Manager... www.ikanalm.com Summary PVCS Version Manager is considered as one of the leading versioning tools that offers complete versioning control.
Decomposition into Parts. Software Engineering, Lecture 4. Data and Function Cohesion. Allocation of Functions and Data. Component Interfaces
Software Engineering, Lecture 4 Decomposition into suitable parts Cross cutting concerns Design patterns I will also give an example scenario that you are supposed to analyse and make synthesis from The
BIRT Document Transform
BIRT Document Transform BIRT Document Transform is the industry leader in enterprise-class, high-volume document transformation. It transforms and repurposes high-volume documents and print streams such
EMC DOCUMENT SCIENCES XPRESSION ENTERPRISE INTEGRATION
White Paper EMC DOCUMENT SCIENCES XPRESSION ENTERPRISE INTEGRATION How xpression integrates with applications, content, data, web, and distribution systems Abstract This white paper describes the EMC Document
Unicenter Desktop DNA r11
Data Sheet Unicenter Desktop DNA r11 Unicenter Desktop DNA is a scalable migration solution for the management, movement and maintenance of a PC s DNA (including user settings, preferences and data.) A
BUSINESS SYSTEMS ANALYST I BUSINESS SYSTEMS ANALYST II
CITY OF ROSEVILLE BUSINESS SYSTEMS ANALYST I BUSINESS SYSTEMS ANALYST II DEFINITION To perform professional level work in the analysis, design, programming, testing, installation and maintenance of business
GENPACT partners with Automation Anywhere to deliver Robotic Process Automation (RPA) An Automation Anywhere Case Study
GENPACT partners with Automation Anywhere to deliver Robotic Process Automation (RPA) An Automation Anywhere Case Study Automating F&A processes takes a trucking company the extra mile Industry: Trucking
Meister Going Beyond Maven
Meister Going Beyond Maven A technical whitepaper comparing OpenMake Meister and Apache Maven OpenMake Software 312.440.9545 800.359.8049 Winners of the 2009 Jolt Award Introduction There are many similarities
Migration Strategies & Methodologies
Migration Strategies & Methodologies White Paper April 2011 Table of Contents About this Document... 3 Migration Overview... 4 Infrastructure Migration... 4 Application Migration... 5 Migration Strategies...
Performance Test Process
A white Success The performance testing helped the client identify and resolve performance bottlenecks which otherwise crippled the business. The ability to support 500 concurrent users was a performance
