Legacy Data Migration: DIY Might Leave You DOA
|
|
|
- Lindsay Dixon
- 10 years ago
- Views:
Transcription
1 Legacy Data Migration: DIY Might Leave You DOA By Wayne Lashley, Chief Business Development Officer White Paper 1
2 In any application migration/renewal project, data migration is 4. Capture of all source and target metadata into a repository; usually a relatively minor component in terms of overall project 5. Accommodation of source data structure changes over time; effort. Yet failure of the data migration process can and does 6. Native interfaces to both the source database and the cause failure of the entire project. So why do project managers target database to maximize throughput and ensure correct fail to take essential steps to mitigate this risk? interpretation and formatting; Often, data migration is perceived as the customer s problem, 7. An evolutionary methodology for refining the mappings and or a problem to be process, allowing considered later, for repeatable and the expectation comparative testing; being that someone will write a suite of 8. Minimized extraction programs extraction impact on that will dump production systems; the legacy data 9. Support for into flat files to be phased project loaded into the new implementation, relational database including Change management system Data Capture (RDBMS). However, (CDC) and possibly anytime a particular bidirectional data implementation replication capabilities. is expressed It would be a daunting in a foreign challenge for a representation, there Given the maturity, wealth of functionality and relative low cost of tools like tcvision, as migration project team is great opportunity compared to the effort, complexity and risk entailed in a Do-It-Yourself solution, there to design, develop, for mistranslation, is no reason why a legacy renewal project should run aground on data migration. test and deploy tools so such a simplistic to support these tactics and features. Furthermore, to reinvent point of view opens the door to potential catastrophe. the wheel for each new project is simply a waste of resources. A comprehensive legacy data migration strategy must promote a Fortunately, this is unnecessary, since mature, proven, featurerich and cost-effective products that contribute substantially to number of important tactics and features: 1. Discovery and analysis of legacy source data structures, project success are commercially available. including both logical and physical definitions; As a leading provider of legacy data migration, replication and 2. Data modeling and design of the RDBMS tables, often with integration products and services, Treehouse Software leverages the goal of not only reflecting the structure and content of the production experience gained in the field since the mid-1990s. legacy source, but also its behavior; Treehouse data replication solutions have been used to support many data transfer and migration projects. Each customer s 3. Specification and implementation of appropriate mappings project has led to enhancements and refinements such that and transformations from source to target; today it is possible to handle virtually any conceivable legacy data migration requirement. 2
3 As an illustration of the potential pitfalls of legacy data migration, consider some issues that Treehouse has successfully addressed with respect to Software AG s ADABAS DBMS: unique datatypes ( D and T ) that appear to be merely packed-decimal integers on the surface, but that represent date or date-time values when interpreted using Software AG s proprietary NATURAL-oriented algorithm. The most ADABAS has no concept of transaction isolation, in that a program may read a record that another program has updated, in its updated state, even though the update has not been committed. This means that programmatically reading a live ADABAS database one that is available to update users will almost inevitably lead to erroneous extraction of data. Record modifications (updates, inserts and deletes) that are extracted, and subsequently backed out, will be represented incorrectly or not at all in the appropriate way to migrate such datatypes is to recognize them and map them to the corresponding native RDBMS datatype (e.g., Oracle DATE) in conjunction with a transformation that decodes the NATURAL value and formats it to match the target datatype. tcvision, a comprehensive data migration and replication solution offered by Treehouse, provides high-productivity, high-efficiency automated capabilities in dealing with these and other issues (Figure 1). While the discussion below will focus target. Because of this, at Treehouse we say the only on ADABAS, in fact tcvision provides similar capabilities for a safe data source is a static data source not the live database. wide range of legacy sources, including CA-IDMS, CA-Datacom, IMS/DB, DL/I, DB2, Many legacy ADABAS applications make use of record typing, i.e., multiple logical tables stored in a single ADABAS file. Often, each must be extracted to a separate table in the target RDBMS. The classic example is that of the codelookup file. Most shops have a single file containing state codes, employee codes, product-type SQL/DS, VSAM and even sequential files, as well as non-legacy sources such as Oracle Database, Microsoft SQL Server, DB2 LUW and ODBC data sources. Moreover, any of the aforementioned sources can also be a target, making tcvision the most comprehensive and flexible data migration and replication solution anywhere. For discovery and analysis of legacy source data structures, tcvision codes, etc. Records belonging to a given Figure 1: The tcvision data migration and replication solution includes modeling and mapping facilities to code table may be distinguished by the presence of a value in a particular index (descriptor or superdescriptor in ADABAS parlance), or by a range of specific values. Thus, the extraction process must be able to dynamically assign data content from a given record to different target tables depending on the data content itself. view and capture logical ADABAS structures, as documented in Software AG s PREDICT data dictionary, as well as physical structures as described in ADABAS Field Definition Tables (FDTs). Note that PREDICT is a passive data dictionary there is neither requirement nor enforcement that the logical and physical representations agree, so it is necessary to scrutinize both to ADABAS is most often used in conjunction with Software AG s NATURAL 4GL, and conveniently provides for ensure that the source structures are accurately modeled. 3
4 Figure 4: tcvision target RDBMS DDL generation Figure 2: tcvision capture of source structure metadata Productivity in the data modeling and design of the RDBMS tables is greatly enhanced with tcvision. In seconds, the product automatically generates a complete, native, fullyfunctional yet completely-customizable RDBMS (IBM DB2, Oracle, Microsoft SQL Server, etc.) schema based on a specified ADABAS file (Figure 3). Furthermore, tcvision generates specification and implementation of appropriate mappings and transformations for converting ADABAS datatypes and structures to corresponding RDBMS datatypes and structures, including automatic handling of the proprietary D and T source datatypes. It is important to note that the schema, mappings and transformations that result from auto-generation can be tailored to any specific requirements after the fact. It is even possible to import an existing RDBMS schema and retrofit it, via drag-anddrop, to the source ADABAS elements. Using the tcvision GUI, the most complex transformations can be specified. Source fields can be combined into a single column, decomposed into separate columns, and be subject to calculations, database lookups, string operations and even programmatic manipulation. Furthermore, mapping rules can be implemented to specify that data content from a source ADABAS record be mapped to one or more target RDBMS tables each with its own different structure, as desired based on the data content itself. Target tables can even be populated from more than one source file. Figure 3: tcvision target schema auto-generation and mapping tcvision even generates the RDBMS Data Definition Language (DDL) to instantiate the schema (Figure 4). Over the course of a migration project, it is nearly inevitable that source data structure changes will be required and must be accommodated, which almost certainly will affect the target and the data migration process. tcvision source-to-target mappings can be specified with a Valid From date/time and Valid To date/time so that the structure change can be anticipated, scheduled and implemented smoothly (Figure 5). 4
5 tcvision ETL executes as a single continuous process from source to target, without the need for intermediate storage or middleware. The process can be scheduled and monitored from the tcvision Control Board (Figure 6). Figure 6: tcvision Control Board Figure 5: tcvision source-to-target mapping tcvision captures all source and target metadata into its own RDBMS-based repository, the schema of which is documented to facilitate customized querying, reporting or even integration with other applications. At any time, all metadata and specifications for source-to-target data migration are at the project team s fingertips, and where applicable can be exposed to or integrated with application-migration toolsets. tcvision offers highly-efficient batch Extract-Transform- Load (ETL) processing, known as Bulk Loading in tcvision terminology. Bulk Loading is optimized by use of native interfaces to both the source database and the target database: a static native ADABAS data source (a utility unload of the pertinent database files) can be used as a data source instead of accessing the live database itself thus avoiding contention and transaction-isolation issues and the transformed data is natively formatted to be RDBMS-loader-ready, including automatic generation of the loader utility control instructions, with a high-throughput transformation engine in the middle. With these productivity aids, and an optimized ETL process, an evolutionary methodology for refining the mappings and process is enabled. Treehouse advises data migration customers to migrate early and often. The sooner that data can be populated into the prospective RDBMS, the sooner that data quality and assumption issues can be discovered, confronted and resolved, and capacity and performance planning commenced. Moreover, very early on in the process a test bed can be generated for modeling and prototyping the new application. Using a static data source allows ETL processing to be repeated and compared between various mapping scenarios and data samples. Thus, the RDBMS schema and mappings can be iteratively refined over time as requirements are developed and discoveries made. 5
6 Customers also benefit by minimized extraction impact on the operational ADABAS database and applications. ETL can be resource-intensive and costly, particularly in scenarios where database calls are charged back to the user and batch windows are tight. When configured to use a static data source, tcvision places no workload whatsoever on ADABAS itself. Moreover, it is possible to simply transmit the unloaded source data in its raw format off the mainframe and have tcvision process on a platform with available low-tco capacity (Windows, Linux, Unix) incurring no mainframe workload at all. Due to pure logistics, most migration projects require support for phased project implementation. Only the smallest projects can risk a big bang approach. tcvision users benefit from Replication, tcvision s term for CDC, in that it enables implementation of the new RDBMS for certain tables, to be used in a read-only manner (generally speaking) in the new system, while maintenance of the source ADABAS data (the system of record ) continues in the legacy system for a period of time. tcvision s Log Processing mode enables changes made to ADABAS (reflected in the ADABAS protection log, or PLOG) to be periodically extracted and transformed to appropriate SQL statements, according to the established mapping rules the same ones as are used for ETL to update the target RDBMS tables. This can even be done in real time using tcvision s DBMS Extension mode, where changes are captured live from the operational ADABAS database. And for very complex project implementations, tcvision can be configured to capture changes from both the ADABAS and the RDBMS for batch or real-time bidirectional Replication. In projects where phased implementation may or may not be contemplated, but where data volumes to be migrated are large, tcvision Replication enables the data migration to be staged over a period of time as long a period as is needed to accomplish the migration according to the available processing capacity or other constraints. ADABAS files can be sequenced and grouped as desired, and each file or group of files separately Bulk Loaded to the RDBMS. Thereafter, Replication can be activated for each Bulk Loaded file. This process can continue until all files have been migrated through Bulk Loading and then maintained in sync via Replication so that all migrated data arrives at the goal line at the same time. In fact, the golive can be scheduled whenever convenient by simply continuing Replication until the appropriate time. This capability offers enormous planning and scheduling flexibility for the migration team. It is impractical to discuss all the features and capabilities of tcvision within an overview discussion, so suffice it to say that we have only touched on those most critical ones here. Given the maturity, wealth of functionality and relative low cost of tools like tcvision, as compared to the effort, complexity and risk entailed in a Do-It-Yourself, solution there is no reason why a legacy renewal project should run aground on data migration. About the Author J. Wayne Lashley is Chief Business Development Officer for, based in Sewickley, PA. Wayne oversees Treehouse product development; sales and marketing; technical support, documentation and quality assurance; partner relations; pre- and post-sales service delivery; and corporate technology. Prior to joining Treehouse in 2000, Wayne held IT technical and management positions in software development, academic, financial services, government and primary industry sectors through a career that began in the late 1970s. His positions involved responsibility for programming, analysis, testing, standards and quality assurance, documentation, configuration management, emerging technologies, business intelligence systems, database administration, development support, software procurement, product management and service delivery. 6
7 About Since 1982, Treehouse Software has been serving enterprises worldwide with industry-leading software products and outstanding technical support. Today, Treehouse is a global leader in providing data migration, replication and integration solutions for the most complex and demanding heterogeneous environments, as well as feature-rich, accelerated-roi offerings for information delivery, business intelligence and analytics and application modernization. Treehouse Software customers are able to: REPLICATE Data Anywhere INTEGRATE Data Everywhere MODERNIZE Data from Legacy Applications ANALYZE Data for Business Advantage With unmatched comprehensiveness of tools and depth of experience, Treehouse Software applies proven approaches to help customers and partners mitigate risk and profit sooner from modernization benefits. For more information, contact us: [email protected] Phone: Nicholson Road, Suite 1230 Sewickley, PA
SharePlex for SQL Server
SharePlex for SQL Server Improving analytics and reporting with near real-time data replication Written by Susan Wong, principal solutions architect, Dell Software Abstract Many organizations today rely
Accelerate Data Loading for Big Data Analytics Attunity Click-2-Load for HP Vertica
Accelerate Data Loading for Big Data Analytics Attunity Click-2-Load for HP Vertica Menachem Brouk, Regional Director - EMEA Agenda» Attunity update» Solutions for : 1. Big Data Analytics 2. Live Reporting
OWB Users, Enter The New ODI World
OWB Users, Enter The New ODI World Kulvinder Hari Oracle Introduction Oracle Data Integrator (ODI) is a best-of-breed data integration platform focused on fast bulk data movement and handling complex data
Data Masking Secure Sensitive Data Improve Application Quality. Becky Albin Chief IT Architect [email protected]
Data Masking Secure Sensitive Data Improve Application Quality Becky Albin Chief IT Architect [email protected] Data Masking for Adabas The information provided in this PPT is entirely subject
Realizing the Benefits of Data Modernization
February 2015 Perspective Realizing the Benefits of How to overcome legacy data challenges with innovative technologies and a seamless data modernization roadmap. Companies born into the digital world
Contents. Introduction... 1
Managed SQL Server 2005 Deployments with CA ERwin Data Modeler and Microsoft Visual Studio Team Edition for Database Professionals Helping to Develop, Model, and Maintain Complex Database Architectures
<Insert Picture Here> Move to Oracle Database with Oracle SQL Developer Migrations
Move to Oracle Database with Oracle SQL Developer Migrations The following is intended to outline our general product direction. It is intended for information purposes only, and
Microsoft SQL Server for Oracle DBAs Course 40045; 4 Days, Instructor-led
Microsoft SQL Server for Oracle DBAs Course 40045; 4 Days, Instructor-led Course Description This four-day instructor-led course provides students with the knowledge and skills to capitalize on their skills
An Oracle White Paper March 2014. Best Practices for Real-Time Data Warehousing
An Oracle White Paper March 2014 Best Practices for Real-Time Data Warehousing Executive Overview Today s integration project teams face the daunting challenge that, while data volumes are exponentially
Green Migration from Oracle
Green Migration from Oracle Greenplum Migration Approach Strong Experiences on Oracle Migration Automate all tasks DDL Migration Data Migration PL-SQL and SQL Scripts Migration Data Quality Tests ETL and
Attunity Integration Suite
Attunity Integration Suite A White Paper February 2009 1 of 17 Attunity Integration Suite Attunity Ltd. follows a policy of continuous development and reserves the right to alter, without prior notice,
Extraction Transformation Loading ETL Get data out of sources and load into the DW
Lection 5 ETL Definition Extraction Transformation Loading ETL Get data out of sources and load into the DW Data is extracted from OLTP database, transformed to match the DW schema and loaded into the
PROGRESS OPENEDGE PRO2 DATA REPLICATION
WHITEPAPER PROGRESS OPENEDGE PRO2 DATA REPLICATION DATA OpenEdge OpenEdge Pro2 Data Server DATA Oracle, SQL Server, OpenEdge Contents Introduction... 2 Executive Summary... 3 The Pro2 Solution... 4 Pro2
Dedicated Real-time Reporting Instances for Oracle Applications using Oracle GoldenGate
Dedicated Real-time Reporting Instances for Oracle Applications using Oracle GoldenGate Keywords: Karsten Stöhr ORACLE Deutschland B.V. & Co. KG Hamburg Operational Reporting, Query Off-Loading, Oracle
SAP Data Services 4.X. An Enterprise Information management Solution
SAP Data Services 4.X An Enterprise Information management Solution Table of Contents I. SAP Data Services 4.X... 3 Highlights Training Objectives Audience Pre Requisites Keys to Success Certification
Implementing a Data Warehouse with Microsoft SQL Server
This course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse 2014, implement ETL with SQL Server Integration Services, and
Five Steps to Integrate SalesForce.com with 3 rd -Party Systems and Avoid Most Common Mistakes
Five Steps to Integrate SalesForce.com with 3 rd -Party Systems and Avoid Most Common Mistakes This white paper will help you learn how to integrate your SalesForce.com data with 3 rd -party on-demand,
University Data Warehouse Design Issues: A Case Study
Session 2358 University Data Warehouse Design Issues: A Case Study Melissa C. Lin Chief Information Office, University of Florida Abstract A discussion of the design and modeling issues associated with
COURSE 20463C: IMPLEMENTING A DATA WAREHOUSE WITH MICROSOFT SQL SERVER
Page 1 of 8 ABOUT THIS COURSE This 5 day course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse with Microsoft SQL Server
IMS Users Group. The Right Change SQL-Based Middleware. Mike McKee CONNX Solutions. [email protected] http://www.connx.com
IMS Users Group The Right Change SQL-Based Middleware Mike McKee CONNX Solutions [email protected] http://www.connx.com What was the most overused word in 2008 Election? Maverick My Friends Joe the Plumber
Implementing a Data Warehouse with Microsoft SQL Server
Page 1 of 7 Overview This course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse with Microsoft SQL 2014, implement ETL
Implementing a Data Warehouse with Microsoft SQL Server MOC 20463
Implementing a Data Warehouse with Microsoft SQL Server MOC 20463 Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing
COURSE OUTLINE MOC 20463: IMPLEMENTING A DATA WAREHOUSE WITH MICROSOFT SQL SERVER
COURSE OUTLINE MOC 20463: IMPLEMENTING A DATA WAREHOUSE WITH MICROSOFT SQL SERVER MODULE 1: INTRODUCTION TO DATA WAREHOUSING This module provides an introduction to the key components of a data warehousing
Enabling Better Business Intelligence and Information Architecture With SAP Sybase PowerDesigner Software
SAP Technology Enabling Better Business Intelligence and Information Architecture With SAP Sybase PowerDesigner Software Table of Contents 4 Seeing the Big Picture with a 360-Degree View Gaining Efficiencies
Consolidate by Migrating Your Databases to Oracle Database 11g. Fred Louis Enterprise Architect
Consolidate by Migrating Your Databases to Oracle Database 11g Fred Louis Enterprise Architect Agenda Why migrate to Oracle What is migration? What can you migrate to Oracle? SQL Developer Migration Workbench
Comparing Microsoft SQL Server 2005 Replication and DataXtend Remote Edition for Mobile and Distributed Applications
Comparing Microsoft SQL Server 2005 Replication and DataXtend Remote Edition for Mobile and Distributed Applications White Paper Table of Contents Overview...3 Replication Types Supported...3 Set-up &
Enabling Better Business Intelligence and Information Architecture With SAP PowerDesigner Software
SAP Technology Enabling Better Business Intelligence and Information Architecture With SAP PowerDesigner Software Table of Contents 4 Seeing the Big Picture with a 360-Degree View Gaining Efficiencies
Selecting the Right Change Management Solution Key Factors to Consider When Evaluating Change Management Tools for Your Databases and Teams
Tech Notes Selecting the Right Change Management Solution Key Factors to Consider When Evaluating Change Management Tools for Your Databases and Teams Embarcadero Technologies July 2007 Corporate Headquarters
Oracle Data Integrator Technical Overview. An Oracle White Paper Updated December 2006
Oracle Data Integrator Technical Overview An Oracle White Paper Updated December 2006 Oracle Data Integrator Technical Overview Introduction... 3 E-LT Architecture... 3 Traditional ETL... 3 E-LT... 4 Declarative
An Oracle White Paper February 2014. Oracle Data Integrator 12c Architecture Overview
An Oracle White Paper February 2014 Oracle Data Integrator 12c Introduction Oracle Data Integrator (ODI) 12c is built on several components all working together around a centralized metadata repository.
Oracle Data Integrator 12c: Integration and Administration
Oracle University Contact Us: +33 15 7602 081 Oracle Data Integrator 12c: Integration and Administration Duration: 5 Days What you will learn Oracle Data Integrator is a comprehensive data integration
An Oracle White Paper. Using Oracle GoldenGate to Achieve Operational Reporting for Oracle Applications
An Oracle White Paper Using Oracle GoldenGate to Achieve Operational Reporting for Oracle Applications Executive Overview... 1 Introduction: Right Time For Reporting... 2 Common Solutions for Reporting...
East Asia Network Sdn Bhd
Course: Analyzing, Designing, and Implementing a Data Warehouse with Microsoft SQL Server 2014 Elements of this syllabus may be change to cater to the participants background & knowledge. This course describes
The Evolution of ETL
The Evolution of ETL -From Hand-coded ETL to Tool-based ETL By Madhu Zode Data Warehousing & Business Intelligence Practice Page 1 of 13 ABSTRACT To build a data warehouse various tools are used like modeling
data express DATA SHEET OVERVIEW
data express DATA SHEET OVERVIEW The reliability of IT systems is a key requirement of almost any organization. Unexpected failure of enterprise systems can be expensive and damaging to an organization.
Implement a Data Warehouse with Microsoft SQL Server 20463C; 5 days
Lincoln Land Community College Capital City Training Center 130 West Mason Springfield, IL 62702 217-782-7436 www.llcc.edu/cctc Implement a Data Warehouse with Microsoft SQL Server 20463C; 5 days Course
In-memory databases and innovations in Business Intelligence
Database Systems Journal vol. VI, no. 1/2015 59 In-memory databases and innovations in Business Intelligence Ruxandra BĂBEANU, Marian CIOBANU University of Economic Studies, Bucharest, Romania [email protected],
Using Oracle Data Integrator with Essbase, Planning and the Rest of the Oracle EPM Products
Using Oracle Data Integrator with Essbase, Planning and the Rest of the Oracle EPM Products Edward Roske [email protected] BLOG: LookSmarter.blogspot.com WEBSITE: www.interrel.com TWITTER: ERoske 2 4
What You Don t Know Will Haunt You.
Comprehensive Consulting Solutions, Inc. Business Savvy. IT Smart. Joint Application Design (JAD) A Case Study White Paper Published: June 2002 (with revisions) What You Don t Know Will Haunt You. Contents
Shadowbase Data Replication Solutions. William Holenstein Senior Manager of Product Delivery Shadowbase Products Group
Shadowbase Data Replication Solutions William Holenstein Senior Manager of Product Delivery Shadowbase Products Group 1 Agenda Introduction to Gravic Shadowbase Product Overview Shadowbase for Business
A TECHNICAL WHITE PAPER ATTUNITY VISIBILITY
A TECHNICAL WHITE PAPER ATTUNITY VISIBILITY Analytics for Enterprise Data Warehouse Management and Optimization Executive Summary Successful enterprise data management is an important initiative for growing
Microsoft. Course 20463C: Implementing a Data Warehouse with Microsoft SQL Server
Course 20463C: Implementing a Data Warehouse with Microsoft SQL Server Length : 5 Days Audience(s) : IT Professionals Level : 300 Technology : Microsoft SQL Server 2014 Delivery Method : Instructor-led
Implementing and Maintaining Microsoft SQL Server 2008 Integration Services
Course 6234A: Implementing and Maintaining Microsoft SQL Server 2008 Integration Services Length: 3 Days Language(s): English Audience(s): IT Professionals Level: 200 Technology: Microsoft SQL Server 2008
Move Data from Oracle to Hadoop and Gain New Business Insights
Move Data from Oracle to Hadoop and Gain New Business Insights Written by Lenka Vanek, senior director of engineering, Dell Software Abstract Today, the majority of data for transaction processing resides
Overview Western 12.-13.9.2012 Mariusz Gieparda
Overview Western 12.-13.9.2012 Mariusz Gieparda 1 Corporate Overview Company Global Leader in Business Continuity Easy. Affordable. Innovative. Technology Protection Operational Excellence Compliance Customer
Compared to MySQL database, Oracle has the following advantages:
To: John, Jerry Date: 2-23-07 From: Joshua Li Subj: Migration of NEESit Databases from MySQL to Oracle I. Why do we need database migration? Compared to MySQL database, Oracle has the following advantages:
Course 10777A: Implementing a Data Warehouse with Microsoft SQL Server 2012
Course 10777A: Implementing a Data Warehouse with Microsoft SQL Server 2012 OVERVIEW About this Course Data warehousing is a solution organizations use to centralize business data for reporting and analysis.
Best Practices and a Must Have Toolset for SOA Migration Projects
White Paper Best Practices and a Must Have Toolset for SOA Migration Projects Six Ways to Leverage Embarcadero All-Access Ron Lewis, CDO Technologies February 2010 Corporate Headquarters EMEA Headquarters
Efficient and Real Time Data Integration With Change Data Capture
Efficient and Real Time Data Integration With Change Data Capture Efficiency. Low Latency. No Batch Windows. Lower Costs. an Attunity White Paper Change Data Capture (CDC) is a strategic component in the
IT Services Management Service Brief
IT Services Management Service Brief Service Continuity (Disaster Recovery Planning) Prepared by: Rick Leopoldi May 25, 2002 Copyright 2002. All rights reserved. Duplication of this document or extraction
Implementing a Data Warehouse with Microsoft SQL Server 2012
Course 10777A: Implementing a Data Warehouse with Microsoft SQL Server 2012 Length: Audience(s): 5 Days Level: 200 IT Professionals Technology: Microsoft SQL Server 2012 Type: Delivery Method: Course Instructor-led
The Top 10 Things DBAs Should Know About Toad for IBM DB2
The Top 10 Things DBAs Should Know About Toad for IBM DB2 Written by Jeff Podlasek, senior product architect, Dell Software Abstract Toad for IBM DB2 is a powerful tool for the database administrator.
WHITE PAPER DATA GOVERNANCE ENTERPRISE MODEL MANAGEMENT
WHITE PAPER DATA GOVERNANCE ENTERPRISE MODEL MANAGEMENT CONTENTS 1. THE NEED FOR DATA GOVERNANCE... 2 2. DATA GOVERNANCE... 2 2.1. Definition... 2 2.2. Responsibilities... 3 3. ACTIVITIES... 6 4. THE
Real-time Data Replication
Real-time Data Replication from Oracle to other databases using DataCurrents WHITEPAPER Contents Data Replication Concepts... 2 Real time Data Replication... 3 Heterogeneous Data Replication... 4 Different
ETL Process in Data Warehouse. G.Lakshmi Priya & Razia Sultana.A Assistant Professor/IT
ETL Process in Data Warehouse G.Lakshmi Priya & Razia Sultana.A Assistant Professor/IT Outline ETL Extraction Transformation Loading ETL Overview Extraction Transformation Loading ETL To get data out of
How To Use Shareplex
Data consolidation and distribution with SharePlex database replication Written by Sujith Kumar, Chief Technologist Executive summary In today s fast-paced mobile age, data continues to accrue by leaps
ETL Overview. Extract, Transform, Load (ETL) Refreshment Workflow. The ETL Process. General ETL issues. MS Integration Services
ETL Overview Extract, Transform, Load (ETL) General ETL issues ETL/DW refreshment process Building dimensions Building fact tables Extract Transformations/cleansing Load MS Integration Services Original
Oracle Data Integrator 11g: Integration and Administration
Oracle University Contact Us: Local: 1800 103 4775 Intl: +91 80 4108 4709 Oracle Data Integrator 11g: Integration and Administration Duration: 5 Days What you will learn Oracle Data Integrator is a comprehensive
Oracle to MySQL Migration
to Migration Stored Procedures, Packages, Triggers, Scripts and Applications White Paper March 2009, Ispirer Systems Ltd. Copyright 1999-2012. Ispirer Systems Ltd. All Rights Reserved. 1 Introduction The
Migrating Non-Oracle Databases and their Applications to Oracle Database 12c O R A C L E W H I T E P A P E R D E C E M B E R 2 0 1 4
Migrating Non-Oracle Databases and their Applications to Oracle Database 12c O R A C L E W H I T E P A P E R D E C E M B E R 2 0 1 4 1. Introduction Oracle provides products that reduce the time, risk,
Implementing a Data Warehouse with Microsoft SQL Server 2012 MOC 10777
Implementing a Data Warehouse with Microsoft SQL Server 2012 MOC 10777 Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing
Custom Consulting Services Catalog
Custom Consulting Services Catalog Meeting Your Exact Needs Contents Custom Consulting Services Overview... 1 Assessment & Gap Analysis... 2 Requirements & Portfolio Planning... 3 Roadmap & Justification...
Phire Architect Hardware and Software Requirements
Phire Architect Hardware and Software Requirements Copyright 2014, Phire. All rights reserved. The Programs (which include both the software and documentation) contain proprietary information; they are
SAP Sybase Replication Server What s New in 15.7.1 SP100. Bill Zhang, Product Management, SAP HANA Lisa Spagnolie, Director of Product Marketing
SAP Sybase Replication Server What s New in 15.7.1 SP100 Bill Zhang, Product Management, SAP HANA Lisa Spagnolie, Director of Product Marketing Agenda SAP Sybase Replication Server Overview Replication
Implementing a Data Warehouse with Microsoft SQL Server 2012
Course 10777 : Implementing a Data Warehouse with Microsoft SQL Server 2012 Page 1 of 8 Implementing a Data Warehouse with Microsoft SQL Server 2012 Course 10777: 4 days; Instructor-Led Introduction Data
HP Shadowbase Solutions Overview
HP Solutions Overview 2015 GTUG Conference & Exhibition Paul J. Holenstein Executive Vice President Products Group Gravic, Inc. Introduction Corporate HQ Malvern, PA USA Paul J. Holenstein Executive Vice
A roadmap to enterprise data integration.
Information integration solutions February 2006 A roadmap to enterprise data integration. Colin White BI Research Page 1 Contents 1 Data integration in the enterprise 1 Characteristics of data integration
ETL Tools. L. Libkin 1 Data Integration and Exchange
ETL Tools ETL = Extract Transform Load Typically: data integration software for building data warehouse Pull large volumes of data from different sources, in different formats, restructure them and load
Cloud Ready Data: Speeding Your Journey to the Cloud
Cloud Ready Data: Speeding Your Journey to the Cloud Hybrid Cloud first Born to the cloud 3 Am I part of a Cloud First organization? Am I part of a Cloud First agency? The cloud applications questions
Oracle Data Integrator 11g New Features & OBIEE Integration. Presented by: Arun K. Chaturvedi Business Intelligence Consultant/Architect
Oracle Data Integrator 11g New Features & OBIEE Integration Presented by: Arun K. Chaturvedi Business Intelligence Consultant/Architect Agenda 01. Overview & The Architecture 02. New Features Productivity,
Designing Business Intelligence Solutions with Microsoft SQL Server 2012 Course 20467A; 5 Days
Lincoln Land Community College Capital City Training Center 130 West Mason Springfield, IL 62702 217-782-7436 www.llcc.edu/cctc Designing Business Intelligence Solutions with Microsoft SQL Server 2012
Real Time Data Integration
Using Change Data Capture Technology with Microsoft SSIS An Attunity White Paper Change data capture (CDC) technology has become a strategic component of many data warehouse and business intelligence (BI)
Module 5 Introduction to Processes and Controls
IT Terminology 1. General IT Environment The general IT environment is the umbrella over the following IT processes: 1. Operating Systems 2. Physical and Logical Security 3. Program Changes 4. System Development
How to Implement Multi-way Active/Active Replication SIMPLY
How to Implement Multi-way Active/Active Replication SIMPLY The easiest way to ensure data is always up to date in a 24x7 environment is to use a single global database. This approach works well if your
IBM INFORMATION MANAGEMENT SYSTEMS (IMS ) MIGRATION AND MODERNIZATION - CONVERSION OF HIERARCHICAL DL/1 STRUCTURES TO RDBMS
IBM INFORMATION MANAGEMENT SYSTEMS (IMS ) MIGRATION AND MODERNIZATION - CONVERSION OF HIERARCHICAL DL/1 STRUCTURES TO RDBMS Leverage the technology and operational advantages inherent within the modern
Test Data Management Concepts
Test Data Management Concepts BIZDATAX IS AN EKOBIT BRAND Executive Summary Test Data Management (TDM), as a part of the quality assurance (QA) process is more than ever in the focus among IT organizations
Course Outline. Module 1: Introduction to Data Warehousing
Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing solution and the highlevel considerations you must take into account
Integrating data in the Information System An Open Source approach
WHITE PAPER Integrating data in the Information System An Open Source approach Table of Contents Most IT Deployments Require Integration... 3 Scenario 1: Data Migration... 4 Scenario 2: e-business Application
Foundations of Business Intelligence: Databases and Information Management
Chapter 5 Foundations of Business Intelligence: Databases and Information Management 5.1 See Markers-ORDER-DB Logically Related Tables Relational Approach: Physically Related Tables: The Relationship Screen
ISM 318: Database Systems. Objectives. Database. Dr. Hamid R. Nemati
ISM 318: Database Systems Dr. Hamid R. Nemati Department of Information Systems Operations Management Bryan School of Business Economics Objectives Underst the basics of data databases Underst characteristics
EAI vs. ETL: Drawing Boundaries for Data Integration
A P P L I C A T I O N S A W h i t e P a p e r S e r i e s EAI and ETL technology have strengths and weaknesses alike. There are clear boundaries around the types of application integration projects most
SQL Server Master Data Services A Point of View
SQL Server Master Data Services A Point of View SUBRAHMANYA V SENIOR CONSULTANT [email protected] Abstract Is Microsoft s Master Data Services an answer for low cost MDM solution? Will
Business Intelligence Tutorial
IBM DB2 Universal Database Business Intelligence Tutorial Version 7 IBM DB2 Universal Database Business Intelligence Tutorial Version 7 Before using this information and the product it supports, be sure
IBM TSM DISASTER RECOVERY BEST PRACTICES WITH EMC DATA DOMAIN DEDUPLICATION STORAGE
White Paper IBM TSM DISASTER RECOVERY BEST PRACTICES WITH EMC DATA DOMAIN DEDUPLICATION STORAGE Abstract This white paper focuses on recovery of an IBM Tivoli Storage Manager (TSM) server and explores
Enterprise Information Integration (EII) A Technical Ally of EAI and ETL Author Bipin Chandra Joshi Integration Architect Infosys Technologies Ltd
Enterprise Information Integration (EII) A Technical Ally of EAI and ETL Author Bipin Chandra Joshi Integration Architect Infosys Technologies Ltd Page 1 of 8 TU1UT TUENTERPRISE TU2UT TUREFERENCESUT TABLE
Oracle Data Integration: CON7926 Oracle Data Integration: A Crucial Ingredient for Cloud Integration
Oracle Data Integration: CON7926 Oracle Data Integration: A Crucial Ingredient for Cloud Integration Julien Testut Principal Product Manager, Oracle Data Integration Sumit Sarkar Principal Systems Engineer,
Addressing the SAP Data Migration Challenges with SAP Netweaver XI
Addressing the SAP Data Migration Challenges with SAP Netweaver XI Executive Summary: Whether it is during the final phases of a new SAP implementation, during SAP upgrades and updates, during corporate
Best Practices for Implementing Oracle Data Integrator (ODI) July 21, 2011
July 21, 2011 Lee Anne Spencer Founder & CEO Global View Analytics Cheryl McCormick Chief Architect Global View Analytics Agenda Introduction Oracle Data Integrator ODI Components Best Practices Implementation
Oracle SQL Developer Migration
An Oracle White Paper May 2010 Oracle SQL Developer Migration Overview... 3 Introduction... 3 Oracle SQL Developer: Architecture and Supported Platforms... 3 Supported Platforms... 4 Supported Databases...
Implementing a Data Warehouse with Microsoft SQL Server 2012
Implementing a Data Warehouse with Microsoft SQL Server 2012 Module 1: Introduction to Data Warehousing Describe data warehouse concepts and architecture considerations Considerations for a Data Warehouse
MODULE FRAMEWORK : Dip: Information Technology Network Integration Specialist (ITNIS) (Articulate to Edexcel: Adv. Dip Network Information Specialist)
Year: 2011 (2 nd year) Term: 3 Class group: Blue Facilitator: C. Du Toit Description: Learn how to install, configure and administer a Microsoft SQL Server. Learn how to manage a SQL database hosted on
Contents. visualintegrator The Data Creator for Analytical Applications. www.visualmetrics.co.uk. Executive Summary. Operational Scenario
About visualmetrics visualmetrics is a Business Intelligence (BI) solutions provider that develops and delivers best of breed Analytical Applications, utilising BI tools, to its focus markets. Based in
Course Outline: Course: Implementing a Data Warehouse with Microsoft SQL Server 2012 Learning Method: Instructor-led Classroom Learning
Course Outline: Course: Implementing a Data with Microsoft SQL Server 2012 Learning Method: Instructor-led Classroom Learning Duration: 5.00 Day(s)/ 40 hrs Overview: This 5-day instructor-led course describes
ETL-EXTRACT, TRANSFORM & LOAD TESTING
ETL-EXTRACT, TRANSFORM & LOAD TESTING Rajesh Popli Manager (Quality), Nagarro Software Pvt. Ltd., Gurgaon, INDIA [email protected] ABSTRACT Data is most important part in any organization. Data
