Key Data Replication Criteria for Enabling Operational Reporting and Analytics
|
|
|
- Eleanor Park
- 10 years ago
- Views:
Transcription
1 Key Data Replication Criteria for Enabling Operational Reporting and Analytics A Technical Whitepaper Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy May 2013 Sponsored by
2 Copyright 2013 R20/Consultancy. All rights reserved. Informatica Data Replication, Informatica Fast Clone, Informatica Data Services, and Informatica PowerCenter are registered trademarks or trademarks of Informatica Corporation. Trademarks of other companies referenced in this document are the sole property of their respective owners.
3 Table of Contents 1 Management Summary 1 2 Operational Business Intelligence: Easier Said Than Done Business Intelligence and Car Windows and Mirrors What is Operational BI? The Business Value of Operational BI Operational BI: Easier Said Than Done 3 The Need for Data Replication Data Replication in a Nutshell Data Replication in Business Intelligence Architectures Data Replication Criteria 4 Heterogeneous Data Source and Data Target Access 6 5 Non-Intrusive Data Extraction 7 6 Scalability and High-Performance 8 7 Easy to Use and Manage 11 8 End-to-End Data Integration 12 About the Author Rick F. van der Lans 15 About Informatica Corporation
4
5 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 1 1 Management Summary Operational business intelligence (OBI) is about supporting decision-making processes that require access to zero-latency data. Because these decisions are often urgent and timesensitive, access to old data is worthless. A key technology to implement operational BI is data replication. With this technology new data entered in production systems can be made available for reporting and analytics much quicker than with ETL. It allows decision makers to see what s happening right there and right then within the organization. Organizations have to study whether they need data replication technology, and if so, which product to select. But what are critical criteria for selecting tools? What are, for example, important performance-related capabilities when used in a BI environment? Are transformation features important? To help out with evaluating data replication technology for developing OBI systems, this whitepaper lists and explains important criteria which are classified in five groups: Heterogeneous data source and data target access Non-intrusive data extraction Scalability and high-performance Easy to use and manage End-to-end data integration In addition, this whitepaper describes which criteria are met by Informatica Data Replication (IDR). 2 Operational Business Intelligence: Easier Said Than Done Business Intelligence and Car Windows and Mirrors Business intelligence is the domain that focusses on supporting and improving the decision making processes of organizations. This is done by supplying the organization with reporting and analytical capabilities. Reporting is about presenting to users what has happened and analytics is about showing what may happen in the future. A popular analogy used to illustrate the differences between reporting and analytics is driving a car. Reporting can be compared to looking through a car s rearview mirror, whereas analytics can be associated with looking through the front window. Reporting allows you to see what has happened and analytics helps you to predict what may happen. Unfortunately, the analogy is only partially correct. If you look through the rearview mirror of your car, you see what s happening right at that moment. Unfortunately, this is not true for most reports. Most of them show what has happened a few hours ago, the previous day, or maybe even what has happened the previous month. In a way, the same is true for analytics, it should show what may happen. If you look through your car s front window, you see what s happening in front of you right then and there. You won t see what will happen after you have rounded the corner.
6 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 2 Bottom line: Looking through car windows and mirrors shows what s happening right there and then. This is difficult to relate to classic reporting and analytics. This is the domain of operational BI! What is Operational BI? In a nutshell, operational BI is about supporting decision-making processes with reporting and analytics that operate on zero-latency data (or operational data). Popular synonyms for OBI are operational intelligence, real-time analytics, and real-time business intelligence. Whatever name is used, as Wikipedia indicates: The purpose of OBI is to monitor business activities and identify and detect situations relating to inefficiencies, opportunities, and threats and provide operational solutions. In many organizations, OBI is already being used. For example, for a long time now credit card companies try to detect fraudulent payments immediately after they are made. The sooner a false payment is detected, the less the impact on the customer and the company itself. Another example is traffic control at airports. Traffic controllers, sitting in their high towers and managing incoming and outgoing airplanes, can only operate successfully when they are supplied with zero-latency data. Showing on their monitors where flights were 30 seconds ago, would not be very helpful. To return to the car analogy. As drivers of cars, we are users of OBI. For example, the fuel gauge shows how much fuel is left in the tank. It would not be useful if it indicates how much fuel was available 30 minutes ago. The same applies to the speedometer. If a speedometer shows what our speed was 1 minute ago, we could end up with a mountain of speeding tickets. When you drive a car, all your instruments should show the situation as is. The Aberdeen Group 1 identifies the following six "flavors" of OBI: transactional BI with analysis and reporting real-time analytics with business rules applied to data as it is captured near real-time analytics business rules operational reporting business activity monitoring or business process monitoring decision management based on business rules with integrated reporting and analytic applications The Business Value of Operational BI Many business processes and user groups can benefit dramatically from zero-latency data that shows what s happening right there and then. Operational management, the workforce, and external business parties, such as customers and suppliers, are all examples of user groups that can benefit from OBI. Organizations usually identify the following benefits for implementing operational BI: improved efficiency of individual business processes and the organization as a whole, reduction of operating costs, and increased customer satisfaction. 1 The Aberdeen Group, Operational BI: Getting Real Time About Performance, December 2007, see ibm.com/jct03004c/tools/cpeportal/fileserve/download0/140686/AberdeenOperationBIperf.pdf?contentid=140686
7 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 3 For example, in a credit card company the costs of fixing fraudulent payments the minute it happens, is factors less than when those false payments are discovered a day later. The thief may have already made hundreds of payments. Another example is when visitors are browsing a website. Based on the style of browsing, certain advertisement should be presented. Again, which advertisement to show, depends on what the visitor is doing at that particular moment. Operational BI: Easier Said Than Done Most current business intelligence systems have an architecture consisting of a network of databases; see Figure 1. In this architecture, data is entered in the operational databases on the left-hand side of the diagram. ETL solutions are used to copy data from these operational databases to a staging area. Next, data is copied to a data warehouse. It s very likely that ETL technology is used to transform, standardize, integrate, and cleanse data during this process. Then, the data is copied to one or more data marts. In most of these data marts, data is organized in tables with a star schema or snowflake schema. Finally, data is copied to personal data stores (not in all cases). With every step, the form and quality level of the data is more in line with the requirements of the reporting and analytical tools. personal data stores operational databases data marts data staging area data warehouse Figure 1 Most business intelligence systems form a network of databases in which ETL processes are responsible for copying data between data stores. This step-wise approach takes time. It takes time before data, entered in an operational database, arrives in a data mart or personal data store and becomes available for reporting and analytics. Clearly, if users need access to zero-latency data, that data has to be moved through the network of databases in microseconds, which is a technological challenge for the following two technical reasons: First, to extract the new and changed data from a source system, an ETL tool requests the database server to retrieve that data from the database. The effect is that the database server has to spend time and resources on these requests, database files are
8 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 4 accessed, and it creates extra workload on the operational system. The performance experienced by the production systems may go down temporarily, performance instabilities may arise, and so on. For this reason, in many organizations ETL tools are only allowed to retrieve data when the workload on the operational systems is minimal or when no users are active at all. Basically what this means is that data is extracted once per 24 hours: at night. Some organizations decide to postpone the ETL work until the weekend. The consequence is that data accessed by the reports have a latency of at minimum a day or week. All this is not in the interest of the OBI users. Second, to avoid the interference on operational systems as described above, ETL processes are scheduled. Periodically data is extracted from the source system and pumped bulk wise to a target system. So, there is no continuous copying of new data. To summarize, implementing OBI in classic BI systems where scheduled ETL processes copy data from one database to another, is a challenge. A technology is required that is capable of copying new data right when it has been inserted, deleted, or updated, and that does not cause any interference on the operational systems. This is where data replication comes in. 3 The Need for Data Replication Data Replication in a Nutshell Data replication is an alternative technology for copying data from source to target systems. The main difference with a more classic ETL solution is that with data replication new data is continuously copied to the target system. When a new record is inserted, or an old one deleted or updated, that change is copied to the target system right away. So, there is no scheduling and no or a minimal delay. In microseconds, new data can become available in the target system. To minimize interference on the operational systems, most data replication tools do not access the database files or request the database server for data, instead they retrieve (changed) data from the so-called log files. Right before database changes have been committed, they re also written by the database server to these log files. In other words, these log files contain all the data changes. Because data replication tools read data from these log files, there is almost no interference on the operational systems. Data Replication in Business Intelligence Architectures When data replication is deployed in classic BI systems, it can be used to make new data available more quickly making it possible to deliver zero-latency data and thus creating a foundation for OBI. Figure 2 shows an example of how data replication can be embedded in a BI system. In this example, data replication is used to copy changed data to a replicated database as quickly as possible. This replicated database can be used by reports that need zero-latency data. In addition, the data can also be copied from the replicated database to the staging area and the data warehouse to support the non- OBI reports.
9 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 5 operational database replicated operational database data staging area data warehouse data marts classic BI reports data replication operational BI reports Figure 2 Data replication can be used to make zero-latency data available in a BI environment. Evidently, this is only one way of embedding data replication in a BI system. An alternative would be to merge the replicated operational database and the staging area, or to replicate data straight into the data warehouse. All these alternatives have their advantages and disadvantages, and they all need data replication as an enabler. For a more in-depth explanation of data replication, we refer to the whitepaper Data Replication for Enabling Operational BI. 2 In a nutshell, the business value of deploying data replication is to let business users operate with and to let them make decisions on zero-latency data. Replication makes it possible for BI to follow the speed of today s businesses. The business will not be constrained anymore because only yesterday s data can be obtained. Users can monitor and analyze business processes live. Data Replication Criteria Data replication tools have been around since the 1990s. Until recently they were primarily used in operational environments to increase availability or scalability. Now, with the increasing need for OBI, data replication is entering a new application domain. In general, when a technology is used in a new application domain, some of the traditional capabilities are still needed, but other and new ones can become more important. To help out with evaluating data replication technology for developing OBI systems, the following sections list and explain important criteria which are classified in five groups Heterogeneous data source and data target access Non-intrusive data extraction Scalability and high-performance Easy to use and manage End-to-end data integration 2 Rick F. van der Lans, Data Replication for Enabling Operational BI, June 2012, see Data_Replication_for_Enabling_Operational_BI_WP_www
10 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 6 In addition, these sections describe which criteria are met by Informatica Data Replication (IDR). 4 Heterogeneous Data Source and Data Target Access It s very common that enterprise data is stored in a heterogeneous set of databases. For example, different database servers may be in use for different operational systems, and for data warehouses and data marts also other database servers may have been selected. Therefore, it s important that data replication technology supports a large set of database servers as source and as target. In addition, the extraction, transportation, and loading of data should be as efficient as possible. The reason is straightforward: data replication is deployed when time matters, when it s crucial that data is made available to reports as soon as possible. It s all about zero-latency data. When evaluating a data replication technology to ensure Heterogeneous Data Source and Data Target Access, the criteria highlighted below should be considered. Criteria 1: What is the list of data sources from which the data replication technology can extract data? Especially in the world of business intelligence, more and more different database servers are being deployed. Therefore, it s important that the list of supported database servers by the data replication technology that can act as source is extensive. IDR can extract data from the following SQL database servers: Oracle IBM DB2 for Linux, Unix, and Windows Microsoft SQL Server Sybase Criteria 2: What is the list of data sources the data replication technology can load data into? What applies to the source systems applies to the target systems as well: it s important that the list of supported database servers by the data replication technology that can act as target is extensive. IDR can load data into the following SQL database servers: Oracle Microsoft SQL Server MySQL IBM DB2 for Linux, Unix, and Windows Sybase Teradata IBM/Netezza HP/Vertica EMC/Greenplum PostgreSQL
11 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 7 In addition, IDR can efficiently copy data from the data sources (mentioned above) via intermediate files to Hadoop. This can be useful for various reasons. The first reason is for copying structured data from production systems to an Hadoop environment. This enriches the data with more context. For example, if sensor data is stored in Hadoop, it s probably highly coded data. By adding more contextual data to those codes, the data becomes more valuable for reporting and analytics. Another reason is when large amounts of data have to be processed before they are stored in a classic database. In this case they can first be loaded into Hadoop (IDR uses intermediate files), then processed, and finally copied from Hadoop to a SQL database server. Also, IDR can send data to message queue technology. The message bus can then pass that data to any type of application interested in this data. In a way, by combining IDR with message queues, data is pushed from the source system to the target system. For example, operational dashboards can really benefit from this. Criteria 3: Besides replication of data, can the data replication technology also replicate database schema s and database schema changes? IDR does notice when the schema of the database to be replicated has changed. Examples of database changes: new columns, dropped columns, and data types changes. Such changes have to be replicated as well, to keep the replicated database in synch. IDR can also replicate schemas in an heterogeneous environment. Criteria 4: Is the data replication technology where possible database agnostic and are crossplatform mappings supported? Developers use so-called mappings to indicate how data from a source system should be replicated to a target system. Some of those specifications are product-specific, such as the specifications that link the data replication technology to a source or target system. The rest of the mapping specifications, such as which source column should be mapped to which target column, which columns to skip, which records to filter, and so on, should be as database agnostic as possible. In other words, the mappings should be cross-platform. This allows for cross-platform development the source and target systems don t have to be identical. IDR can be classified as database agnostic and supports cross-platform mappings. Most of the specifications to be entered are not dependent on the source or target system used. 5 Non-Intrusive Data Extraction Technologically, there are different styles to extract data from source systems. Some of these styles require substantial processing by the source systems and some require almost no processing. In BI environments, most of the source systems are production systems. Because data replication continuously extracts data, it s important that the replication process doesn t interfere with the production processing itself. In other words, the replication process should be as non-intrusive as possible.
12 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 8 Criteria 1: Is log-based CDC used by the data replication technology for non-intrusive extraction of data from the data sources mentioned in criteria 1? To be as non-intrusive as possible, IDR uses log-based CDC for all four SQL database servers: Oracle, DB2 LUW, Microsoft SQL Server, and Sybase. For Oracle it extracts transactional changes from the online redo logs and/or the archive logs, for Microsoft SQL Server changes are extracted from the online or backup log files, for DB2 LUW the changes come from the UDB database log files, and for Sybase changes are extracted from the transaction log files. In all cases, the extraction process is non-intrusive and does not interfere with the production systems. Criteria 2: Does the component of the data replication technology responsible for extracting data from the log files do most of its processing on the same machine on which the source system itself runs, or can some or all of the extraction work be offloaded to another machine? IDR offers two styles of log-based CDC extraction. First, the Extractor component of IDR (responsible for capturing and extracting data) transfers the changed, transactional data to intermediate files. Next, these files are shipped over to the server on which the target system resides or to any other system. In this solution, the extraction process runs on the target system or a third system dedicated for this purpose. Second, IDR also supports an option (called the Off-box option) where the archive logs from the source system are shipped over to another machine where it s processed by IDR. With this option, there is almost no work done on the source system side. The level of interference on the production systems is therefore very close to zero. 6 Scalability and High-Performance Operational business intelligence requites zero-latency data. This implies that it is important that data is extracted, transformed, transported, and loaded fast. Performance is key here! In addition, the entire process should be robust. Decisions are made on the data in the reports. Serious business problems may occur if a user is not aware that the data is not correct. Criteria 1: Which load mechanisms are used by the data replication technology to load new data efficiently in the data sources? IDR provides three different forms of loading data: With SQL Apply SQL statements are applied to the target system that correspond with the SQL statements executed by the source system. For example, if a transaction on the source system consisted of some INSERT, UPDATE, and DELETE statements, then a set of INSERTs, UPDATEs, and DELETEs are executed on the target system that have the same effect on the data (this may not be the same set of SQL statements). To improve the load performance, and if possible, multiple transactions coming from the source system are bundled into one transaction on the target system.
13 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 9 With Merge Apply data coming from the source system is first copied into staging tables in the target system. Periodically, the data in these staging tables is copied into the real target tables using a bulk load approach. This is the preferred approach when data is loaded into data warehouse appliances, such as EMC/Greenplum, HP/Vertica, IBM/Netezza, and Teradata. The bulk load approach makes it possible to deploy the native loading utility of these systems. This form of loading is much more efficient than using individual SQL statements. With Audit Apply each, each source table has a corresponding audit log table in the target system. These audit log tables record a row for every SQL update, insert, or delete on the source table. Each row contains a before- and an after-image of the data for each column in the table and for the columns that IDR adds to store metadata about the operation. This style of loading is very useful for auditors. It gives them full insight in what has been replicated and when. Criteria 2: For which target systems does the data replication technology use a high-end load utility or feature of the target system? As indicated in Criteria 1, IDR has a loading option called Merge Apply that optimizes loading data into data warehouse appliances and analytical database servers by using bulk loads. For example, for Teradata the utility called TPT (Teradata Parallel Transporter) is used. Most of the data warehouse appliances allow for parallel loading of data. IDR provides the ability to configure the number of parallel threads during the loading process. This allows for parallel loading of data if the target system supports it and thus reducing the elapsed time necessary for loading data; see Figure 3. operational database target database operational database data warehouse appliance parallel loading Figure 3 IDR supports loading of data into data warehouse appliances and analytical database by deploying the parallel loading features servers of those products. Criteria 3: The internal architecture of data replication technology is crucial for the potential scalability, performance, and robustness. What does that the internal architecture look like? Figure 4 shows the high-level architecture of IDR. IDR ships the files with data changes before everything has been committed. In other words, the changes are streamed to the target system on a continuous basis regardless of whether the transaction has been committed on the source. It s the Applier component that determines whether changes have really been committed. If not, they are not processed. This approach is sometimes referred to as optimistic replication. The advantage of this approach is that no additional memory or disc space is utilized to store the
14 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 10 intermediate files on the source. In addition, since the intermediate files are continuously streamed, any latency associated with moving the intermediate files over to the target is eliminated. console operational database Extractor Intermediate Files Applier SQL Apply Merge Apply Audit Apply target database Checkpoint Committed Server Manager Checkpoint Server Manager Figure 4 High-level architecture of IDR. Criteria 4: How extensive are the data transformation capabilities of the data replication technology? For example, can new fields be added, can fields be dropped, can multiple fields be concatenated, can string, mathematical, and date-time operations be applied? IDR supports basic transformations. For example, string and mathematical manipulations can be applied to columns, columns can be concatenated, virtual columns can be added, and existing ones can be skipped. IDR can also use any built-in database function to transform data as it is applied to the target system. Criteria 5: When data is transported from a source to a target system, can it be compressed by the data replication technology? The Extractor component of IDR stages all data changes in compressed files, reducing network overhead. Criteria 6: When data is transported from a source to a target system, can it be encrypted by the data replication technology? In IDR the intermediate files can be encrypted while in transit from the source to the target system. Criteria 7: Can the data replication technology join data from multiple sources/tables into one table in the target system?
15 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 11 In data warehouse environments it s not uncommon that data from different systems has to be merged together in one table. For example, customer records from two production systems have to be brought together in one customer table in, for example, a data warehouse. In relational terminology, this is a union of rows. In IDR, for consolidation purposes, data can be extracted from multiple data sources and loaded into one and the same table. For example, if multiple production systems contain customer data, and there is a need to create one table that shows the consolidation of all that customer data in a real-time fashion, this feature is strongly needed. Criteria 8: What happens when a target system becomes unavailable? To recover from the situation when a target system becomes unavailable, IDR provides a checkpoint mechanism. If, for any reason, the target system becomes unavailable, the Apply process stops. However, the checkpoint mechanism marks the exact recovery point. The other components continue to function unabated. Once the target system is back up again, the Apply process continues seamlessly, picking up from where it left off. 7 Easy to Use and Manage When data replication technology is used to support business intelligence, it s very likely that specialists from the BI department or BI competence center (BICC) have to work with it. Because BI specialists don t always have detailed technical knowledge of the operational systems, and are not always skilled in the technologies deployed in operational systems, it s important that the data replication technology is easy to use and manage. Criteria 1: Does the data replication technology support features for initially loading the target system with data that s already in the source system? IDR comes with a module called InitialSync for initial synchronization of source and target systems. In addition, Informatica offers a separate product called Informatica Fast Clone which performs high speed synchronization of databases. Even when the source database is in use, the entire database can be cloned. Fast Clone is not accessing the database server itself, but is accessing the database files directly. Currently, Fast Clone only supports Oracle databases as source. It supports several targets. Criteria 2: Does the data replication technology support a graphical, user-friendly interface for entering and maintaining extraction, transformation, transportation, and loading specifications? Or do the developers need to use a technical scripting language? IDR offers a graphical, user-friendly user-interface for configuration, management, and administration. This eliminates the need for manual scripting. Figure 5 shows an example of the interface of IDR.
16 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 12 Figure 5 Screenshot showing the graphical, user-friendly interface of IDR. Criteria 3: Can data replication specifications of source and target systems be re-used? In IDR all these specifications can be re-used. There is no need to define a source system again, if the data in that system is used a second time to replicate to another target system. Criteria 4: Does the data replication technology support a centralized monitor to show the entire data replication environment? IDR offers a centralized GUI-based console. Figure 6 contains a screenshot of the monitor in action. 8 End-to-End Data Integration The first sections show that data replication technology is an enabler for building OBI systems. Without it, having access to zero-latency data for reporting and analytics is hard to implement. But it s not the only technology needed to build OBI systems. Data replication is responsible for
17 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 13 making new data entered in the operational systems available for reporting, analytics, and other tasks as quickly as possible. Next, data may have to be transformed, cleansed, and integrated before it can be used in an operational reporting environment. It may also be necessary that the data is processed by a complex event processing (CEP) engine, that applies rules to the data for making embedded business decisions. Figure 6 Screenshot showing the monitor of IDR. In other words, to implement operational analytics and reporting, data replication alone is not sufficient. Besides data replication, a data integration platform should also support ETL functionality for advanced batch-oriented transformation, integration, and cleansing functionality, it should support on-demand transformation, integration, and cleansing functionality (data virtualization), and it should offer CEP functionality to implement embedded BI. Together these functionalities form an end-to-end data integration platform. Criteria 1: Is ETL functionality supported for advanced data transformation, integration, and cleansing functionality in a batch-oriented and scheduled style? IDR compliments the Informatica Platform, which includes a popular and powerful ETL product called PowerCenter. IDR easily interoperates with PowerCenter. The latter can be used to transform, integrate, aggregate, clean the data to bring it the right form the BI reports need.
18 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 14 Criteria 2: Is data virtualization functionality supported for advanced data transformation, integration, and cleansing functionality in an on-demand style? Informatica supports a data virtualization product, called Informatica Data Services which uses the same repository and development language as Informatica PowerCenter. Like IDR, IDS is a component of the Informatica Platform. These two tools can cooperate in different ways. For example, when IDR has pushed data to a replicated database, IDS can be used to transform, aggregate, and integrate the data in an on-demand style, delivering zero-latency data in the right form to the reports. Another example is, that when data is cached in IDS, that data can be replicated to other environments. Criteria 3: Is complex event processing (CEP) functionality supported for embedded analytics? The goal of CEP is to identify meaningful events (such as opportunities or threats) and respond to them instantaneously. With CEP, data is processed continuously. When data is processed, it s held against relevant rules. If a rule is true, an action is taken. Examples of such rules are: Is this the second payment with a particular card within 1 minute? Did the visitor jump from webpage X to Y directly? When data flows in and these rules are true, an application or user can make a decision on what to do what that? In case of the credit card payment, the second one may be a fraudulent one, and must be stopped. Informatica s CEP product is called RulePoint with which CEP rules can be specified. IDR can act as a source that continuously streams data to the RulePoint CEP engine. Together, IDR and RulePoint form a foundation for developing embedded OBI systems.
19 Key Data Replication Criteria for Enabling Operational Reporting and Analytics 15 About the Author Rick F. van der Lans Rick F. van der Lans is an independent analyst, consultant, author, and lecturer specializing in data warehousing, business intelligence, service oriented architectures, and database technology. He works for R20/Consultancy ( a consultancy company he founded in Rick is chairman of the annual European Data Warehouse and Business Intelligence Conference (organized in London). He writes for the eminent B-eye-Network 3 and other websites. He introduced the business intelligence architecture called the Data Delivery Platform in 2009 in a number of articles 4 all published at BeyeNetwork.com. He has written several books on SQL. His popular Introduction to SQL 5 was the first English book on the market in 1987 devoted entirely to SQL. After more than twenty years, this book is still being sold, and has been translated in several languages, including Chinese, German, and Italian. For more information please visit or to [email protected]. You can also get in touch with him via LinkedIn ( and via About Informatica Corporation Informatica Corporation is the world s number one independent provider of data integration software. Organizations around the world gain a competitive advantage in today s global information economy with timely, relevant and trustworthy data for their top business imperatives. More than 4,100 enterprises worldwide rely on Informatica to access, integrate, and trust their information assets held in the traditional enterprise, off premise, and in the Cloud. For more information, call ( in the U.S.), or visit Connect with Informatica at or follow on 3 See 4 See 5 R.F. van der Lans, Introduction to SQL; Mastering the Relational Database Language, fourth edition, Addison- Wesley, 2007.
Data Warehouse Optimization
Data Warehouse Optimization Embedding Hadoop in Data Warehouse Environments A Whitepaper Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy September 2013 Sponsored by Copyright
Discovering Business Insights in Big Data Using SQL-MapReduce
Discovering Business Insights in Big Data Using SQL-MapReduce A Technical Whitepaper Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy July 2013 Sponsored by Copyright 2013
What is Data Virtualization? Rick F. van der Lans, R20/Consultancy
What is Data Virtualization? by Rick F. van der Lans, R20/Consultancy August 2011 Introduction Data virtualization is receiving more and more attention in the IT industry, especially from those interested
An Oracle White Paper March 2014. Best Practices for Real-Time Data Warehousing
An Oracle White Paper March 2014 Best Practices for Real-Time Data Warehousing Executive Overview Today s integration project teams face the daunting challenge that, while data volumes are exponentially
Informatica Data Replication
White Paper Moving and Synchronizing Real-Time Data in a Heterogeneous Environment WHITE PAPER This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information )
Data Vault + Data Virtualization = Double Flexibility
Vault + Virtualization = Double Flexibility Copyright 1991-2015 R20/Consultancy B.V., The Hague, The Netherlands. All rights reserved. No part of this material may be reproduced, stored in a retrieval
Real-time Data Replication
Real-time Data Replication from Oracle to other databases using DataCurrents WHITEPAPER Contents Data Replication Concepts... 2 Real time Data Replication... 3 Heterogeneous Data Replication... 4 Different
Course Outline. Module 1: Introduction to Data Warehousing
Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing solution and the highlevel considerations you must take into account
Data Services: The Marriage of Data Integration and Application Integration
Data Services: The Marriage of Data Integration and Application Integration A Whitepaper Author: Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy July, 2012 Sponsored by Copyright
Oracle BI Applications (BI Apps) is a prebuilt business intelligence solution.
1 2 Oracle BI Applications (BI Apps) is a prebuilt business intelligence solution. BI Apps supports Oracle sources, such as Oracle E-Business Suite Applications, Oracle's Siebel Applications, Oracle's
An Oracle White Paper October 2013. Oracle Data Integrator 12c New Features Overview
An Oracle White Paper October 2013 Oracle Data Integrator 12c Disclaimer This document is for informational purposes. It is not a commitment to deliver any material, code, or functionality, and should
Migrating to Virtual Data Marts using Data Virtualization Simplifying Business Intelligence Systems
Migrating to Virtual Data Marts using Data Virtualization Simplifying Business Intelligence Systems A Technical Whitepaper Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy
What is Data Virtualization?
What is Data Virtualization? Rick F. van der Lans Data virtualization is receiving more and more attention in the IT industry, especially from those interested in data management and business intelligence.
Implementing a Data Warehouse with Microsoft SQL Server 2012 MOC 10777
Implementing a Data Warehouse with Microsoft SQL Server 2012 MOC 10777 Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing
Informatica Data Replication 9.1.1 FAQs
Informatica Data Replication 9.1.1 FAQs 2012 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)
Enterprise Data Integration
Enterprise Data Integration Access, Integrate, and Deliver Data Efficiently Throughout the Enterprise brochure How Can Your IT Organization Deliver a Return on Data? The High Price of Data Fragmentation
A TECHNICAL WHITE PAPER ATTUNITY VISIBILITY
A TECHNICAL WHITE PAPER ATTUNITY VISIBILITY Analytics for Enterprise Data Warehouse Management and Optimization Executive Summary Successful enterprise data management is an important initiative for growing
Overview Western 12.-13.9.2012 Mariusz Gieparda
Overview Western 12.-13.9.2012 Mariusz Gieparda 1 Corporate Overview Company Global Leader in Business Continuity Easy. Affordable. Innovative. Technology Protection Operational Excellence Compliance Customer
Course Outline: Course: Implementing a Data Warehouse with Microsoft SQL Server 2012 Learning Method: Instructor-led Classroom Learning
Course Outline: Course: Implementing a Data with Microsoft SQL Server 2012 Learning Method: Instructor-led Classroom Learning Duration: 5.00 Day(s)/ 40 hrs Overview: This 5-day instructor-led course describes
East Asia Network Sdn Bhd
Course: Analyzing, Designing, and Implementing a Data Warehouse with Microsoft SQL Server 2014 Elements of this syllabus may be change to cater to the participants background & knowledge. This course describes
ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION
ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION EXECUTIVE SUMMARY Oracle business intelligence solutions are complete, open, and integrated. Key components of Oracle business intelligence
Implementing a Data Warehouse with Microsoft SQL Server 2012
Course 10777 : Implementing a Data Warehouse with Microsoft SQL Server 2012 Page 1 of 8 Implementing a Data Warehouse with Microsoft SQL Server 2012 Course 10777: 4 days; Instructor-Led Introduction Data
Implementing a Data Warehouse with Microsoft SQL Server MOC 20463
Implementing a Data Warehouse with Microsoft SQL Server MOC 20463 Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing
COURSE OUTLINE MOC 20463: IMPLEMENTING A DATA WAREHOUSE WITH MICROSOFT SQL SERVER
COURSE OUTLINE MOC 20463: IMPLEMENTING A DATA WAREHOUSE WITH MICROSOFT SQL SERVER MODULE 1: INTRODUCTION TO DATA WAREHOUSING This module provides an introduction to the key components of a data warehousing
Informatica Data Replication: Maximize Return on Data in Real Time Chai Pydimukkala Principal Product Manager Informatica
Informatica Data Replication: Maximize Return on Data in Real Time Chai Pydimukkala Principal Product Manager Informatica Terry Simonds Technical Evangelist Informatica 2 Agenda Replication Business Drivers
Implementing a Data Warehouse with Microsoft SQL Server 2012
Implementing a Data Warehouse with Microsoft SQL Server 2012 Module 1: Introduction to Data Warehousing Describe data warehouse concepts and architecture considerations Considerations for a Data Warehouse
Chapter 6 Basics of Data Integration. Fundamentals of Business Analytics RN Prasad and Seema Acharya
Chapter 6 Basics of Data Integration Fundamentals of Business Analytics Learning Objectives and Learning Outcomes Learning Objectives 1. Concepts of data integration 2. Needs and advantages of using data
Getting Real Real Time Data Integration Patterns and Architectures
Getting Real Real Time Data Integration Patterns and Architectures Nelson Petracek Senior Director, Enterprise Technology Architecture Informatica Digital Government Institute s Enterprise Architecture
Course 10777A: Implementing a Data Warehouse with Microsoft SQL Server 2012
Course 10777A: Implementing a Data Warehouse with Microsoft SQL Server 2012 OVERVIEW About this Course Data warehousing is a solution organizations use to centralize business data for reporting and analysis.
COURSE 20463C: IMPLEMENTING A DATA WAREHOUSE WITH MICROSOFT SQL SERVER
Page 1 of 8 ABOUT THIS COURSE This 5 day course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse with Microsoft SQL Server
Data Vault and Data Virtualization: Double Agility
Data Vault and Data Virtualization: Double Agility A Technical Whitepaper Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy March 2015 Sponsored by Copyright 2015 R20/Consultancy.
Implementing a Data Warehouse with Microsoft SQL Server
Page 1 of 7 Overview This course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse with Microsoft SQL 2014, implement ETL
How To Use Shareplex
Data consolidation and distribution with SharePlex database replication Written by Sujith Kumar, Chief Technologist Executive summary In today s fast-paced mobile age, data continues to accrue by leaps
So Many Tools, So Much Data, and So Much Meta Data
So Many Tools, So Much Data, and So Much Meta Data Copyright 1991-2012 R20/Consultancy B.V., The Hague, The Netherlands. All rights reserved. No part of this material may be reproduced, stored in a retrieval
Implementing a Data Warehouse with Microsoft SQL Server 2012
Course 10777A: Implementing a Data Warehouse with Microsoft SQL Server 2012 Length: Audience(s): 5 Days Level: 200 IT Professionals Technology: Microsoft SQL Server 2012 Type: Delivery Method: Course Instructor-led
Using Attunity Replicate with Greenplum Database Using Attunity Replicate for data migration and Change Data Capture to the Greenplum Database
White Paper Using Attunity Replicate with Greenplum Database Using Attunity Replicate for data migration and Change Data Capture to the Greenplum Database Abstract This white paper explores the technology
Integrating Ingres in the Information System: An Open Source Approach
Integrating Ingres in the Information System: WHITE PAPER Table of Contents Ingres, a Business Open Source Database that needs Integration... 3 Scenario 1: Data Migration... 4 Scenario 2: e-business Application
SAP Sybase Replication Server What s New in 15.7.1 SP100. Bill Zhang, Product Management, SAP HANA Lisa Spagnolie, Director of Product Marketing
SAP Sybase Replication Server What s New in 15.7.1 SP100 Bill Zhang, Product Management, SAP HANA Lisa Spagnolie, Director of Product Marketing Agenda SAP Sybase Replication Server Overview Replication
In-memory databases and innovations in Business Intelligence
Database Systems Journal vol. VI, no. 1/2015 59 In-memory databases and innovations in Business Intelligence Ruxandra BĂBEANU, Marian CIOBANU University of Economic Studies, Bucharest, Romania [email protected],
An Oracle White Paper. Using Oracle GoldenGate to Achieve Operational Reporting for Oracle Applications
An Oracle White Paper Using Oracle GoldenGate to Achieve Operational Reporting for Oracle Applications Executive Overview... 1 Introduction: Right Time For Reporting... 2 Common Solutions for Reporting...
Implementing a Data Warehouse with Microsoft SQL Server
This course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse 2014, implement ETL with SQL Server Integration Services, and
Implementing a Data Warehouse with Microsoft SQL Server 2012 (70-463)
Implementing a Data Warehouse with Microsoft SQL Server 2012 (70-463) Course Description Data warehousing is a solution organizations use to centralize business data for reporting and analysis. This five-day
Why Big Data in the Cloud?
Have 40 Why Big Data in the Cloud? Colin White, BI Research January 2014 Sponsored by Treasure Data TABLE OF CONTENTS Introduction The Importance of Big Data The Role of Cloud Computing Using Big Data
End to End Solution to Accelerate Data Warehouse Optimization. Franco Flore Alliance Sales Director - APJ
End to End Solution to Accelerate Data Warehouse Optimization Franco Flore Alliance Sales Director - APJ Big Data Is Driving Key Business Initiatives Increase profitability, innovation, customer satisfaction,
An Introduction to HIPPO V4 - the Operational Monitoring and Profiling Solution for the Informatica PowerCenter platform.
An Introduction to HIPPO V4 - the Operational Monitoring and Profiling Solution for the Informatica PowerCenter platform. A Functional overview. ` Runner-Up 2012 MVP Competition Copyright Assertive Software
IBM AND NEXT GENERATION ARCHITECTURE FOR BIG DATA & ANALYTICS!
The Bloor Group IBM AND NEXT GENERATION ARCHITECTURE FOR BIG DATA & ANALYTICS VENDOR PROFILE The IBM Big Data Landscape IBM can legitimately claim to have been involved in Big Data and to have a much broader
High-Volume Data Warehousing in Centerprise. Product Datasheet
High-Volume Data Warehousing in Centerprise Product Datasheet Table of Contents Overview 3 Data Complexity 3 Data Quality 3 Speed and Scalability 3 Centerprise Data Warehouse Features 4 ETL in a Unified
Microsoft. Course 20463C: Implementing a Data Warehouse with Microsoft SQL Server
Course 20463C: Implementing a Data Warehouse with Microsoft SQL Server Length : 5 Days Audience(s) : IT Professionals Level : 300 Technology : Microsoft SQL Server 2014 Delivery Method : Instructor-led
Integrating SAP and non-sap data for comprehensive Business Intelligence
WHITE PAPER Integrating SAP and non-sap data for comprehensive Business Intelligence www.barc.de/en Business Application Research Center 2 Integrating SAP and non-sap data Authors Timm Grosser Senior Analyst
Implementing a Data Warehouse with Microsoft SQL Server
Course Code: M20463 Vendor: Microsoft Course Overview Duration: 5 RRP: 2,025 Implementing a Data Warehouse with Microsoft SQL Server Overview This course describes how to implement a data warehouse platform
Online Transaction Processing in SQL Server 2008
Online Transaction Processing in SQL Server 2008 White Paper Published: August 2007 Updated: July 2008 Summary: Microsoft SQL Server 2008 provides a database platform that is optimized for today s applications,
An Oracle White Paper February 2014. Oracle Data Integrator 12c Architecture Overview
An Oracle White Paper February 2014 Oracle Data Integrator 12c Introduction Oracle Data Integrator (ODI) 12c is built on several components all working together around a centralized metadata repository.
Converging Technologies: Real-Time Business Intelligence and Big Data
Have 40 Converging Technologies: Real-Time Business Intelligence and Big Data Claudia Imhoff, Intelligent Solutions, Inc Colin White, BI Research September 2013 Sponsored by Vitria Technologies, Inc. Converging
The New Rules for Integration
The New Rules for Integration A Unified Integration Approach for Big Data, the Cloud, and the Enterprise A Whitepaper Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy September
www.ducenit.com Analance Data Integration Technical Whitepaper
Analance Data Integration Technical Whitepaper Executive Summary Business Intelligence is a thriving discipline in the marvelous era of computing in which we live. It s the process of analyzing and exploring
SQL Server 2012 Business Intelligence Boot Camp
SQL Server 2012 Business Intelligence Boot Camp Length: 5 Days Technology: Microsoft SQL Server 2012 Delivery Method: Instructor-led (classroom) About this Course Data warehousing is a solution organizations
Efficient and Real Time Data Integration With Change Data Capture
Efficient and Real Time Data Integration With Change Data Capture Efficiency. Low Latency. No Batch Windows. Lower Costs. an Attunity White Paper Change Data Capture (CDC) is a strategic component in the
Data Virtualization for Business Intelligence Agility
Data Virtualization for Business Intelligence Agility A Whitepaper Author: Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy February 9, 2012 Sponsored by Copyright 2012 R20/Consultancy.
CERULIUM TERADATA COURSE CATALOG
CERULIUM TERADATA COURSE CATALOG Cerulium Corporation has provided quality Teradata education and consulting expertise for over seven years. We offer customized solutions to maximize your warehouse. Prepared
Implement a Data Warehouse with Microsoft SQL Server 20463C; 5 days
Lincoln Land Community College Capital City Training Center 130 West Mason Springfield, IL 62702 217-782-7436 www.llcc.edu/cctc Implement a Data Warehouse with Microsoft SQL Server 20463C; 5 days Course
Moving Large Data at a Blinding Speed for Critical Business Intelligence. A competitive advantage
Moving Large Data at a Blinding Speed for Critical Business Intelligence A competitive advantage Intelligent Data In Real Time How do you detect and stop a Money Laundering transaction just about to take
Data Integration and ETL with Oracle Warehouse Builder: Part 1
Oracle University Contact Us: + 38516306373 Data Integration and ETL with Oracle Warehouse Builder: Part 1 Duration: 3 Days What you will learn This Data Integration and ETL with Oracle Warehouse Builder:
Analytics of Textual Big Data Text Exploration of the Big Untapped Data Source
Analytics of Textual Big Data Text Exploration of the Big Untapped Data Source A Whitepaper Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy December 2013 Sponsored by Copyright
Sisense. Product Highlights. www.sisense.com
Sisense Product Highlights Introduction Sisense is a business intelligence solution that simplifies analytics for complex data by offering an end-to-end platform that lets users easily prepare and analyze
Five Steps to Integrate SalesForce.com with 3 rd -Party Systems and Avoid Most Common Mistakes
Five Steps to Integrate SalesForce.com with 3 rd -Party Systems and Avoid Most Common Mistakes This white paper will help you learn how to integrate your SalesForce.com data with 3 rd -party on-demand,
SQL Server 2012 and MySQL 5
SQL Server 2012 and MySQL 5 A Detailed Comparison of Approaches and Features SQL Server White Paper Published: April 2012 Applies to: SQL Server 2012 Introduction: The question whether to implement commercial
Integrating data in the Information System An Open Source approach
WHITE PAPER Integrating data in the Information System An Open Source approach Table of Contents Most IT Deployments Require Integration... 3 Scenario 1: Data Migration... 4 Scenario 2: e-business Application
Sybase Replication Server 15.6 Real Time Loading into Sybase IQ
Sybase Replication Server 15.6 Real Time Loading into Sybase IQ Technical White Paper Contents Executive Summary... 4 Historical Overview... 4 Real Time Loading- Staging with High Speed Data Load... 5
An Oracle White Paper February 2014. Oracle Data Integrator Performance Guide
An Oracle White Paper February 2014 Oracle Data Integrator Performance Guide Executive Overview... 2 INTRODUCTION... 3 UNDERSTANDING E-LT... 3 ORACLE DATA INTEGRATOR ARCHITECTURE AT RUN-TIME... 4 Sources,
Dedicated Real-time Reporting Instances for Oracle Applications using Oracle GoldenGate
Dedicated Real-time Reporting Instances for Oracle Applications using Oracle GoldenGate Keywords: Karsten Stöhr ORACLE Deutschland B.V. & Co. KG Hamburg Operational Reporting, Query Off-Loading, Oracle
www.sryas.com Analance Data Integration Technical Whitepaper
Analance Data Integration Technical Whitepaper Executive Summary Business Intelligence is a thriving discipline in the marvelous era of computing in which we live. It s the process of analyzing and exploring
ORACLE DATA INTEGRATOR ENTERPRISE EDITION
ORACLE DATA INTEGRATOR ENTERPRISE EDITION ORACLE DATA INTEGRATOR ENTERPRISE EDITION KEY FEATURES Out-of-box integration with databases, ERPs, CRMs, B2B systems, flat files, XML data, LDAP, JDBC, ODBC Knowledge
Data Integration and ETL with Oracle Warehouse Builder NEW
Oracle University Appelez-nous: +33 (0) 1 57 60 20 81 Data Integration and ETL with Oracle Warehouse Builder NEW Durée: 5 Jours Description In this 5-day hands-on course, students explore the concepts,
Optimizing the Performance of the Oracle BI Applications using Oracle Datawarehousing Features and Oracle DAC 10.1.3.4.1
Optimizing the Performance of the Oracle BI Applications using Oracle Datawarehousing Features and Oracle DAC 10.1.3.4.1 Mark Rittman, Director, Rittman Mead Consulting for Collaborate 09, Florida, USA,
How To Use Axway Sentinel
Axway Sentinel Data Flow Visibility and Monitoring In order to unlock the full value of your business interactions, you need to control and optimize truly govern the flow of data throughout your organization,
The IBM Cognos Platform
The IBM Cognos Platform Deliver complete, consistent, timely information to all your users, with cost-effective scale Highlights Reach all your information reliably and quickly Deliver a complete, consistent
SQL Server 2008 Business Intelligence
SQL Server 2008 Business Intelligence White Paper Published: August 2007 Updated: July 2008 Summary: SQL Server 2008 makes business intelligence available to everyone through deep integration with Microsoft
SQL Server 2012 Gives You More Advanced Features (Out-Of-The-Box)
SQL Server 2012 Gives You More Advanced Features (Out-Of-The-Box) SQL Server White Paper Published: January 2012 Applies to: SQL Server 2012 Summary: This paper explains the different ways in which databases
The Evolution of ETL
The Evolution of ETL -From Hand-coded ETL to Tool-based ETL By Madhu Zode Data Warehousing & Business Intelligence Practice Page 1 of 13 ABSTRACT To build a data warehouse various tools are used like modeling
Creating an Agile Data Integration Platform using Data Virtualization
Creating an Agile Data Integration Platform using Data Virtualization A Technical Whitepaper Rick F. van der Lans Independent Business Intelligence Analyst R20/Consultancy May 2013 Sponsored by Copyright
SQL Server 2012 and PostgreSQL 9
SQL Server 2012 and PostgreSQL 9 A Detailed Comparison of Approaches and Features SQL Server White Paper Published: April 2012 Applies to: SQL Server 2012 Introduction: The question whether to implement
Oracle Database 11g Comparison Chart
Key Feature Summary Express 10g Standard One Standard Enterprise Maximum 1 CPU 2 Sockets 4 Sockets No Limit RAM 1GB OS Max OS Max OS Max Database Size 4GB No Limit No Limit No Limit Windows Linux Unix
An Oracle White Paper November 2010. Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics
An Oracle White Paper November 2010 Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics 1 Introduction New applications such as web searches, recommendation engines,
Data Integration Overview
Data Integration Overview Phill Rizzo Regional Manager, Data Integration Solutions [email protected] Oracle Products for Data Movement Comparing How They Work Oracle Data
SQL Server 2005 Features Comparison
Page 1 of 10 Quick Links Home Worldwide Search Microsoft.com for: Go : Home Product Information How to Buy Editions Learning Downloads Support Partners Technologies Solutions Community Previous Versions
An Enterprise Framework for Business Intelligence
An Enterprise Framework for Business Intelligence Colin White BI Research May 2009 Sponsored by Oracle Corporation TABLE OF CONTENTS AN ENTERPRISE FRAMEWORK FOR BUSINESS INTELLIGENCE 1 THE BI PROCESSING
Symantec NetBackup 7 Clients and Agents
Complete protection for your information-driven enterprise Overview Symantec NetBackup provides a simple yet comprehensive selection of innovative clients and agents to optimize the performance and efficiency
Chapter 3 - Data Replication and Materialized Integration
Prof. Dr.-Ing. Stefan Deßloch AG Heterogene Informationssysteme Geb. 36, Raum 329 Tel. 0631/205 3275 [email protected] Chapter 3 - Data Replication and Materialized Integration Motivation Replication:
AV-005: Administering and Implementing a Data Warehouse with SQL Server 2014
AV-005: Administering and Implementing a Data Warehouse with SQL Server 2014 Career Details Duration 105 hours Prerequisites This career requires that you meet the following prerequisites: Working knowledge
OWB Users, Enter The New ODI World
OWB Users, Enter The New ODI World Kulvinder Hari Oracle Introduction Oracle Data Integrator (ODI) is a best-of-breed data integration platform focused on fast bulk data movement and handling complex data
QAD Business Intelligence Release Notes
QAD Business Intelligence Release Notes September 2008 These release notes include information about the latest QAD Business Intelligence (QAD BI) fixes and changes. These changes may affect the way you
SharePlex for SQL Server
SharePlex for SQL Server Improving analytics and reporting with near real-time data replication Written by Susan Wong, principal solutions architect, Dell Software Abstract Many organizations today rely
ESS event: Big Data in Official Statistics. Antonino Virgillito, Istat
ESS event: Big Data in Official Statistics Antonino Virgillito, Istat v erbi v is 1 About me Head of Unit Web and BI Technologies, IT Directorate of Istat Project manager and technical coordinator of Web
Beta: Implementing a Data Warehouse with Microsoft SQL Server 2012
CÔNG TY CỔ PHẦN TRƯỜNG CNTT TÂN ĐỨC TAN DUC INFORMATION TECHNOLOGY SCHOOL JSC LEARN MORE WITH LESS! Course 10777: Beta: Implementing a Data Warehouse with Microsoft SQL Server 2012 Length: 5 Days Audience:
Replicating to everything
Replicating to everything Featuring Tungsten Replicator A Giuseppe Maxia, QA Architect Vmware About me Giuseppe Maxia, a.k.a. "The Data Charmer" QA Architect at VMware Previously at AB / Sun / 3 times
Data Warehouse Overview. Srini Rengarajan
Data Warehouse Overview Srini Rengarajan Please mute Your cell! Agenda Data Warehouse Architecture Approaches to build a Data Warehouse Top Down Approach Bottom Up Approach Best Practices Case Example
Workday Big Data Analytics
Workday Big Data Analytics Today s fast-paced business climate demands that decision-makers stay informed. Having access to key information gives them the best insight into their business. However, many
SAP Data Services 4.X. An Enterprise Information management Solution
SAP Data Services 4.X An Enterprise Information management Solution Table of Contents I. SAP Data Services 4.X... 3 Highlights Training Objectives Audience Pre Requisites Keys to Success Certification
