1 Optimizing the Performance of the Oracle BI Applications using Oracle Datawarehousing Features and Oracle DAC Mark Rittman, Director, Rittman Mead Consulting for Collaborate 09, Florida, USA, May 2009 The Oracle BI Applications consists of a pre defined dimensional data warehouse, ETL routines, an Oracle BI Enterprise Edition repository and example dashboards and reports. The ETL routines are built using Informatica PowerCenter and are scheduled and orchestrated using the Oracle Data Warehouse Administration Console. The data warehouse provided with the Oracle BI Applications is designed to be deployed on either Oracle Database Enterprise Edition, Microsoft SQL Server or IBM DB/2. As such, whilst it uses common data warehousing features such as bitmap indexes, it does not make use of any Oracle specific features such as segment compression, partitioning or materialized views. It is possible however to make use of these features, and this paper sets out a methodology for their use with Oracle BI Applications 7.9.5, along with the Oracle Data Warehouse Administration Console or higher. An Oracle Business Analytics Warehouse Overview The Oracle Business Analytics Warehouse consists of a number of staging and presentation tables that together support the loading and querying of enterprise data via a conformed dimensional model. Tables are created as regular heap tables, with a minimal amount of NOT NULL check constraints but no primary keys, foreign keys, partitions or other additional items of metadata. Tables are loaded via Informatica PowerCenter 8.1.1, using the PowerCenter Integration Service and row by row data loading. Aggregate tables are created and populated to support key fact tables, using separate ETL processes after the main fact table loads that truncate, and then rebuild the aggregates. It is however possible to customize the Oracle Business Analytics Warehouse to take advantage of features such as segment compression, partitioning, materialized views and other Oracle data warehouse features. It is also possible to add additional metadata such as primary key and foreign key constraints, dimensions and other features to support more efficient querying of detail level and summarized data. To illustrate how these Oracle features can be used to optimize the loading and querying of the Oracle Business Analytics Warehouse, this paper will take on of the fact tables within the data warehouse and apply these techniques to it. Performance Optimization Scenario The Oracle Business Analytics Warehouse contains a table called W_SALES_INVOICE_LINE_F that contains fact data on the sales invoices generated by the business. It is supported by an aggregate table, W_SALES_INVOICE_LINE_A that takes data from the original table and summarizes it to improve query performance. In the sample data set used in this paper, these two tables had the following row count and size.
2 select count(*) from w_sales_invoice_line_a; COUNT(*) select count(*) from w_sales_invoice_line_f; COUNT(*) select segment_name, bytes/1024/1024 "Size in MB" from user_segments where segment_name in ('W_SALES_INVOICE_LINE_F, 'W_SALES_INVOICE_LINE_A'); SEGMENT_NAME Size in MB W_SALES_INVOICE_LINE_A 9 W_SALES_INVOICE_LINE_F These tables are loaded by two DAC tasks, and a DAC Task Group: 1. TASK_GROUP_Load_SalesFact calls the following SIL and PLP tasks, and when they have completed recreates any indexes required for supporting queries. 2. SIL_SalesInvoiceLinesFact, which initially drops the indexes on the fact table, then calls either the SIL_SalesInvoiceLinesFact and SIL_SalesInvoiceLinesFact_Full Informatica workflows for incremental and full loads respectively, and then recreates just those indexes required for the rest of the ETL process. 3. PLP_SalesInvoiceLineAggregate, which again drops indexes this time on the aggregate table, then calls either the PLP_SalesInvoiceLinesAggregate_Load and PLP_SalesInvoiceLinesAggregate_Load_Full Informatica Workflows for incremental and full loads of the aggregate table, then recreates the indexes required for the rest of the ETL process. Initial Benchmarks To create baseline figures to compare your optimizations to, start the Oracle Data Warehouse Administration Console or higher, and create a new subject area within the DAC Repository that uses the two tasks and one task group listed above.
3 Now switch to the Execute view in the DAC Console and create a new execution plan that deploys this new subject area. Create the parameter list for the execution plan and build the set of ordered tasks, so that you are ready to run the execution plan and generate some baseline ETL timings. Now run the execution plan two times, firstly in FULL mode and then in INCREMENTAL mode, so that you can compare them against subsequent timings to establish the benefit that each feature provides. Adding Compression to the Fact Table The first optimization task is to add the COMPRESS clause to the W_SALES_INVOICE_LINE_F fact table, so that rows that are inserted using direct path operations are compressed. Compression is an Oracle Database Enteprise Edition feature, and stores more rows of data into each individual data block to provide two main benefits: less space is taken up by data warehouse data, and
4 full table scans can be performed faster as less blocks are required to retrieve all the table s data. To test the benefits of compressing the W_SALES_INVOICE_LINE_F fact table, first truncate it and then alter the table to add compression. SQL> truncate table w_sales_invoice_line_a; Table truncated. SQL> alter table w_sales_invoice_line_f compress; Table altered. You can then restore the ETL source tables back to their original state and run the full, and then incremental loads into the fact and the aggregate tables in order to test that compression is working as expected. Note that tables will only be compressed when data is inserted, as Informatica PowerCenter by default uses bulk load functionality perform table inserts. Updates, or mixed insert/update loads will not benefit from compression as Informatica will revert to conventional path inserts, and of course updates remove compression from Oracle tables unless you are using the Advanced Compression Option to the database. Partitioning the Fact Table Partitioning is an option to the Enterprise Edition of the Oracle Database that allows you to split one large physical table into several smaller physical ones, with users still seeing it as one big table but giving you far more flexibility in how you can store and manage data within it. Partitioning is typically used with large fact tables and allows the DBA to assign each partition to separate tablespaces, which can then be stored on different physical disk units and backed up independently. As with table compression though, unfortunately the Data Warehouse Administration console does not have any concept of partitioning and you will therefore have to carry out some additional steps to use this feature. Tables such as the W_SALES_INVOICE_LINE_F table are normally created by the DAC administrator when initially installing the Oracle Business Analytics Warehouse, by selecting Tools > ETL Management > Configure from the DAC menu. However there is no provision to create tables using the PARTITION BY(or COMPRESS) clauses, and so we can either create the table outside of the DAC, as we did in the previous step for the COMPRESS clause, or we can use the Actions feature in DAC to create our table for us, using the requisite clause, before we try and do a full load into the table. Actions are a new feature of the version of the DAC and allow us to create table, index and task actions. Table actions allow us to override the Truncate and Analyze steps carried out on tables during an ETL process Index actions allow us to override the creation and dropping actions associated with indexes Task actions allow us to execute SQL and PL/SQL steps before or after a task executes.
5 Note that the version of the DAC has a bug in it that corrupts the SQL text for an action. You will need to apply a patch over this release to be able to carry out the actions in this paper (available from Oracle Metalink, patch number TBA at the time of writing this paper), or install the release when it becomes available. As DAC actions cannot override the creation step for a table, only the truncate and analyze steps, you will add a new task action for the SIL_SalesInvoiceLinesFact task, that will run when the task is run in FULL mode, and that will drop the existing, non partitioned version of the table and recreate it using the required partitioning clause. To start this process, select Tools > Seed Data > Actions > Task Actions from the DAC application menu. At the Task Action dialog, press New to create a new action, call the action Create W_SALES_INVOICE_LINE_F Partitioned, save the action and click in the Value text box to set the table creation scripts.
6 Using the action Value dialog, create two steps, one to drop the table and the other to create it. Make sure the drop step is listed above the create step is set to Continue on Fail, and enter the following SQL statement to drop the table: drop table w_sales_invoice_line_f For the create table step, do not check the Continue on Fail checkbox, then enter the appropriate table creation command into the SQL Statement text box, remembering to add the PARTITION BY clause to the script, and the COMPRESS clause if you would like the table to be compressed as well. create table w_sales_invoice_line_f (sales_ordln_id varchar2(80 char), x_custom varchar2(80 char) compress partition by range (cost_center_wid) (partition cc_1 values less tan 4000).
7 Do not place any semi colons at the end of the SQL script as this will cause it to fail when run. Even though this new task action will drop and then recreate, using partitioning, the W_SALES_INVOICE_LINE_F table, the DAC still holds details of it in its repository together with details of the indexes that are associated with it. As things stand, the DAC would drop these indexes as part of the SIL task and recreate them using the task group task, however it does not normally understand the concept of local indexes and will try and create them without any local or global clause, which has the effect of creating them as global indexes. To instruct the DAC to create our indexes as local indexes, you now need to create an Index Action to override the normal index creation process for these indexes. The first step in this process is to define the index action, then you will associate it with the relevant indexes. To create the index action, select Tools > Seed Data > Actions > Index Action, name the index action, press Save and then enter the Value editor. This index action will execute for every index that we associate it with. It consists of an SQL statement that uses a built in DAC function to return the name of the index in question, and the list of columns that it indexes.
8 The SQL statement to be used is shown below, with the DAC functions highlighted in bold: create bitmap index getindexname() on gettablename() (getindexcolumns()) local nologging parallel When the action is used, the DAC will substitute the index name, table name and index columns into the SQL statement and thereby create the index in question as a local index. Next, save the action and return to the main DAC console. Now you need to associate the index action with the bitmap indexes that need to be created as local indexes. To do this, navigate to the Indices tab in the Design view of the DAC, and query the repository to return just the bitmap indexes associated with the W_SALES_INVOICE_LINE_F table. When the list of indexes are displayed, right click anywhere on the list and select Add Actions
9 Using the Add Actions dialog, select Create Index as the action type, Both as the Load Type, and then select the index action that you created in the previous step for the Action Type.
10 Now you can associate the task action with the SIL DAC task that populates the W_SALES_INVOICE_LINE_F table, so that it drops and recreates the table using partitioning when it runs in full load mode. To do this, locate the SIL_SalesInvoiceLinesFact task using the Task tab in the Design view, and select the Actions tab when the task is displayed. Then, select Preceding Action as the Action Type, Full as the Load Type and then select the task action that you created earlier. Now you can re run your execution plan, which will now include these index and task actions in the steps that are carried out. After the execution plan completes, you can check the list of steps carried out by the SIL_SalesInvoiceLinesFact task to see your task action being carried out
11 Using Materialized Views for Table Aggregation The Oracle BI Applications use aggregate tables to improve the performance of queries that request aggregated data. These aggregate tables are then mapped into the Oracle Business Intelligence Enterprise Edition metadata layer, so that the BI Server can internally rewrite queries to use these aggregates. Post Load Processing (PLP) Informatica workflows load data into these aggregate tables, either as a complete refresh or incrementally, as part of the subject area load. The Enterprise Edition of the Oracle Database has similar functionality in the form of Materialized Views. These allow database administrators to define aggregates in the database, which are then used at query runtime to satisfy queries that required aggregated data. Materialized Views can be fast refreshable and can be designed to either satisfy a single aggregation, a range of aggregations or can even be created using an OLAP Analytic Workspace to meet the aggregation needs of an entire star schema. Like partitioned tables and local indexes, the DAC does not contain out of the box functionality to create and maintain materialized views. To add this functionality to your ETL process, you therefore need to add two new actions to the DAC repository: 1. An action to create the W_SALES_INVOICE_LINE_A object as a materialized view rather than a regular table, and to create the required Materialized View Logs to support fast refresh of this object. 2. An action to perform the refresh of the materialized view, which you will use in place of the regular PLP_SalesInvoiceLinesAggregate_Load and PLP_SalesInvoiceLinesAggregate_Load_Full Informatica workflows. To see how the existing aggregated table is populated, and to extract the base SQL statement that you will need to create the materialized view, open the Informatica Designer application and locate the PLP_SalesInvoiceLinesAggregate_Load_Full mapping. When you view the mapping logic, you will see that the W_SALES_INVOICE_LINE_F table is joined to the W_DAY_D table through a Source
12 Qualifier mapping, which is then supplemented with a sequence number that is used to populate the ROW_WID column. Whilst we cannot reproduce the sequence functionality with a materialized view, you will be able to take the data from these two tables and use it to initially populate, and then refresh, the materialized view. Like the previous example where you create a partitioned table, creation of the materialized view will be performed by a new task action that you will associate with the PLP_SalesInvoiceLinesAggregate_Load DAC task when run in Full mode. To create the action, select Tools > Seed Data > Actions > Task Action, and create and save a new task action. At the Value dialog, create individual steps to drop and recreate the required materialized view logs, then drop and recreate the materialized view, using the following SQL statements: 1. Drop Materialized View Log on W_SALES_INVOICE_LINE_F drop materialized view log on W_SALES_INVOICE_LINE_F 2. Drop materialized view log on W_SALES_INVOICE_LINE_F drop materialized view log on W_DAY_D
13 3. Create Materialized View Log on W_SALES_INVOICE_LINE_F create materialized view log on W_SALES_INVOICE_LINE_F with sequence, rowed ( sales_ordln_id, sales_pckln_id... discount_line_flg) including new values 4. Create Materialized View Log on W_DAY_D create materialized view log on W_DAY_D with sequence, rowed ( row_wid, calendar_date... x_custom) including new values 5. Drop Materialized View drop materialized view W_SALES_INVOICE_LINE_A 6. Create Materialized View create materialized view W_SALES_INVOICE_LINE_A pctfree 0 build immediate refresh fast as select 1 as row_wid, w_sales_invoice_line_f.chnl_type_wid from w_sales_invoice_line_f, w_day_d where. group by ) Note that with the materialized view definition script, you will need to populate the ROW_WID column with a constant, as it is not possible to create a materialized view that uses an Oracle sequence to populate a column. You will also need to include COUNT(*) and COUNT(column) columns for all aggregated columns in order for the materialized view to be fast refreshable. See the Oracle Data Warehousing Guide available on for full details on creating materialized views. Make sure that you mark all drop steps as Continue on Fail, so that the whole ETL process doesn t stop because the object did not exist in the first place, something that will happen when you first make use of the action.
14 Now that you have the task action in place to create the materialized view and it s associated logs, you can now create another task action to refresh the materialized view. To do this, select Tools > Seed Data > Actions > Task Action again, and this time create a new action to perform the refresh. Enter the following anonymous PL/SQL block into the SQL Statement text box to refresh the materialized view: begin dbms_mview.refresh( OBAW.W_SALES_INVOICE_LINE_A ) end; Remember to replace the name of the schema with the one appropriate for your database. In addition, create another task action called Dummy Refresh or similar that you will associate with the PLP task run in full mode, create a new step within it but do not enter any SQL text. This is required as running the task in full mode will create and refresh the materialized view automatically, but we need an action to associate with the task to make it valid. Once all of your task actions are created, including the ones used in the previous example, your list of task actions should look like this:
15 Next, locate the PLP_SalesInvoiceLinesAggregate_Load task in the DAC Design view and change the Execution Type to SQL File, then replace the Command for Incremental Load command with a call to the Fast Refresh task action created previously, and the Command for Full Load command with a call to the dummy action you created at the same time.
16 Then, switch to the Target Tables tab and un check the Truncate for Full Load checkbox, otherwise the DAC will automatically truncate the materialized view just after you have created and refreshed it, and subsequent fast refreshes will fail with an ORA error due to the truncation having counted as a partition maintenance operation (PMOP).
17 Finally, to get the DAC to create the materialized view for you when the task is first run, add a new Preceding Task action to the task to run the Create Materialized View task action you created previously. Be sure to drop the existing W_SALES_INVOICE_LINE_A aggregate table using SQL*Plus before you re run your ETL, as trying to drop it whilst referring to it as a materialized view will cause an error and fail to drop the table. Delete, and recreate the execution plan for your subject area, to pick up the changes to the PLP mapping. Once you have done this, you are now ready to re run your DAC execution plan, to assess what improvement to processing time these changes have made. Quantifying the Improvements When run against the author s installation of Oracle Business Intelligence Applications 7.9.5, using a subset of the Oracle E Business Suite Vision dataset, the following timings were recorded using these scenarios: 1. Baseline run of the standard out of the box ETL routines 2. Addition of the COMPRESS clause to the W_SALES_INVOICE_LINE_F table 3. Adding of partitioning to the W_SALES_INVOICE_LINE_F table, and keeping compression.
18 4. All of the above, and replacement of the W_SALES_INVOICE_LINE_A table with a fast refresh materialized view. The results of these scenarios are shown in the table below. Scenario Load Type Rows Loaded Elapsed Time Time Improvemen t Vs. Baseline Fact Table Size 1 Baseline Full secs n/a 189Mb n/a 1 Baseline Incremental secs n/a 189Mb n/a 2 With Full secs 2% 43MB 77% Compression 2 With Incremental secs 3% 43MB 77% Compression 3 With Partitioning & Compression Full secs 7% 44MB 77% 3 With Partitioning & Compression 4 With Partitioning, Compression and Materialized View 4 With Partitioning, Compression and Materialized View Incremental secs 6% 44MB 77% Full secs 51% 44MB 77% Incremental secs 33% 44MB 77% Size improvemen t Vs. Baseline Overall, using table compression on the main fact table reduced its storage requirement by 77%, from 189MB to 44MB. Using a fast refreshable materialized view, along with partitioning and compression, reduced the ETL time for the main fact table and associated aggregate by 51% for a full load and 33% for an incremental load. In addition, queries against the partitioned version of the fact table that can benefit from partition elimination can experience significantly lower execution plan costs. The following query and execution plan were executed against the original, non partitioned version of the W_SALES_INVOICE_LINE_F table:
19 When the table is subsequently partitioned though, queries that benefit from partition elimination show a significant drop in their cost. In the example below, the table is being partitioned on COST_CENTER_WID, whereas in reality it is likely to be partitioned on a date column, so that queries that only require data for a particular range of months or years can avoid scanning the entire table. Further Opportunities for Optimization As well as the actions outlined in this paper, there are further opportunities for optimizing the ETL and query processing carried out by the Oracle BI Applications when working with the Enterprise Edition of the Oracle Database.
20 For example, the DAC does not create primary key or foreign key constraints on the tables that it creates, which together with the default setting for the STAR_TRANSFORMATION_ENABLED parameter for newly created databases, in most cases means that star transformations are not used when the database handles queries against fact tables that involve filtering against two or more dimensions. You could therefore add addition task actions to SIL and PLP tasks to create and drop these constraints, possibly using the RELY NOVALIDATE clauses to minimize unnecessary redo, set the STAR_TRANSFORMATION_ENABLED parameter appropriately and take advantage of this key Oracle data warehousing feature. Another optimization possibility is to use the OLAP Option to Oracle Database 11g to replace the materialized view outlined in this paper with a Cube Organized Materialized View, which could provide aggregations for an entire star schema at multiple levels of aggregation. You would need to use Oracle Analytic Workspace Manager (a free download from to create the cube organized materialized view, but once it is created it could be refreshed in the same manner as the materialized view that this paper describes. Conclusions The ETL and query optimization techniques provided out of the box with the Oracle BI Applications provides are appropriate for generic databases, but can be improved apon if you make use of the specific data warehouse optimizations available on your actual target database. The Enterprise Edition of the Oracle Database provides many such features including segment compression, materialized views and partitioning, and this paper sets out how they can be used in conjunction with the new actions feature available with the Oracle Data Warehouse Administration Console and higher. About the Author Mark Rittman is an Oracle ACE Director and is co founder of Rittman Mead Consulting, a specialist Oracle partner delivering Oracle data warehousing, business intelligence and performance management solutions. Mark is co chair of the ODTUG BI&DW SIG, is editor of Oracle Scene, the magazine of the UK Oracle User Group, writes regularly for Oracle Magazine, Oracle Technology Network and the ODTUG Technical Journal, and runs a blog at Mark can be reached at if you would like to discuss the contents of this white paper.
Course 20467A: Designing Business Intelligence Solutions with Microsoft SQL Server 2012 Length: 5 Days Published: December 21, 2012 Language(s): English Audience(s): IT Professionals Overview Level: 300
This course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse 2014, implement ETL with SQL Server Integration Services, and
Implementing a Data Warehouse with Microsoft SQL Server MOC 20463 Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing
COURSE OUTLINE MOC 20463: IMPLEMENTING A DATA WAREHOUSE WITH MICROSOFT SQL SERVER MODULE 1: INTRODUCTION TO DATA WAREHOUSING This module provides an introduction to the key components of a data warehousing
Page 1 of 8 ABOUT THIS COURSE This 5 day course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse with Microsoft SQL Server
Page 1 of 7 Overview This course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse with Microsoft SQL 2014, implement ETL
Oracle OLAP 11g and Oracle Essbase Mark Rittman, Director, Rittman Mead Consulting Who Am I? Oracle BI&W Architecture and Development Specialist Co-Founder of Rittman Mead Consulting Oracle BI&W Project
Creating Hybrid Relational-Multidimensional Data Models using OBIEE and Essbase by Mark Rittman and Venkatakrishnan J ODTUG Kaleidoscope Conference June 2009, Monterey, USA Oracle Business Intelligence
Lincoln Land Community College Capital City Training Center 130 West Mason Springfield, IL 62702 217-782-7436 www.llcc.edu/cctc Implement a Data Warehouse with Microsoft SQL Server 20463C; 5 days Course
MS 20467: Designing Business Intelligence Solutions with Microsoft SQL Server 2012 Description: This five-day instructor-led course teaches students how to design and implement a BI infrastructure. The
Oracle University Contact Us: + 38516306373 Data Integration and ETL with Oracle Warehouse Builder: Part 1 Duration: 3 Days What you will learn This Data Integration and ETL with Oracle Warehouse Builder:
SQL Server 2012 Business Intelligence Boot Camp Length: 5 Days Technology: Microsoft SQL Server 2012 Delivery Method: Instructor-led (classroom) About this Course Data warehousing is a solution organizations
Course 20463C: Implementing a Data Warehouse with Microsoft SQL Server Length : 5 Days Audience(s) : IT Professionals Level : 300 Technology : Microsoft SQL Server 2014 Delivery Method : Instructor-led
QAD Business Intelligence Release Notes September 2008 These release notes include information about the latest QAD Business Intelligence (QAD BI) fixes and changes. These changes may affect the way you
CÔNG TY CỔ PHẦN TRƯỜNG CNTT TÂN ĐỨC TAN DUC INFORMATION TECHNOLOGY SCHOOL JSC LEARN MORE WITH LESS! Course 20463 Implementing a Data Warehouse with Microsoft SQL Server Length: 5 Days Audience: IT Professionals
Data warehousing/dimensional modeling/ SAP BW 7.0 Concepts 1. OLTP vs. OLAP 2. Types of OLAP 3. Multi Dimensional Modeling Of SAP BW 7.0 4. SAP BW 7.0 Cubes, DSO s,multi Providers, Infosets 5. Business
Implementing a Data Warehouse with Microsoft SQL Server 2014 MOC 20463 Duración: 25 horas Introducción This course describes how to implement a data warehouse platform to support a BI solution. Students
SQL Server An Overview SQL Server Microsoft SQL Server is designed to work effectively in a number of environments: As a two-tier or multi-tier client/server database system As a desktop database system
ORACLE OLAP KEY FEATURES AND BENEFITS FAST ANSWERS TO TOUGH QUESTIONS EASILY KEY FEATURES & BENEFITS World class analytic engine Superior query performance Simple SQL access to advanced analytics Enhanced
Course: Analyzing, Designing, and Implementing a Data Warehouse with Microsoft SQL Server 2014 Elements of this syllabus may be change to cater to the participants background & knowledge. This course describes
Oracle Database 11 g Performance Tuning Recipes Sam R. Alapati Darl Kuhn Bill Padfield Apress* Contents About the Authors About the Technical Reviewer Acknowledgments xvi xvii xviii Chapter 1: Optimizing
PeopleSoft DDL & DDL Management by David Kurtz, Go-Faster Consultancy Ltd. Since their takeover of PeopleSoft, Oracle has announced project Fusion, an initiative for a new generation of Oracle Applications
AV-005: Administering and Implementing a Data Warehouse with SQL Server 2014 Career Details Duration 105 hours Prerequisites This career requires that you meet the following prerequisites: Working knowledge
SAS BI Course Content; Introduction to DWH / BI Concepts SAS Web Report Studio 4.2 SAS EG 4.2 SAS Information Delivery Portal 4.2 SAS Data Integration Studio 4.2 SAS BI Dashboard 4.2 SAS Management Console
Oracle BI 10g: Analytics Overview Student Guide D50207GC10 Edition 1.0 July 2007 D51731 Copyright 2007, Oracle. All rights reserved. Disclaimer This document contains proprietary information and is protected
Oracle University Appelez-nous: +33 (0) 1 57 60 20 81 Data Integration and ETL with Oracle Warehouse Builder NEW Durée: 5 Jours Description In this 5-day hands-on course, students explore the concepts,
Documented by - Sreenath Reddy G OLAP Cube Manual deployment and Error resolution with limited licenses and Config keys Functionality in Microsoft Dynamics AX can be turned on or off depending on license
Oracle Business Intelligence Enterprise Edition (OBIEE) Training: Working with Oracle Business Intelligence Answers Introduction to Oracle BI Answers Working with requests in Oracle BI Answers Using advanced
Course 20463:Implementing a Data Warehouse with Microsoft SQL Server Type:Course Audience(s):IT Professionals Technology:Microsoft SQL Server Level:300 This Revision:C Delivery method: Instructor-led (classroom)
Setting up the Oracle Warehouse Builder Project Purpose In this tutorial, you setup and configure the project environment for Oracle Warehouse Builder 10g Release 2. You create a Warehouse Builder repository
Course 10777A: Implementing a Data Warehouse with Microsoft SQL Server 2012 OVERVIEW About this Course Data warehousing is a solution organizations use to centralize business data for reporting and analysis.
Creating a Patch Management Dashboard with IT Analytics Hands-On Lab Description This lab provides a hands-on overview of the IT Analytics Solution. Students will learn how to browse cubes and configure
Course Code: M20463 Vendor: Microsoft Course Overview Duration: 5 RRP: 2,025 Implementing a Data Warehouse with Microsoft SQL Server Overview This course describes how to implement a data warehouse platform
Course 10777A: Implementing a Data Warehouse with Microsoft SQL Server 2012 Length: Audience(s): 5 Days Level: 200 IT Professionals Technology: Microsoft SQL Server 2012 Type: Delivery Method: Course Instructor-led
MOC 20467B: Designing Business Intelligence Solutions with Microsoft SQL Server 2012 Course Overview This course provides students with the knowledge and skills to design business intelligence solutions
GoldenGate and ODI - A Perfect Match for Real-Time Data Warehousing Michael Rainey, Principal Consultant, Rittman Mead RMOUG Training Days, February 2013 About me... Michael Rainey, Principal Consultant,
Presented by: Jose Chinchilla, MCITP Jose Chinchilla MCITP: Database Administrator, SQL Server 2008 MCITP: Business Intelligence SQL Server 2008 Customers & Partners Current Positions: President, Agile
Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing solution and the highlevel considerations you must take into account
IBM Cúram Social Program Management Cúram Business Intelligence Reporting Developer Guide Version 6.0.5 IBM Cúram Social Program Management Cúram Business Intelligence Reporting Developer Guide Version
Course 10777 : Implementing a Data Warehouse with Microsoft SQL Server 2012 Page 1 of 8 Implementing a Data Warehouse with Microsoft SQL Server 2012 Course 10777: 4 days; Instructor-Led Introduction Data
Top 10 Performance Tips for OBI-EE Narasimha Rao Madhuvarsu L V Bharath Terala October 2011 Apps Associates LLC Boston New York Atlanta Germany India Premier IT Professional Service and Solution Provider
September, 21 2012 TRANSFORMING YOUR BUSINESS PROCESS INTO DATA MODEL Prasad Duvvuri AST Corporation Agenda First Step Analysis Data Modeling End Solution Wrap Up FIRST STEP It Starts With.. What is the
OBIEE 11g Data Modeling Best Practices Mark Rittman, Director, Rittman Mead Oracle Open World 2010, San Francisco, September 2010 Introductions Mark Rittman, Co-Founder of Rittman Mead Oracle ACE Director,
Oracle Database 10g Express This tutorial prepares the Oracle Database 10g Express Edition Developer to perform common development and administrative tasks of Oracle Database 10g Express Edition. Objectives
1 COURSE SYLLABUS COURSE TITLE: FORMAT: CERTIFICATION EXAMS: 55043AC Microsoft End to End Business Intelligence Boot Camp Instructor-led None This course syllabus should be used to determine whether the
1Z0-117 Oracle Database 11g Release 2: SQL Tuning Oracle To purchase Full version of Practice exam click below; http://www.certshome.com/1z0-117-practice-test.html FOR Oracle 1Z0-117 Exam Candidates We
HP Vertica Analytic Database Software Version: 7.0.x Document Release Date: 2/20/2015 Legal Notices Warranty The only warranties for HP products and services are set forth in the express warranty statements
Yes-M Systems offers the unique opportunity to aspiring fresher s and experienced professionals to get real time experience in ETL Data warehouse tool IBM DataStage. Course Description With this training
Moving the Web Security Log Database Topic 50530 Web Security Solutions Version 7.7.x, 7.8.x Updated 22-Oct-2013 Version 7.8 introduces support for the Web Security Log Database on Microsoft SQL Server
CUBE ORGANIZED MATERIALIZED VIEWS, DO THEY DELIVER? Peter Scott, Rittman Mead Consulting OVERVIEW The recent 11g release of the Oracle relational database included many new or enhanced features that may
Paper BB-01 Lost in Space? Methodology for a Guided Drill-Through Analysis Out of the Wormhole ABSTRACT Stephen Overton, Overton Technologies, LLC, Raleigh, NC Business information can be consumed many
An Oracle White Paper October 2013 Oracle Data Miner (Extension of SQL Developer 4.0) Generate a PL/SQL script for workflow deployment Denny Wong Oracle Data Mining Technologies 10 Van de Graff Drive Burlington,
Informatica Data Replication 9.1.1 FAQs 2012 Informatica Corporation. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise)
Understanding Oracle BI Applications Oracle BI Applications are a complete, end-to-end BI environment covering the Oracle BI EE platform and the prepackaged analytic applications. The Oracle BI Applications
Oracle9i Data Warehouse Review Robert F. Edwards Dulcian, Inc. Agenda Oracle9i Server OLAP Server Analytical SQL Data Mining ETL Warehouse Builder 3i Oracle 9i Server Overview 9i Server = Data Warehouse
Oracle OLAP Data Validation Plug-in for Analytic Workspace Manager User s Guide E18663-01 January 2011 Data Validation Plug-in for Analytic Workspace Manager provides tests to quickly find conditions in
StreamServe Persuasion SP5 Oracle Database Database Guidelines Rev A StreamServe Persuasion SP5 Oracle Database Database Guidelines Rev A 2001-2011 STREAMSERVE, INC. ALL RIGHTS RESERVED United States patent
Jet Data Manager 2012 User Guide Welcome This documentation provides descriptions of the concepts and features of the Jet Data Manager and how to use with them. With the Jet Data Manager you can transform
A Unit of Sequelgate Innovative Technologies Pvt. Ltd. ISO Certified Training Institute Microsoft Certified Partner SQL Server Analysis Services Complete Practical & Real-time Training Mode: Practical,
Oracle EXAM - 1Z0-117 Oracle Database 11g Release 2: SQL Tuning Buy Full Product http://www.examskey.com/1z0-117.html Examskey Oracle 1Z0-117 exam demo product is here for you to test the quality of the
July 21, 2011 Lee Anne Spencer Founder & CEO Global View Analytics Cheryl McCormick Chief Architect Global View Analytics Agenda Introduction Oracle Data Integrator ODI Components Best Practices Implementation
COURSE CODE: COURSE TITLE: CURRENCY: AUDIENCE: ORAACF Oracle Architecture, Concepts & Facilities 10g & 11g Database administrators, system administrators and developers PREREQUISITES: At least 1 year of
Course Outline: Course: Implementing a Data with Microsoft SQL Server 2012 Learning Method: Instructor-led Classroom Learning Duration: 5.00 Day(s)/ 40 hrs Overview: This 5-day instructor-led course describes
Implementing a Data Warehouse with Microsoft SQL Server 2012 MOC 10777 Course Outline Module 1: Introduction to Data Warehousing This module provides an introduction to the key components of a data warehousing
Data warehousing in Oracle Materialized views and SQL extensions to analyze data in Oracle data warehouses SQL extensions for data warehouse analysis Available OLAP functions Computation windows window
IBM Smart Analytics System Best Practices Using IBM InfoSphere Optim High Performance Unload as part of a Recovery Strategy Garrett Fitzsimons IBM Data Warehouse Best Practices Specialist Konrad Emanowicz
Exadata in the Retail Sector Jon Mead Managing Director - Rittman Mead Consulting Agenda Introduction Business Problem Approach Design Considerations Observations Wins Summary Q&A What it is not... Introductions
Monitor and Manage Your MicroStrategy BI Environment Using Enterprise Manager and Health Center Presented by: Dennis Liao Sales Engineer Zach Rea Sales Engineer January 27 th, 2015 Session 4 This Session
Analytics: Pharma Analytics (Siebel 7.8) Student Guide D44606GC11 Edition 1.1 March 2008 D54241 Copyright 2008, Oracle. All rights reserved. Disclaimer This document contains proprietary information and
Oracle Data Integrator 11g New Features & OBIEE Integration Presented by: Arun K. Chaturvedi Business Intelligence Consultant/Architect Agenda 01. Overview & The Architecture 02. New Features Productivity,
QAD Business Intelligence Data Warehouse Demonstration Guide May 2015 BI 3.11 Overview This demonstration focuses on the foundation of QAD Business Intelligence the Data Warehouse and shows how this functionality
Course 6234A: Implementing and Maintaining Microsoft SQL Server 2008 Integration Services Length: 3 Days Language(s): English Audience(s): IT Professionals Level: 200 Technology: Microsoft SQL Server 2008
s@lm@n Oracle Exam 1z0-591 Oracle Business Intelligence Foundation Suite 11g Essentials Version: 6.6 [ Total Questions: 120 ] Question No : 1 A customer would like to create a change and a % Change for
Exploring Oracle BI Apps: How it Works and What I Get NZOUG March 2013 Copyright This document is the property of James & Monroe Pty Ltd. Distribution of this document is limited to authorised personnel.
Oracle Database Concepts Database Structure The database has logical structures and physical structures. Because the physical and logical structures are separate, the physical storage of data can be managed
SAP BusinessObjects Business Intelligence platform Document Version: 4.1 Support Package 5-2014-11-06 Table of Contents 1 What's new in the....14 2 Getting started with the information design tool....18
ECRIC news from Tom Bacon about Monday's lecture I won't be at the lecture on Monday due to the work swamp. The plan is still to try and get into the data centre in two weeks time and do the next migration,
SAP Data Services Document Version: 4.2 Support Package 6 (220.127.116.11) 2015-11-20 PUBLIC Content 1 Welcome to SAP Data Services....6 1.1 Welcome.... 6 1.2 Documentation set for SAP Data Services....6 1.3
Implementing a Data Warehouse with Microsoft SQL Server 2012 Module 1: Introduction to Data Warehousing Describe data warehouse concepts and architecture considerations Considerations for a Data Warehouse
Performance Tuning Guidelines for PowerExchange for Microsoft Dynamics CRM 1993-2016 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying,
Using Oracle Data Integrator with Essbase, Planning and the Rest of the Oracle EPM Products Edward Roske firstname.lastname@example.org BLOG: LookSmarter.blogspot.com WEBSITE: www.interrel.com TWITTER: ERoske 2 4
Report and Dashboard Template 9.5.1 User Guide Introduction The Informatica Data Quality Reporting and Dashboard Template for Informatica Data Quality 9.5.1, is designed to provide you a framework to capture
Extensibility of Oracle BI Applications The Value of Oracle s BI Analytic Applications with Non-ERP Sources A White Paper by Guident Written - April 2009 Revised - February 2010 Guident Technologies, Inc.