Reporting MDM Data Attribute Inconsistencies for the Enterprise Using DataFlux

Size: px
Start display at page:

Download "Reporting MDM Data Attribute Inconsistencies for the Enterprise Using DataFlux"

Transcription

1 Reporting MDM Data Attribute Inconsistencies for the Enterprise Using DataFlux Ernesto Roco, Hyundai Capital America (HCA), Irvine, CA ABSTRACT The purpose of this paper is to demonstrate how we use DataFlux Data Management Studio Platform to provide our organization with a tool that at an entity record and attribute level, identifies inconsistencies between the MDM (Master Data Manager) master data when compared against the same entity record existing from other source contributor systems. For this conference, I want to share several key aspects of our design. With our design, we made possible a weekly data inconsistency report for millions of MDM cluster records within an enterprise MDM implementation, without significant overhead and development effort required. First, we believe that our design, with its metadata driven approach, can be modeled as a template for other organizations looking to implement the same capability for their enterprise MDM. Second, with the use of parallel processing nodes to maximize the utilization of available processing power, we have been able to significantly reduce overall processing time. Lastly, the design features the ability to output in different formats which contribute to the overall usability of this tool. With the report output we have been able to quickly identify areas where data inconsistencies exist and implement processes to minimize future occurrences. INTRODUCTION As you may already know, one of the main functions of an MDM solution is to provide the single best version of an entity record that is consistent and shared across the entire enterprise. At HCA, we believe that achieving a very high level of data consistency across all source systems is a high priority task in our efforts to maximize ROI within our MDM solution. As the MDM administrator for our organization, one of the ways I support this task is by providing the enterprise a tool to report back to the business the current state of MDM master data inconsistency across all source systems. Data Stewards can then use this report and take the appropriate action to resolve data inconsistencies. By working together with Data Stewards and infrastructure teams, we can identify any needed improvements within the system, in order to minimize the re-occurrence of data inconsistencies. This paper assumes that the reader has a basic understanding of MDM principles and its purpose for the enterprise along with some familiarity of DataFlux Data Management Studio development platform. THE DATA INCONSISTENCY PROBLEM A data inconsistency exists for an entity record when one or more of the attributes of a master record does not match with the same entity record and\or attribute existing in another source system. SOURCE FIRST_NM LAST_NM PHONE SSN ADDR CITY ST ZIP MDM JOHN MILLER jm@yahoo.com BAY DR. HEMET CA CRM ED MILLER jm@hotmail.com BAY DR. HEMET CA LEASE JOHN MILLER jm@yahoo.com BAY DR. HEMET CA RETAIL JOHN MILLER jm@yahoo.com ELM ST. IRVINE CA Table 1.A typical data inconsistency example for an HCA Customer (Note: All data used in this presentation is for example purposes only and does not reflect actual HCA customer information) In the example above, we have a customer that exists across all three of our source systems (CRM, LEASE, and RETAIL) where the attribute first name from CRM is inconsistent with the MDM first name value. The same customer also has an inconsistent value in our Retail system for both phone and address fields. Inconsistencies are highlighted in red. With almost 4 million master data records having 65 attributes we need an up to date and reliable way to identify each instance where a data inconsistency exists and to minimize problems going forward.

2 Admittedly, having data inconsistency problems is the reality of having an MDM solution particularly for those hybrid MDM implementations with complex real-time integrations service while having multiple source systems. Even with the best of intentions and procedures in place: network, hardware, code errors, and other system failures can occur, causing outages and service interruptions. Because of these outages, transactions may be processed successfully from source systems, though they may not necessarily reach MDM. Because of this, records are not mastered, and\or are not being propagated back to the source systems. In addition, updates to entity records outside of the normal accepted processes can and will occur, which may bypass MDM processes thus, creating another source of data inconsistency. The challenge is to keep the number of data inconsistencies as low as possible, while taking both proactive and reactive action. This is a work in progress type of data maintenance task that requires continual efforts across multifunctional technical MDM support, infrastructure, source system administrators, and Data Governance teams. If left unchecked these inconsistencies may actually remain in an incorrect state which ends up having a negative impact on our customer service, billing, correspondence, collections, and more importantly state and federal compliance and regulatory functions. MAIN CHALLENGES Here are some of the main challenges in developing an attribute inconsistency tool: 1. Programming\Coding Comparing 65 attributes across multiple source systems can be a very tedious and cumbersome task. Typing up hundreds of these attribute comparisons also increases the likelihood of developers introducing a coding error into the tool. 2. Maintenance Having hard coded comparison logic is problematic to maintain in the event attribute changes occur between MDM and\or any of the 3 source systems which do occasionally occur. 3. Processing Limitations Comparing 65 attributes across 3 different source systems with 4 million records may take several days to process and not hours. Our requirement is to have the processing completed within 24 hours for this tool. OUR SOLUTION OVERVIEW Display 1.Main Process Job for Attribute Inconsistency Report 1. Staging This is where the de-normalized data from source systems is extracted then staged into the MDM staging area. 2. Parallel Compare Parallel processing is used for comparing attributes from all source systems against MDM attributes to improve performance and greatly reduce total processing time. 2

3 3. Attribute Exceptions This process will insert and format the result of all the comparisons from a record level to an attribute level format. This greatly enhances our ability to quickly spot problem areas at an attribute level much like what a spreadsheet pivoting table function would do but with additional features specific to our particular needs. The focus of this paper will be on items 2 and 3 only since this is where our solution can be used as a template for other enterprise to follow. METADATA APPROACH DataFlux has available out of the box four Expression Engine Language (EEL) scripting functions that we used to implement for this approach. They are: for(), fieldcount(), fieldname(), and fieldvalue(). These four functions used in combination gives us the ability to loop through each column in the data stream, while at the same time bypassing the necessity to hard code each column for comparison. This metadata approach solves the first two main challenges we have identified above. Please note that each record in the data stream is a data join between a single source system entity record to an MDM master record based on our specific MDM clustering criteria and the source system record unique identifier. With 65 attributes for the MDM master record and an additional 65 of the same attributes coming from the same entity record from the source system, we have a total of approximately 130 columns contained within the data stream. Display 2.Metadata Approach in Action (step 1) Step 1: (Loop through each master record attribute) In the example above, we demonstrate how we utilized a for function to loop by the number of times there are columns that are in the data stream returned by one of the aforementioned functions, fieldcount. The fieldname function, which returns the column name, is then used to filter out certain column names that are not required for our attribute comparison. The fieldvalue function is then used to extract the master record attribute value of the column for comparison against the same source system record attribute value. This first step populates a variable we named gr_value with the value of the master record attribute we are comparing against a source record with the same attribute. This step also populates another variable, named gr_name with the name of the master record column name. For this example, we will use master record column CUSTOMER_FIRST_NAME_GR as the column name of the master record attribute, comparatively we have the same attribute from the source record column named CUSTOMER_FIRST_NAME in the data stream, which is essentially the same column name minus the trailing characters _GR. This was done so we can programmatically 3

4 differentiate between the master record and source record of the same column. In doing so, we can then compare the two as a single attribute of the master entity record. Step 2: (For this example, loop until the source column CUSTOMER_FIRST_NAME is found) Display 3.Metadata Approach in Action (step 2) In step 2, we are in a sub loop from step 1, where for the purposes of this example we are in a looping construct through each column until we match with column name CUSTOMER_FIRST_NAME. Once we get a match on the column name we then use the function fieldvalue again to extract the value of the column just before comparison which populates the variable non_gr_value. The comparison to determine if an inconsistency exists now will simply be an if condition gr_value = non_gr_value. If no mismatch is found, then the actual field value for CUSTOMER_FIRST_NAME will be replaced with the string <match>, and vice versa if there is a mismatch the current inconsistent attribute value will be left intact and also incrementing the user variable errors with plus 1. This variable was initialized at 0 at the beginning of each record processing in the data stream. The errors variable is used by the program to determine if the current data stream record which is again a single entity record with a join of the MDM master entity and of the same record existing in a source system, contains any attribute inconsistencies. If the errors variable > 0, then that record will be inserted into the inconsistencies table, and if not then it will move on to the next record. As you can see, by using the metadata approach as described above we are able to perform every single attribute comparison from the data stream without having to hard code a single column name to column name comparison. This has been an instrumental DataFlux feature for us in meeting our business and technical requirements for this tool, while addressing the first two main challenges we have listed above. 4

5 PARALLEL PROCESSING Display 4.DataFlux parallel processing node in action Another DataFlux out of box tool is the Fork node. In order for us to meet our processing time requirements, we had to find a way to utilize more of the available processing power that we have available to reduce the overall processing time. By using the fork node, we were able to split into 9 separate parallel processes enabling us to complete the processing in a matter of hours and not days. In the future, we may again re-evaluate the number of processes, depending on the amount of data that we have to process which is increasingly getting larger every day. Display 5.DataFlux parallel processing node splits into 9 parallel processes above If we were to drill down on the Parallel_Compare fork node above (Display 4), it will take us to what you see in Display 5. This sub process contains the nine processes that we have developed to execute in parallel attribute comparisons across 3 source systems. At this time, our best available option for splitting into nine similarly sized batches is by using the first digit of the customer ID (a numeric non-sequential string ranging from 0 9), statistically this gives us a very reliable way of developing several batches without losing any records while maintaining relatively the same distribution ratio of records that we have today and into the future. 5

6 As you can also see from our screen capture above, we have three source systems which have different record count volume to process. Our CRM source system has significantly more records to process, thus requiring more parallel processes to complete for the same amount of time versus the other two source systems. Another out of box node tool that we used to facilitate parallel processing is the Global Get/Set node, which is used as a place holder to pass the hard coded first digit of the Customer ID to the sql used to retrieve the source entity records from the staging tables. Each process operating in parallel uses a custom sql to retrieve each record from the source system, as shown in the example below: Display 6.Sample SQL used to retrieve records for source entity records in staging area Each one of the nine nodes running in parallel uses a sql query similar to what is shown in Display 6 above to extract data for processing from the staging tables. As demonstrated above, the hard coded first digit of the Customer ID is being passed as a parameter value from %%CUST_ID%% inside of an IN clause. The value contained in the parameter CUST_ID is coming from the Get/Set node, thus giving us as close to an even sized distribution of records as possible. By using the Fork node to process our comparisons in parallel across our three source systems, we are able to successfully meet the time requirements for completing processing of our attribute inconsistency report every week. This functionality addresses the third main challenge we have identified above. There are many ways an organization can get creative in choosing how to implement their own parallel processing strategy. The goal must be to set the distribution of records for each parallel node as close as possible, while meeting the processing time requirements and also remaining scalable for the future. As a cautionary step, an enterprise preparing to implement a similar parallel processing approach should perform some benchmarking tests to determine the optimum number of processes their MDM infrastructure can support. At some point there may be too many parallel processes running at one time, which can overwhelm the servers causing diminishing returns. There are other options we can look further to exploit even more ways to utilize parallel processing here at HCA. With our multi (2) server DM Server configuration we have the option to double the amount of processes we can have running from nine to eighteen processes at one time, by simply just utilizing both servers to process the data. It is good to know that there is still room for growth as we scale up to meet our business requirements now and into the future. REPORTING OUTPUT At HCA, we report inconsistencies found for each source entity record in two stages, first at the record level and then as if to drill down from that record, down to an attribute level. Immediately after each comparison between a source entity record and an MDM master data record where an inconsistency is found to exist for one or more attributes, we insert the entire data stream contents into a record exception table as shown in the example below. 6

7 SOURCE FIRST_NM_MD FIRST_NM LAST_NM_MD LAST_NM _MD PHONE_MD PHONE EXC_ID CRM MARY MARY JANE SMITH CRM BOB ROBERT DOE DOE RETAIL JOE JOE KERR KERR Table 2.Record level table exceptions example Record Level Exceptions Table Details Much like the data stream naming convention, the column naming convention we used for the record level exceptions table has the attribute name from the MDM record having the trailing _MD to denote the source of that attribute as coming from the master data, while the source column name of the same attribute have no extensions. For example, FIRST_NAME_MD is the MDM attribute name while FIRST_NAME without the _MD is the source entity name of the same attribute FIRST_NAME. Included in the record level exceptions is the EXC_ID (Exception ID), column which is a unique key for each record in the record level exceptions table, SRC_REC_ID, (not shown in the example), is the primary key of the entity record from the source system, and a flag column X_PASSED which is the status of the row with the following 4 possible values: 1. Missing Account Cluster Record an account record exists in the source system but not in MDM 2. Missing Customer Cluster Record a customer record exists in the source system but not in MDM 3. Missing both Account and Customer Cluster records Both records are NOT in MDM 4. Inconsistent attribute exist At least one attribute of the master data is inconsistent with the source system If none of the above conditions apply for the current record, then no insert to the exceptions table will take place as there is no data issue found for the specific record. Record Level Attribute Inconsistency Report In the first example record above, you will see that HCA customer MARY JANE may have had a name change submitted in CRM, where customer information such as the and phone was also updated along with the name columns. For some reason, this successful update from the CRM source system never reached MDM and thus was also never propagated back to the other 2 source systems. This results in a customer record that is updated only in CRM while MDM along with the 2 other source systems are now having outdated customer information. In the second example, we have a CRM entity record where the MDM FIRST_NM value is inconsistent and everything else is a match. In the last example, the RETAIL source system attribute is the only inconsistency with MDM. This could be an instance where the update was successful in CRM and reached MDM successfully however, the update did not propagate back to the other source systems from MDM as it was supposed to. This is also a typical scenario for us and does occasionally happen due to various network issues or data anomalies. We have also used this reporting table to obtain various statistical data to determine the number of records that have inconsistent attributes with MDM versus the entire data set or any of the 4 possible outcomes that I have listed above. We routinely deliver to the business the top 10 inconsistent attributes by source system to highlight any patterns emerging. As you can see, we also can tell when an entity record may exist in the source system however, not have an MDM master record. These functionalities bring additional value to us by helping identify these data issues where MDM is missing entire master records. EXC_ID MD_CLUSTER_ID ATTR_NAME CRM_VALUE RETAIL_VALUE LEASE_VALUE MDM_VALUE SRC_REC_ID LAST_NAME SMITH JANE JANE JANE msmith@abc.com mjane@abc.com mjane@abc.com mjane@abc.com PHONE FIRST_NAME ROBERT <null> BOB BOB jkerr@yahoo.com joker@hotmail.com <null> jkerr@yahoo.com 3763 Table 3.Attribute level table exceptions example 7

8 Attribute Level Exceptions Table Details Included in the attribute level table report are columns to show the entity attribute value from each source system and the MDM master data value. Since the record level exception table record is only a direct comparison between a single MDM and source entity value, there is an additional query that we must run for each record in the attribute level exception reporting, in order to fetch the other attribute values from either 1 or 2 more source systems. The record level table primary key (EXC_ID) is also included so we can have the ability to link the attribute level record back to the record level exceptions table. Attribute Level Inconsistency Report Immediately after the record level exceptions table is populated, the next step begins to process the contents of the record level exceptions table, by breaking it down into an attribute level reporting output format. We use standard DataFlux functionality to accomplish this task which at this time is not utilizing parallel processing like we did with the attribute comparisons. However, this may be a consideration for us to do in the near future as the number of master records continues to grow by almost 50% per year. Table 3 drills down from the record level exception table down to the attribute level exceptions table. Table 2 contains three distinct record level sample exceptions which breaks down into five distinct records, for our attribute level exceptions records. Each record in the attribute exceptions table is again by attribute level from the record level exceptions table. For example, where record level exception EXC_ID = 111 contain three total CRM attributes inconsistent with MDM which are entity LAST_NAME, , and PHONE from CRM source. For each of these, they are broken down into three records on our attribute level exceptions output. Each record in our attribute level report has the CRM, RETAIL, LEASE, and MDM values for each one of the attributes that are inconsistent. This format greatly enhances our Data Steward s ability to quickly identify which source system/s is inconsistent with MDM. Main Steps for Attribute Level Exception Reporting 1. Use a SQL input node to extract data from Record level exceptions table as shown below. Display 7.Input SQL node to extract data from record exceptions table A simple sql is used to extract each record for attribute level processing. 2. Using an Expression node, extract attribute values by source system and MDM value. 8

9 Display 8.Populate variables The expression node script on the left is used to populate three variables in preparing the data from MDM and source systems for each attribute. Note that the crm_value variable on the left is still a null value. This will be populated later in step Using the Expression node again to run a dynamic sql statement to extract the CRM attribute value to populate the variable crm_value. Since this attribute name is a variable a dynamic sql statement is used as shown in display 9. Display 9.Extract CRM value of the attribute using dynamic sql 4. Use Clustering and Survivorship nodes to guarantee uniqueness of records inserted in the attribute level reporting table. 9

10 Using Display 10.Clustering and Survivorship nodes ensures only unique records are inserted The combination of the attribute id and the master data cluster id should be unique prior to inserting the data stream into the attribute level reporting table. The possibility that a duplicate may be inserted is quite possible when the attribute is inconsistent in more than one source system. Since we are already reporting the attribute value for all source systems along with the MDM value then there is no need to have more than a single existence of the attribute id and master data cluster id combination inserted into the table. 5. Insert records into the attribute level exceptions table using a Data Insert output node. CONCLUSION Having DataFlux Data Management Studio has given us the necessary tools to implement a very reliable and accurate attribute inconsistency reporting tool given that, it was up to us to come up with creative ways to harness the power of the tool to overcome our technical challenges. Though the techniques we demonstrated here are specific to meet our own business requirements, they can be easily retrofitted to fit any other MDM solution as well. Additionally, the MDM solution does not have to be a DataFlux qmdm vendor, for these techniques to work. Implementing MDM was a significant expense and undertaking for our organization and quite possibly for others as well. As the MDM administrator, we are committed to doing whatever we can to get the most value from our investment. This is just one way we as MDM administrators have contributed towards that goal. CONTACT INFORMATION Your comments and questions are valued and encouraged. You may contact the author at: Name: Ernesto Roco Enterprise: Hyundai Capital America Address: 3161 Michelson drive Suite 1900 City,State, ZIP: Irvine, CA Work Phone: Fax: eroco@hcamerica.com Web: SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. indicates USA registration. Other brand and product names are trademarks of their respective companies. 10

What's New in SAS Data Management

What's New in SAS Data Management Paper SAS034-2014 What's New in SAS Data Management Nancy Rausch, SAS Institute Inc., Cary, NC; Mike Frost, SAS Institute Inc., Cary, NC, Mike Ames, SAS Institute Inc., Cary ABSTRACT The latest releases

More information

Data Warehouse and Business Intelligence Testing: Challenges, Best Practices & the Solution

Data Warehouse and Business Intelligence Testing: Challenges, Best Practices & the Solution Warehouse and Business Intelligence : Challenges, Best Practices & the Solution Prepared by datagaps http://www.datagaps.com http://www.youtube.com/datagaps http://www.twitter.com/datagaps Contact contact@datagaps.com

More information

INTEGRATING MICROSOFT DYNAMICS CRM WITH SIMEGO DS3

INTEGRATING MICROSOFT DYNAMICS CRM WITH SIMEGO DS3 INTEGRATING MICROSOFT DYNAMICS CRM WITH SIMEGO DS3 Often the most compelling way to introduce yourself to a software product is to try deliver value as soon as possible. Simego DS3 is designed to get you

More information

How to Enhance Traditional BI Architecture to Leverage Big Data

How to Enhance Traditional BI Architecture to Leverage Big Data B I G D ATA How to Enhance Traditional BI Architecture to Leverage Big Data Contents Executive Summary... 1 Traditional BI - DataStack 2.0 Architecture... 2 Benefits of Traditional BI - DataStack 2.0...

More information

A basic create statement for a simple student table would look like the following.

A basic create statement for a simple student table would look like the following. Creating Tables A basic create statement for a simple student table would look like the following. create table Student (SID varchar(10), FirstName varchar(30), LastName varchar(30), EmailAddress varchar(30));

More information

PROC SQL for SQL Die-hards Jessica Bennett, Advance America, Spartanburg, SC Barbara Ross, Flexshopper LLC, Boca Raton, FL

PROC SQL for SQL Die-hards Jessica Bennett, Advance America, Spartanburg, SC Barbara Ross, Flexshopper LLC, Boca Raton, FL PharmaSUG 2015 - Paper QT06 PROC SQL for SQL Die-hards Jessica Bennett, Advance America, Spartanburg, SC Barbara Ross, Flexshopper LLC, Boca Raton, FL ABSTRACT Inspired by Christianna William s paper on

More information

SQL Server 2012 Performance White Paper

SQL Server 2012 Performance White Paper Published: April 2012 Applies to: SQL Server 2012 Copyright The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication.

More information

SAS MDM 4.1. User s Guide Second Edition. SAS Documentation

SAS MDM 4.1. User s Guide Second Edition. SAS Documentation SAS MDM 4.1 User s Guide Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. SAS MDM 4.1: User's Guide, Second Edition. Cary, NC:

More information

dbspeak DBs peak when we speak

dbspeak DBs peak when we speak Data Profiling: A Practitioner s approach using Dataflux [Data profiling] employs analytic methods for looking at data for the purpose of developing a thorough understanding of the content, structure,

More information

Data processing goes big

Data processing goes big Test report: Integration Big Data Edition Data processing goes big Dr. Götz Güttich Integration is a powerful set of tools to access, transform, move and synchronize data. With more than 450 connectors,

More information

Accurate identification and maintenance. unique customer profiles are critical to the success of Oracle CRM implementations.

Accurate identification and maintenance. unique customer profiles are critical to the success of Oracle CRM implementations. Maintaining Unique Customer Profile for Oracle CRM Implementations By Anand Kanakagiri Editor s Note: It is a fairly common business practice for organizations to have customer data in several systems.

More information

Master Data Management and Universal Customer Master Overview

Master Data Management and Universal Customer Master Overview Master Data Management and Universal Customer Master Overview 1 MDM: Master Data Management Large companies often have IT systems that are used by diverse business functions (e.g., finance, sales, R&D,

More information

SAS MDM 4.2. User s Guide. SAS Documentation

SAS MDM 4.2. User s Guide. SAS Documentation SAS MDM 4.2 User s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS MDM 4.2: User's Guide. Cary, NC: SAS Institute Inc. SAS MDM 4.2:

More information

Outlines. Business Intelligence. What Is Business Intelligence? Data mining life cycle

Outlines. Business Intelligence. What Is Business Intelligence? Data mining life cycle Outlines Business Intelligence Lecture 15 Why integrate BI into your smart client application? Integrating Mining into your application Integrating into your application What Is Business Intelligence?

More information

Dynamic Decision-Making Web Services Using SAS Stored Processes and SAS Business Rules Manager

Dynamic Decision-Making Web Services Using SAS Stored Processes and SAS Business Rules Manager Paper SAS1787-2015 Dynamic Decision-Making Web Services Using SAS Stored Processes and SAS Business Rules Manager Chris Upton and Lori Small, SAS Institute Inc. ABSTRACT With the latest release of SAS

More information

Course ID#: 1401-801-14-W 35 Hrs. Course Content

Course ID#: 1401-801-14-W 35 Hrs. Course Content Course Content Course Description: This 5-day instructor led course provides students with the technical skills required to write basic Transact- SQL queries for Microsoft SQL Server 2014. This course

More information

Querying Microsoft SQL Server 2012

Querying Microsoft SQL Server 2012 Course 10774A: Querying Microsoft SQL Server 2012 Length: 5 Days Language(s): English Audience(s): IT Professionals Level: 200 Technology: Microsoft SQL Server 2012 Type: Course Delivery Method: Instructor-led

More information

Querying Microsoft SQL Server

Querying Microsoft SQL Server Course 20461C: Querying Microsoft SQL Server Module 1: Introduction to Microsoft SQL Server 2014 This module introduces the SQL Server platform and major tools. It discusses editions, versions, tools used

More information

Oracle EXAM - 1Z0-117. Oracle Database 11g Release 2: SQL Tuning. Buy Full Product. http://www.examskey.com/1z0-117.html

Oracle EXAM - 1Z0-117. Oracle Database 11g Release 2: SQL Tuning. Buy Full Product. http://www.examskey.com/1z0-117.html Oracle EXAM - 1Z0-117 Oracle Database 11g Release 2: SQL Tuning Buy Full Product http://www.examskey.com/1z0-117.html Examskey Oracle 1Z0-117 exam demo product is here for you to test the quality of the

More information

Webapps Vulnerability Report

Webapps Vulnerability Report Tuesday, May 1, 2012 Webapps Vulnerability Report Introduction This report provides detailed information of every vulnerability that was found and successfully exploited by CORE Impact Professional during

More information

Introducing Microsoft SQL Server 2012 Getting Started with SQL Server Management Studio

Introducing Microsoft SQL Server 2012 Getting Started with SQL Server Management Studio Querying Microsoft SQL Server 2012 Microsoft Course 10774 This 5-day instructor led course provides students with the technical skills required to write basic Transact-SQL queries for Microsoft SQL Server

More information

Course 10774A: Querying Microsoft SQL Server 2012

Course 10774A: Querying Microsoft SQL Server 2012 Course 10774A: Querying Microsoft SQL Server 2012 About this Course This 5-day instructor led course provides students with the technical skills required to write basic Transact-SQL queries for Microsoft

More information

Course 10774A: Querying Microsoft SQL Server 2012 Length: 5 Days Published: May 25, 2012 Language(s): English Audience(s): IT Professionals

Course 10774A: Querying Microsoft SQL Server 2012 Length: 5 Days Published: May 25, 2012 Language(s): English Audience(s): IT Professionals Course 10774A: Querying Microsoft SQL Server 2012 Length: 5 Days Published: May 25, 2012 Language(s): English Audience(s): IT Professionals Overview About this Course Level: 200 Technology: Microsoft SQL

More information

Postgres Plus xdb Replication Server with Multi-Master User s Guide

Postgres Plus xdb Replication Server with Multi-Master User s Guide Postgres Plus xdb Replication Server with Multi-Master User s Guide Postgres Plus xdb Replication Server with Multi-Master build 57 August 22, 2012 , Version 5.0 by EnterpriseDB Corporation Copyright 2012

More information

1Z0-117 Oracle Database 11g Release 2: SQL Tuning. Oracle

1Z0-117 Oracle Database 11g Release 2: SQL Tuning. Oracle 1Z0-117 Oracle Database 11g Release 2: SQL Tuning Oracle To purchase Full version of Practice exam click below; http://www.certshome.com/1z0-117-practice-test.html FOR Oracle 1Z0-117 Exam Candidates We

More information

A Brief Introduction to MySQL

A Brief Introduction to MySQL A Brief Introduction to MySQL by Derek Schuurman Introduction to Databases A database is a structured collection of logically related data. One common type of database is the relational database, a term

More information

Leveraging the SAS Open Metadata Architecture Ray Helm & Yolanda Howard, University of Kansas, Lawrence, KS

Leveraging the SAS Open Metadata Architecture Ray Helm & Yolanda Howard, University of Kansas, Lawrence, KS Paper AD08-2011 Leveraging the SAS Open Metadata Architecture Ray Helm & Yolanda Howard, University of Kansas, Lawrence, KS Abstract In the SAS Enterprise BI and Data Integration environments, the SAS

More information

Lost in Space? Methodology for a Guided Drill-Through Analysis Out of the Wormhole

Lost in Space? Methodology for a Guided Drill-Through Analysis Out of the Wormhole Paper BB-01 Lost in Space? Methodology for a Guided Drill-Through Analysis Out of the Wormhole ABSTRACT Stephen Overton, Overton Technologies, LLC, Raleigh, NC Business information can be consumed many

More information

Oracle 10g PL/SQL Training

Oracle 10g PL/SQL Training Oracle 10g PL/SQL Training Course Number: ORCL PS01 Length: 3 Day(s) Certification Exam This course will help you prepare for the following exams: 1Z0 042 1Z0 043 Course Overview PL/SQL is Oracle's Procedural

More information

PeopleSoft Query Training

PeopleSoft Query Training PeopleSoft Query Training Overview Guide Tanya Harris & Alfred Karam Publish Date - 3/16/2011 Chapter: Introduction Table of Contents Introduction... 4 Navigation of Queries... 4 Query Manager... 6 Query

More information

MASTER DATA MANAGEMENT IN THE AGE OF BIG DATA

MASTER DATA MANAGEMENT IN THE AGE OF BIG DATA MASTER DATA MANAGEMENT IN THE AGE OF BIG DATA PRESENTED TO IRMAC MAY 15, 2013 STEVE PAPAGIANNIS STEVE.PAPAGIANNIS@SAS.COM 416 307 4620 DEFINITIONS WHAT ARE MASTER AND BIG DATA??? Master data is information

More information

Data Management, Analysis Tools, and Analysis Mechanics

Data Management, Analysis Tools, and Analysis Mechanics Chapter 2 Data Management, Analysis Tools, and Analysis Mechanics This chapter explores different tools and techniques for handling data for research purposes. This chapter assumes that a research problem

More information

Performance Implications of Various Cursor Types in Microsoft SQL Server. By: Edward Whalen Performance Tuning Corporation

Performance Implications of Various Cursor Types in Microsoft SQL Server. By: Edward Whalen Performance Tuning Corporation Performance Implications of Various Cursor Types in Microsoft SQL Server By: Edward Whalen Performance Tuning Corporation INTRODUCTION There are a number of different types of cursors that can be created

More information

Nimble Storage Best Practices for Microsoft SQL Server

Nimble Storage Best Practices for Microsoft SQL Server BEST PRACTICES GUIDE: Nimble Storage Best Practices for Microsoft SQL Server Summary Microsoft SQL Server databases provide the data storage back end for mission-critical applications. Therefore, it s

More information

AV-005: Administering and Implementing a Data Warehouse with SQL Server 2014

AV-005: Administering and Implementing a Data Warehouse with SQL Server 2014 AV-005: Administering and Implementing a Data Warehouse with SQL Server 2014 Career Details Duration 105 hours Prerequisites This career requires that you meet the following prerequisites: Working knowledge

More information

Microsoft Access 3: Understanding and Creating Queries

Microsoft Access 3: Understanding and Creating Queries Microsoft Access 3: Understanding and Creating Queries In Access Level 2, we learned how to perform basic data retrievals by using Search & Replace functions and Sort & Filter functions. For more complex

More information

Move Data from Oracle to Hadoop and Gain New Business Insights

Move Data from Oracle to Hadoop and Gain New Business Insights Move Data from Oracle to Hadoop and Gain New Business Insights Written by Lenka Vanek, senior director of engineering, Dell Software Abstract Today, the majority of data for transaction processing resides

More information

ORACLE DATA QUALITY ORACLE DATA SHEET KEY BENEFITS

ORACLE DATA QUALITY ORACLE DATA SHEET KEY BENEFITS ORACLE DATA QUALITY KEY BENEFITS Oracle Data Quality offers, A complete solution for all customer data quality needs covering the full spectrum of data quality functionalities Proven scalability and high

More information

SAS/Data Integration Studio Creating and Using A Generated Transformation Jeff Dyson, Financial Risk Group, Cary, NC

SAS/Data Integration Studio Creating and Using A Generated Transformation Jeff Dyson, Financial Risk Group, Cary, NC Paper BB-05 SAS/Data Integration Studio Creating and Using A Generated Transformation Jeff Dyson, Financial Risk Group, Cary, NC ABSTRACT SAS/Data Integration Studio (DI Studio) transformations are packaged

More information

CA Database Performance

CA Database Performance DATA SHEET CA Database Performance CA Database Performance helps you proactively manage and alert on database performance across the IT infrastructure, regardless of whether the database is located within

More information

Course 20461C: Querying Microsoft SQL Server Duration: 35 hours

Course 20461C: Querying Microsoft SQL Server Duration: 35 hours Course 20461C: Querying Microsoft SQL Server Duration: 35 hours About this Course This course is the foundation for all SQL Server-related disciplines; namely, Database Administration, Database Development

More information

Same Data Different Attributes: Cloning Issues with Data Sets Brian Varney, Experis Business Analytics, Portage, MI

Same Data Different Attributes: Cloning Issues with Data Sets Brian Varney, Experis Business Analytics, Portage, MI Paper BtB-16 Same Data Different Attributes: Cloning Issues with Data Sets Brian Varney, Experis Business Analytics, Portage, MI SESUG 2013 ABSTRACT When dealing with data from multiple or unstructured

More information

Internet/Intranet, the Web & SAS. II006 Building a Web Based EIS for Data Analysis Ed Confer, KGC Programming Solutions, Potomac Falls, VA

Internet/Intranet, the Web & SAS. II006 Building a Web Based EIS for Data Analysis Ed Confer, KGC Programming Solutions, Potomac Falls, VA II006 Building a Web Based EIS for Data Analysis Ed Confer, KGC Programming Solutions, Potomac Falls, VA Abstract Web based reporting has enhanced the ability of management to interface with data in a

More information

An email macro: Exploring metadata EG and user credentials in Linux to automate email notifications Jason Baucom, Ateb Inc.

An email macro: Exploring metadata EG and user credentials in Linux to automate email notifications Jason Baucom, Ateb Inc. SESUG 2012 Paper CT-02 An email macro: Exploring metadata EG and user credentials in Linux to automate email notifications Jason Baucom, Ateb Inc., Raleigh, NC ABSTRACT Enterprise Guide (EG) provides useful

More information

Introduction to Visual Basic and Visual C++ Database Foundation. Types of Databases. Data Access Application Models. Introduction to Database System

Introduction to Visual Basic and Visual C++ Database Foundation. Types of Databases. Data Access Application Models. Introduction to Database System Introduction to Visual Basic and Visual C++ Database Foundation Lesson 8 Introduction to Database System I154-1-A A @ Peter Lo 2010 1 I154-1-A A @ Peter Lo 2010 2 Data Access Application Models Types of

More information

White Paper. How Streaming Data Analytics Enables Real-Time Decisions

White Paper. How Streaming Data Analytics Enables Real-Time Decisions White Paper How Streaming Data Analytics Enables Real-Time Decisions Contents Introduction... 1 What Is Streaming Analytics?... 1 How Does SAS Event Stream Processing Work?... 2 Overview...2 Event Stream

More information

Oracle9i Data Warehouse Review. Robert F. Edwards Dulcian, Inc.

Oracle9i Data Warehouse Review. Robert F. Edwards Dulcian, Inc. Oracle9i Data Warehouse Review Robert F. Edwards Dulcian, Inc. Agenda Oracle9i Server OLAP Server Analytical SQL Data Mining ETL Warehouse Builder 3i Oracle 9i Server Overview 9i Server = Data Warehouse

More information

Sales Performance Management Using Salesforce.com and Tableau 8 Desktop Professional & Server

Sales Performance Management Using Salesforce.com and Tableau 8 Desktop Professional & Server Sales Performance Management Using Salesforce.com and Tableau 8 Desktop Professional & Server Author: Phil Gilles Sales Operations Analyst, Tableau Software March 2013 p2 Executive Summary Managing sales

More information

HP Quality Center. Upgrade Preparation Guide

HP Quality Center. Upgrade Preparation Guide HP Quality Center Upgrade Preparation Guide Document Release Date: November 2008 Software Release Date: November 2008 Legal Notices Warranty The only warranties for HP products and services are set forth

More information

Querying Microsoft SQL Server 20461C; 5 days

Querying Microsoft SQL Server 20461C; 5 days Lincoln Land Community College Capital City Training Center 130 West Mason Springfield, IL 62702 217-782-7436 www.llcc.edu/cctc Querying Microsoft SQL Server 20461C; 5 days Course Description This 5-day

More information

MOC 20461C: Querying Microsoft SQL Server. Course Overview

MOC 20461C: Querying Microsoft SQL Server. Course Overview MOC 20461C: Querying Microsoft SQL Server Course Overview This course provides students with the knowledge and skills to query Microsoft SQL Server. Students will learn about T-SQL querying, SQL Server

More information

Architecting an Industrial Sensor Data Platform for Big Data Analytics

Architecting an Industrial Sensor Data Platform for Big Data Analytics Architecting an Industrial Sensor Data Platform for Big Data Analytics 1 Welcome For decades, organizations have been evolving best practices for IT (Information Technology) and OT (Operation Technology).

More information

POLAR IT SERVICES. Business Intelligence Project Methodology

POLAR IT SERVICES. Business Intelligence Project Methodology POLAR IT SERVICES Business Intelligence Project Methodology Table of Contents 1. Overview... 2 2. Visualize... 3 3. Planning and Architecture... 4 3.1 Define Requirements... 4 3.1.1 Define Attributes...

More information

Querying Microsoft SQL Server Course M20461 5 Day(s) 30:00 Hours

Querying Microsoft SQL Server Course M20461 5 Day(s) 30:00 Hours Área de formação Plataforma e Tecnologias de Informação Querying Microsoft SQL Introduction This 5-day instructor led course provides students with the technical skills required to write basic Transact-SQL

More information

SQL Databases Course. by Applied Technology Research Center. This course provides training for MySQL, Oracle, SQL Server and PostgreSQL databases.

SQL Databases Course. by Applied Technology Research Center. This course provides training for MySQL, Oracle, SQL Server and PostgreSQL databases. SQL Databases Course by Applied Technology Research Center. 23 September 2015 This course provides training for MySQL, Oracle, SQL Server and PostgreSQL databases. Oracle Topics This Oracle Database: SQL

More information

Contingency Planning and Disaster Recovery

Contingency Planning and Disaster Recovery Contingency Planning and Disaster Recovery Best Practices Guide Perceptive Content Version: 7.0.x Written by: Product Knowledge Date: October 2014 2014 Perceptive Software. All rights reserved Perceptive

More information

Module 1: Getting Started with Databases and Transact-SQL in SQL Server 2008

Module 1: Getting Started with Databases and Transact-SQL in SQL Server 2008 Course 2778A: Writing Queries Using Microsoft SQL Server 2008 Transact-SQL About this Course This 3-day instructor led course provides students with the technical skills required to write basic Transact-

More information

MySQL for Beginners Ed 3

MySQL for Beginners Ed 3 Oracle University Contact Us: 1.800.529.0165 MySQL for Beginners Ed 3 Duration: 4 Days What you will learn The MySQL for Beginners course helps you learn about the world's most popular open source database.

More information

Building a Web Based EIS for Data Analysis Ed Confer, KGC Programming Solutions, Potomac Falls, VA

Building a Web Based EIS for Data Analysis Ed Confer, KGC Programming Solutions, Potomac Falls, VA Building a Web Based EIS for Data Analysis Ed Confer, KGC Programming Solutions, Potomac Falls, VA Abstract Web based reporting has enhanced the ability of management to interface with data in a point

More information

CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS

CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS In today's scenario data warehouse plays a crucial role in order to perform important operations. Different indexing techniques has been used and analyzed using

More information

Extraction Transformation Loading ETL Get data out of sources and load into the DW

Extraction Transformation Loading ETL Get data out of sources and load into the DW Lection 5 ETL Definition Extraction Transformation Loading ETL Get data out of sources and load into the DW Data is extracted from OLTP database, transformed to match the DW schema and loaded into the

More information

Oracle Database: SQL and PL/SQL Fundamentals

Oracle Database: SQL and PL/SQL Fundamentals Oracle University Contact Us: 1.800.529.0165 Oracle Database: SQL and PL/SQL Fundamentals Duration: 5 Days What you will learn This course is designed to deliver the fundamentals of SQL and PL/SQL along

More information

Experiences in Using Academic Data for BI Dashboard Development

Experiences in Using Academic Data for BI Dashboard Development Paper RIV09 Experiences in Using Academic Data for BI Dashboard Development Evangeline Collado, University of Central Florida; Michelle Parente, University of Central Florida ABSTRACT Business Intelligence

More information

SAS Marketing Automation 5.1. User s Guide

SAS Marketing Automation 5.1. User s Guide SAS Marketing Automation 5.1 User s Guide The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2007. SAS Marketing Automation 5.1: User s Guide. Cary, NC: SAS Institute

More information

Writing Queries Using Microsoft SQL Server 2008 Transact-SQL

Writing Queries Using Microsoft SQL Server 2008 Transact-SQL Course 2778A: Writing Queries Using Microsoft SQL Server 2008 Transact-SQL Length: 3 Days Language(s): English Audience(s): IT Professionals Level: 200 Technology: Microsoft SQL Server 2008 Type: Course

More information

release 240 Exact Synergy Enterprise CRM Implementation Manual

release 240 Exact Synergy Enterprise CRM Implementation Manual release 240 Exact Synergy Enterprise CRM Implementation Manual EXACT SYNERGY ENTERPRISE CRM IMPLEMENTATION MANUAL The information provided in this manual is intended for internal use by or within the organization

More information

How to Ingest Data into Google BigQuery using Talend for Big Data. A Technical Solution Paper from Saama Technologies, Inc.

How to Ingest Data into Google BigQuery using Talend for Big Data. A Technical Solution Paper from Saama Technologies, Inc. How to Ingest Data into Google BigQuery using Talend for Big Data A Technical Solution Paper from Saama Technologies, Inc. July 30, 2013 Table of Contents Intended Audience What you will Learn Background

More information

ABSTRACT INTRODUCTION DATA FEEDS TO THE DASHBOARD

ABSTRACT INTRODUCTION DATA FEEDS TO THE DASHBOARD Dashboard Reports for Predictive Model Management Jifa Wei, SAS Institute Inc., Cary, NC Emily (Yan) Gao, SAS Institute Inc., Beijing, China Frank (Jidong) Wang, SAS Institute Inc., Beijing, China Robert

More information

MOC 20461 QUERYING MICROSOFT SQL SERVER

MOC 20461 QUERYING MICROSOFT SQL SERVER ONE STEP AHEAD. MOC 20461 QUERYING MICROSOFT SQL SERVER Length: 5 days Level: 300 Technology: Microsoft SQL Server Delivery Method: Instructor-led (classroom) COURSE OUTLINE Module 1: Introduction to Microsoft

More information

Data Integration with Talend Open Studio Robert A. Nisbet, Ph.D.

Data Integration with Talend Open Studio Robert A. Nisbet, Ph.D. Data Integration with Talend Open Studio Robert A. Nisbet, Ph.D. Most college courses in statistical analysis and data mining are focus on the mathematical techniques for analyzing data structures, rather

More information

Optimum Database Design: Using Normal Forms and Ensuring Data Integrity. by Patrick Crever, Relational Database Programmer, Synergex

Optimum Database Design: Using Normal Forms and Ensuring Data Integrity. by Patrick Crever, Relational Database Programmer, Synergex Optimum Database Design: Using Normal Forms and Ensuring Data Integrity by Patrick Crever, Relational Database Programmer, Synergex Printed: April 2007 The information contained in this document is subject

More information

Intro to Embedded SQL Programming for ILE RPG Developers

Intro to Embedded SQL Programming for ILE RPG Developers Intro to Embedded SQL Programming for ILE RPG Developers Dan Cruikshank DB2 for i Center of Excellence 1 Agenda Reasons for using Embedded SQL Getting started with Embedded SQL Using Host Variables Using

More information

About the Tutorial. Audience. Prerequisites. Disclaimer & Copyright. ETL Testing

About the Tutorial. Audience. Prerequisites. Disclaimer & Copyright. ETL Testing About the Tutorial An ETL tool extracts the data from all these heterogeneous data sources, transforms the data (like applying calculations, joining fields, keys, removing incorrect data fields, etc.),

More information

Implementing a Data Warehouse with Microsoft SQL Server

Implementing a Data Warehouse with Microsoft SQL Server This course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse 2014, implement ETL with SQL Server Integration Services, and

More information

Data Domain Profiling and Data Masking for Hadoop

Data Domain Profiling and Data Masking for Hadoop Data Domain Profiling and Data Masking for Hadoop 1993-2015 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or

More information

Relational Database Basics Review

Relational Database Basics Review Relational Database Basics Review IT 4153 Advanced Database J.G. Zheng Spring 2012 Overview Database approach Database system Relational model Database development 2 File Processing Approaches Based on

More information

Trafodion Operational SQL-on-Hadoop

Trafodion Operational SQL-on-Hadoop Trafodion Operational SQL-on-Hadoop SophiaConf 2015 Pierre Baudelle, HP EMEA TSC July 6 th, 2015 Hadoop workload profiles Operational Interactive Non-interactive Batch Real-time analytics Operational SQL

More information

Database Design Standards. U.S. Small Business Administration Office of the Chief Information Officer Office of Information Systems Support

Database Design Standards. U.S. Small Business Administration Office of the Chief Information Officer Office of Information Systems Support Database Design Standards U.S. Small Business Administration Office of the Chief Information Officer Office of Information Systems Support TABLE OF CONTENTS CHAPTER PAGE NO 1. Standards and Conventions

More information

Optimizing Your Database Performance the Easy Way

Optimizing Your Database Performance the Easy Way Optimizing Your Database Performance the Easy Way by Diane Beeler, Consulting Product Marketing Manager, BMC Software and Igy Rodriguez, Technical Product Manager, BMC Software Customers and managers of

More information

Teamstudio USER GUIDE

Teamstudio USER GUIDE Teamstudio Software Engineering Tools for IBM Lotus Notes and Domino USER GUIDE Edition 30 Copyright Notice This User Guide documents the entire Teamstudio product suite, including: Teamstudio Analyzer

More information

Simba Apache Cassandra ODBC Driver

Simba Apache Cassandra ODBC Driver Simba Apache Cassandra ODBC Driver with SQL Connector 2.2.0 Released 2015-11-13 These release notes provide details of enhancements, features, and known issues in Simba Apache Cassandra ODBC Driver with

More information

Automate Data Integration Processes for Pharmaceutical Data Warehouse

Automate Data Integration Processes for Pharmaceutical Data Warehouse Paper AD01 Automate Data Integration Processes for Pharmaceutical Data Warehouse Sandy Lei, Johnson & Johnson Pharmaceutical Research and Development, L.L.C, Titusville, NJ Kwang-Shi Shu, Johnson & Johnson

More information

Sage ERP MAS. Everything you want to know about Sage ERP MAS Intelligence. What is Sage ERP MAS Intelligence? benefits

Sage ERP MAS. Everything you want to know about Sage ERP MAS Intelligence. What is Sage ERP MAS Intelligence? benefits Sage ERP MAS Everything you want to know about Sage ERP MAS Intelligence What is Sage ERP MAS Intelligence? Sage ERP MAS Intelligence (or Intelligence) empowers managers to quickly and easily obtain operations

More information

Institutional Research Database Study

Institutional Research Database Study Institutional Research Database Study The Office of Institutional Research uses data provided by Administrative Computing to perform reporting requirements to SCHEV and other state government agencies.

More information

AMB-PDM Overview v6.0.5

AMB-PDM Overview v6.0.5 Predictive Data Management (PDM) makes profiling and data testing more simple, powerful, and cost effective than ever before. Version 6.0.5 adds new SOA and in-stream capabilities while delivering a powerful

More information

ETL Anatomy 101 Tom Miron, Systems Seminar Consultants, Madison, WI

ETL Anatomy 101 Tom Miron, Systems Seminar Consultants, Madison, WI Paper AD01-2011 ETL Anatomy 101 Tom Miron, Systems Seminar Consultants, Madison, WI Abstract Extract, Transform, and Load isn't just for data warehouse. This paper explains ETL principles that can be applied

More information

D61830GC30. MySQL for Developers. Summary. Introduction. Prerequisites. At Course completion After completing this course, students will be able to:

D61830GC30. MySQL for Developers. Summary. Introduction. Prerequisites. At Course completion After completing this course, students will be able to: D61830GC30 for Developers Summary Duration Vendor Audience 5 Days Oracle Database Administrators, Developers, Web Administrators Level Technology Professional Oracle 5.6 Delivery Method Instructor-led

More information

An Oracle White Paper March 2014. Best Practices for Real-Time Data Warehousing

An Oracle White Paper March 2014. Best Practices for Real-Time Data Warehousing An Oracle White Paper March 2014 Best Practices for Real-Time Data Warehousing Executive Overview Today s integration project teams face the daunting challenge that, while data volumes are exponentially

More information

COURSE 20463C: IMPLEMENTING A DATA WAREHOUSE WITH MICROSOFT SQL SERVER

COURSE 20463C: IMPLEMENTING A DATA WAREHOUSE WITH MICROSOFT SQL SERVER Page 1 of 8 ABOUT THIS COURSE This 5 day course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse with Microsoft SQL Server

More information

ICAB4136B Use structured query language to create database structures and manipulate data

ICAB4136B Use structured query language to create database structures and manipulate data ICAB4136B Use structured query language to create database structures and manipulate data Release: 1 ICAB4136B Use structured query language to create database structures and manipulate data Modification

More information

Hortonworks & SAS. Analytics everywhere. Page 1. Hortonworks Inc. 2011 2014. All Rights Reserved

Hortonworks & SAS. Analytics everywhere. Page 1. Hortonworks Inc. 2011 2014. All Rights Reserved Hortonworks & SAS Analytics everywhere. Page 1 A change in focus. A shift in Advertising From mass branding A shift in Financial Services From Educated Investing A shift in Healthcare From mass treatment

More information

Implementing a Data Warehouse with Microsoft SQL Server

Implementing a Data Warehouse with Microsoft SQL Server Page 1 of 7 Overview This course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse with Microsoft SQL 2014, implement ETL

More information

Using SAS Enterprise Business Intelligence to Automate a Manual Process: A Case Study Erik S. Larsen, Independent Consultant, Charleston, SC

Using SAS Enterprise Business Intelligence to Automate a Manual Process: A Case Study Erik S. Larsen, Independent Consultant, Charleston, SC Using SAS Enterprise Business Intelligence to Automate a Manual Process: A Case Study Erik S. Larsen, Independent Consultant, Charleston, SC Abstract: Often times while on a client site as a SAS consultant,

More information

University of Leeds, VLE Service 2006 User Management Development Phase 1. Software Design Specification

University of Leeds, VLE Service 2006 User Management Development Phase 1. Software Design Specification University of Leeds, VLE Service 2006 User Management Development Phase 1 Software Design Specification Author: Jon Maber Client: The University of Leeds, VLE Service Date: July 2006 Purpose of this Document

More information

Retrieving Data Using the SQL SELECT Statement. Copyright 2006, Oracle. All rights reserved.

Retrieving Data Using the SQL SELECT Statement. Copyright 2006, Oracle. All rights reserved. Retrieving Data Using the SQL SELECT Statement Objectives After completing this lesson, you should be able to do the following: List the capabilities of SQL SELECT statements Execute a basic SELECT statement

More information

How To Store Data On An Ocora Nosql Database On A Flash Memory Device On A Microsoft Flash Memory 2 (Iomemory)

How To Store Data On An Ocora Nosql Database On A Flash Memory Device On A Microsoft Flash Memory 2 (Iomemory) WHITE PAPER Oracle NoSQL Database and SanDisk Offer Cost-Effective Extreme Performance for Big Data 951 SanDisk Drive, Milpitas, CA 95035 www.sandisk.com Table of Contents Abstract... 3 What Is Big Data?...

More information

Ultimus and Microsoft Active Directory

Ultimus and Microsoft Active Directory Ultimus and Microsoft Active Directory May 2004 Ultimus, Incorporated 15200 Weston Parkway, Suite 106 Cary, North Carolina 27513 Phone: (919) 678-0900 Fax: (919) 678-0901 E-mail: documents@ultimus.com

More information

DataPA OpenAnalytics End User Training

DataPA OpenAnalytics End User Training DataPA OpenAnalytics End User Training DataPA End User Training Lesson 1 Course Overview DataPA Chapter 1 Course Overview Introduction This course covers the skills required to use DataPA OpenAnalytics

More information

www.dotnetsparkles.wordpress.com

www.dotnetsparkles.wordpress.com Database Design Considerations Designing a database requires an understanding of both the business functions you want to model and the database concepts and features used to represent those business functions.

More information

IBM Global Business Services Microsoft Dynamics CRM solutions from IBM

IBM Global Business Services Microsoft Dynamics CRM solutions from IBM IBM Global Business Services Microsoft Dynamics CRM solutions from IBM Power your productivity 2 Microsoft Dynamics CRM solutions from IBM Highlights Win more deals by spending more time on selling and

More information