Process-Driven and Context-Powered Data Quality Methodology

Size: px
Start display at page:

Download "Process-Driven and Context-Powered Data Quality Methodology"

Transcription

1 Process-Driven and Context-Powered Data Quality Methodology A MIOsoft White Paper MIOvantage

2

3 TABLE OF CONTENTS TABLE OF CONTENTS... 3 CONTEXT DATA QUALITY PROVIDES DRAMATIC COST REDUCTION... 4 BUSINESS BENEFITS OF CONTEXT... 4 BUSINESS PROCESSES DRIVE EFFECTIVE DATA QUALITY... 4 KEY POINTS... 5 PROCESSES DETERMINE EXPECTATIONS... 5 CONTEXT ALLOWS COMPLEX DATA QUALITY MEASURES... 7 INTEGRATION CHECKPOINTS... 7 FORMALIZATION EXAMPLE CONTEXT CLUSTERING FORMALIZATION EXAMPLE EXPECTED VALUES FORMALIZATION EXAMPLE COMPARISON OF ACTUAL AND EXPECTED DATA CATEGORIZATION OF FINDINGS FORMALIZATION ROOT CAUSE ANALYSIS BIG DATA QUALITY REQUIRES CONTEXT MIOsoft Corporation. All Rights Reserved.

4 CONTEXT DATA QUALITY PROVIDES DRAMATIC COST REDUCTION Companies know that business success requires high-quality data. MIOsoft s data quality assurance method uses context to leverage the insights of a process s information by clustering all related data. Context provides a direct way to discover the root causes of issues and perform real-time transformation, directly reducing data-quality-related support and cleansing costs. Business Benefits of Context Reduces support costs and increases customer satisfaction by preventing problems proactively, in many cases before they are encountered by the customer. Finds all problems in the process, even unanticipated ones. Reduces analysis and cleansing costs by finding the root causes of data quality problems. Monitors and controls side effects of data cleansing. Continuous data quality analysis offers real-time KPIs for monitoring and analyzing customer behavior. BUSINESS PROCESSES DRIVE EFFECTIVE DATA QUALITY Fit for use is the most common definition of data quality. In this paper, we will discuss a methodology which allows us to define the actual use and value of our data, then measure whether the data is fit for this use. First the context is chosen, according to the process to be analyzed: for example, an order or a customer. We define checkpoints along the process s data flow, especially at critical points of the process and after data transactions. Checkpoints can be between, across, and within the systems that belong to the process. At all checkpoints, we can leverage context to quickly determine the expected data values and compare them to the actual data values found in or between the systems. 4

5 Choice of processes to be analyzed (defines the context) Definition of checkpoints and captures in process data flow Computation of expected data for all checkpoints and captures Comparison to actual data Key Points Traditional approaches to data quality only discover and treat the symptoms of underlying problems. Finding inaccurate data across or between systems using the process helps identify the root cause of the problem so that it can be eliminated. Context allows the data to easily be understood with respect to differing systems and processes. Appropriate context technology for data quality provides three main features: checkpoints that can be set throughout the process when data is at rest, captures that can be set between checkpoints to record data in motion, and advanced relationship discovery techniques to combine all related data in one context. PROCESSES DETERMINE EXPECTATIONS We use the process to compute the expected data so that we can later measure whether or not the actual data meets these criteria. Computing the expected values is critical, since this governs the output. Although this may seem straightforward, in reality companies often do not know their expected data values in detail. By performing an analysis of a company s implemented business processes, we can understand the data s purpose, expected use, and expected values. It is important to be aware that in many companies, especially large enterprises, some processes do not work as described in their documentation. Workarounds are frequently introduced and are usually not accurately or fully documented. 5

6 MIOsoft s Context-Powered and Process-Driven Data Quality Methodology 6

7 CONTEXT ALLOWS COMPLEX DATA QUALITY MEASURES We use context to examine complex processes. A context is a group of related data. For example, a purchase context could consist of the purchased product s information, the customer s information, and payment information. Using contexts in data quality has several advantages. Having all process-related data in one context makes finding the expected data and comparing it to the actual data simpler: all needed information is easily available. By using relationship discovery techniques, we can also find data that we did not expect to be related, or data that we did not expect to exist in the system. The process of forming contexts by collecting and visualizing all related data creates a data pool where potential data problems can be exposed. This allows us to quickly see data quality issues, even unanticipated ones. To demonstrate how this works, we will discuss a revenue assurance project we implemented for one of the world s largest telecommunications companies. Billing System Internet Legacy Product has Internet part CRM- and Order System Which product is ordered? Product needs to be rated? YES Rating System Product has line part Customer is migrated? No YES Billing System Line Legacy Billing System Line No End The basis of any revenue assurance program is a single question: Did we bill an order correctly? Managers require a quick answer to this question, but the complexity of the answer can vary depending on the complexity of the business processes. An abstracted model of our example business process is shown above. INTEGRATION CHECKPOINTS We define checkpoints at critical data locations along the business process. Checkpoints are places where we check whether the data flow contains the correct data at a given time. To do 7

8 this, we unload one or more snapshots of the checkpoint, which consist of the relevant attributes and the snapshot time, and compare them. In order to generate a snapshot, and to make two snapshots comparable, we need specialized data integration techniques. In our revenue assurance example, one of the sources, an Oracle database system, was very large. We decided to unload the system s CRM data with a delta process and deliver the data without any table relations. Realities and compromises like these are common, as is an initially incomplete understanding of the data s relationships. Therefore, one benefit of appropriate context technology is the ability to discover the relationships in the data: for example, connecting contracts to customer data. In the figure below, we can see an example of the complexity of key connections for our revenue assurance example. Note that the product system is used to transform keys in order to make two systems compatible and comparable. CRM- and Order System Internet Key Billing System Internet Lecacy Internet Internet Product Asset Internet Line Billing Product Billing Price Key InternetProduct Rating Key Billing System Line Legacy Billing Price Rating Key Rating Key Rating System Rating Price Key Price Key Asset Key Billing System Asset Product System Price Product InternetProduct Rating In order to compare the CRM and the Rating system, we need to transfer the CRM product data into rating data. We can do this using the product system. In green, we can see the stored data of the respective systems. Data integration with MIOsoft s tools can be developed very quickly and easily. In the user interface, we see the data we need to integrate on the right-hand side 8

9 and our data quality model on the left-hand side. We can connect the data to the model s attributes with two clicks. CRM- and Order System Billing System Internet Legacy Rating System Data Flow 1 CRM- and Order System Billing System Line Legacy Rating System Data Flow 2 CRM- and Order System Billing System Line Data Flow 3 Starting Time Starting Time + Time that Data Flow 3 needs to reach Billing System Line (Billing Line time 3) Starting Time + Time that Data Flow 1 needs to reach the Rating system (Rating time 1) Time of Data Flow In the previous figure, we see our example process s data flows and the time at which the snapshots are taken. We need to be careful to choose the right time to unload, since we have to maximize the synchronization of the compared data and minimize the use of data that is out of sync. For the definition of a snapshot, we assume that, at every checkpoint, it is possible to unload the relevant data in one table. In order to implement this, we must use strong data integration techniques. The data integration technologies must handle joins, data transformation and standardization, and all kinds of standard and non-standard data formats. Checkpoints are used as master checkpoints or data checkpoints. Master checkpoints are located at the start of a data flows. Usually, there will only be one logical master checkpoint in a business process, but this is not required. Data checkpoints are located at any non-starting point of a data flow. 9

10 In addition to the checkpoints, we also perform captures. In general, captures are unloads of a bus or a queue. Captures allow investigation of the data flow between checkpoints over a period of time. Because of this, captures are helpful when finding root causes. We will discuss this concept in more detail later. Formalization A formalization of our integration definition allows us to be more explicit. We define a set I of elements x1, x2,..xn to be I = {x1,x2,..xn}. Let A be a table of data with rows (records) and columns (attributes) unloaded or created at time t. Let C be the set of columns of A; C= {column1, column2,..columnn } Let R be the set of rows of A; R= { row1,row2, rowm } Such that A (t)= [C][R] A is therefore a set of elements where each element (cell) knows its position (column,row). Elements can be of various types like bool, string, date, etc. We define the subset a of A with selected columns and all rows to be a (t)= [Ca] where Ca is a subset of C. and the subset b of A with selected columns and selected rows to be b (t)= [Cb][Rb] where Cb is a subset of C and Rb is a subset of R. Let A be the table of system A with A(time of unload)= [CsystemA][RsystemA]. Then we can define the snapshot of system A (starting time of data flow + time of data flow arriving at system A) S(A)=[attribute1, attribute2, attributen] where {attribute1, attribute2, attributen} is the subset of relevant columns of CsystemA. We define a fragment of a checkpoint such that looking at one row of a snapshot we get a fragment. 10

11 Fragment X of Snapshot system A = [attribute1, attribute2, attributen][row x] We define a Capture K between snapshot A and B to be K of snapshots A and B (starting time + time of data flow arriving at system A; starting time + time of data flow arriving at system B) = [attribute1, attribute2, attributen] where {attribute1, attribute2, attributen} are a subset of CBus. Again, we get a fragment by looking at just one row. Fragment X of Capture K = [attribute1, attribute2, attributen][row x]. Example In our revenue assurance example, we can choose the CRM system as the master checkpoint since all data flows begin in the CRM system. Data checkpoints are located at every system with data at rest (Billing Internet, Billing Line, Billing Line Legacy, and Rating). Snapshot CRM (starting time) = [customer nb, asset_nb, internet_product_key, product_name, internet_key, billing_nb, price_key, billing_price_key, rating_key ] Snapshot Billing Internet (starting time + Billing Internet Time 1) = [customer nb, internet_product_key,product_name, internet_key, rating_key] Snapshot Billing Line (starting time + Billing Line Time 3) = [asset_nb] Snapshot Billing Line Legacy (starting time + Billing Line Legacy Time 2) = [billing_price_key, billing_nb, price_key, rating_key] Snapshot Rating (starting time + Rating Time 1) = [rating_key] Looking at our example, we would place a capture on the data flow between the CRM system and the Billing Internet system. In the case of an inconsistency, we can locate exactly where the error happened by looking at the messages sent from the CRM system to the Billing Internet system. 11

12 Capture of Snapshots CRM and Billing Internet (starting time;starting time + Billing Internet Time 1) = [customer nb, asset nb, internet_product_key] CONTEXT CLUSTERING For our methodology, we have to carefully choose which contexts we are interested in, keeping in mind our desired result and the business process we are analyzing. To get the correct context, we have to discover all relationships between fragments. In particular, we are interested in clustering related fragments to form contextually-related groupings. For that we have several techniques: Key Connection (equal): The easiest is a simple key connection, which means that two fragments have one key in common. Transitive Matching: Also relatively straightforward is using transitive closure to match related fragments. With transitive closure, two fragments can match without being a direct match. For example, if FragmentA matches FragmentB, Fragment B matches FragmentC, and Fragment A does not match FragmentC, transitive closure brings FragmentA, FragmentB, and FragmentC together in one context. Distance-Based Clustering: For non-key data we need to be more tolerant than byte-bybyte comparison allows. Instead, we can used distance-based clustering. For instance, for names there are several mathematical algorithms such as Q-Gram and phonetic matching that can assign a closeness comparison value between two names. In order to produce a context, we define rules for fragment clustering. The table above shows an example rule set. Two data records (fragments) are clustered in the same context when At- 12

13 tribute2 is equal and Attribute3 is equal after applying the Q-Gram algorithm, or when Attribute6 is equal, or when Attribute1 is equal and Attribute2 is equal after applying a two-typo algorithm. If we were using relational data integration techniques, this would be much more timeconsuming and complex. We would have to use the right joins to match all data sources, and would need to decide ahead of time which data we need later. With MIOsoft s method we can include all data in the context, since it is not much more work and performance is mostly unaffected. A context also serves as a kind of data pool. We can use it to connect all data we are interested in with all the necessary relationships. With all related fragments at available at one glance, we can recognize patterns across complex data types with good performance. In our revenue assurance example, using context was particularly powerful. Architectural changes to the source systems happened frequently and quickly; data quality rules and calculations changed almost every month. These changes did not necessarily reach the DQ department quickly enough, and many workarounds were also discovered. Context made it easy to discover and communicate these changes between departments. Context can also be leveraged to determine whether the defined business process is reflected in the data. We do this by predicting how data should look at our checkpoints and checking to see if the actual data matches our prediction. Formalization Using the clustering rule table, we get a context which consists of all fragments of all snapshots that meet the rule table criteria. Context CO (Ruletable) = {fragment1, fragment2,..,fragmentn} Example In our revenue assurance example, we used the customer as our context. The Customer clustering rule is shown below. 13

14 EXPECTED VALUES In order to know whether our integrated data is correctly stored, we have to define what should be in our context. This definition is derived from the implemented process; however, this step can be very complex, because business rules for every detail of the data flows in the process must be specified. Expected values can be computed for snapshots and for captures. The master snapshot is the basis of the computation that determines the expected data values of all snapshots. The expected data values are themselves formed as snapshots. For every data snapshot, we want to convert the master snapshot into the expected data snapshots using the business rules. 14

15 In the end, we have a virtual construct describing which data values we expect to find at each data checkpoint. The capture itself is the basis for the computation of our expected data values for captures. CRM- and Order System Business Rule Process message data CRM Data Flow 1 Capture between CRM and Billing Internet Formalization Let f be an algorithm f(attribute1) which describes the business rule transforming the value of attribute1 in the master snapshot into the value of attribute1 in the expected snapshot. Business rules can be lookups in other tables or more complex functions. 15

16 Then we can define the expected snapshot as Expected Snapshot system A= [f(attribute1), f(attribute2), f(attributen)] where the set of {attribute1,attribute2,..attributen} is the set of relevant columns of the data snapshot. Let fc be an algorithm fc(attributex) which describes the business rule transforming attributex of the source snapshot into the attributey of the capture. Then we can define the expected capture as Expected Capture of system A and B= [f(attribute1), f(attribute2), f(attributen)] where the set of {attribute1,attribute2,..attributen} is the set of relevant columns of the source snapshot. Example In our example we get the following expected snapshots: Expected Snapshot BillingInternet DF1 = [BRinternet1(customer nb), BRinternet1(internet_product_key), BRinternet1(product_name, internet_key), BRinternet1( rating_key)] Expected Snapshot Rating = [BRrating1(rating_key)] Expected Snapshot Billing Line Legacy = [BRbillingline2(billing_price_key), BRbillingline2(billing_nb), BRbillingline2(price_key), BRbillingline2(rating_key)] Expected Snapshot Rating = [BRrating2(rating_key)] Expected Snapshot Billing Line = [BRbillingline3(asset_nb)] And the following expected capture: Expected Capture of CRM and BillingInternet = [BRcrm(customer nb), BRcrm(asset_nb), BRcrm( internet_product_key), 16

17 BRcrm(product_name), BRcrm(internet_key), BRcrm(billing_nb), BRcrm(price_key), BRcrm(billing_price_key), BRcrm(rating_key)] COMPARISON OF ACTUAL AND EXPECTED DATA Once we have the expected data values, we have to check whether all data snapshots are the same as the expected snapshots. To do this, we have to define which columns are relevant for the connection of the expected data to the actual data. Then, we can easily compare the expected snapshot with the data snapshots. We may also find actual snapshots that do not match to any expected snapshot, and vice versa. We saw this in our revenue assurance example: The actual and expected values were compared and categorized. In our example, we found two different problems. In data flow 1, we found that rating data was missing. We additionally found actual data from unexpected sources. After the categorization of the found data quality problems, we can analyze our results. This analysis is saved so it can be compared to future results. This is important for finding the source of the problem and can be done by monitoring in a dashboard, or by other reporting and analysis functions. Moreover, we can generate and unload data and use it to perform manual or automatic data cleansing in the original systems. 17

18 CATEGORIZATION OF FINDINGS For categorization, we distinguish between contexts that are OK and contexts that have a data quality problems. OK means that all equations in our previous formalization are true; contexts with a data quality problem have at least one equation with a false result. Since we know exactly what value is expected, we can even say how far apart the actual and expected values are and categorize using this distance. In the revenue assurance case, we assigned a cost to every snapshot and its expected snapshots. By comparing the expected and actual cost and computing the delta, we could calculate missed or overcharged revenue. This is the most important outcome of our revenue assurance problem and drove the whole project. We can easily categorize overbilling (negative delta values) and underbilling (positive delta values) and can prioritize and organize cleansing according to the problem s category. Formalization We can define our comparison as a simple Boolean test for some context x: Context x is OK IFF Expected Snapshot system A = snapshot system A ROOT CAUSE ANALYSIS Sometimes just finding the data problem shows its root cause, but often this is not the case, and we need to investigate inside the context. Because a context has all related data at one glance, we have a platform from which to find the root problem. We can measure whether the expected message was sent from the source system, whether the message reached the next system, and how the system used the message to update its data. 18

19 Expected Actual Data Order 1 Flow 1 Billing Internet Legacy $12,99 Message DF1 Rating $4,67 Order Data 1 Flow 1 Billing Internet Legacy $12,99 Missing Rating Data! Data Order 2 Flow 2 Billing Line Legacy $23,50 Message DF2 Message DF2 Rating $3,80 Order Data 2 Flow 2 Billing Line Legacy $22,00 Message DF2 Message DF2 Rating $3,80 Data Order 3 Flow 3 Billing Line $4,67 comparison Order Data 3 Flow 3 Billing Line $4,67 No Actual Data! Billing Line Legacy $23,50 We can see this clearly by looking at our revenue assurance example. In data flow 1, we expect a message going from the Billing Internet system to the Rating system, but this message is not in the actual capture. This is why the data is missing in the Rating snapshot. In data flow 2, we see that all messages we expect are actually captured, so the Rating snapshot is correct. To find the root cause of missing or additional snapshot data, we search for the message we expected to build the snapshot data and identify where something went wrong. Here, we found that the root cause was that the Billing Internet system did not send the expected message to the Rating system. 19

20 BIG DATA QUALITY REQUIRES CONTEXT Big data is often understood to be high-volume data with no structure. In reality, the variety and complexity of data requires the use of big data methods to deliver more business benefit. MIOsoft s context-powered and process-driven data quality method provides the opportunity to show data quality failures for multiple very complex data structures. With this method, even unreachable data like data in motion or unforeseen special business cases can be considered and can play an important role in decision-making. In particular, the contextual consolidation of data at rest and data in motion enables a better understanding of the data flow and process optimization. Big data also must be cost-effective, using approaches such as highly distributable technologies and cost-effective commodity hardware. Some contextual technologies, such as those from MIOsoft, are particularly well designed to scale up. Furthermore, leveraging contextual systems allows reduction of time-to-market. Timeconsuming tasks such as data integration with extra ETL tools or manual matching are often not needed when a contextual system allows and understands contextual modeling in a first-order way. 20

DATA QUALITY ASPECTS OF REVENUE ASSURANCE (Practice Oriented)

DATA QUALITY ASPECTS OF REVENUE ASSURANCE (Practice Oriented) DATA QUALITY ASPECTS OF REVENUE ASSURANCE (Practice Oriented) Katharina Baamann MioSoft Katharina.Baamann@miosoft.com Abstract: Revenue Assurance describes a methodology to increase a company s income

More information

WHITE PAPER. The 7 Deadly Sins of. Dashboard Design

WHITE PAPER. The 7 Deadly Sins of. Dashboard Design WHITE PAPER The 7 Deadly Sins of Dashboard Design Overview In the new world of business intelligence (BI), the front end of an executive management platform, or dashboard, is one of several critical elements

More information

Oracle s Agile Enterprise. for the Product Value Chain. Addressing the Challenges of Product-Oriented Businesses

Oracle s Agile Enterprise. for the Product Value Chain. Addressing the Challenges of Product-Oriented Businesses Oracle s Agile Enterprise Product Lifecycle Management for the Product Value Chain Addressing the Challenges of Product-Oriented Businesses Enterprise Product Lifecycle Management as a Strategic Initiative

More information

Integrated Data Management: Discovering what you may not know

Integrated Data Management: Discovering what you may not know Integrated Data Management: Discovering what you may not know Eric Naiburg ericnaiburg@us.ibm.com Agenda Discovering existing data assets is hard What is Discovery Discovery and archiving Discovery, test

More information

HP Quality Center. Upgrade Preparation Guide

HP Quality Center. Upgrade Preparation Guide HP Quality Center Upgrade Preparation Guide Document Release Date: November 2008 Software Release Date: November 2008 Legal Notices Warranty The only warranties for HP products and services are set forth

More information

Deploying the CMDB for Change & Configuration Management

Deploying the CMDB for Change & Configuration Management WHITE PAPER: CMDB, CHANGE & CONFIGURATION MANAGEMENT Deploying the CMDB for Change & Configuration Management MAY 2007 CA and BearingPoint Table of Contents Executive Summary SECTION 1 2 Pressures on IT

More information

SOLUTION WHITE PAPER. Align Change and Incident Management with Business Priorities

SOLUTION WHITE PAPER. Align Change and Incident Management with Business Priorities SOLUTION WHITE PAPER Align Change and Incident Management with Business Priorities Table of Contents Executive summary 1 the Need for Business aware Service support processes 2 The Challenge of Traditional

More information

HIGH PRECISION MATCHING AT THE HEART OF MASTER DATA MANAGEMENT

HIGH PRECISION MATCHING AT THE HEART OF MASTER DATA MANAGEMENT HIGH PRECISION MATCHING AT THE HEART OF MASTER DATA MANAGEMENT Author: Holger Wandt Management Summary This whitepaper explains why the need for High Precision Matching should be at the heart of every

More information

IBM Tivoli Service Request Manager

IBM Tivoli Service Request Manager Deliver high-quality services while helping to control cost IBM Tivoli Service Request Manager Highlights Streamline incident and problem management processes for more rapid service restoration at an appropriate

More information

EAI vs. ETL: Drawing Boundaries for Data Integration

EAI vs. ETL: Drawing Boundaries for Data Integration A P P L I C A T I O N S A W h i t e P a p e r S e r i e s EAI and ETL technology have strengths and weaknesses alike. There are clear boundaries around the types of application integration projects most

More information

Informatica Master Data Management

Informatica Master Data Management Informatica Master Data Management Improve Operations and Decision Making with Consolidated and Reliable Business-Critical Data brochure The Costs of Inconsistency Today, businesses are handling more data,

More information

Data Warehouse and Business Intelligence Testing: Challenges, Best Practices & the Solution

Data Warehouse and Business Intelligence Testing: Challenges, Best Practices & the Solution Warehouse and Business Intelligence : Challenges, Best Practices & the Solution Prepared by datagaps http://www.datagaps.com http://www.youtube.com/datagaps http://www.twitter.com/datagaps Contact contact@datagaps.com

More information

Mergers and Acquisitions: The Data Dimension

Mergers and Acquisitions: The Data Dimension Global Excellence Mergers and Acquisitions: The Dimension A White Paper by Dr Walid el Abed CEO Trusted Intelligence Contents Preamble...............................................................3 The

More information

APPLICATION MANAGEMENT SUITE FOR SIEBEL APPLICATIONS

APPLICATION MANAGEMENT SUITE FOR SIEBEL APPLICATIONS APPLICATION MANAGEMENT SUITE FOR SIEBEL APPLICATIONS USER EXPERIENCE MANAGEMENT SERVICE LEVEL OBJECTIVE REAL USER MONITORING SYNTHETIC USER MONITORING SERVICE TEST KEY PERFORMANCE INDICATOR PERFORMANCE

More information

LAVASTORM ANALYTICS lavastorm.com. Solving the Customer Service Challenges of Mixed Billing Environments

LAVASTORM ANALYTICS lavastorm.com. Solving the Customer Service Challenges of Mixed Billing Environments Solving the Customer Service Challenges of Mixed Billing Environments Billing Challenges in Multiservice Organizations Communications Services Providers (CSPs) that provide multiple services, such as fixed

More information

Master Data Management Enterprise Architecture IT Strategy and Governance

Master Data Management Enterprise Architecture IT Strategy and Governance ? Master Data Management Enterprise Architecture IT Strategy and Governance Intertwining three strategic fields of Information Technology, We help you Get the best out of IT Master Data Management MDM

More information

Lection 3-4 WAREHOUSING

Lection 3-4 WAREHOUSING Lection 3-4 DATA WAREHOUSING Learning Objectives Understand d the basic definitions iti and concepts of data warehouses Understand data warehousing architectures Describe the processes used in developing

More information

CA Virtual Assurance/ Systems Performance for IM r12 DACHSUG 2011

CA Virtual Assurance/ Systems Performance for IM r12 DACHSUG 2011 CA Virtual Assurance/ Systems Performance for IM r12 DACHSUG 2011 Happy Birthday Spectrum! On this day, exactly 20 years ago (4/15/1991) Spectrum was officially considered meant - 2 CA Virtual Assurance

More information

Accurate identification and maintenance. unique customer profiles are critical to the success of Oracle CRM implementations.

Accurate identification and maintenance. unique customer profiles are critical to the success of Oracle CRM implementations. Maintaining Unique Customer Profile for Oracle CRM Implementations By Anand Kanakagiri Editor s Note: It is a fairly common business practice for organizations to have customer data in several systems.

More information

Business Intelligence Project Management 101

Business Intelligence Project Management 101 Business Intelligence Project Management 101 Managing BI Projects within the PMI Process Groups Too many times, Business Intelligence (BI) and Data Warehousing project managers are ill-equipped to handle

More information

DataFlux Data Management Studio

DataFlux Data Management Studio DataFlux Data Management Studio DataFlux Data Management Studio provides the key for true business and IT collaboration a single interface for data management tasks. A Single Point of Control for Enterprise

More information

Analytics for Healthcare

Analytics for Healthcare IBM Software Group White Paper Analytics for Healthcare Improve care, reduce costs and make better decisions with performance management 2 Healthcare Performance Management Abstract The global healthcare

More information

Co-Creation of Models and Metamodels for Enterprise. Architecture Projects.

Co-Creation of Models and Metamodels for Enterprise. Architecture Projects. Co-Creation of Models and Metamodels for Enterprise Architecture Projects Paola Gómez pa.gomez398@uniandes.edu.co Hector Florez ha.florez39@uniandes.edu.co ABSTRACT The linguistic conformance and the ontological

More information

Jet Data Manager 2012 User Guide

Jet Data Manager 2012 User Guide Jet Data Manager 2012 User Guide Welcome This documentation provides descriptions of the concepts and features of the Jet Data Manager and how to use with them. With the Jet Data Manager you can transform

More information

RESTRICTION ON DISCLOSURE AND USE OF DATA

RESTRICTION ON DISCLOSURE AND USE OF DATA RESTRICTION ON DISCLOSURE AND USE OF DATA This document includes data that shall not be disclosed outside the Government and shall not be duplicated, used, or disclosed in whole or in part for any purpose

More information

Power Smart Business Operations with Real-Time Process Intelligence

Power Smart Business Operations with Real-Time Process Intelligence SAP Brief SAP Business Suite SAP Operational Process Intelligence Powered by SAP HANA Objectives Power Smart Business Operations with Real-Time Process Intelligence Gain visibility into processes and data

More information

hmetrix Revolutionizing Healthcare Analytics with Vertica & Tableau

hmetrix Revolutionizing Healthcare Analytics with Vertica & Tableau Powered by Vertica Solution Series in conjunction with: hmetrix Revolutionizing Healthcare Analytics with Vertica & Tableau The cost of healthcare in the US continues to escalate. Consumers, employers,

More information

Management Packs for Database

Management Packs for Database Management Packs for Database Diagnostics Pack for Database Oracle Diagnostics Pack for Database offers a complete, cost-effective, and easy to use solution for managing the performance of Oracle Database

More information

Transform your customer relationships. Avanade Enterprise CRM Solutions

Transform your customer relationships. Avanade Enterprise CRM Solutions Transform your customer relationships Avanade Enterprise CRM Solutions Avanade has deployed more Microsoft Dynamics CRM solutions than any other organization in the world. Our CRM experts utilize our global

More information

White Paper September 2009. Healthcare Performance Management Improve care, reduce costs and make better decisions with performance management

White Paper September 2009. Healthcare Performance Management Improve care, reduce costs and make better decisions with performance management White Paper September 2009 Healthcare Performance Management Improve care, reduce costs and make better decisions with performance management 2 Contents 4 What is healthcare performance management? Measuring

More information

Maximizing the ROI Of Visual Rules

Maximizing the ROI Of Visual Rules Table of Contents Introduction... 3 Decision Management... 3 Decision Discovery... 4 Decision Services... 6 Decision Analysis... 11 Conclusion... 12 About Decision Management Solutions... 12 Acknowledgements

More information

CAPTURING UNTAPPED REVENUE: How Customer Experience Insights Improve Remarketing and Customer Recovery Efforts

CAPTURING UNTAPPED REVENUE: How Customer Experience Insights Improve Remarketing and Customer Recovery Efforts CAPTURING UNTAPPED REVENUE: How Customer Experience Insights Improve Remarketing and Customer Recovery Efforts Hilary Salazar, Product Marketing Manager, Tealeaf TABLE OF CONTENTS Executive Summary...1

More information

High Precision Matching at the heart of Customer Data Integration

High Precision Matching at the heart of Customer Data Integration High Precision Matching at the heart of Customer Data Integration The quality of your CDI Solution is only as powerful as the quality of your matching engine White paper Release date: September 2008 Version

More information

Lavastorm Resolution Center 2.2 Release Frequently Asked Questions

Lavastorm Resolution Center 2.2 Release Frequently Asked Questions Lavastorm Resolution Center 2.2 Release Frequently Asked Questions Software Description What is Lavastorm Resolution Center 2.2? Lavastorm Resolution Center (LRC) is a flexible business improvement management

More information

Before getting started, we need to make sure we. Business Intelligence Project Management 101: Managing BI Projects Within the PMI Process Group

Before getting started, we need to make sure we. Business Intelligence Project Management 101: Managing BI Projects Within the PMI Process Group PMI Virtual Library 2010 Carole Wittemann Business Intelligence Project Management 101: Managing BI Projects Within the PMI Process Group By Carole Wittemann, PMP Abstract Too many times, business intelligence

More information

TechTips. Connecting Xcelsius Dashboards to External Data Sources using: Web Services (Dynamic Web Query)

TechTips. Connecting Xcelsius Dashboards to External Data Sources using: Web Services (Dynamic Web Query) TechTips Connecting Xcelsius Dashboards to External Data Sources using: Web Services (Dynamic Web Query) A step-by-step guide to connecting Xcelsius Enterprise XE dashboards to company databases using

More information

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Project Management Self-Assessment Contents Introduction 3 User Guidance 4 P3M3 Self-Assessment Questionnaire

More information

The New Rules of Telecom Expense Management Effectively Managing Telecom Spend in Today s World

The New Rules of Telecom Expense Management Effectively Managing Telecom Spend in Today s World Over the past decade, the way organizations communicate has radically changed. As data and wireless communications have grown, telecom expenses have become a significant expense equating to 3-4% of revenue

More information

IN SEARCH OF BUSINESS VALUE: HOW TO ACHIEVE THE BENEFITS OF ERP TECHNOLOGY

IN SEARCH OF BUSINESS VALUE: HOW TO ACHIEVE THE BENEFITS OF ERP TECHNOLOGY Introduction IN SEARCH OF BUSINESS VALUE: HOW TO ACHIEVE THE BENEFITS OF ERP TECHNOLOGY In today s increasingly competitive business environment, significant investment in Information Technology (IT) is

More information

Table of contents. Matching server virtualization with advanced storage virtualization

Table of contents. Matching server virtualization with advanced storage virtualization Matching server virtualization with advanced storage virtualization Using HP LeftHand SAN and VMware Infrastructure 3 for improved ease of use, reduced cost and complexity, increased availability, and

More information

IT Management On Demand

IT Management On Demand IT Management On Demand FUJITSU Cloud IT Management as a Service: Delivering Simple, Powerful and Unified IT Management Capabilities shaping tomorrow with you The Challenges of Managing a Dynamic IT Environment

More information

Availability Management: A CA Service Management Process Map

Availability Management: A CA Service Management Process Map TECHNOLOGY brief: AVAILABILITY MANAGEMENT Availability : A CA Process Map Malcolm Ryder ARCHITECT CA SERVICES Table of Contents Executive Summary 1 SECTION 1: CHALLENGE 2 Simplifying ITIL How to Use the

More information

BENEFITS OF AUTOMATING DATA WAREHOUSING

BENEFITS OF AUTOMATING DATA WAREHOUSING BENEFITS OF AUTOMATING DATA WAREHOUSING Introduction...2 The Process...2 The Problem...2 The Solution...2 Benefits...2 Background...3 Automating the Data Warehouse with UC4 Workload Automation Suite...3

More information

Master Data Management and Data Warehousing. Zahra Mansoori

Master Data Management and Data Warehousing. Zahra Mansoori Master Data Management and Data Warehousing Zahra Mansoori 1 1. Preference 2 IT landscape growth IT landscapes have grown into complex arrays of different systems, applications, and technologies over the

More information

Technology in Action. Alan Evans Kendall Martin Mary Anne Poatsy. Eleventh Edition. Copyright 2015 Pearson Education, Inc.

Technology in Action. Alan Evans Kendall Martin Mary Anne Poatsy. Eleventh Edition. Copyright 2015 Pearson Education, Inc. Copyright 2015 Pearson Education, Inc. Technology in Action Alan Evans Kendall Martin Mary Anne Poatsy Eleventh Edition Copyright 2015 Pearson Education, Inc. Technology in Action Chapter 9 Behind the

More information

Best Practices in Contract Migration

Best Practices in Contract Migration ebook Best Practices in Contract Migration Why You Should & How to Do It Introducing Contract Migration Organizations have as many as 10,000-200,000 contracts, perhaps more, yet very few organizations

More information

See your customer from every angle and you ll see all the opportunities you ve been missing. Customer Information Management Analyze Optimize Manage

See your customer from every angle and you ll see all the opportunities you ve been missing. Customer Information Management Analyze Optimize Manage Customer Information Management Analyze Optimize Manage Customer Integration Manager D&B s software to Manage customer information helps you maintain high-quality information over time and create an integrated,enterprise-wide

More information

Extraction Transformation Loading ETL Get data out of sources and load into the DW

Extraction Transformation Loading ETL Get data out of sources and load into the DW Lection 5 ETL Definition Extraction Transformation Loading ETL Get data out of sources and load into the DW Data is extracted from OLTP database, transformed to match the DW schema and loaded into the

More information

Red Hat Enterprise Linux: The ideal platform for running your Oracle database

Red Hat Enterprise Linux: The ideal platform for running your Oracle database Red Hat Enterprise Linux: The ideal platform for running your Oracle database 2 Introduction 2 Scalability 2 Availability 3 Reliability 4 Manageability 5 Red Hat subscriptions 6 Conclusion www.redhat.com

More information

hybris Solution Brief Hybris Marketing Market to an Audience of One

hybris Solution Brief Hybris Marketing Market to an Audience of One hybris Solution Brief Hybris Marketing Market to an Audience of One People are intuitive. A shop owner can meet a customer and immediately pick up on explicit and implicit cues that signal intent: What

More information

Performance Prediction, Sizing and Capacity Planning for Distributed E-Commerce Applications

Performance Prediction, Sizing and Capacity Planning for Distributed E-Commerce Applications Performance Prediction, Sizing and Capacity Planning for Distributed E-Commerce Applications by Samuel D. Kounev (skounev@ito.tu-darmstadt.de) Information Technology Transfer Office Abstract Modern e-commerce

More information

ORACLE DATA QUALITY ORACLE DATA SHEET KEY BENEFITS

ORACLE DATA QUALITY ORACLE DATA SHEET KEY BENEFITS ORACLE DATA QUALITY KEY BENEFITS Oracle Data Quality offers, A complete solution for all customer data quality needs covering the full spectrum of data quality functionalities Proven scalability and high

More information

How To Manage Event Data With Rocano Ops

How To Manage Event Data With Rocano Ops ROCANA WHITEPAPER Improving Event Data Management and Legacy Systems INTRODUCTION STATE OF AFFAIRS WHAT IS EVENT DATA? There are a myriad of terms and definitions related to data that is the by-product

More information

A Capability Model for Business Analytics: Part 2 Assessing Analytic Capabilities

A Capability Model for Business Analytics: Part 2 Assessing Analytic Capabilities A Capability Model for Business Analytics: Part 2 Assessing Analytic Capabilities The first article of this series presented the capability model for business analytics that is illustrated in Figure One.

More information

Whitepaper. Data Warehouse/BI Testing Offering YOUR SUCCESS IS OUR FOCUS. Published on: January 2009 Author: BIBA PRACTICE

Whitepaper. Data Warehouse/BI Testing Offering YOUR SUCCESS IS OUR FOCUS. Published on: January 2009 Author: BIBA PRACTICE YOUR SUCCESS IS OUR FOCUS Whitepaper Published on: January 2009 Author: BIBA PRACTICE 2009 Hexaware Technologies. All rights reserved. Table of Contents 1. 2. Data Warehouse - Typical pain points 3. Hexaware

More information

Advantages of Implementing a Data Warehouse During an ERP Upgrade

Advantages of Implementing a Data Warehouse During an ERP Upgrade Advantages of Implementing a Data Warehouse During an ERP Upgrade Advantages of Implementing a Data Warehouse During an ERP Upgrade Introduction Upgrading an ERP system represents a number of challenges

More information

Quantrix & Excel: 3 Key Differences A QUANTRIX WHITE PAPER

Quantrix & Excel: 3 Key Differences A QUANTRIX WHITE PAPER Quantrix & Excel: 3 Key Differences A QUANTRIX WHITE PAPER Abstract This whitepaper is designed to educate spreadsheet users about three key conceptual and practical differences between Quantrix Modeler

More information

Five Ways to Manage Your IT Dashboard

Five Ways to Manage Your IT Dashboard uptime s IT Operations IT Systems Management Management Series: For IT Managers Large Financial Company Decreases IT Costs While Using Historical Trends to Plan for the Future Harnessing the Power of an

More information

ADVANTAGES OF IMPLEMENTING A DATA WAREHOUSE DURING AN ERP UPGRADE

ADVANTAGES OF IMPLEMENTING A DATA WAREHOUSE DURING AN ERP UPGRADE ADVANTAGES OF IMPLEMENTING A DATA WAREHOUSE DURING AN ERP UPGRADE Advantages of Implementing a Data Warehouse During an ERP Upgrade Upgrading an ERP system presents a number of challenges to many organizations.

More information

INFORMATION CONNECTED

INFORMATION CONNECTED INFORMATION CONNECTED Business Solutions for the Utilities Industry Primavera Project Portfolio Management Solutions Achieve Operational Excellence with Robust Project Portfolio Management Solutions The

More information

Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows

Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows TECHNISCHE UNIVERSITEIT EINDHOVEN Branch-and-Price Approach to the Vehicle Routing Problem with Time Windows Lloyd A. Fasting May 2014 Supervisors: dr. M. Firat dr.ir. M.A.A. Boon J. van Twist MSc. Contents

More information

Eliminating Complexity to Ensure Fastest Time to Big Data Value

Eliminating Complexity to Ensure Fastest Time to Big Data Value Eliminating Complexity to Ensure Fastest Time to Big Data Value Copyright 2015 Pentaho Corporation. Redistribution permitted. All trademarks are the property of their respective owners. For the latest

More information

Data Quality Assessment. Approach

Data Quality Assessment. Approach Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source

More information

Microsoft Dynamics NAV 2015 What s new?

Microsoft Dynamics NAV 2015 What s new? What s new? RapidStart Upgrade includes several enhancements for upgrading solutions. RapidStart Upgrade - Code New application merge utilities help partners upgrade their solution - application code and

More information

IMPROVING DATA INTEGRATION FOR DATA WAREHOUSE: A DATA MINING APPROACH

IMPROVING DATA INTEGRATION FOR DATA WAREHOUSE: A DATA MINING APPROACH IMPROVING DATA INTEGRATION FOR DATA WAREHOUSE: A DATA MINING APPROACH Kalinka Mihaylova Kaloyanova St. Kliment Ohridski University of Sofia, Faculty of Mathematics and Informatics Sofia 1164, Bulgaria

More information

Data Migration in SAP environments

Data Migration in SAP environments Framework for Data Migration in SAP environments Does this scenario seem familiar? Want to save 50% in migration costs? Data migration is about far more than just moving data into a new application or

More information

A financial software company

A financial software company A financial software company Projecting USD10 million revenue lift with the IBM Netezza data warehouse appliance Overview The need A financial software company sought to analyze customer engagements to

More information

Measurement Information Model

Measurement Information Model mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides

More information

META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING

META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 asistithod@gmail.com

More information

SQL Server Query Tuning

SQL Server Query Tuning SQL Server Query Tuning A 12-Step Program By Thomas LaRock, Technical Evangelist and Head Geek Confio Software 4772 Walnut Street, Suite 100 Boulder, CO 80301 www.confio.com Introduction Query tuning is

More information

Documentation for data centre migrations

Documentation for data centre migrations Documentation for data centre migrations Data centre migrations are part of the normal life cycle of a typical enterprise. As organisations expand, many reach a point where maintaining multiple, distributed

More information

Achieving High Oracle Performance

Achieving High Oracle Performance Achieving High Oracle Performance Advanced Performance Management for Today s Complex, Critical Databases Abstract DBAs today need better tools than ever, because they are being asked to manage increasingly

More information

Four Steps to Faster, Better Application Dependency Mapping

Four Steps to Faster, Better Application Dependency Mapping THOUGHT LEADERSHIP WHITE PAPER Four Steps to Faster, Better Application Dependency Mapping Laying the Foundation for Effective Business Service Models By Adam Kerrison, Principal Product Developer, BMC

More information

The Top Challenges in Big Data and Analytics

The Top Challenges in Big Data and Analytics Big Data Leads to Insights, Improvements & Automation Over the past few years, there has been a tremendous amount of hype around Big Data data that doesn t work well in traditional BI systems and warehouses

More information

Corporate Performance Management (CPM) - An Introduction

Corporate Performance Management (CPM) - An Introduction CPM Overview Corporate Performance Management (CPM) - An Introduction The Business Drivers Common challenges corporations face today... Increasing demand for corporate accountability from stockholders

More information

So we thought we knew money

So we thought we knew money So we thought we knew money Ying Hu Sam Peng Here at Custom House Global Foreign Exchange, we have built and are continuing to evolve a system that processes online foreign exchange transactions. Currency,

More information

Cisco Info Center Business Service Manager

Cisco Info Center Business Service Manager Cisco Info Center Business Service Manager Cisco Info Center Business Service Manager Today's business services are more complex than ever, composed of an ever-changing mix of legacy and nextgeneration

More information

Knowledgent White Paper Series. Developing an MDM Strategy WHITE PAPER. Key Components for Success

Knowledgent White Paper Series. Developing an MDM Strategy WHITE PAPER. Key Components for Success Developing an MDM Strategy Key Components for Success WHITE PAPER Table of Contents Introduction... 2 Process Considerations... 3 Architecture Considerations... 5 Conclusion... 9 About Knowledgent... 10

More information

Product Lifecycle Management in the Medical Device Industry. An Oracle White Paper Updated January 2008

Product Lifecycle Management in the Medical Device Industry. An Oracle White Paper Updated January 2008 Product Lifecycle Management in the Medical Device Industry An Oracle White Paper Updated January 2008 Product Lifecycle Management in the Medical Device Industry PLM technology ensures FDA compliance

More information

Migrate, Manage, Monitor SQL Server 2005: How Idera s Tools for SQL Server Can Help

Migrate, Manage, Monitor SQL Server 2005: How Idera s Tools for SQL Server Can Help Migrate, Manage, Monitor SQL Server 2005: How Idera s Tools for SQL Server Can Help White Paper January 2007 Abstract If you haven't already made the move to SQL Server 2005, most likely it is on your

More information

JOURNAL OF OBJECT TECHNOLOGY

JOURNAL OF OBJECT TECHNOLOGY JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7, No. 8, November-December 2008 What s Your Information Agenda? Mahesh H. Dodani,

More information

SAP Solution Brief SAP Technology SAP IT Infrastructure Management. Unify Infrastructure and Application Lifecycle Management

SAP Solution Brief SAP Technology SAP IT Infrastructure Management. Unify Infrastructure and Application Lifecycle Management SAP Brief SAP Technology SAP IT Infrastructure Management Objectives Unify Infrastructure and Application Lifecycle Management Supercharge your IT infrastructure Supercharge your IT infrastructure What

More information

release 240 Exact Synergy Enterprise CRM Implementation Manual

release 240 Exact Synergy Enterprise CRM Implementation Manual release 240 Exact Synergy Enterprise CRM Implementation Manual EXACT SYNERGY ENTERPRISE CRM IMPLEMENTATION MANUAL The information provided in this manual is intended for internal use by or within the organization

More information

A Symptom Extraction and Classification Method for Self-Management

A Symptom Extraction and Classification Method for Self-Management LANOMS 2005-4th Latin American Network Operations and Management Symposium 201 A Symptom Extraction and Classification Method for Self-Management Marcelo Perazolo Autonomic Computing Architecture IBM Corporation

More information

P3M3 Portfolio Management Self-Assessment

P3M3 Portfolio Management Self-Assessment Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Portfolio Management Self-Assessment P3M3 is a registered trade mark of AXELOS Limited Contents Introduction

More information

Next Generation Business Performance Management Solution

Next Generation Business Performance Management Solution Next Generation Business Performance Management Solution Why Existing Business Intelligence (BI) Products are Inadequate Changing Business Environment In the face of increased competition, complex customer

More information

Speeding ETL Processing in Data Warehouses White Paper

Speeding ETL Processing in Data Warehouses White Paper Speeding ETL Processing in Data Warehouses White Paper 020607dmxwpADM High-Performance Aggregations and Joins for Faster Data Warehouse Processing Data Processing Challenges... 1 Joins and Aggregates are

More information

2003-2007, Aplicor, Inc., All Rights Reserved

2003-2007, Aplicor, Inc., All Rights Reserved I N T E G R A T I O N S E R V I C E S W H I T E P A P E R Copyright 2003-2007, Aplicor, Inc., All Rights Reserved Introduction to Integration Services Due to Aplicor s focus on mid-market and enterprise

More information

TECHNOLOGY BRIEF: CA ERWIN SAPHIR OPTION. CA ERwin Saphir Option

TECHNOLOGY BRIEF: CA ERWIN SAPHIR OPTION. CA ERwin Saphir Option TECHNOLOGY BRIEF: CA ERWIN SAPHIR OPTION CA ERwin Saphir Option Table of Contents Executive Summary SECTION 1 2 Introduction SECTION 2: OPPORTUNITY 2 Modeling ERP Systems ERP Systems and Data Warehouses

More information

Linear Programming Notes VII Sensitivity Analysis

Linear Programming Notes VII Sensitivity Analysis Linear Programming Notes VII Sensitivity Analysis 1 Introduction When you use a mathematical model to describe reality you must make approximations. The world is more complicated than the kinds of optimization

More information

Coverity White Paper. Effective Management of Static Analysis Vulnerabilities and Defects

Coverity White Paper. Effective Management of Static Analysis Vulnerabilities and Defects Effective Management of Static Analysis Vulnerabilities and Defects Introduction According to a recent industry study, companies are increasingly expanding their development testing efforts to lower their

More information

Build an effective data integration strategy to drive innovation

Build an effective data integration strategy to drive innovation IBM Software Thought Leadership White Paper September 2010 Build an effective data integration strategy to drive innovation Five questions business leaders must ask 2 Build an effective data integration

More information

A Near Real-Time Personalization for ecommerce Platform Amit Rustagi arustagi@ebay.com

A Near Real-Time Personalization for ecommerce Platform Amit Rustagi arustagi@ebay.com A Near Real-Time Personalization for ecommerce Platform Amit Rustagi arustagi@ebay.com Abstract. In today's competitive environment, you only have a few seconds to help site visitors understand that you

More information

1 Using a SQL Filter in Outlook 2002/2003 Views. 2 Defining the Problem The Task at Hand

1 Using a SQL Filter in Outlook 2002/2003 Views. 2 Defining the Problem The Task at Hand 1 Using a SQL Filter in Outlook 2002/2003 Views Those of you who have used Outlook for a while may have discovered the power of Outlook Views and use these on every folder to group, sort and filter your

More information

are you helping your customers achieve their expectations for IT based service quality and availability?

are you helping your customers achieve their expectations for IT based service quality and availability? PARTNER BRIEF Service Operations Management from CA Technologies are you helping your customers achieve their expectations for IT based service quality and availability? FOR PARTNER USE ONLY DO NOT DISTRIBUTE

More information

Are You Ready for Big Data?

Are You Ready for Big Data? Are You Ready for Big Data? Jim Gallo National Director, Business Analytics February 11, 2013 Agenda What is Big Data? How do you leverage Big Data in your company? How do you prepare for a Big Data initiative?

More information

GROW YOUR ANALYTICS MATURITY

GROW YOUR ANALYTICS MATURITY GROW YOUR ANALYTICS MATURITY Gain and Sustain a Competitive Edge FROM DATA TO ACTION YOU VE HEARD THE BIG DATA BUZZ. WE RE SWIMMING IN MORE DATA THAN EVER. But it s not about the amount of data, the different

More information

Response Time Analysis

Response Time Analysis Response Time Analysis A Pragmatic Approach for Tuning and Optimizing Database Performance By Dean Richards Confio Software 4772 Walnut Street, Suite 100 Boulder, CO 80301 866.CONFIO.1 www.confio.com Introduction

More information

Informatica Application Information Lifecycle Management

Informatica Application Information Lifecycle Management Informatica Application Information Lifecycle Management Cost-Effectively Manage Every Phase of the Information Lifecycle brochure Controlling Explosive Data Growth The era of big data presents today s

More information