Enterprise Intelligence - Enabling High Quality in the Data Warehouse/DSS Environment. by Bill Inmon. INTEGRITY IN All Your INformation

Size: px
Start display at page:

Download "Enterprise Intelligence - Enabling High Quality in the Data Warehouse/DSS Environment. by Bill Inmon. INTEGRITY IN All Your INformation"

Transcription

1 INTEGRITY IN All Your INformation R TECHNOLOGY INCORPORATED Enterprise Intelligence - Enabling High Quality in the Data Warehouse/DSS Environment by Bill Inmon WPS.INM.E e

2 Introduction In a few short years, data warehousing has passed from theory to conventional wisdom. In the explosive growth that has transpired, a body of thought has developed surrounding it. From the beginning, data warehousing was never a theoretical exercise, but has always been rooted in pragmatism. But as is inevitable given the breathtaking growth that has been the lot of data warehousing, an organized, thorough intellectual framework has begun to grow around both its infrastructure and rationale. There are many aspects to this intellectual framework. One of the important considerations, critical to the infrastructure, is the quality of data that courses through the veins of components of the warehouse. Indeed, quality in many different forms is one the cornerstones of data warehousing. If the data warehouse is ever to achieve the lofty goal of becoming a foundation for enterprise intelligence, data quality must become a reality. It is simply unthinkable that analysis for important corporate decisions should proceed on the basis of incorrect and incomplete data. Therefore, a de facto prerequisite for enterprise intelligence is quality throughout the data warehouse environment. The corporate information factory Before there can be a discussion of the quality of data in the data warehouse/dss environment, there needs to be a discussion of the structure of the data warehouse environment and of its infrastructure. The data warehouse has grown from a a database separate and apart from transaction processing into a sophisticated structure known as the "corporate information factory." Figure 1 (on the following page) depicts the corporate information factory. The genesis of data in the corporate information factory is the application environment. Here, detailed data is gathered, audited, transacted, and stored. The application is written for specific requirements. The essence of the application environment is transactions, which typically execute very quickly, operating on small amounts of data. Once data is gathered into the application environment, it is passed through a layer of programs called the "integration and transformation" layer. The programs for integration and transformation processing integrate and convert the application data into a corporate format. The integration and transformation programs that a corporation writes usually represent its largest expense and effort in developing the data warehouse. Once data passes through the integration and transformation layer, it heads to one of two directions: to the data warehouse or to the ODS (operational data store). When the data heads to the ODS, it goes to an environment that is a hybrid DSS/ operational structure. The ODS is a place where it is possible to achieve high-performance OLTP response time. At the same time, it is possible to access and analyze integrated data there and, on occasion, to do DSS processing. Not all companies have a need for an ODS. But where there is a need for an ODS, a business is served well by having one. Eventually, data that passes into the ODS also passes into the data warehouse. Enterprise Intelligence-Enabling High Quality in the Data Warehouse DSS Environment 1

3 Figure 1: The Corporate Information Factory Integration/Transformation Data Marts Applications Enterprise Data Warehouse Exploration Warehouse O D S Near Line Storage The data warehouse is then fed integrated data from either the integration and transformation layer or the ODS. The data warehouse is the heart of the DSS infrastructure. It is the place where the integrated granular data of the corporation resides. It contains historical data, sometimes up to ten years of data, depending on the business of the corporation. It represents the single "source of truth" for the data residing in the corporation. And it represents the ultimate basis for reconciling any discrepancies that a corporation might have. There is almost always a large volume of data residing in the data warehouse. And the volume of data found in the data warehouse grows at a rate that is breathtaking. Data emanates from the data warehouse in many directions. Data marts are created from the granular data found in the data warehouse. Data marts reflect departmental views of the corporation: each data mart selects and shapes the granular data to its own needs. Consequently, the data marts are significantly smaller than the data warehouse. As such, they can take advantage of specialized technology such as multi-dimensional technology and cube technology. Another extension of the data warehouse is the exploration warehouse. This is built for the explorers of the corporation. By creating a separate facility for explorers, companies avoid disrupting the regular work of the data warehouse. The exploration warehouse environment is best served by technology unique to its environment. There is one other important component of the corporate information factory: near line storage. Near line storage exists to house bulk and infrequently used data. It allows the cost of warehousing to be driven down to a relatively small expenditure. By introducing near line storage into the corporate information factory, the designer is free to take data down to the lowest level of granularity desired. 2 Copyright, Vality Technology Inc. All rights reserved.

4 Issues of quality What, then, are the issues of quality that arise in creating and operating a corporate information factory? The heart of the corporate information factory is the data warehouse. The first major issue of data quality in the corporate information factory is how to ensure the data arrives in the data warehouse with the highest degree of quality. Figure 2 shows that there are three opportunities for ensuring data quality as data is prepared for loading into the data warehouse. All of these opportunities for data quality have their own considerations. In fact, it is recommended that all three be used together for maximum effectiveness. Figure 2: The Three Opportunities for Quality in the DataWarehouse Environment These 3 opportunities are: 1. Cleansing data at the source, the application environment, 2. Cleansing data as the data is integrated upon leaving the applications and entering the integration and transformation programs, and 3. Cleansing and auditing data after it has been loaded into the warehouse. Data cleansing at the application level At first glance, it appears that the most natural place for assuring data quality is in the application. Data first enters the corporate information factory and is captured in the application. Indeed, the cleaner the data at the point of entry, the better off the corporate information factory. One theory says that if the data is perfectly cleaned at the application level, it need not be cleaned elsewhere. Unfortunately, this is not the case at all. Enterprise Intelligence-Enabling High Quality in the Data Warehouse DSS Environment 3

5 Several mitigating factors prevent the application from being the panacea for data quality. The first difficulty is the state of the application itself. In many cases, the application is old and undocumented. Applications programmers are legitimately scared to go back into old application code and alter it in any significant way. The fear is that one problem may be fixed, but two others may arise. Fixing one problem then might set off a cascade of other problems. The result is that the application is worse off than it was before the application was maintained. The second reason why application developers are loathe to go back into old code is that they see no benefit in doing so. Application developers focus on immediate requirements, and they see no urgency, or for that matter, any motivation, in going back into old code and modifying it to solve someone else's problems. Politics then enters the picture of what is and is not a priority. There is then a motivational and an organizational problem in trying to get changes made at the application level. But even if you could magically and easily do anything you wished at the application level, you would still need to cleanse data elsewhere in the corporate information factory. The reason why integration and transformation cleansing is still necessary elsewhere, even when application data is perfect, is that application data is not integrated. Data may be just fine in the eyes of a single application developer or user. But the data residing in the application still needs to be integrated across the corporate information factory. There is a big difference between cleaning application data and integrating application data. Only AFTER the data comes out of the application is there a need and opportunity for integrating the data. The first opportunity for integration arises as data passes into the integration and transformation layer. Data cleansing in the integration and transformation layer Multiple applications pass data into the integration and transformation layer. Each application has its own interpretation of data, as originally specified by the application designer. Keys, attributes, structures, encoding conventions are all different across the many applications. But in order for the data warehouse to contain integrated data, the many application structures and conventions must be integrated into a single, cohesive set of structures and conventions. There is then a complex task in store for the integration and transformation processing. Not only are keys, structures, and encoding conventions different across the many applications, but, in many cases, relationships between data within systems, as well as across systems, are undetected. Legacy information is often buried and floating within free-form text fields such as name and address lines, comment fields, and other data fields that have become a storage closet for meanings and relationships not accounted for in the original system. Data relationships may be hidden because initial systems did not provide a key structure that linked all relevant records, e.g., multiple account numbers might block the fact that all the records are from subsidiaries of the same company. Data anomalies in names, addresses, part descriptions, account codes are another area to rectify. And inconsistencies between meta field definitions and the applications tend to surface over time as the application systems become part of the operational fabric of an organization, e.g., commercial names mixed with personal names, addresses with missing information, truncated information, use of special characters as separators, missing values, abbreviations, 4 Copyright, Vality Technology Inc. All rights reserved.

6 etc. These quality issues can be found in one set of application data, can be multiplied when integrated within multiple applications, and can put the effectiveness of the resulting data warehouse at risk for delivering enterprise intelligence. The result of the tedious and difficult integration and transformation processing is integrated data. And the process of integrating the many applications together is certainly one form of cleansing data. It is noteworthy that this form of cleansing data is not possible until the data has passed out of the application. Therefore, there is another separate opportunity for data quality other than cleansing data in the application. But there is a also third place where data quality needs to be addressed: after the data has been loaded into the data warehouse. Data quality inside the data warehouse Suppose that you could create perfect application and perfect integration and transformation programs. Would you still need a data quality facility within the data warehouse itself? The answer is "yes." First of all, as new application data is added to the data warehouse environment, all the integration and transformation layer issues will be re-addressed, and the new data may also uncover more hidden anomalies and relationships even in the warehouse itself. However, another key reason is that the data warehouse contains data collected over a spectrum of time. In some cases, the spectrum is as long as ten years. The problem with data collected over time is that data itself changes over time. In some cases, the changes are slow and subtle. In other cases, the changes are fast and radical. In any case, it is simply a fact of life that data changes over time. And with these changes comes the need to integrate data over time within the data warehouse after it has already been loaded. Even if data is entered perfectly from the applications and integration and transformation programs, there will still be a need to examine data quality inside the data warehouse over time. But has the data remained constant over those years? Hardly. Figure 3: In a data warehouse, data is loaded into the warehouse over time Enterprise Intelligence-Enabling High Quality in the Data Warehouse DSS Environment 5

7 Figure 4 shows some common changes that have occurred over the years. Figure 4: There are plenty of examples where data undergoes a fundamental change over time standard chart of accounts standard chart of accounts SAP SAP Franc, Pound, Peseta,... Franc, Pound, Peseta,... Franc, Pound, Peseta,... Eurodollar In the case shown in Figure 4, there was a standard chart of accounts until Then, in 1998, SAP was brought into the corporation, and a new chart of accounts was created. Trying to use the chart of accounts codes from 1996 to 1999 based on data in the warehouse produces very misleading and inaccurate results. As another example, money is measured in the local currency prior to But in 1999, money is measured in eurodollars. Trying to perform a cash analysis from 1996 to 1999 will be very difficult because the underlying meaning of the data has changed. Therefore, even if data quality has been perfected elsewhere, it remains to be perfected one more time after the data enters the warehouse simply because data ages inside the warehouse. Referential Integrity in the data warehouse environment There is another form of data quality that deserves mention: the quality of relationships among types of data inside the data warehouse. This type of data quality has long been known as "referential integrity." As a simple example of referential integrity in the classical operational environment, consider a common relationship between two elements of data, A and B, the parent/child relationship. In this relationship when data element A exists and data element B relates to data element A in a parent/child manner, if A is deleted, then data element B is also deleted by the referential integrity facility. Or if a user wants to insert data element B,, data element B cannot be inserted unless the data element A that it relates to also exists. The facility for referential integrity exists in order to ensure that the relationships defined are held intact by the database management system. 6 Copyright, Vality Technology Inc. All rights reserved.

8 Referential integrity applies to the world of data warehouses just as it applies to operational systems. However, referential integrity is implemented quite differently in the data warehouse environment. There are several reasons why: The volumes of data in the data warehouse are significantly larger than the volumes of data found in the operational environment. Snapshots are created in the data warehouse whereas updates of data are done in the operational environment. Data in the warehouse represents a spectrum of time while data in the operational environment is usually taken to be current valued data - that is, the data is current as of the moment of online access. For these and other reasons, referential integrity in the data warehouse environment is implemented quite differently than it is in the operational environment. As a simple example of the difference in the way that referential integrity is maintained in the DSS data warehouse environment, consider the parent/child relationship again. In the data warehouse environment, this relationship would be framed by some parameters of time. There would be a START TIME and a STOP TIME. The relationship between A and B would be valid from January 1 to February 15. The data warehouse referential integrity facility would first check the moment in time being considered. If this moment lies outside of the dates defined for the relationship, say July 20, then there would be no implication of a relationship between A and B. But if the dates being discussed lie between the START TIME and STOP TIME, say between January 18 and February 2, then the relationship between A and B would be enforced. Three places for data quality It is interesting to compare the three places where quality of data needs to be addressed inside the data warehouse. In the application arena, there is the need to see that data is entered and recorded correctly. Data quality standards for applications include ensuring that data is entered correctly and that information is not buried and floating within free-form fields. Clear routines for data defect detection are critical to ensure that misspellings do not result in duplicate customer or product entries, and that relationships between entities, such as subsidiaries or multiple accounts for a single client, are maintained. In the integration and transformation layer, it is necessary to see that data has been integrated. In most environments this is the most difficult of all data quality audits. Integrating data involves determining relationships across disparate data files where there are multiple formats as well as complex matching and consolidation, particularly where there are relationships among non-keyed fields. And once inside the data warehouse, it is necessary to examine the integration of data over time. In many cases there will be no differences. But where there are differences, there is the question of what to do about discrepancies. Enterprise Intelligence-Enabling High Quality in the Data Warehouse DSS Environment 7

9 As the volume and capacity of the data warehouse grows, massive conversions are not a viable option. Data quality tools such as Vality's INTEGRITY (TM) Data Reengineering Environment utilize parsing and matching technology to automatically unify and correct disparate formats, creative data entry, spelling and keying anomalies, and undiscovered data inconsistencies. Regular batch runs for cleansing and real- time defect detection and correction together can provide a continuing high degree of data integrity. Figure 5: Quality of data becomes a different issue after it is acquired. Integration/Transformation Data Marts Applications Enterprise Data Warehouse Exploration Warehouse O D S Near Line Storage Analytical data quality It is one thing to ensure that data that resides in the data warehouse is of the highest quality. It is another thing to say that the data used for analysis is also of the highest quality. There is an important split in the corporate information factory that delimits the difference in the approach to data quality. Figure 5 shows this line of demarcation. Figure 5 shows that there are two divisions in the corporate information factory in relation to data quality. To the left is the application arena, the integration and transformation arena, and the data warehouse itself. In this arena the objective is to cleanse and purify the data as much as possible. But to the right is the data mart and exploration warehouse arena. In this arena there is a choice as to what data is best used for analysis. 8 Copyright, Vality Technology Inc. All rights reserved.

10 In the data mart and exploration warehouse arena the issue of ensuring quality switches to become an issue of ensuring that the right data is being used for analysis. The strongest guarantee that the best data is indeed being used for analysis is the analyst himself or herself. The analyst needs to be sure of what the data means, where it came from, and how fresh it is. He or she needs to understand the data intimately in order to use it most effectively. And he or she is responsible for the interpretation of the data. The best aid the analyst can have is accurate and robust metadata. Figure 6 shows the metadata that can be very useful to the analyst. The metadata that describes the data residing in the different components of the corporate information factory is varied in content. Typically, the metadata that describes the parts of the corporate information factory contains descriptors for: Table descriptions Attribute descriptions Sources of data Definitions of data Relationships of data, and so forth Figure 6: Upon analysis of the data, metadata becomes the central issue. md md md md Data Marts Exploration Warehouse md Near Line Storage md Enterprise Intelligence-Enabling High Quality in the Data Warehouse DSS Environment 9

11 Metadata mining provides an automated means to surface essential business information buried within legacy systems or the data warehouse. This information, unreachable by metadata movement tool, is needed by both data warehouse data modelers and business users of information systems. Metadata mining is a low-level investigation of operational data. It analyzes each and every data value within each record occurrence in order to assign a data type to each value and perform entity identification. The ability to process data at the value/instance level is the fundamental prerequisite for solving type identification, entity identification, and quality problems at the heart of the enterprise information architecture. Anything that can make the job of the analyst easier and more organized is welcome. Once the analyst has a clear idea of what is available and what differs from one set of data to another, he or she is prepared to make the most concise and incisive analysis Summary In order to achieve enterprise intelligence, data quality must be achieved at the data level and the metadata level. There are three different places for ensuring quality in a data warehouse environment: in the source or application environment, during the integration and transformation stage when data is moving into the data warehouse, and routinely within the data warehouse itself in order to address changes in data values over time. About Bill Inmon Bill Inmon is widely recognized as the father of the data warehouse concept. He has more that 26 years of database technology management experience and data warehouse design expertise. He has published 36 books and more than 350 articles in the major computer journals. Bill is also the author of DM Review magazine's "Information Management: Charting the Course" column. Before founding Pine Cone Systems, Bill was a co-founder of Prism Solutions, Inc. Mr. Inmon is responsible for the high-level design of Pine Cone products, as well as for the architecture of planned and future products. Mr. Inmon has consulted with a large number of Fortune 1000 clients, offering data warehouse design and database management services. About Vality The leading provider of data quality tools and services for enterprise intelligence, Vality Technology is the industry's leading supplier of data standardization and matching software and consulting services. Our customers are Global 5000 corporations in finance, healthcare, insurance, manufacturing, retail, telecommunications, energy, and utilities. These companies rely on our flagship product, the INTEGRITY Data Re-engineering Environment, to help them uncover patterns and relationships within their data to optimize their strategic information assets in areas such as data quality, data warehouse and business intelligence systems, ERP conversions, enterprise relationship management systems, and electronic commerce. For more information For more information about Vality Technology Inc. and the INTEGRITY Data Re-engineering Environment, please call or visit the Vality Web site at 10 Copyright, Vality Technology Inc. All rights reserved.

META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING

META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 asistithod@gmail.com

More information

Data Warehousing: A Technology Review and Update Vernon Hoffner, Ph.D., CCP EntreSoft Resouces, Inc.

Data Warehousing: A Technology Review and Update Vernon Hoffner, Ph.D., CCP EntreSoft Resouces, Inc. Warehousing: A Technology Review and Update Vernon Hoffner, Ph.D., CCP EntreSoft Resouces, Inc. Introduction Abstract warehousing has been around for over a decade. Therefore, when you read the articles

More information

Data Warehousing and Data Mining

Data Warehousing and Data Mining Data Warehousing and Data Mining Part I: Data Warehousing Gao Cong gaocong@cs.aau.dk Slides adapted from Man Lung Yiu and Torben Bach Pedersen Course Structure Business intelligence: Extract knowledge

More information

Lection 3-4 WAREHOUSING

Lection 3-4 WAREHOUSING Lection 3-4 DATA WAREHOUSING Learning Objectives Understand d the basic definitions iti and concepts of data warehouses Understand data warehousing architectures Describe the processes used in developing

More information

B.Sc (Computer Science) Database Management Systems UNIT-V

B.Sc (Computer Science) Database Management Systems UNIT-V 1 B.Sc (Computer Science) Database Management Systems UNIT-V Business Intelligence? Business intelligence is a term used to describe a comprehensive cohesive and integrated set of tools and process used

More information

Part 22. Data Warehousing

Part 22. Data Warehousing Part 22 Data Warehousing The Decision Support System (DSS) Tools to assist decision-making Used at all levels in the organization Sometimes focused on a single area Sometimes focused on a single problem

More information

IST722 Data Warehousing

IST722 Data Warehousing IST722 Data Warehousing Components of the Data Warehouse Michael A. Fudge, Jr. Recall: Inmon s CIF The CIF is a reference architecture Understanding the Diagram The CIF is a reference architecture CIF

More information

When to consider OLAP?

When to consider OLAP? When to consider OLAP? Author: Prakash Kewalramani Organization: Evaltech, Inc. Evaltech Research Group, Data Warehousing Practice. Date: 03/10/08 Email: erg@evaltech.com Abstract: Do you need an OLAP

More information

Alexander Nikov. 5. Database Systems and Managing Data Resources. Learning Objectives. RR Donnelley Tries to Master Its Data

Alexander Nikov. 5. Database Systems and Managing Data Resources. Learning Objectives. RR Donnelley Tries to Master Its Data INFO 1500 Introduction to IT Fundamentals 5. Database Systems and Managing Data Resources Learning Objectives 1. Describe how the problems of managing data resources in a traditional file environment are

More information

Chapter 6. Foundations of Business Intelligence: Databases and Information Management

Chapter 6. Foundations of Business Intelligence: Databases and Information Management Chapter 6 Foundations of Business Intelligence: Databases and Information Management VIDEO CASES Case 1a: City of Dubuque Uses Cloud Computing and Sensors to Build a Smarter, Sustainable City Case 1b:

More information

Foundations of Business Intelligence: Databases and Information Management

Foundations of Business Intelligence: Databases and Information Management Chapter 5 Foundations of Business Intelligence: Databases and Information Management 5.1 Copyright 2011 Pearson Education, Inc. Student Learning Objectives How does a relational database organize data,

More information

CHAPTER SIX DATA. Business Intelligence. 2011 The McGraw-Hill Companies, All Rights Reserved

CHAPTER SIX DATA. Business Intelligence. 2011 The McGraw-Hill Companies, All Rights Reserved CHAPTER SIX DATA Business Intelligence 2011 The McGraw-Hill Companies, All Rights Reserved 2 CHAPTER OVERVIEW SECTION 6.1 Data, Information, Databases The Business Benefits of High-Quality Information

More information

Foundations of Business Intelligence: Databases and Information Management

Foundations of Business Intelligence: Databases and Information Management Chapter 6 Foundations of Business Intelligence: Databases and Information Management 6.1 2010 by Prentice Hall LEARNING OBJECTIVES Describe how the problems of managing data resources in a traditional

More information

Chapter 6 8/12/2015. Foundations of Business Intelligence: Databases and Information Management. Problem:

Chapter 6 8/12/2015. Foundations of Business Intelligence: Databases and Information Management. Problem: Foundations of Business Intelligence: Databases and Information Management VIDEO CASES Chapter 6 Case 1a: City of Dubuque Uses Cloud Computing and Sensors to Build a Smarter, Sustainable City Case 1b:

More information

Framework for Data warehouse architectural components

Framework for Data warehouse architectural components Framework for Data warehouse architectural components Author: Jim Wendt Organization: Evaltech, Inc. Evaltech Research Group, Data Warehousing Practice. Date: 04/08/11 Email: erg@evaltech.com Abstract:

More information

Enterprise Data Quality

Enterprise Data Quality Enterprise Data Quality An Approach to Improve the Trust Factor of Operational Data Sivaprakasam S.R. Given the poor quality of data, Communication Service Providers (CSPs) face challenges of order fallout,

More information

Course 103402 MIS. Foundations of Business Intelligence

Course 103402 MIS. Foundations of Business Intelligence Oman College of Management and Technology Course 103402 MIS Topic 5 Foundations of Business Intelligence CS/MIS Department Organizing Data in a Traditional File Environment File organization concepts Database:

More information

Foundations of Business Intelligence: Databases and Information Management

Foundations of Business Intelligence: Databases and Information Management Foundations of Business Intelligence: Databases and Information Management Content Problems of managing data resources in a traditional file environment Capabilities and value of a database management

More information

Foundations of Business Intelligence: Databases and Information Management

Foundations of Business Intelligence: Databases and Information Management Foundations of Business Intelligence: Databases and Information Management Problem: HP s numerous systems unable to deliver the information needed for a complete picture of business operations, lack of

More information

Chapter 6 FOUNDATIONS OF BUSINESS INTELLIGENCE: DATABASES AND INFORMATION MANAGEMENT Learning Objectives

Chapter 6 FOUNDATIONS OF BUSINESS INTELLIGENCE: DATABASES AND INFORMATION MANAGEMENT Learning Objectives Chapter 6 FOUNDATIONS OF BUSINESS INTELLIGENCE: DATABASES AND INFORMATION MANAGEMENT Learning Objectives Describe how the problems of managing data resources in a traditional file environment are solved

More information

The growth of computing can be measured in two ways growth in what is termed structured systems and growth in what is termed unstructured systems.

The growth of computing can be measured in two ways growth in what is termed structured systems and growth in what is termed unstructured systems. The world of computing has grown from a small, unsophisticated world in the early 1960 s to a world today of massive size and sophistication. Nearly every person on the globe in one way or the other is

More information

Corporate Governance and Compliance: Could Data Quality Be Your Downfall?

Corporate Governance and Compliance: Could Data Quality Be Your Downfall? Corporate Governance and Compliance: Could Data Quality Be Your Downfall? White Paper This paper discusses the potential consequences of poor data quality on an organization s attempts to meet regulatory

More information

THE DATA WAREHOUSE ETL TOOLKIT CDT803 Three Days

THE DATA WAREHOUSE ETL TOOLKIT CDT803 Three Days Three Days Prerequisites Students should have at least some experience with any relational database management system. Who Should Attend This course is targeted at technical staff, team leaders and project

More information

INTELLIGENCE AND HOMELAND DEFENSE INSIGHT

INTELLIGENCE AND HOMELAND DEFENSE INSIGHT I N D U S T R Y INTELLIGENCE AND HOMELAND DEFENSE INSIGHT INTELLIGENCE AND HOMELAND DEFENSE CHALLENGES The Intelligence Community (IC) needs the right information, in real time, to make critical decisions.

More information

OLAP AND DATA WAREHOUSE BY W. H. Inmon

OLAP AND DATA WAREHOUSE BY W. H. Inmon OLAP AND DATA WAREHOUSE BY W. H. Inmon The goal of informational processing is to turn data into information. Online analytical processing (OLAP) is an important method by which this goal can be accomplished

More information

January 2010. Fast-Tracking Data Warehousing & Business Intelligence Projects via Intelligent Data Modeling. Sponsored by:

January 2010. Fast-Tracking Data Warehousing & Business Intelligence Projects via Intelligent Data Modeling. Sponsored by: Fast-Tracking Data Warehousing & Business Intelligence Projects via Intelligent Data Modeling January 2010 Claudia Imhoff, Ph.D Sponsored by: Table of Contents Introduction... 3 What is a Data Model?...

More information

The Data Warehouse ETL Toolkit

The Data Warehouse ETL Toolkit 2008 AGI-Information Management Consultants May be used for personal purporses only or by libraries associated to dandelon.com network. The Data Warehouse ETL Toolkit Practical Techniques for Extracting,

More information

The Quality Data Warehouse: Solving Problems for the Enterprise

The Quality Data Warehouse: Solving Problems for the Enterprise The Quality Data Warehouse: Solving Problems for the Enterprise Bradley W. Klenz, SAS Institute Inc., Cary NC Donna O. Fulenwider, SAS Institute Inc., Cary NC ABSTRACT Enterprise quality improvement is

More information

HYPERION MASTER DATA MANAGEMENT SOLUTIONS FOR IT

HYPERION MASTER DATA MANAGEMENT SOLUTIONS FOR IT HYPERION MASTER DATA MANAGEMENT SOLUTIONS FOR IT POINT-AND-SYNC MASTER DATA MANAGEMENT 04.2005 Hyperion s new master data management solution provides a centralized, transparent process for managing critical

More information

Data Mining for Successful Healthcare Organizations

Data Mining for Successful Healthcare Organizations Data Mining for Successful Healthcare Organizations For successful healthcare organizations, it is important to empower the management and staff with data warehousing-based critical thinking and knowledge

More information

DATA MINING AND WAREHOUSING CONCEPTS

DATA MINING AND WAREHOUSING CONCEPTS CHAPTER 1 DATA MINING AND WAREHOUSING CONCEPTS 1.1 INTRODUCTION The past couple of decades have seen a dramatic increase in the amount of information or data being stored in electronic format. This accumulation

More information

Integrating IBM Cognos TM1 with Oracle General Ledger

Integrating IBM Cognos TM1 with Oracle General Ledger Integrating IBM Cognos TM1 with Oracle General Ledger Highlights Streamlines the data integration process for fast and precise data loads. Enables planners to drill back into transactional data for the

More information

Master Data Management and Data Warehousing. Zahra Mansoori

Master Data Management and Data Warehousing. Zahra Mansoori Master Data Management and Data Warehousing Zahra Mansoori 1 1. Preference 2 IT landscape growth IT landscapes have grown into complex arrays of different systems, applications, and technologies over the

More information

MDM and Data Warehousing Complement Each Other

MDM and Data Warehousing Complement Each Other Master Management MDM and Warehousing Complement Each Other Greater business value from both 2011 IBM Corporation Executive Summary Master Management (MDM) and Warehousing (DW) complement each other There

More information

Data Warehouse Overview. Srini Rengarajan

Data Warehouse Overview. Srini Rengarajan Data Warehouse Overview Srini Rengarajan Please mute Your cell! Agenda Data Warehouse Architecture Approaches to build a Data Warehouse Top Down Approach Bottom Up Approach Best Practices Case Example

More information

A Knowledge Management Framework Using Business Intelligence Solutions

A Knowledge Management Framework Using Business Intelligence Solutions www.ijcsi.org 102 A Knowledge Management Framework Using Business Intelligence Solutions Marwa Gadu 1 and Prof. Dr. Nashaat El-Khameesy 2 1 Computer and Information Systems Department, Sadat Academy For

More information

Information Systems and Technologies in Organizations

Information Systems and Technologies in Organizations Information Systems and Technologies in Organizations Information System One that collects, processes, stores, analyzes, and disseminates information for a specific purpose Is school register an information

More information

Five Technology Trends for Improved Business Intelligence Performance

Five Technology Trends for Improved Business Intelligence Performance TechTarget Enterprise Applications Media E-Book Five Technology Trends for Improved Business Intelligence Performance The demand for business intelligence data only continues to increase, putting BI vendors

More information

Introduction to Datawarehousing

Introduction to Datawarehousing DIPARTIMENTO DI INGEGNERIA INFORMATICA AUTOMATICA E GESTIONALE ANTONIO RUBERTI Master of Science in Engineering in Computer Science (MSE-CS) Seminars in Software and Services for the Information Society

More information

Dimensional Modeling and E-R Modeling In. Joseph M. Firestone, Ph.D. White Paper No. Eight. June 22, 1998

Dimensional Modeling and E-R Modeling In. Joseph M. Firestone, Ph.D. White Paper No. Eight. June 22, 1998 1 of 9 5/24/02 3:47 PM Dimensional Modeling and E-R Modeling In The Data Warehouse By Joseph M. Firestone, Ph.D. White Paper No. Eight June 22, 1998 Introduction Dimensional Modeling (DM) is a favorite

More information

Databases and Information Management

Databases and Information Management Databases and Information Management Reading: Laudon & Laudon chapter 5 Additional Reading: Brien & Marakas chapter 3-4 COMP 5131 1 Outline Database Approach to Data Management Database Management Systems

More information

INFO 1400. Koffka Khan. Tutorial 6

INFO 1400. Koffka Khan. Tutorial 6 INFO 1400 Koffka Khan Tutorial 6 Running Case Assignment: Improving Decision Making: Redesigning the Customer Database Dirt Bikes U.S.A. sells primarily through its distributors. It maintains a small customer

More information

What You Don t Know Does Hurt You: Five Critical Risk Factors in Data Warehouse Quality. An Infogix White Paper

What You Don t Know Does Hurt You: Five Critical Risk Factors in Data Warehouse Quality. An Infogix White Paper What You Don t Know Does Hurt You: Five Critical Risk Factors in Data Warehouse Quality Executive Summary Data warehouses are becoming increasingly large, increasingly complex and increasingly important

More information

Foundations of Business Intelligence: Databases and Information Management

Foundations of Business Intelligence: Databases and Information Management Foundations of Business Intelligence: Databases and Information Management Wienand Omta Fabiano Dalpiaz 1 drs. ing. Wienand Omta Learning Objectives Describe how the problems of managing data resources

More information

www.ducenit.com Analance Data Integration Technical Whitepaper

www.ducenit.com Analance Data Integration Technical Whitepaper Analance Data Integration Technical Whitepaper Executive Summary Business Intelligence is a thriving discipline in the marvelous era of computing in which we live. It s the process of analyzing and exploring

More information

INFORMATION MANAGEMENT. Transform Your Information into a Strategic Asset

INFORMATION MANAGEMENT. Transform Your Information into a Strategic Asset INFORMATION MANAGEMENT Transform Your Information into a Strategic Asset The information explosion in all organizations has created a challenge and opportunity for enterprises. When properly managed, information

More information

Data Warehousing and Data Mining in Business Applications

Data Warehousing and Data Mining in Business Applications 133 Data Warehousing and Data Mining in Business Applications Eesha Goel CSE Deptt. GZS-PTU Campus, Bathinda. Abstract Information technology is now required in all aspect of our lives that helps in business

More information

Data Warehouse Testing

Data Warehouse Testing Data Warehouse Testing Manoj Philip Mathen Abstract Exhaustive testing of a Data warehouse during its design and on an ongoing basis (for the incremental activities) comprises Data warehouse testing. This

More information

Data Quality Assessment. Approach

Data Quality Assessment. Approach Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source

More information

Build an effective data integration strategy to drive innovation

Build an effective data integration strategy to drive innovation IBM Software Thought Leadership White Paper September 2010 Build an effective data integration strategy to drive innovation Five questions business leaders must ask 2 Build an effective data integration

More information

Speeding ETL Processing in Data Warehouses White Paper

Speeding ETL Processing in Data Warehouses White Paper Speeding ETL Processing in Data Warehouses White Paper 020607dmxwpADM High-Performance Aggregations and Joins for Faster Data Warehouse Processing Data Processing Challenges... 1 Joins and Aggregates are

More information

Accenture Human Capital Management Solutions. Transforming people and process to achieve high performance

Accenture Human Capital Management Solutions. Transforming people and process to achieve high performance Accenture Human Capital Management Solutions Transforming people and process to achieve high performance The sophistication of our products and services requires the expertise of a special and talented

More information

Data Profiling and Mapping The Essential First Step in Data Migration and Integration Projects

Data Profiling and Mapping The Essential First Step in Data Migration and Integration Projects Data Profiling and Mapping The Essential First Step in Data Migration and Integration Projects An Evoke Software White Paper Summary At any given time, according to industry analyst estimates, roughly

More information

Data Management Roadmap

Data Management Roadmap Data Management Roadmap A progressive approach towards building an Information Architecture strategy 1 Business and IT Drivers q Support for business agility and innovation q Faster time to market Improve

More information

ORACLE ENTERPRISE DATA QUALITY PRODUCT FAMILY

ORACLE ENTERPRISE DATA QUALITY PRODUCT FAMILY ORACLE ENTERPRISE DATA QUALITY PRODUCT FAMILY The Oracle Enterprise Data Quality family of products helps organizations achieve maximum value from their business critical applications by delivering fit

More information

Realizing the True Power of Insurance Data: An Integrated Approach to Legacy Replacement and Business Intelligence

Realizing the True Power of Insurance Data: An Integrated Approach to Legacy Replacement and Business Intelligence Realizing the True Power of Insurance Data: An Integrated Approach to Legacy Replacement and Business Intelligence Featuring as an example: Guidewire DataHub TM and Guidewire InfoCenter TM An Author: Mark

More information

Integrating SAP and non-sap data for comprehensive Business Intelligence

Integrating SAP and non-sap data for comprehensive Business Intelligence WHITE PAPER Integrating SAP and non-sap data for comprehensive Business Intelligence www.barc.de/en Business Application Research Center 2 Integrating SAP and non-sap data Authors Timm Grosser Senior Analyst

More information

Scalable Enterprise Data Integration Your business agility depends on how fast you can access your complex data

Scalable Enterprise Data Integration Your business agility depends on how fast you can access your complex data Transforming Data into Intelligence Scalable Enterprise Data Integration Your business agility depends on how fast you can access your complex data Big Data Data Warehousing Data Governance and Quality

More information

Technology in Action. Alan Evans Kendall Martin Mary Anne Poatsy. Eleventh Edition. Copyright 2015 Pearson Education, Inc.

Technology in Action. Alan Evans Kendall Martin Mary Anne Poatsy. Eleventh Edition. Copyright 2015 Pearson Education, Inc. Copyright 2015 Pearson Education, Inc. Technology in Action Alan Evans Kendall Martin Mary Anne Poatsy Eleventh Edition Copyright 2015 Pearson Education, Inc. Technology in Action Chapter 9 Behind the

More information

An Introduction to Data Warehousing. An organization manages information in two dominant forms: operational systems of

An Introduction to Data Warehousing. An organization manages information in two dominant forms: operational systems of An Introduction to Data Warehousing An organization manages information in two dominant forms: operational systems of record and data warehouses. Operational systems are designed to support online transaction

More information

Data Quality Assurance

Data Quality Assurance CHAPTER 4 Data Quality Assurance The previous chapters define accurate data. They talk about the importance of data and in particular the importance of accurate data. They describe how complex the topic

More information

While people are often a corporation s true intellectual property, data is what

While people are often a corporation s true intellectual property, data is what While people are often a corporation s true intellectual property, data is what feeds the people, enabling employees to see where the company stands and where it will go. Quick access to quality data helps

More information

Implementing Oracle BI Applications during an ERP Upgrade

Implementing Oracle BI Applications during an ERP Upgrade 1 Implementing Oracle BI Applications during an ERP Upgrade Jamal Syed Table of Contents TABLE OF CONTENTS... 2 Executive Summary... 3 Planning an ERP Upgrade?... 4 A Need for Speed... 6 Impact of data

More information

White Paper. An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management

White Paper. An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management White Paper An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management Managing Data as an Enterprise Asset By setting up a structure of

More information

Making Business Intelligence Easy. White Paper Spreadsheet reporting within a BI framework

Making Business Intelligence Easy. White Paper Spreadsheet reporting within a BI framework Making Business Intelligence Easy White Paper Spreadsheet reporting within a BI framework Contents Overview...4 What is spreadsheet reporting and why does it exist?...5 Risks and issues with spreadsheets

More information

www.sryas.com Analance Data Integration Technical Whitepaper

www.sryas.com Analance Data Integration Technical Whitepaper Analance Data Integration Technical Whitepaper Executive Summary Business Intelligence is a thriving discipline in the marvelous era of computing in which we live. It s the process of analyzing and exploring

More information

Issues in Information Systems Volume 15, Issue II, pp. 133-140, 2014

Issues in Information Systems Volume 15, Issue II, pp. 133-140, 2014 MOVING FROM TRADITIONAL DATA WAREHOUSE TO ENTERPRISE DATA MANAGEMENT: A CASE STUDY Amit Pandey, Robert Morris University, axpst29@mail.rmu.edu Sushma Mishra, Robert Morris University, mishra@rmu.edu ABSTRACT

More information

Service Oriented Data Management

Service Oriented Data Management Service Oriented Management Nabin Bilas Integration Architect Integration & SOA: Agenda Integration Overview 5 Reasons Why Is Critical to SOA Oracle Integration Solution Integration

More information

dxhub Denologix MDM Solution Page 1

dxhub Denologix MDM Solution Page 1 Most successful large organizations are organized by lines of business (LOB). This has been a very successful way to organize for the accountability of profit and loss. It gives LOB leaders autonomy to

More information

Making Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management

Making Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management Making Business Intelligence Easy Whitepaper Measuring data quality for successful Master Data Management Contents Overview... 3 What is Master Data Management?... 3 Master Data Modeling Approaches...

More information

C A S E S T UDY The Path Toward Pervasive Business Intelligence at an International Financial Institution

C A S E S T UDY The Path Toward Pervasive Business Intelligence at an International Financial Institution C A S E S T UDY The Path Toward Pervasive Business Intelligence at an International Financial Institution Sponsored by: Tata Consultancy Services October 2008 SUMMARY Global Headquarters: 5 Speen Street

More information

Next Generation Business Performance Management Solution

Next Generation Business Performance Management Solution Next Generation Business Performance Management Solution Why Existing Business Intelligence (BI) Products are Inadequate Changing Business Environment In the face of increased competition, complex customer

More information

www.ijreat.org Published by: PIONEER RESEARCH & DEVELOPMENT GROUP (www.prdg.org) 28

www.ijreat.org Published by: PIONEER RESEARCH & DEVELOPMENT GROUP (www.prdg.org) 28 Data Warehousing - Essential Element To Support Decision- Making Process In Industries Ashima Bhasin 1, Mr Manoj Kumar 2 1 Computer Science Engineering Department, 2 Associate Professor, CSE Abstract SGT

More information

[callout: no organization can afford to deny itself the power of business intelligence ]

[callout: no organization can afford to deny itself the power of business intelligence ] Publication: Telephony Author: Douglas Hackney Headline: Applied Business Intelligence [callout: no organization can afford to deny itself the power of business intelligence ] [begin copy] 1 Business Intelligence

More information

An Enterprise Data Hub, the Next Gen Operational Data Store

An Enterprise Data Hub, the Next Gen Operational Data Store An Enterprise Data Hub, the Next Gen Operational Data Store Version: 101 Table of Contents Summary 3 The ODS in Practice 4 Drawbacks of the ODS Today 5 The Case for ODS on an EDH 5 Conclusion 6 About the

More information

Architecting an Industrial Sensor Data Platform for Big Data Analytics

Architecting an Industrial Sensor Data Platform for Big Data Analytics Architecting an Industrial Sensor Data Platform for Big Data Analytics 1 Welcome For decades, organizations have been evolving best practices for IT (Information Technology) and OT (Operation Technology).

More information

Whitepaper. Data Warehouse/BI Testing Offering YOUR SUCCESS IS OUR FOCUS. Published on: January 2009 Author: BIBA PRACTICE

Whitepaper. Data Warehouse/BI Testing Offering YOUR SUCCESS IS OUR FOCUS. Published on: January 2009 Author: BIBA PRACTICE YOUR SUCCESS IS OUR FOCUS Whitepaper Published on: January 2009 Author: BIBA PRACTICE 2009 Hexaware Technologies. All rights reserved. Table of Contents 1. 2. Data Warehouse - Typical pain points 3. Hexaware

More information

PARALLEL PROCESSING AND THE DATA WAREHOUSE

PARALLEL PROCESSING AND THE DATA WAREHOUSE PARALLEL PROCESSING AND THE DATA WAREHOUSE BY W. H. Inmon One of the essences of the data warehouse environment is the accumulation of and the management of large amounts of data. Indeed, it is said that

More information

CIC Audit Review: Experian Data Quality Enterprise Integrations. Guidance for maximising your investment in enterprise applications

CIC Audit Review: Experian Data Quality Enterprise Integrations. Guidance for maximising your investment in enterprise applications CIC Audit Review: Experian Data Quality Enterprise Integrations Guidance for maximising your investment in enterprise applications February 2014 Table of contents 1. Challenge Overview 03 1.1 Experian

More information

Three Fundamental Techniques To Maximize the Value of Your Enterprise Data

Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Prepared for Talend by: David Loshin Knowledge Integrity, Inc. October, 2010 2010 Knowledge Integrity, Inc. 1 Introduction Organizations

More information

Life Cycle of a Data Warehousing Project in Healthcare

Life Cycle of a Data Warehousing Project in Healthcare Life Cycle of a Data Warehousing Project in Healthcare Ravi Verma, Jeannette Harper ABSTRACT Hill Physicians Medical Group (and its medical management firm, PriMed Management) early on recognized the need

More information

Original Research Articles

Original Research Articles Original Research Articles Researchers Sweety Patel Department of Computer Science, Fairleigh Dickinson University, USA Email- sweetu83patel@yahoo.com Different Data Warehouse Architecture Creation Criteria

More information

BUILDING BLOCKS OF DATAWAREHOUSE. G.Lakshmi Priya & Razia Sultana.A Assistant Professor/IT

BUILDING BLOCKS OF DATAWAREHOUSE. G.Lakshmi Priya & Razia Sultana.A Assistant Professor/IT BUILDING BLOCKS OF DATAWAREHOUSE G.Lakshmi Priya & Razia Sultana.A Assistant Professor/IT 1 Data Warehouse Subject Oriented Organized around major subjects, such as customer, product, sales. Focusing on

More information

Getting started with a data quality program

Getting started with a data quality program IBM Software White Paper Information Management Getting started with a data quality program 2 Getting started with a data quality program The data quality challenge Organizations depend on quality data

More information

Implementing Oracle BI Applications during an ERP Upgrade

Implementing Oracle BI Applications during an ERP Upgrade Implementing Oracle BI Applications during an ERP Upgrade Summary Jamal Syed BI Practice Lead Emerging solutions 20 N. Wacker Drive Suite 1870 Chicago, IL 60606 Emerging Solutions, a professional services

More information

A Design Technique: Data Integration Modeling

A Design Technique: Data Integration Modeling C H A P T E R 3 A Design Technique: Integration ing This chapter focuses on a new design technique for the analysis and design of data integration processes. This technique uses a graphical process modeling

More information

Business Intelligence Solutions. Cognos BI 8. by Adis Terzić

Business Intelligence Solutions. Cognos BI 8. by Adis Terzić Business Intelligence Solutions Cognos BI 8 by Adis Terzić Fairfax, Virginia August, 2008 Table of Content Table of Content... 2 Introduction... 3 Cognos BI 8 Solutions... 3 Cognos 8 Components... 3 Cognos

More information

Master Data Management. Zahra Mansoori

Master Data Management. Zahra Mansoori Master Data Management Zahra Mansoori 1 1. Preference 2 A critical question arises How do you get from a thousand points of data entry to a single view of the business? We are going to answer this question

More information

IBM Software A Journey to Adaptive MDM

IBM Software A Journey to Adaptive MDM IBM Software A Journey to Adaptive MDM What is Master Data? Why is it Important? A Journey to Adaptive MDM Contents 2 MDM Business Drivers and Business Value 4 MDM is a Journey 7 IBM MDM Portfolio An Adaptive

More information

AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS SAN DIEGO SAN FRANCISCO SINGAPORE SYDNEY TOKYO

AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS SAN DIEGO SAN FRANCISCO SINGAPORE SYDNEY TOKYO DW2.0 The Architecture for the Next Generation of Data Warehousing W. H. Inmon Forest Rim Technology Derek Strauss Gavroshe Genia Neushloss Gavroshe AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS

More information

JOURNAL OF OBJECT TECHNOLOGY

JOURNAL OF OBJECT TECHNOLOGY JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7, No. 8, November-December 2008 What s Your Information Agenda? Mahesh H. Dodani,

More information

Business Intelligence Engineer Position Description

Business Intelligence Engineer Position Description Business Intelligence Position Description February 9, 2015 Position Description February 9, 2015 Page i Table of Contents General Characteristics... 1 Career Path... 2 Explanation of Proficiency Level

More information

!"#$%&&'(#)*+,+#*-#./(0/1#2'3*4,%(5#%(#678'1# /(&#9:/,#;*0#)/(#<*#/=*0,#>:'?# !"#$%&'()%*&

!#$%&&'(#)*+,+#*-#./(0/1#2'3*4,%(5#%(#678'1# /(&#9:/,#;*0#)/(#<*#/=*0,#>:'?# !#$%&'()%*& !"#$%&&'(#)*+,+#*-#./(0/1#2'3*4,%(5#%(#678'1# /(&#9:/,#;*0#)/(#:'?#!"#$%&'()%*& @0??/4A# Since it was released in 1985, Excel has been the standard for business reporting and analysis. With each

More information

Quality Data for Your Information Infrastructure

Quality Data for Your Information Infrastructure SAP Product Brief SAP s for Small Businesses and Midsize Companies SAP Data Quality Management, Edge Edition Objectives Quality Data for Your Information Infrastructure Data quality management for confident

More information

Making confident decisions with the full spectrum of analysis capabilities

Making confident decisions with the full spectrum of analysis capabilities IBM Software Business Analytics Analysis Making confident decisions with the full spectrum of analysis capabilities Making confident decisions with the full spectrum of analysis capabilities Contents 2

More information

Enable Business Agility and Speed Empower your business with proven multidomain master data management (MDM)

Enable Business Agility and Speed Empower your business with proven multidomain master data management (MDM) Enable Business Agility and Speed Empower your business with proven multidomain master data management (MDM) Customer Viewpoint By leveraging a well-thoughtout MDM strategy, we have been able to strengthen

More information

Open Source on the Trading Desk

Open Source on the Trading Desk Open Source on the Trading Desk Stephen Ferrando dbconcert, Inc. January 21, 2009 1 Open Source on the Trading Desk Table of Contents Table of Contents... 2 Introduction... 3 Business Intelligence and

More information

POLAR IT SERVICES. Business Intelligence Project Methodology

POLAR IT SERVICES. Business Intelligence Project Methodology POLAR IT SERVICES Business Intelligence Project Methodology Table of Contents 1. Overview... 2 2. Visualize... 3 3. Planning and Architecture... 4 3.1 Define Requirements... 4 3.1.1 Define Attributes...

More information