DATA QUALITY DATA BASE QUALITY INFORMATION SYSTEM QUALITY

Size: px
Start display at page:

Download "DATA QUALITY DATA BASE QUALITY INFORMATION SYSTEM QUALITY"

Transcription

1 DATA QUALITY DATA BASE QUALITY INFORMATION SYSTEM QUALITY The content of those documents are the exclusive property of REVER. The aim of those documents is to provide information and should, in no case, be considered as a commitment of REVER. All uses, including referencing a part or the complete document, is entitled to a prior written agreement from REVER. REVER SA ; Belgique ; ;

2 Table of content 1 Introduction Facts Encountered difficulties REVER s solutions The REVER s approach Major results provided by REVER Data Quality Data base quality Measures on the structures Measures on the database, as used by the programs Information systems quality The added values of REVER To know more...17 Service Marketing 11/06/2008 REVER-SO10 Page 2 / 17

3 1 Introduction In order to assist the reading of this document, it looks relevant to remind some definitions: A data is a succession of characters representing an elementary object of an information system: as such, a data has no particular meaning. In computerized information systems, data are stored in electronic files ; An information is a data or a set of data endowed with a semantics ; A data base is a group of files allowing the storage of a great quantity of data in order to allow their management (data addition, update, request). This set of files is structured and organized in order to provide the users with the necessary information to cover their domain of activity. To reach this objective, the availability of a data base is not enough: programs are needed to run the data processing and provide the interface with the users. This whole (data base and programs) is called an application ; An information system is an organized set of resources (personnel, data base, procedures, hardware, software ) allowing to acquire, store, structure and communicate information (in text, images, sounds and company coded data form). The goal is to coordinate these activities to allow the organization to reach its objectives. The information system is the vehicle of the communication in the organization. The quality of its information system is a major challenge for the organizations: without credible, relevant and coherent information, it will be rather difficult to act daily and, a fortiori, to make important decisions, vital for the organization. Considering this strategic stake, the data and data bases quality measures are essential elements to evaluate the information system quality. The former definitions illustrate the important differences between the different concepts: the quality of a given piece of information and/or an information system is a concern for the organization and the users. The quality of data and data bases is a concern for the IT department. 2 Facts In fact, the quality problems are daily matters: The users are facing wrong or contradictory data (typically two screens giving two different customer addresses) Programs which abort due to incoherent data ; A data migration process which is blocked when loading data because the source data have formats, or types, incompatible with the structure and rules of the target base ; Communication to customers of wrong results which will have to be corrected later on by sending mails to them ; Service Marketing 11/06/2008 REVER-SO10 Page 3 / 17

4 Looking at these examples, the situations diversity multiplied by the frequency of «anomalies» and/or technical crash», leads to undesired consequences: Unforeseen workload leading to over budgeting (sometimes important); Delay in applications delivery with direct and/or indirect consequences on the activities of the organization ; Loss of confidence of the users in the IT system ; Organizations have long been considering that this type of trouble had a minor impact on their operations and, as a consequence, have tolerated them. The situation has however drastically evolved these last years: indeed, the automated information systems became, along the time, the central tool to operate an organization. Any stop, delay or error in the data leads to damages which can be sometimes important, in particular when these dysfunctions have a direct impact on the customer base. In this context, and being conscious of the importance of the challenge, strong actions are needed. 3 Encountered difficulties If the dysfunctions report is easy to make, to carry out the solutions is unfortunately not simple. The hereunder reasons for it are not exhaustive: Organizational reasons, : Which is the department (or persons) responsible for quality? Which are the objectives which were assigned to it? What is the quality level expected by the Management (no stoppage, no incoherence...)? What are the available budgets to achieve quality? What are the mission and the authority of the quality team? Is it only a technical one? Operational reasons: Where to start? With IS? With the technical aspects? Is there a data «dictionary» of the organization? If yes, is it updated? If not, how to build it? Considering that data are in a permanent and continued flow, entering and leaving the organization, how is it possible to perpetuate the obtained results? What are the technical solutions and/or the available products? Are they compatible with the technological platforms? Anyone will understand that it is not simple. It is even more complex when analyzing the different solutions of the market. It appears clearly that, regarding data quality, and nowadays, two main trends can be considered, depending on the approach (through the solutions, or through the results angle). The first trend is expressed in a document of the Gardner Group («magic quadrant for data quality tools» 29 June 2007) for the first approach and summarized in the following table: Service Marketing 11/06/2008 REVER-SO10 Page 4 / 17

5 Functions parsing and standardization generalized cleansing matching profiling monitoring enrichment Table 1 : Gardner Group Definition decomposition of text fields into component parts and formatting of values into consistent layouts based on industry standards, local standards (for example, postal authority standards for address data), user-defined business rules, and knowledge bases of values and patterns. modification of data values to meet domain restrictions, integrity constraints or other business rules that define sufficient data quality for the organization identification, linking or merging related entries within or across sets of data analysis of data to capture statistics (metadata) that provide insight into the quality of the data and aid in the identification of data quality issues deployment of controls to ensure ongoing conformance of data to business rules that define data quality for the organization enhancing the value of internally held data by appending related attributes from external sources (for example, consumer demographic attributes or geographic descriptors) The second trend is expressed by the Normalization ISO Institute (ISO/IEEC JTC1/SC7 3792, 28 June 2007). In its approach, ISO provides 16 measuring characteristics and makes a distinction, for each of them, between an inherent data characteristics or an extended one. More precisely: IN = Inherent data quality The capability of data to satisfy stated and implied needs when data are used under specified conditions, independently of other system s components. Inherent data quality refers to data itself; it provides the criteria to ensure and verify the quality of data taking into account: data domains values and possible restrictions (e.g.: business rules governing the currentness, accuracy, precision, etc. required for a given application) metadata relationships between data (e.g.: integrity constraints) EX = Extended data quality The extent to which data can satisfy stated and implied needs when data are used under specified conditions, using some capabilities of computer system components. Extended data quality refers to properties inherited from technological environment through the implementation of systems capabilities; it provides criteria to ensure and verify the quality of data for: data acquisition (e.g.: data entry, data loading, data updating in accordance with parsing, enriching and transforming rules of the data management) data management (e.g.: backup/restore capabilities in terms of capacity and store/retrieve times) data access (e.g. obscuring, for security purposes, the first twelve digits of a credit card number on a purchase receipt ) Service Marketing 11/06/2008 REVER-SO10 Page 5 / 17

6 Table 2 : ISO norm characteristics in ex definition the extent to which data are coherent with other data in the same context of consistency X use. Inconsistency can be verified on the same or different entities. the extent to which the data are the right age. Currentness is critical for currentness X volatile data the extent to which the subject data associated with an entity have values for all expected attributes and related entity occurrences in a specific completeness X context of use. Completeness includes also the capability of data to represent the context observed by users. precision X X the extent to which the data provide the depth of information needed. the extent to which the data correctly represent an attribute of a real world object, concept or event. Accuracy has two main aspects: Syntactic accuracy syntactical accuracy is defined as the closeness of the data values to a set of accuracy X values defined in a domain considered syntactically correct. Semantic accuracy semantic accuracy is defined as the closeness of the data values to a set of values defined in a domain considered semantically correct. confidentiality the extent to which data can be accessed and interpreted only by authorized X X users. availability X the extent to which data are retrievable by authorized users. recoverability the extent to which data maintain and preserve a specified level of X operations and its physical and logical integrity, even in the event of failure. the extent to which the data stored in their native format can be read and understandability X X easily interpreted by users, and are expressed in appropriate languages, symbols and units. the extent to which data can be processed (accessed, acquired, updated, efficiency X X managed and used etc) and provide the expected levels of performance using the appropriate amounts and types of resources under stated conditions. changeability the extent to which data can be modified. For instance, modification of its X type, length or assigned value. the extent to which data can be moved from one platform to another; this portability X includes also the possibility to install and replace data in the destination platform. traceability the extent to which data provide an audit trail to the origin and any changes X X made to the data. credibility X the extent to which data are regarded as true and believable by users. accessibility the extent to which data can be reached, particularly by people who need X X supporting technology or special configuration because of some disability. compliance the extent to which the data adhere to standards, conventions or regulations X X in force and similar rules relating to data quality. 4 REVER s solutions Let s remember that the REVER s activities are based on a Model Driven Data Engineering approach. In particular, the methods and technologies of REVER allow an Service Marketing 11/06/2008 REVER-SO10 Page 6 / 17

7 application (or a set of applications) data model reconstruction (physical, logical and semantic). For REVER, the data models contain: The data structures (entities and attributes) ; The relations linking the entities with each other ; The data rules, meaning the business rules which, in case of non-respect, create incoherence in persistent objects. The modeling tools allow, at any time, to add or modify content elements of the model and, in particular, the data rules. In the same way, let s note that, from a modeling point of view, it is possible to construct a complete model of the organization information system by placing side by side, and by linking, the data models of each applications which are parts of the information system. 4.1 The REVER s approach In a MDDE approach, as proposed by REVER, it is rather natural to approach the data quality problems in two phases: In a first step, the right thing to do is to measure the data quality through the data model which has structured the application(s) ; In a second step, the right thing to do is to evaluate the capacity of the data model to meet the users and/or organization requirements. More precisely, as for any qualitative measures, the measurement of an information system quality, of a database, of data, needs one or more reference elements allowing the appreciation of the measure. In this context, and as showed in the hereunder schema: The information system quality has to be measured in line with the users and organization needs; The data base quality has to be measured in line with its use by the programs, according to complexity, performance, evolution capacity criteria, ; The data quality has to be measured in line with the application data models and, in particular, the data rules of the IT application. Service Marketing 11/06/2008 REVER-SO10 Page 7 / 17

8 From this point of view, the solutions and technologies proposed by REVER will complete the existing solutions and will not be a substitute to them. This allows the production of qualitatively better results, faster and at an inferior cost. To illustrate these complementarities, let us take the two following examples: The splitting of a data zone in an elementary field: it is frequent (mainly in the applications using a non relational data bases) that the definition of one or more zones is global and not described under the form of elementary attributes (typically, an address field is described in the database as address, alphanumeric type, length: 163). In order to know if a more precise structure does exist, or not, (e.g.: N -type: numerical length: 5 -, STREET type : alpha, length 100-, COUNTRY type alpha, length 3 -, POSTAL CODE numerical type length: 5-, CITY alpha type, length 50- ), the «data profiling» tools will achieve a statistical analysis of the data and provide the different formats found during the analysis. It is obvious that, in such an approach, if an important proportion (let us say more than 30%) of the data does not respect these rules, it will be difficult to appreciate the data quality in that case. In a MDDE approach, as proposed by REVER, the principle is radically different: the exhaustive analysis of all programs source code will indicate the existence of at least one program (for example containing an address data entry screen) which takes into account the splitting. If it is the case, then the splitting used by the program is part of the model. Later on, the data validation programs, which are generated from the model, allow knowing in detail the «addresses» which do not respect the splitting rules. Of course, if no program is giving the splitting rules, then the data profiling Service Marketing 11/06/2008 REVER-SO10 Page 8 / 17

9 technique, by its capacity to analyze chains of characters, brings an added value which can not be provided by the models ; A great deal of programming languages allows «redefinitions» of fields (REDEFINE in COBOL for example). This mechanism is aiming at gaining room in the data base by authorizing the use of a same physical location for different concepts, mutually exclusive. For example: in a record concerning «persons», the field «name» will be used to memorize company names, when specifying a legal entity. It will be redefined in two fields «name» and «birth name» for a physical persons (with a rule which says that the zone birthday place is only used for married woman). Clearly, in these circumstances, an indicator exists necessarily in the record (in our example, a zone «type of person») which allows the programs to know which structures have to be taken into consideration. In a MDDE approach, the whole of these redefinitions are recuperated during the analysis of the programs source code and integrated into the model (the hereunder schema shows the modeling of an application in which a same field represents 77 different concepts); As a consequence, the data analysis takes into account, for each record of the structure, the concept which is memorized. Without precise indication on the existing concepts, a data profiling tool, in these circumstances, will not be able to deliver relevant results regarding data quality. This example clarifies the main added value of MDDE to assess the data quality, the data base quality and more generally, the quality of information systems. The hereunder tables take into account the two previous mentioned approaches by indicating, for each characteristics and/or functions, if MDDE, as applied by REVER, is suited to complement existing solutions (cells marked with the sign). Service Marketing 11/06/2008 REVER-SO10 Page 9 / 17

10 Functions Data Table 1 : Gardner Group Aspects concerned Application data base programs Information system standardization cleansing matching profiling monitoring enrichment table 2 : ISO norms Characteristics Inherent Extended Data Aspects concerned Applications data base programs Information system consistency X currentness X completeness X precision X X accuracy X confidentiality X X availability X recoverability X understandability X X efficiency X X changeability X portability X traceability X X credibility X accessibility X X compliance X X 4.2 Major results provided by REVER Through their MDDE approach, the solutions have an impact at each of the three levels of the quality measurement: data quality, data base quality and information system quality. Service Marketing 11/06/2008 REVER-SO10 Page 10 / 17

11 4.2.1 Data Quality In concrete terms, the tools of REVER allow, for an application and from the existing technical elements: To reconstruct the data model including: The detailed structures of the data base (entities, attributes type, length, position) Relations between entities ; The other data rules ; To identify the redundant attributes and to check their values ; To automatically generate the controls which allow verifying that the data respect the models rules ; To locate the modules and programs using the data and those updating them; As an example, the hereunder screens are the results of a data base content control. The operated controls have been generated automatically from the data model. The first screen illustrates the global results of the data base analysis such as the total count of requests (1), the number of tables containing errors (2) as well as the list of entities (3): The second screen provides the list of attributes per entity (1), the number of analyzed records (2) as well as the number of non consistent data: Service Marketing 11/06/2008 REVER-SO10 Page 11 / 17

12 The third screen indicates the data (1) which are not consistent with the rules (e.g.: the 31st of September is not a valid date) as well as the record identifiers (2): Service Marketing 11/06/2008 REVER-SO10 Page 12 / 17

13 It will also be mentioned that, beyond the simple control of conformity between data and data models, the solutions proposed by REVER, from the list of modules (programs) updating the data, allow in particular: To identify the reasons of divergence between the data model and the programmed rules, leading to appropriate solutions and corrections ; to correct the possible errors in the source modules ; to put in place the control and alert mechanisms during data introduction when not respecting the model rules ; to modify the model by adding data rules, which allows a simple validation of a rule which is not yet clearly set ; Data base quality The database quality measures, besides the above mentioned data quality as such, will be based on two elements: The database structures (entities, attributes, relations...) The database as used by the programs Measures on the structures Service Marketing 11/06/2008 REVER-SO10 Page 13 / 17

14 Measures on the database, as used by the programs These measures are aiming at providing elements in order to know the type of database use done by the programs. Mainly, two main measures will be realized: Dependency measures: the objective is to know the degree of dependency between data and program. In concrete terms, the REVER tools allow identifying, for each of the «procedural objects», the list of accessed entities and the access types (read, write, etc...). These elements are grouped in a table where each point shows that program X is using entity Y. Of course, this graph can be done only for reads, or only for writes, or for both, depending of the needs. Such a table is presented below : Criticality measures: the objective is to determine the risks induced by the programs when using the data. In concrete terms: In the model a weight will be given to each entity depending on the number of parents and child s ; In the programs, and for each access verb to the database, a weight will be attributed depending of the action type (read, write, delete,...) These parameters being fixed, the following will be obtained by a simple calculation : The weight of a specific access calculated by a function depending on the weight of the verb and of the one of the entity ; The weight of a module is the sum of the weight of each access achieved in the module ; The weight of a program is the sum of the modules weights composing it. An example of this measure is illustrated in the hereunder graph : Service Marketing 11/06/2008 REVER-SO10 Page 14 / 17

15 Moreover, with the help of the utilization ratio for each program, it will be possible to make a graph of the program s risks related to data. It is illustrated in the hereunder table : In this graph, the upper right quadrant put aside the high risk programs (the frequently used and heaviest»), the lowest left quadrant being the one setting aside the less risky ones. This type of results is largely used for the scheduling of the programs to be tested. Service Marketing 11/06/2008 REVER-SO10 Page 15 / 17

16 4.2.3 Information systems quality Looking at information systems quality, the REVER s tools provide to the users a complete and detailed description of the information system managed by an application. It is now clear that, from the model, one can determined the evolutions to be made in order to meet the needs. Moreover, the REVER tools allow the generalization of the results produced for one given application to all of them. They give the following possibilities: To compare and set aside the data descriptions which are identical and/or looking alike; To compare the data value when it s about identical pieces of information ; To rationalize the data descriptions of the organization ; To put in place an architecture based on «data services» keeping simultaneously identical data updated in several databases, As an illustration, the hereunder screen is an example of a multi bases dictionary allowing identification of identical, or look alike, data in order to connect them. Service Marketing 11/06/2008 REVER-SO10 Page 16 / 17

17 5 The added values of REVER The approach proposed by REVER provides the following benefits: It separates the technical part from the organizational one, the elements to measure for each of them being clearer; It allows a mastered approach of problems identification and of their solutions, simplifying significantly the global process ; It can be applied to one or all applications of the organization, offering to progressively extend the capabilities of the approach in function of the expected results ; Last but not least, and it is not its less attractive aspect, it allows to provide concrete solutions allowing to measure the quality of the different elements by supporting and integrating the two main market approaches. 6 To know more Additional information regarding the methods and tools used by REVER are provided in the following documents: Reverse engineering of data bases DB-MAIN Titles Service Marketing 11/06/2008 REVER-SO10 Page 17 / 17

Data Warehouse design

Data Warehouse design Data Warehouse design Design of Enterprise Systems University of Pavia 21/11/2013-1- Data Warehouse design DATA QUALITY - 2- Data Quality The quality of the data in a data warehouse determines the reputation

More information

Chapter 6 Basics of Data Integration. Fundamentals of Business Analytics RN Prasad and Seema Acharya

Chapter 6 Basics of Data Integration. Fundamentals of Business Analytics RN Prasad and Seema Acharya Chapter 6 Basics of Data Integration Fundamentals of Business Analytics Learning Objectives and Learning Outcomes Learning Objectives 1. Concepts of data integration 2. Needs and advantages of using data

More information

Methods Commission CLUB DE LA SECURITE DE L INFORMATION FRANÇAIS. 30, rue Pierre Semard, 75009 PARIS

Methods Commission CLUB DE LA SECURITE DE L INFORMATION FRANÇAIS. 30, rue Pierre Semard, 75009 PARIS MEHARI 2007 Overview Methods Commission Mehari is a trademark registered by the Clusif CLUB DE LA SECURITE DE L INFORMATION FRANÇAIS 30, rue Pierre Semard, 75009 PARIS Tél.: +33 153 25 08 80 - Fax: +33

More information

Software Engineering Compiled By: Roshani Ghimire Page 1

Software Engineering Compiled By: Roshani Ghimire Page 1 Unit 7: Metric for Process and Product 7.1 Software Measurement Measurement is the process by which numbers or symbols are assigned to the attributes of entities in the real world in such a way as to define

More information

WHITE PAPER DATA GOVERNANCE ENTERPRISE MODEL MANAGEMENT

WHITE PAPER DATA GOVERNANCE ENTERPRISE MODEL MANAGEMENT WHITE PAPER DATA GOVERNANCE ENTERPRISE MODEL MANAGEMENT CONTENTS 1. THE NEED FOR DATA GOVERNANCE... 2 2. DATA GOVERNANCE... 2 2.1. Definition... 2 2.2. Responsibilities... 3 3. ACTIVITIES... 6 4. THE

More information

March 2008 Grant Halverson CEO, GFG Group. Regional Processing Models

March 2008 Grant Halverson CEO, GFG Group. Regional Processing Models March 2008 Grant Halverson CEO, GFG Group Regional Processing Models The search for successful regional and global IT processing models has been a major focus of the last fifteen years across banks, insurance

More information

8 Critical Success Factors When Planning a CMS Data Migration

8 Critical Success Factors When Planning a CMS Data Migration 8 Critical Success Factors When Planning a CMS Data Migration Executive Summary The first step to success. This paper is loaded with critical information that will promote the likelihood of your success

More information

THIRD REGIONAL TRAINING WORKSHOP ON TAXATION. Brasilia, Brazil, December 3 5, 2002. Topic 4

THIRD REGIONAL TRAINING WORKSHOP ON TAXATION. Brasilia, Brazil, December 3 5, 2002. Topic 4 THIRD REGIONAL TRAINING WORKSHOP ON TAXATION Brasilia, Brazil, December 3 5, 2002 Topic 4 INFORMATION TECHNOLOGY IN SUPPORT OF THE TAX ADMINISTRATION FUNCTIONS AND TAXPAYER ASSISTANCE Nelson Gutierrez

More information

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT Mar 31, 2014 Japan Aerospace Exploration Agency This is an English translation of JERG-2-610. Whenever there is anything ambiguous in this document, the original

More information

Fundamentals of Measurements

Fundamentals of Measurements Objective Software Project Measurements Slide 1 Fundamentals of Measurements Educational Objective: To review the fundamentals of software measurement, to illustrate that measurement plays a central role

More information

META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING

META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 asistithod@gmail.com

More information

Principal MDM Components and Capabilities

Principal MDM Components and Capabilities Principal MDM Components and Capabilities David Loshin Knowledge Integrity, Inc. 1 Agenda Introduction to master data management The MDM Component Layer Model MDM Maturity MDM Functional Services Summary

More information

An Enterprise Architecture and Data quality framework

An Enterprise Architecture and Data quality framework An Enterprise Architecture and quality framework Jerome Capirossi - NATEA-Consulting jerome@capirossi.org http://capirossi.org, Pascal Rabier La Mutuelle Generale prabier@lamutuellegeneral.fr Abstract:

More information

10. System Evaluation

10. System Evaluation 200 Chapter 10. System Evaluation 10. System Evaluation 10.1. Introduction Evaluation of software and databases is a practice established for so long as that of developing software itself. (Martyn & Unwin

More information

The Requirements Compliance Matrix columns are defined as follows:

The Requirements Compliance Matrix columns are defined as follows: 1 DETAILED REQUIREMENTS AND REQUIREMENTS COMPLIANCE The following s Compliance Matrices present the detailed requirements for the P&I System. Completion of all matrices is required; proposals submitted

More information

Total Exploration & Production: Field Monitoring Case Study

Total Exploration & Production: Field Monitoring Case Study Total Exploration & Production: Field Monitoring Case Study 1 Summary TOTAL S.A. is a word-class energy producer and provider, actually part of the super majors, i.e. the worldwide independent oil companies.

More information

Master data deployment and management in a global ERP implementation

Master data deployment and management in a global ERP implementation Master data deployment and management in a global ERP implementation Contents Master data management overview Master data maturity and ERP Master data governance Information management (IM) Business processes

More information

Building an identity repository is at the heart of identity and access management.

Building an identity repository is at the heart of identity and access management. State of the art ID Synchronization for a multi-directory identity repository Building an identity repository is at the heart of identity and access management. In fact, no matter the quality of an access

More information

Chapter 11 MANAGEMENT CONTROL, REPORTING, INTERNAL AUDIT FUNCTIONS A uniform reporting system in all the subsidiaries allows effective management control and the production of a Group dashboard on a monthly

More information

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same!

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same! Software Metrics & Software Metrology Alain Abran Chapter 4 Quantification and Measurement are Not the Same! 1 Agenda This chapter covers: The difference between a number & an analysis model. The Measurement

More information

The challenges of becoming a Trusted Digital Repository

The challenges of becoming a Trusted Digital Repository The challenges of becoming a Trusted Digital Repository Annemieke de Jong is Preservation Officer at the Netherlands Institute for Sound and Vision (NISV) in Hilversum. She is responsible for setting out

More information

DATA GOVERNANCE METHODS. Model Driven Data Engineering added value

DATA GOVERNANCE METHODS. Model Driven Data Engineering added value Model Driven Data Engineering added value Vincent Ciselet (Ingénieur), Jean Henrard (Ph. D.), Jean-Marc Hick (Ph. D.), Frumence Mayala (Ingénieur), Christophe Mouchet (Ingénieur), Dominique Orban (Ingénieur),

More information

Author: Alexander Tsang (Posterita)

Author: Alexander Tsang (Posterita) WHY SHOULD YOU IMPLEMENT ADempiere IN YOUR COMPANY? 1. Traditional IT systems Traditional IT systems are mostly built around an accounting package and different software to cater for the apparently unrelated

More information

3SL. Requirements Definition and Management Using Cradle

3SL. Requirements Definition and Management Using Cradle 3SL Requirements Definition and Management Using Cradle November 2014 1 1 Introduction This white paper describes Requirements Definition and Management activities for system/product development and modification

More information

Data Management Implementation Plan

Data Management Implementation Plan Appendix 8.H Data Management Implementation Plan Prepared by Vikram Vyas CRESP-Amchitka Data Management Component 1. INTRODUCTION... 2 1.1. OBJECTIVES AND SCOPE... 2 2. DATA REPORTING CONVENTIONS... 2

More information

System Development and Life-Cycle Management (SDLCM) Methodology. Approval CISSCO Program Director

System Development and Life-Cycle Management (SDLCM) Methodology. Approval CISSCO Program Director System Development and Life-Cycle Management (SDLCM) Methodology Subject Type Standard Approval CISSCO Program Director A. PURPOSE This standard specifies content and format requirements for a Physical

More information

DATA GOVERNANCE AND DATA QUALITY

DATA GOVERNANCE AND DATA QUALITY DATA GOVERNANCE AND DATA QUALITY Kevin Lewis Partner Enterprise Management COE Barb Swartz Account Manager Teradata Government Systems Objectives of the Presentation Show that Governance and Quality are

More information

Chapter 10 Practical Database Design Methodology and Use of UML Diagrams

Chapter 10 Practical Database Design Methodology and Use of UML Diagrams Chapter 10 Practical Database Design Methodology and Use of UML Diagrams Copyright 2011 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 10 Outline The Role of Information Systems in

More information

Introduction to Database Systems

Introduction to Database Systems Introduction to Database Systems A database is a collection of related data. It is a collection of information that exists over a long period of time, often many years. The common use of the term database

More information

Analysis of Object Oriented Software by Using Software Modularization Matrix

Analysis of Object Oriented Software by Using Software Modularization Matrix Analysis of Object Oriented Software by Using Software Modularization Matrix Anup 1, Mahesh Kumar 2 1 M.Tech Student, 2 Assistant Professor, Department of Computer Science and Application, RPS College,

More information

Data Integration and ETL Process

Data Integration and ETL Process Data Integration and ETL Process Krzysztof Dembczyński Intelligent Decision Support Systems Laboratory (IDSS) Poznań University of Technology, Poland Software Development Technologies Master studies, second

More information

MEHARI 2010. Overview. April 2010. Methods working group. Please post your questions and comments on the forum: http://mehari.

MEHARI 2010. Overview. April 2010. Methods working group. Please post your questions and comments on the forum: http://mehari. MEHARI 2010 Overview April 2010 Methods working group Please post your questions and comments on the forum: http://mehari.info/ CLUB DE LA SECURITE DE L INFORMATION FRANÇAIS 30 rue Pierre Sémard, 75009

More information

Three Fundamental Techniques To Maximize the Value of Your Enterprise Data

Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Prepared for Talend by: David Loshin Knowledge Integrity, Inc. October, 2010 2010 Knowledge Integrity, Inc. 1 Introduction Organizations

More information

Patterns of Information Management

Patterns of Information Management PATTERNS OF MANAGEMENT Patterns of Information Management Making the right choices for your organization s information Summary of Patterns Mandy Chessell and Harald Smith Copyright 2011, 2012 by Mandy

More information

Build an effective data integration strategy to drive innovation

Build an effective data integration strategy to drive innovation IBM Software Thought Leadership White Paper September 2010 Build an effective data integration strategy to drive innovation Five questions business leaders must ask 2 Build an effective data integration

More information

Business Performance & Data Quality Metrics. David Loshin Knowledge Integrity, Inc. loshin@knowledge-integrity.com (301) 754-6350

Business Performance & Data Quality Metrics. David Loshin Knowledge Integrity, Inc. loshin@knowledge-integrity.com (301) 754-6350 Business Performance & Data Quality Metrics David Loshin Knowledge Integrity, Inc. loshin@knowledge-integrity.com (301) 754-6350 1 Does Data Integrity Imply Business Value? Assumption: improved data quality,

More information

HIPAA CRITICAL AREAS TECHNICAL SECURITY FOCUS FOR CLOUD DEPLOYMENT

HIPAA CRITICAL AREAS TECHNICAL SECURITY FOCUS FOR CLOUD DEPLOYMENT HIPAA CRITICAL AREAS TECHNICAL SECURITY FOCUS FOR CLOUD DEPLOYMENT A Review List This paper was put together with Security in mind, ISO, and HIPAA, for guidance as you move into a cloud deployment Dr.

More information

The Evolution of PACS Data Migration

The Evolution of PACS Data Migration Clinical Content Interoperability White Paper The Evolution of PACS Data Migration A Discussion of Current Migration Strategies Shannon Werb Chief Strategy Officer Chief Operating Officer Contents Introduction...

More information

Using Electronic Systems for Document Management in Economic Entities

Using Electronic Systems for Document Management in Economic Entities Informatica Economică, nr. 1 (41)/2007 27 Using Electronic Systems for Document Management in Economic Entities Anca MEHEDINŢU, Cerasela PÎRVU, Ion BULIGIU Faculty of Economic and Business Administration,

More information

WHY SHOULD YOUR COMPANY IMPLEMENT ADEMPIERE?

WHY SHOULD YOUR COMPANY IMPLEMENT ADEMPIERE? WHY SHOULD YOUR COMPANY IMPLEMENT ADEMPIERE? Authors Contribution Date Martine Lemillour (representing Posterita) Alexandre Tsang Mang Kin (representing Posterita) Joseph Brower (representing Nexus Computers)

More information

B.Com(Computers) II Year RELATIONAL DATABASE MANAGEMENT SYSTEM Unit- I

B.Com(Computers) II Year RELATIONAL DATABASE MANAGEMENT SYSTEM Unit- I B.Com(Computers) II Year RELATIONAL DATABASE MANAGEMENT SYSTEM Unit- I 1 1. What is Data? A. Data is a collection of raw information. 2. What is Information? A. Information is a collection of processed

More information

GAO LAND MANAGEMENT SYSTEMS. Major Software Development Does Not Meet BLM s Business Needs

GAO LAND MANAGEMENT SYSTEMS. Major Software Development Does Not Meet BLM s Business Needs GAO United States General Accounting Office Testimony Before the Subcommittee on Interior and Related Agencies, Committee on Appropriations, House of Representatives For Release on Delivery Expected at

More information

Requirements engineering

Requirements engineering Learning Unit 2 Requirements engineering Contents Introduction............................................... 21 2.1 Important concepts........................................ 21 2.1.1 Stakeholders and

More information

QUALITY MANUAL ISO 9001. Quality Management System

QUALITY MANUAL ISO 9001. Quality Management System Page 1 of 20 QUALITY MANUAL ISO 9001 Quality Management System Printed copies are not controlled unless marked "CONTROLLED". Upon receipt of this document, discard all previous copies. Page 2 of 20 Approval

More information

Data Model s Role in DaaS...3. The SQL Azure Use Case...4. Best Practices and Guidelines...5

Data Model s Role in DaaS...3. The SQL Azure Use Case...4. Best Practices and Guidelines...5 Introduction..... 3 Data Model s Role in DaaS....3 The SQL Azure Use Case.......4 Best Practices and Guidelines.......5 Conclusion...... 9 Glossary.....9 References.....10 About the Author.....10 PAGE

More information

POLAR IT SERVICES. Business Intelligence Project Methodology

POLAR IT SERVICES. Business Intelligence Project Methodology POLAR IT SERVICES Business Intelligence Project Methodology Table of Contents 1. Overview... 2 2. Visualize... 3 3. Planning and Architecture... 4 3.1 Define Requirements... 4 3.1.1 Define Attributes...

More information

Database Management. Chapter Objectives

Database Management. Chapter Objectives 3 Database Management Chapter Objectives When actually using a database, administrative processes maintaining data integrity and security, recovery from failures, etc. are required. A database management

More information

Keywords: inspection; nuclear material control and accountancy (NMC&A); quality management system

Keywords: inspection; nuclear material control and accountancy (NMC&A); quality management system French Domestic Safeguards inspections with regard to quality management system of the operators, in the field of nuclear material control and accountancy Julie LASNEL-PAYAN, Flavien LEMOINE, Bruno AUTRUSSON,

More information

Five Tips to Ensure Data Loss Prevention Success

Five Tips to Ensure Data Loss Prevention Success Five Tips to Ensure Data Loss Prevention Success A DLP Experts White Paper January, 2013 Author s Note The content of this white paper was developed independently of any vendor sponsors and is the sole

More information

SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK

SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK Office of Safety and Mission Assurance NASA-GB-9503 SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK AUGUST 1995 National Aeronautics and Space Administration Washington, D.C. 20546 PREFACE The growth in cost

More information

Basic Securities Reconciliation for the Buy Side

Basic Securities Reconciliation for the Buy Side Basic Securities Reconciliation for the Buy Side INTRODUCTION This paper focuses on the operational control requirements of a buy-side securities trading firm with particular reference to post trade reconciliation.

More information

Why enterprise data archiving is critical in a changing landscape

Why enterprise data archiving is critical in a changing landscape Why enterprise data archiving is critical in a changing landscape Ovum white paper for Informatica SUMMARY Catalyst Ovum view The most successful enterprises manage data as strategic asset. They have complete

More information

Cisco Change Management: Best Practices White Paper

Cisco Change Management: Best Practices White Paper Table of Contents Change Management: Best Practices White Paper...1 Introduction...1 Critical Steps for Creating a Change Management Process...1 Planning for Change...1 Managing Change...1 High Level Process

More information

Measurement with Ratios

Measurement with Ratios Grade 6 Mathematics, Quarter 2, Unit 2.1 Measurement with Ratios Overview Number of instructional days: 15 (1 day = 45 minutes) Content to be learned Use ratio reasoning to solve real-world and mathematical

More information

NEOXEN MODUS METHODOLOGY

NEOXEN MODUS METHODOLOGY NEOXEN MODUS METHODOLOGY RELEASE 5.0.0.1 INTRODUCTION TO QA & SOFTWARE TESTING GUIDE D O C U M E N T A T I O N L I C E N S E This documentation, as well as the software described in it, is furnished under

More information

Data Management Roadmap

Data Management Roadmap Data Management Roadmap A progressive approach towards building an Information Architecture strategy 1 Business and IT Drivers q Support for business agility and innovation q Faster time to market Improve

More information

Managing & Validating Research Data

Managing & Validating Research Data Research Management Standard Operating Procedure ISOP-H02 VERSION / REVISION: 2.0 EFFECTIVE DATE: 01 03 12 REVIEW DATE: 01 03 14 AUTHOR(S): CONTROLLER(S): APPROVED BY: Information Officer; NBT Clinical

More information

Information Technology Engineers Examination. Network Specialist Examination. (Level 4) Syllabus. Details of Knowledge and Skills Required for

Information Technology Engineers Examination. Network Specialist Examination. (Level 4) Syllabus. Details of Knowledge and Skills Required for Information Technology Engineers Examination Network Specialist Examination (Level 4) Syllabus Details of Knowledge and Skills Required for the Information Technology Engineers Examination Version 2.0

More information

Business Object Document (BOD) Message Architecture for OAGIS Release 9.+

Business Object Document (BOD) Message Architecture for OAGIS Release 9.+ Business Object Document (BOD) Message Architecture for OAGIS Release 9.+ an OAGi White Paper Document #20110408V1.0 Open standards that open markets TM Open Applications Group, Incorporated OAGi A consortium

More information

Database and Data Mining Security

Database and Data Mining Security Database and Data Mining Security 1 Threats/Protections to the System 1. External procedures security clearance of personnel password protection controlling application programs Audit 2. Physical environment

More information

2. MOTIVATING SCENARIOS 1. INTRODUCTION

2. MOTIVATING SCENARIOS 1. INTRODUCTION Multiple Dimensions of Concern in Software Testing Stanley M. Sutton, Jr. EC Cubed, Inc. 15 River Road, Suite 310 Wilton, Connecticut 06897 ssutton@eccubed.com 1. INTRODUCTION Software testing is an area

More information

Reaching CMM Levels 2 and 3 with the Rational Unified Process

Reaching CMM Levels 2 and 3 with the Rational Unified Process Reaching CMM Levels 2 and 3 with the Rational Unified Process Rational Software White Paper TP174 Table of Contents INTRODUCTION... 1 LEVEL-2, REPEATABLE... 3 Requirements Management... 3 Software Project

More information

Physical Design. Meeting the needs of the users is the gold standard against which we measure our success in creating a database.

Physical Design. Meeting the needs of the users is the gold standard against which we measure our success in creating a database. Physical Design Physical Database Design (Defined): Process of producing a description of the implementation of the database on secondary storage; it describes the base relations, file organizations, and

More information

Data Integration Alternatives Managing Value and Quality

Data Integration Alternatives Managing Value and Quality Solutions for Customer Intelligence, Communications and Care. Data Integration Alternatives Managing Value and Quality Using a Governed Approach to Incorporating Data Quality Services Within the Data Integration

More information

TERRITORY RECORDS OFFICE BUSINESS SYSTEMS AND DIGITAL RECORDKEEPING FUNCTIONALITY ASSESSMENT TOOL

TERRITORY RECORDS OFFICE BUSINESS SYSTEMS AND DIGITAL RECORDKEEPING FUNCTIONALITY ASSESSMENT TOOL TERRITORY RECORDS OFFICE BUSINESS SYSTEMS AND DIGITAL RECORDKEEPING FUNCTIONALITY ASSESSMENT TOOL INTRODUCTION WHAT IS A RECORD? AS ISO 15489-2002 Records Management defines a record as information created,

More information

QUALITY MANUAL ISO 9001:2015

QUALITY MANUAL ISO 9001:2015 Page 1 of 22 QUALITY MANUAL ISO 9001:2015 Quality Management System Page 1 of 22 Page 2 of 22 Sean Duclos Owner Revision History Date Change Notice Change Description 11/02/2015 1001 Original Release to

More information

Data quality and metadata

Data quality and metadata Chapter IX. Data quality and metadata This draft is based on the text adopted by the UN Statistical Commission for purposes of international recommendations for industrial and distributive trade statistics.

More information

How To Manage Risk With Sas

How To Manage Risk With Sas SOLUTION OVERVIEW SAS Solutions for Enterprise Risk Management A holistic view of risk of risk and exposures for better risk management Overview The principal goal of any financial institution is to generate

More information

DATA MANAGEMENT STANDARDS IN COMPUTER-AIDED ACQUISITION AND LOGISTIC SUPPORT (CALS)

DATA MANAGEMENT STANDARDS IN COMPUTER-AIDED ACQUISITION AND LOGISTIC SUPPORT (CALS) i....,.,... 1 ") I 1 1 1 1! I I 1 '1 1 1 I ) _ '' 1 " I / / DATA MANAGEMENT STANDARDS IN COMPUTER-AIDED ACQUISITION AND LOGISTIC SUPPORT (CALS) David K. Jefferson National Institute of Standards and Technology

More information

EM-SOS! from Sandhill Consultants

EM-SOS! from Sandhill Consultants Taming the Chaos of Uncontrolled Data Design: EM-SOS! from Sandhill Consultants Powered by Axis Software Designs Get the most from your CA ERwin data modeling investment with world-class professional services,

More information

ISSA Guidelines on Master Data Management in Social Security

ISSA Guidelines on Master Data Management in Social Security ISSA GUIDELINES ON INFORMATION AND COMMUNICATION TECHNOLOGY ISSA Guidelines on Master Data Management in Social Security Dr af t ve rsi on v1 Draft version v1 The ISSA Guidelines for Social Security Administration

More information

NIST Special Publication (SP) 800-64, Revision 2, Security Considerations in the System Development Life Cycle

NIST Special Publication (SP) 800-64, Revision 2, Security Considerations in the System Development Life Cycle THE SYSTEM DEVELOPMENT LIFE CYCLE (SDLC) Shirley Radack, Editor Computer Security Division Information Technology Laboratory National Institute of Standards and Technology The most effective way to protect

More information

2.1 The RAD life cycle composes of four stages:

2.1 The RAD life cycle composes of four stages: 2.1 The RAD life cycle composes of four stages: A typical RAD life cycle is composed of the following Stages 2.1.1. Requirements Planning; 2.1.2 User Design; 2.1.3 Rapid Construction; 2.1.4 Transition.

More information

Transforming Field Service Operations w ith Microsoft Dynamics NAV

Transforming Field Service Operations w ith Microsoft Dynamics NAV Transforming Field Service Operations w ith Microsoft Dynamics NAV Open Door Technology Inc. Date: May 2010 www.opendoor.ca 8 77.777.776 Contents Introduction... 3 Mobile Technology Needs for Field Services

More information

Measurement Information Model

Measurement Information Model mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides

More information

A Brief Analysis on Architecture and Reliability of Cloud Based Data Storage

A Brief Analysis on Architecture and Reliability of Cloud Based Data Storage Volume 2, No.4, July August 2013 International Journal of Information Systems and Computer Sciences ISSN 2319 7595 Tejaswini S L Jayanthy et al., Available International Online Journal at http://warse.org/pdfs/ijiscs03242013.pdf

More information

The CMDB at the Center of the Universe

The CMDB at the Center of the Universe The CMDB at the Center of the Universe Reg Harbeck CA Wednesday, February 27 Session 5331 Purpose Clarify origin of CMDB concept and what it is Understand difference and equivalence between CMDB and Asset

More information

AD Management Survey: Reveals Security as Key Challenge

AD Management Survey: Reveals Security as Key Challenge Contents How This Paper Is Organized... 1 Survey Respondent Demographics... 2 AD Management Survey: Reveals Security as Key Challenge White Paper August 2009 Survey Results and Observations... 3 Active

More information

Effective Release Management for HPOM Monitoring

Effective Release Management for HPOM Monitoring Whitepaper Effective Release Management for HPOM Monitoring Implementing high-quality ITIL-compliant release management processes for HPOM-based monitoring Content Overview... 3 Release Management... 4

More information

Regulatory Asset Management: Harmonizing Calibration, Maintenance & Validation Systems

Regulatory Asset Management: Harmonizing Calibration, Maintenance & Validation Systems Regulatory Asset Management: Harmonizing Calibration, Maintenance & Validation Systems 800.982.2388 1 Introduction Calibration, maintenance and validation activity, despite operating within the same department

More information

Data Integration and ETL Process

Data Integration and ETL Process Data Integration and ETL Process Krzysztof Dembczyński Institute of Computing Science Laboratory of Intelligent Decision Support Systems Politechnika Poznańska (Poznań University of Technology) Software

More information

4-06-55 Controlling Data Resources in Distributed Environments Barbara Grant

4-06-55 Controlling Data Resources in Distributed Environments Barbara Grant 4-06-55 Controlling Data Resources in Distributed Environments Barbara Grant Payoff As the role of data in an organization expands and data becomes increasingly related to profitability, the impact of

More information

KNOWLEDGE FACTORING USING NORMALIZATION THEORY

KNOWLEDGE FACTORING USING NORMALIZATION THEORY KNOWLEDGE FACTORING USING NORMALIZATION THEORY J. VANTHIENEN M. SNOECK Katholieke Universiteit Leuven Department of Applied Economic Sciences Dekenstraat 2, 3000 Leuven (Belgium) tel. (+32) 16 28 58 09

More information

Module 13. Software Reliability and Quality Management. Version 2 CSE IIT, Kharagpur

Module 13. Software Reliability and Quality Management. Version 2 CSE IIT, Kharagpur Module 13 Software Reliability and Quality Management Lesson 34 ISO 9000 Specific Instructional Objectives At the end of this lesson the student would be able to: State what is meant by ISO 9000 certification.

More information

PROCESSING & MANAGEMENT OF INBOUND TRANSACTIONAL CONTENT

PROCESSING & MANAGEMENT OF INBOUND TRANSACTIONAL CONTENT PROCESSING & MANAGEMENT OF INBOUND TRANSACTIONAL CONTENT IN THE GLOBAL ENTERPRISE A BancTec White Paper SUMMARY Reducing the cost of processing transactions, while meeting clients expectations, protecting

More information

Ex Libris Rosetta: A Digital Preservation System Product Description

Ex Libris Rosetta: A Digital Preservation System Product Description Ex Libris Rosetta: A Digital Preservation System Product Description CONFIDENTIAL INFORMATION The information herein is the property of Ex Libris Ltd. or its affiliates and any misuse or abuse will result

More information

Three Asset Lifecycle Management Fundamentals for Optimizing Cloud and Hybrid Environments

Three Asset Lifecycle Management Fundamentals for Optimizing Cloud and Hybrid Environments Three Asset Lifecycle Management Fundamentals for Optimizing Cloud and Hybrid Environments An ENTERPRISE MANAGEMENT ASSOCIATES (EMA ) White Paper Prepared for BMC April 2011 IT & DATA MANAGEMENT RESEARCH,

More information

Data Integration Alternatives Managing Value and Quality

Data Integration Alternatives Managing Value and Quality Solutions for Enabling Lifetime Customer Relationships Data Integration Alternatives Managing Value and Quality Using a Governed Approach to Incorporating Data Quality Services Within the Data Integration

More information

By Ramon Smitherman, Dream Catchers, Inc. Executive Summary Many companies today are in the mode of buying technology and then figuring out how to adopt their operations to fit its processing requirements.

More information

Data Quality Assessment. Approach

Data Quality Assessment. Approach Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source

More information

Security metrics to improve information security management

Security metrics to improve information security management Security metrics to improve information security management Igli TASHI, Solange GHERNAOUTIHÉLIE HEC Business School University of Lausanne Switzerland Abstract The concept of security metrics is a very

More information

Improving Service Asset and Configuration Management with CA Process Maps

Improving Service Asset and Configuration Management with CA Process Maps TECHNOLOGY BRIEF: SERVICE ASSET AND CONFIGURATION MANAGEMENT MAPS Improving Service Asset and Configuration with CA Process Maps Peter Doherty CA TECHNICAL SALES Table of Contents Executive Summary SECTION

More information

Chapter 1: Introduction

Chapter 1: Introduction Chapter 1: Introduction Database System Concepts, 5th Ed. See www.db book.com for conditions on re use Chapter 1: Introduction Purpose of Database Systems View of Data Database Languages Relational Databases

More information

How To Understand The Role Of Enterprise Architecture In The Context Of Organizational Strategy

How To Understand The Role Of Enterprise Architecture In The Context Of Organizational Strategy Enterprise Architecture in the Context of Organizational Strategy Sundararajan Vaidyanathan Senior Enterprise Architect, Unisys Introduction The Presidential Management Agenda (PMA) 1 is geared towards

More information

TKM COLLEGE OF ENGINEERING LIBRARY AUTOMATION SYSTEM

TKM COLLEGE OF ENGINEERING LIBRARY AUTOMATION SYSTEM Annals of Library and Information Studies 51, 2; 2004; 52-57 TKM COLLEGE OF ENGINEERING LIBRARY AUTOMATION SYSTEM Abdul Azeez T A TKM.Callege of Engineering Kallam Kerala - 691 005. The TKMCE Library Automation

More information

Fixed Scope Offering Fusion Financial Implementation

Fixed Scope Offering Fusion Financial Implementation Fixed Scope Offering Fusion Financial Implementation Mindtree limited 2015 Agenda Introduction Business Objectives Product Overview Key Implementation Features Implementation Packages & Timelines Cloud

More information

Real-Time Security for Active Directory

Real-Time Security for Active Directory Real-Time Security for Active Directory Contents The Need to Monitor and Control Change... 3 Reducing Risk and Standardizing Controls... 3 Integrating Change Monitoring... 4 Policy Compliance... 4 The

More information

Storage Guardian Remote Backup Restore and Archive Services

Storage Guardian Remote Backup Restore and Archive Services Storage Guardian Remote Backup Restore and Archive Services Storage Guardian is the unique alternative to traditional backup methods, replacing conventional tapebased backup systems with a fully automated,

More information

HELP DESK SYSTEMS. Using CaseBased Reasoning

HELP DESK SYSTEMS. Using CaseBased Reasoning HELP DESK SYSTEMS Using CaseBased Reasoning Topics Covered Today What is Help-Desk? Components of HelpDesk Systems Types Of HelpDesk Systems Used Need for CBR in HelpDesk Systems GE Helpdesk using ReMind

More information

Efficient database auditing

Efficient database auditing Topicus Fincare Efficient database auditing And entity reversion Dennis Windhouwer Supervised by: Pim van den Broek, Jasper Laagland and Johan te Winkel 9 April 2014 SUMMARY Topicus wants their current

More information