Enhancing DataQuality. Environments

Size: px
Start display at page:

Download "Enhancing DataQuality. Environments"

Transcription

1 Nothing is more likely to undermine the performance and business value of a data warehouse than inappropriate, misunderstood, or ignored data quality. Enhancing DataQuality in DataWarehouse Environments Decisions by senior management lay the groundwork for lower corporate levels to develop policies and procedures for various corporate activities. However, the potential business contribution of these activities depends on the quality of the decisions and in turn on the quality of the data used to make them. Some inputs are judgmental, others are from transactional systems, and still others are from external sources, but all must have a level of quality appropriate for the decisions they will be part of. Although concern about the quality of one s data is not new, what is fairly recent is using the same data for multiple purposes, which can be quite different from their original purposes. Users working with a particular data set come to know and internalize its deficiencies and idiosyncrasies. This knowledge is lost when data is made available to other parties, like when data needed for decision making is collected in repositories called data warehouses. Here, we offer a conceptual framework for enhancing data quality in data warehouse environments. We explore the factors that should be considered, such as the current level of data quality, the levels of quality needed by the relevant decision processes, and the potential benefits of projects designed to enhance data quality. Those responsible for data quality have to understand the importance of such factors, as well as the interaction among them. This understanding is mandatory in data warehousing environments characterized by multiple users with differing needs for data quality. For warehouses supporting a limited number of decision processes, awareness of these issues coupled with good judgment should suffice. For more complex situations, however, the number and diversity of trade-offs make reliance on judgment alone problematic. For such situations, we offer a methodology that Donald P. Ballou and Giri Kumar Tayi COMMUNICATIONS OF THE ACM January 1999/Vol. 42, No. 1 73

2 Part of the enhancement effort is to elicit from data users as precisely as possible what it is about the data they consider unsatisfactory. systematically incorporates various factors, including trade-offs, and guides the data warehousing manager in allocating resources by identifying the data quality enhancement projects that maximize value to the users of the data. Data warehousing efforts may not succeed for various reasons, but nothing is more certain to yield failure than lack of concern for the quality of the data. Ignore or trivialize problems with the existing data at the start of the project, and that oversight will brutally assert itself... [4]. Another reason warehousing efforts may not succeed is failure to store the appropriate data. Although this fact seems obvious, availability is still sometimes the sole criterion for storage. Data from outside sources and soft data may be ignored completely, even though such data is critical for many decision purposes. Although it might seem that data availability is not a data quality issue, users who regularly work with data include data availability as an attribute of data quality [12]. Data supporting organizational activities in a meaningful way should be warehoused. However, a particular data set may support several low-level organizational activities, whereas another supports only one activity but with higher priority. If a choice has to be made as to which data should be warehoused, how should a warehouse manager decide? Moreover, it may be relatively inexpensive to clean up a data set that is seldom used, but expensive to improve the quality of a frequently used data set. Again, if a choice has to be made, which data set should be worked on? There are other kinds of trade-offs. For example, it may be possible to improve the timeliness of certain data at the expense of completeness. Or it may be inexpensive to include some relatively unimportant data but costly to obtain unavailable but important data. All these trade-offs are in the context of limited resources available for improving the data s quality. A distinguishing characteristic of warehoused data is that it is used for decision making, rather than for operations. A particular data set may well support several decision processes. Support for several processes complicates data management, because these uses are likely to require different degrees of data quality. Research on enhancing data quality has for the most part been in the context of data supporting individual activities, usually of an operational nature [3]. Data warehousing efforts have to address several potential problems. For example, data from different sources may exhibit serious semantic differences. A classic case is the varying definitions of sales employed by different stakeholders in a corporation. Furthermore, data from various sources is likely to contain syntactic inconsistencies, which also have to be addressed. For example, there may well be discrepancies in the time periods for activity reports (such as bimonthly vs. every two weeks). Moreover, the desired data may simply not have been gathered. A significant characteristic of data warehouses is the prominent roles of soft data and historical data. Operational systems exclude soft data, yet such data is often critical for decision making. By soft data, we mean data whose quality is inherently uncertain; an example is human resources evaluations related to future task assignments involving subjective rankings. For a data warehouse to support decision making, soft data, though imperfect, has to be made available. Whereas operational systems focus on current data, decision making often involves temporal comparisons. Thus, historical data often has to be included in data warehouses. It is not sufficient to state that the data is wrong or not useful; such evaluations offer no guidance as to how warehouse managers should go about improving the data. To report that the data is inconsistent indicates a problem rather different from saying that the data is out of date. Part of the enhancement effort is to elicit from data users as precisely as possible what it is about the data they consider unsatisfactory. Determining what s wrong with the data is facilitated by being aware of the dimensions of data quality. 74 January 1999/Vol. 42, No. 1 COMMUNICATIONS OF THE ACM

3 It has long been recognized that data is best described or characterized via multiple attributes, or dimensions. For example, in 1985, Ballou and Pazer identified four dimensions of data quality: accuracy, completeness, consistency, and timeliness [1]. More recently, Wang and Strong [12] analyzed the various attributes of data quality from the perspective of the people using the data. They identified a full set of data quality dimensions, adding believability, value added, interpretability, accessibility, and others to the earlier four. These dimensions were then grouped into four broad categories: intrinsic, contextual, representational, and accessibility. For example, accuracy belongs to intrinsic; completeness and timeliness to contextual; consistency to representational; and availability to accessibility. The earliest work in data quality was carried out by the accounting profession (such as [5]). Several studies have looked at the social and economic effects of inadequate data quality [6, 9]. The consequences of poor data quality have been explored using modeling approaches [1]. Various studies have sought to determine how to improve the quality of data [7, 8]. And other work has examined the nature of data quality [10 12]. Think Systematically For enhancement efforts to be worthwhile, users and data warehouse managers alike have to think systematically about what is required. And organizations have to distinguish between what is vital and what is merely desirable. Data warehouses are developed to support some subset of the organization s activities. For example, in the marketing realm, success depends on such organizational activities as market segmentation, brand and sales management, and marketing strategy. These activities are fundamental to business profitability and success. To make decisions, managers need access to both historical and soft data. Historical data can include sales by product, region, and time period; soft data can include customer preference surveys, fashion trends, and sales forecasts. For data quality enhancement efforts to succeed, the data warehouse manager should first ascertain from decision makers what organizational activities the data warehouse has to support. Some mechanism has to be used to determine the priority of these activities. All other factors being equal, data quality enhancement efforts should be directed first to the activities with the highest priority. Identifying data sets required to support the targeted organizational activities is also needed. By data set, we mean some clearly distinguishable collection of data, such as a traditional file or some aggregation of external data. Some data sets may not currently exist but can be created at reasonable cost and effort. It is important to note that each data set potentially supports several organizational activities simultaneously. Finally, existing and potential problems with the data set have to be identified. The dimensions of data quality guide this determination. For example, if the required data set does not exist, then its data quality is deficient on the availability dimension. If the data exists but for whatever reason cannot be obtained, then accessibility is a problem. If there are semantic discrepancies within the data set, then interpretability is problematical. A data set can be deficient on more than one dimension. For example, a data set may not be accessible and may also be incomplete. It is assumed that various projects can be undertaken to improve the quality of data. One might be to resolve syntactic differences among customer data records. Another might be to obtain regional sales data on a more timely basis. Another could be to identify and enforce a common definition of sales (at least as far as the warehouse is concerned). Another could involve obtaining external data regarding competitors activities a form of soft data not currently available. And yet another might be to extract a relevant subset of some transaction file. Each one influences the quality of one or more data sets that in turn influences one or more of the organization s activities. To make these ideas more precise, we introduce the following index notation: I. Index for organizational activities supported by a data warehouse J. Index for data sets K. Index for data quality attributes or dimensions L. Index for possible data quality projects Consider the following scenario. The data warehouse supports three organizational activities: production planning (I=1), sales tracking (I=2), and promotion effectiveness (I=3). These activities are supported by three data sets: an inventory file (J=1), an historical sales file (J=2), and a promotional activity file (J=3). In terms of the quality dimensions, the inventory file is inaccurate (K=1, accuracy) and somewhat out of date (K=2, timeliness). However, the sales file is accurate (K=1) but has a monthly lag (K=2). The promotional activity file is incomplete (K=3, completeness), containing only aggregate data. To enhance the quality of these files, we undertake several projects to eliminate about 50% of the errors in the inventory file (L=1); make the sales file more timely, even if it sacrifices some accuracy (L=2); and tie the promotional file more closely to COMMUNICATIONS OF THE ACM January 1999/Vol. 42, No. 1 75

4 the actual sales (L=3). In practice, limited resources preclude undertaking all proposed projects. A number of factors influence data quality enhancement projects. In practice, however, determining them precisely is quite difficult. Current quality CQ(J,K). The current quality of each data set J is evaluated on each relevant data quality dimension K. For this scenario, the current accuracy CQ(J=1,K=1) and timeliness CQ(J=1,K=2) of the inventory file, among others, has to be evaluated. Required quality RQ(I,J,K). This factor represents the level of data quality required by activity I, which uses data set J and depends on dimension K. Data warehousing implicitly assumes that the stored data is used for more than one purpose and the quality requirements for them can vary dramatically. For one use (such as aggregate production planning) of a particular data set (sales forecasts), order-of-magnitude correctness may suffice; for another (such as item-level planning), each value must be precise. The required quality also depends on the data quality dimensions. In this scenario, the sales data set (J=2) supports the organizational activities production planning (I=1) and sales tracking (I=2). The current timeliness (K=2) of the sales file is more than adequate for sales tracking but inadequate for production planning. For data sets used by more than one application, the current quality of a particular dimension may be insufficient for some of the applications and more than sufficient for others. Anticipated quality AQ(J,K;L). This factor represents the quality of data set J on dimension K resulting from undertaking project L. The people responsible for data quality can undertake projects affecting the quality of various data sets, but may well affect the various dimensions of data quality in different ways. It is quite possible that efforts to improve the quality of a particular dimension diminishes the quality on another dimension; an example is the trade-off between accuracy and timeliness [2]. So, if project L=2 (to make sales file J=2 more timely) is undertaken, then the timeliness dimension (K=2) would improve, but accuracy (K=1) would suffer. Although data quality projects usually seek to improve data quality, such a goal is not always on the agenda. For example, suppose that for a certain class of data much more data is gathered and stored than is needed. It might well make sense to actually lower the level of completeness for the data set. The reason would be to save resources. The metric used to measure each of the three factors CQ, RQ, and AQ has to be the same. One could use the domain [0,1], with 0 representing the worst case and 1 the best [2]. For example, in regard to the inventory file, suppose CQ(1,1) = 0.6 (60% of the items are accurate). And for project L=1, which seeks to eliminate 50% of the errors, then AQ(1,1,1) = 0.8. Other possible metrics could be based on ordinal scales. Priority of organizational activity, Weight(I). Some organizational activities are clearly more important than others. Weight (I) represents the priority of activity I. All factors being equal, common sense dictates that data sets supporting high-priority activities receive preference over those involved in low-priority activities. Weight(I) should satisfy O< Weight(I)<1, and the weights should add up to 1. Cost of data quality enhancement, Cost(L). The cost of undertaking project L is represented by Cost(L). The various data-quality enhancement projects involve commitment of such resources as funds, personnel, and time, all of which are inherently limited. There are also various crosscurrents between the Cost(L) and the Weight(I), including the question: Is it better to undertake a project that is expensive and supports an activity whose weight is large or to undertake a project that might support several activities, each of moderate importance? The only restriction on the measurement of the Cost(L) values is that the units are the same in all cases. Cost(L) values should be viewed as the total cost associated with project L over some period of time, such as fiscal year or budget cycle. The Cost(L) values include not only the cost of actual project work but the costs associated with ongoing data-qualityrelated activities necessitated by the project. This interpretation of Cost(L) means it is possible for Cost(L) to be negative. For example, should the current quality of a particular data set be better than required, so reducing the quality saves resources, resulting in a negative Cost(L)?; an example is storing summary rather than unneeded detail data. Value added, Utility(I,J,K;L). This factor represents the change in value or utility for organizational activity I should project L be undertaken. Since project L could affect different dimensions of different data sets required by activity I, the utility depends explicitly on each relevant data set J and dimension K. Several considerations regarding the utility values have to be addressed; for example, if a particular project L is undertaken, any data set is totally unaffected; for such a data set, Utility(I,J,K;L) = 0. The utility can be either positive (the project enhances data quality) or negative (the project does not enhance data quality). There is no value in improving quality more than required. And finally, a 76 January 1999/Vol. 42, No. 1 COMMUNICATIONS OF THE ACM

5 U(1,1,1;1) = U(Production Planning, Inventory File, Accuracy; Inventory File Project) U(1,2,1; 2) = U(Production Planning, Sales File, Accuracy; Sales File Project) U(1,2,2;2) = U(Production Planning, Sales File, Timeliness; Sales File Project) U(2,2,1;2) = U(Sales tracking, Sales File, Accuracy; Sales File Project) U(2,2,2;2) = U(Sales tracking, Sales File, Timeliness; Sales File project) U(3,3,3;3) = U(Promotional effectiveness, Promotional file, Completeness, Promotional file project) U(3,2,1;3) = U(Promotional effectiveness, Sales file, Accuracy; Sales file project) U(3,2,2;2) = U(Promotional effectiveness, Sales file, Timeliness; Sales file project) Format: U (I: Organizational Activity; J: Data Set; K: DQ Dimensions; L: DQ Project) particular project need not, and usually will not, completely remove all data quality deficiencies. A logical consequence of these observations is that for each organizational activity I, there is a conceptual relationship between the magnitude of change in data quality and the value for Utility(I,J,K;L). In practice, determining all the functional relationships between the change in data quality and Utility(I,J,K;L) would be daunting. The simplest way to make that determination would be to use whatever knowledge the organization has to estimate all the non-zero Utility(I,J,K;L) values. Value of Project L = Weight(I) Utility(I,J,K;L) In our scenario, there are = 81 potential Utility(I,J,K;L) values. I has three values, as do J, K, and L. However, only eight such values have to be estimated, as the others are zero (see Table 1). Obtaining the Utility (I,J,K;L) estimates may be difficult and imprecise but nevertheless has to be done. After all, warehouse managers can t make an intelligent decision about which projects should be undertaken unless they are aware of what data sets and data quality dimensions are affected by a particular project, as well as the relative magnitude of the benefit of the effect. Even more important, this process forces warehouse managers to think systematically about data quality enhancement. Which Projects First? For relatively simple situations, these factors and issues can guide data quality enhancement of warehoused data. However, as the number of organizational activities supported, data sets involved, and potential enhancement projects increases, it becomes difficult to cope with the various trade-offs and Table 1. Required U(I,J,K;L) values for the illustrative scenario All 1 impossible to determine an optimal project mix. For such situations, an integer programming model can help identify the subset of projects that best enhance the quality of the warehoused data. See Table 2 for a mathematical formulation based on this model s description. Each project is associated with a weight Weight(I) and a set of utilities Utility(I,J,K;L). The weighted sum of these utilities produces an overall worth for project L, call it Value(L). Note that Value(L) captures the fact that a particular data quality project can influence several dimensions of the ALL J All K same data set as well as multiple data sets involved in various organizational activities. Optimization models include an objective function that, in this case, is the sum of the Value(L) of all selected projects. There are several constraints, in addition to the objective function. For example, the sum of the cost of all projects selected cannot exceed the budget (Resource Constraint). Suppose that projects (P(1), P(2),..., P(S) are mutually exclusive, so only one can be carried out. A constraint is needed to select at most one such project (Exclusiveness Constraint). Also suppose that, for example, projects P(1) and P(2) overlap in the sense they share various aspects. Then develop a new project P(3) representing both P(1) and P(2), and select at most one of them (Interaction Constraint). Such constraints are not necessarily exhaustive, and as in all modeling, it can be refined. For example, if appropriate, Cost(L) can be organized into fixed and variable costs. Implementation of this integer-programming Maximize: Total Value from all projects X(L) * Value(L) All L Resource Constraint: X(L) * Cost(L) Budget L Exclusiveness Constraint: X(P(1)) + X(P(2)) + + X(P(S)) 1 Interaction Constraint: X(P(1)) + X(P(2)) + X(P(3)) 1 Integer Constraints: 1 if project L is selected; 0 otherwise 0 X(L) = { 1 Table 2. Integer programming model formulation COMMUNICATIONS OF THE ACM January 1999/Vol. 42, No. 1 77

6 model identifies data quality projects that would increase to the greatest possible extent the utility of the warehoused data, systematically incorporating various trade-offs, including the value of obtaining data sets not currently available and evaluating gains from reducing the quantity of stored data in certain cases. However, the difficulty implementing any model is in obtaining values for model parameters. Therefore, for our model, the data warehouse manager has to: Determine the organizational activities the data warehouse will support; Identify all sets of data needed to support the organizational activities; Estimate the quality of each data set on each relevant data quality dimension; Identify a set of potential projects (and their cost) that could be undertaken for enhancing or affecting data quality; Estimate for each project the likely effect of that project on the quality of the various data sets, by data quality dimension; and Determine for each project, data set, and relevant data quality dimension the change in utility should a particular project be undertaken. The first two items apply even if data quality is not addressed. The third and fourth are required if a data warehouse manager wants to improve data quality. A problem arises with the third item in that the task of measuring data quality is not yet fully solved [11]. The difficulty in implementing the model resides in the fifth and sixth items. However, a model can always be simplified at the expense of its ability to capture reality. One of the model s complexities arises from the need to evaluate the quality of the data sets on the relevant dimensions. This evaluation is necessary if trade-offs regarding the dimensions of data quality represent a significant issue. If the trade-offs are not important, then an overall quality measure for each data set is adequate. Thus, the current quality CQ would depend on the data set only; the required quality RQ on the activity and data set only; the anticipated quality AQ on the data set and project only; and the change in utility on the activity, data set, and project only. Ignoring the trade-offs would substantially reduce the effort of estimating parameters. The model can be formulated using a continuous measurement scale, implying CQ is some value between 0 and 1, and U(I,J,K;L) is a continuous function of the change in data quality. Implementation of the model can be simplified considerably by converting to a discrete version. For example, CQ could be qualitatively evaluated as low, medium, or high, and the change in utility of a data set on a certain dimension resulting from a specific project could also be so measured. Finally, these qualitative estimates low, medium, high would be converted into corresponding numerical values, such as 1, 2, 3. Conclusion Data warehousing depends on integrating data quality assurance into all warehousing phases planning, implementation, and maintenance. Since warehoused data is accessed by users with varying needs, data quality activities should be balanced to best support organizational activities. Therefore, the warehouse manager has to be aware of the various issues involved in enhancing data quality. That s why we presented ideas and described a model to support quality enhancement while highlighting relevant issues and factors. Combined, they are a framework for thinking comprehensively and systematically about data quality enhancement in data warehouse environments. c References 1. Ballou, D., and Pazer, H. Modeling data and process quality in multiinput, multi-output information systems. Management Science 31, 2 (Feb. 1985), Ballou, D., and Pazer, H. Designing information systems to optimize the accuracy-timeliness trade-off. Information Systems Research 6, 1 (Mar. 1995), Ballou, D., and Tayi, G. Methodology for allocating resources for data quality enhancement. Commun. ACM 32, 3 (Mar. 1989), Celko, J., and McDonald, J. Don t warehouse dirty data. Datamation 41, 19 (Oct. 1995), Cushing, B. A mathematical approach to the analysis and design of internal control systems. Accounting Review 49, 1 (Jan. 1974), Laudon, K. Data quality and due process in large interorganizational record systems. Commun. ACM 29, 1 (Jan. 1986), Morey, R.C. Estimating and improving the quality of information in an MIS. Commun. ACM 25, 5 (May 1982), Redman, T. Data Quality: Management and Technology. Bantam Books, New York, Strong, D., and Miller, S. Exceptions and exception handling in computerized information processes. ACM Transactions on Information Systems 13, 2 (Apr. 1995), Wand, Y. and Wang, R. Anchoring data quality dimensions ontological foundations. Commun. ACM 39, 11 (Nov. 1996), Wang, R., Storey, V., and Firth C. A framework for analysis of data quality research. IEEE Transactions on Knowledge and Data Engineering 7, 4 (Aug. 1995), Wang, R.Y. and Strong, D.M. Beyond accuracy: What data quality means to data consumers. Journal of Management Information Systems 12, 4 (Spring 1996), Donald P. Ballou (d.ballou@cnsibm.albany.edu) is an associate professor of management science and information systems in the School of Business at the State University of New York, Albany. Giri Kumar Tayi (g.tayi@cnsibm.albany.edu) is an associate professor of management science and information systems in the School of Business at the State University of New York, Albany ACM /99/0100 $ January 1999/Vol. 42, No. 1 COMMUNICATIONS OF THE ACM

Data Quality Assessment

Data Quality Assessment Data Quality Assessment Leo L. Pipino, Yang W. Lee, and Richard Y. Wang How good is a company s data quality? Answering this question requires usable data quality metrics. Currently, most data quality

More information

Quality. Data. In Context

Quality. Data. In Context Diane M. Strong, Yang W. Lee, and Richard Y. Wang Data A new study reveals businesses are defining Quality data quality with the consumer in mind. In Context DATA-QUALITY (DQ) PROBLEMS ARE INCREASINGLY

More information

TOWARD A FRAMEWORK FOR DATA QUALITY IN ELECTRONIC HEALTH RECORD

TOWARD A FRAMEWORK FOR DATA QUALITY IN ELECTRONIC HEALTH RECORD TOWARD A FRAMEWORK FOR DATA QUALITY IN ELECTRONIC HEALTH RECORD Omar Almutiry, Gary Wills and Richard Crowder School of Electronics and Computer Science, University of Southampton, Southampton, UK. {osa1a11,gbw,rmc}@ecs.soton.ac.uk

More information

Measurement Information Model

Measurement Information Model mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides

More information

Building a Data Quality Scorecard for Operational Data Governance

Building a Data Quality Scorecard for Operational Data Governance Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...

More information

Creating real business intelligence: seven key principles for MIS professionals

Creating real business intelligence: seven key principles for MIS professionals Creating real business intelligence: seven key principles for MIS professionals Laurence Trigwell Senior Director of Financial Services, Cognos Corporation For MIS professionals, an enterprise-wide business

More information

META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING

META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 asistithod@gmail.com

More information

Appendix B Data Quality Dimensions

Appendix B Data Quality Dimensions Appendix B Data Quality Dimensions Purpose Dimensions of data quality are fundamental to understanding how to improve data. This appendix summarizes, in chronological order of publication, three foundational

More information

Report on the Dagstuhl Seminar Data Quality on the Web

Report on the Dagstuhl Seminar Data Quality on the Web Report on the Dagstuhl Seminar Data Quality on the Web Michael Gertz M. Tamer Özsu Gunter Saake Kai-Uwe Sattler U of California at Davis, U.S.A. U of Waterloo, Canada U of Magdeburg, Germany TU Ilmenau,

More information

Data Quality in Health Care Data Warehouse Environments

Data Quality in Health Care Data Warehouse Environments Quality in Health Care Warehouse Environments Robert L. Leitheiser University of Wisconsin Whitewater Abstract quality has become increasingly important to many firms as they build data warehouses and

More information

Industry Environment and Concepts for Forecasting 1

Industry Environment and Concepts for Forecasting 1 Table of Contents Industry Environment and Concepts for Forecasting 1 Forecasting Methods Overview...2 Multilevel Forecasting...3 Demand Forecasting...4 Integrating Information...5 Simplifying the Forecast...6

More information

Online Supplement: A Mathematical Framework for Data Quality Management in Enterprise Systems

Online Supplement: A Mathematical Framework for Data Quality Management in Enterprise Systems Online Supplement: A Mathematical Framework for Data Quality Management in Enterprise Systems Xue Bai Department of Operations and Information Management, School of Business, University of Connecticut,

More information

Dimensions of Statistical Quality

Dimensions of Statistical Quality Inter-agency Meeting on Coordination of Statistical Activities SA/2002/6/Add.1 New York, 17-19 September 2002 22 August 2002 Item 7 of the provisional agenda Dimensions of Statistical Quality A discussion

More information

GAO. Assessing the Reliability of Computer-Processed Data. Applied Research and Methods. United States Government Accountability Office GAO-09-680G

GAO. Assessing the Reliability of Computer-Processed Data. Applied Research and Methods. United States Government Accountability Office GAO-09-680G GAO United States Government Accountability Office Applied Research and Methods July 2009 External Version I Assessing the Reliability of Computer-Processed Data GAO-09-680G Contents Preface 1 Section

More information

Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER

Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER Populating a Data Quality Scorecard with Relevant Metrics WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Useful vs. So-What Metrics... 2 The So-What Metric.... 2 Defining Relevant Metrics...

More information

A General Approach to Incorporate Data Quality Matrices into Data Mining Algorithms

A General Approach to Incorporate Data Quality Matrices into Data Mining Algorithms A General Approach to Incorporate Data Quality Matrices into Data Mining Algorithms Ian Davidson 1st author's affiliation 1st line of address 2nd line of address Telephone number, incl country code 1st

More information

TIPS DATA QUALITY STANDARDS ABOUT TIPS

TIPS DATA QUALITY STANDARDS ABOUT TIPS 2009, NUMBER 12 2 ND EDITION PERFORMANCE MONITORING & EVALUATION TIPS DATA QUALITY STANDARDS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related to performance

More information

Internal Audit. Audit of the Inventory Control Framework

Internal Audit. Audit of the Inventory Control Framework Internal Audit Audit of the Inventory Control Framework June 2010 Table of Contents EXECUTIVE SUMMARY...4 1. INTRODUCTION...7 1.1 BACKGROUND...7 1.2 OBJECTIVES...7 1.3 SCOPE OF THE AUDIT...7 1.4 METHODOLOGY...8

More information

SECURITY METRICS: MEASUREMENTS TO SUPPORT THE CONTINUED DEVELOPMENT OF INFORMATION SECURITY TECHNOLOGY

SECURITY METRICS: MEASUREMENTS TO SUPPORT THE CONTINUED DEVELOPMENT OF INFORMATION SECURITY TECHNOLOGY SECURITY METRICS: MEASUREMENTS TO SUPPORT THE CONTINUED DEVELOPMENT OF INFORMATION SECURITY TECHNOLOGY Shirley Radack, Editor Computer Security Division Information Technology Laboratory National Institute

More information

The Discussion Paper. Conceptual Framework of Financial Accounting

The Discussion Paper. Conceptual Framework of Financial Accounting The Discussion Paper Conceptual Framework of Financial Accounting Accounting Standards Board of Japan December 2006 (Tentative translation: 16 Mar. 2007) Contents Preface 1 Chapter 1 Objectives of Financial

More information

IS YOUR DATA WAREHOUSE SUCCESSFUL? Developing a Data Warehouse Process that responds to the needs of the Enterprise.

IS YOUR DATA WAREHOUSE SUCCESSFUL? Developing a Data Warehouse Process that responds to the needs of the Enterprise. IS YOUR DATA WAREHOUSE SUCCESSFUL? Developing a Data Warehouse Process that responds to the needs of the Enterprise. Peter R. Welbrock Smith-Hanley Consulting Group Philadelphia, PA ABSTRACT Developing

More information

Data Warehouse Quality Management Model

Data Warehouse Quality Management Model ABSTRACT A SIMPLIFIED APPROACH FOR QUALITY MANAGEMENT IN DATA WAREHOUSE Vinay Kumar 1 and Reema Thareja 2* 1 Professor, Department of IT, VIPS, GGSIPU, New Delhi 110 088, India 2 Assistant Professor SPM

More information

Modern Systems Analysis and Design

Modern Systems Analysis and Design Modern Systems Analysis and Design Prof. David Gadish Structuring System Data Requirements Learning Objectives Concisely define each of the following key data modeling terms: entity type, attribute, multivalued

More information

An Exploratory Study of Data Quality Management Practices in the ERP Software Systems Context

An Exploratory Study of Data Quality Management Practices in the ERP Software Systems Context An Exploratory Study of Data Quality Management Practices in the ERP Software Systems Context Michael Röthlin michael.roethlin@iwi.unibe.ch Abstract: Quality data are not only relevant for successful Data

More information

[Refer Slide Time: 05:10]

[Refer Slide Time: 05:10] Principles of Programming Languages Prof: S. Arun Kumar Department of Computer Science and Engineering Indian Institute of Technology Delhi Lecture no 7 Lecture Title: Syntactic Classes Welcome to lecture

More information

BEST PRACTICES IN DEMAND AND INVENTORY PLANNING

BEST PRACTICES IN DEMAND AND INVENTORY PLANNING WHITEPAPER BEST PRACTICES IN DEMAND AND INVENTORY PLANNING for Food & Beverage Companies WHITEPAPER BEST PRACTICES IN DEMAND AND INVENTORY PLANNING 2 ABOUT In support of its present and future customers,

More information

Assessing Your Business Analytics Initiatives

Assessing Your Business Analytics Initiatives Assessing Your Business Analytics Initiatives Eight Metrics That Matter WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 The Metrics... 1 Business Analytics Benchmark Study.... 3 Overall

More information

virtual class local mappings semantically equivalent local classes ... Schema Integration

virtual class local mappings semantically equivalent local classes ... Schema Integration Data Integration Techniques based on Data Quality Aspects Michael Gertz Department of Computer Science University of California, Davis One Shields Avenue Davis, CA 95616, USA gertz@cs.ucdavis.edu Ingo

More information

Introduction to Strategic Supply Chain Network Design Perspectives and Methodologies to Tackle the Most Challenging Supply Chain Network Dilemmas

Introduction to Strategic Supply Chain Network Design Perspectives and Methodologies to Tackle the Most Challenging Supply Chain Network Dilemmas Introduction to Strategic Supply Chain Network Design Perspectives and Methodologies to Tackle the Most Challenging Supply Chain Network Dilemmas D E L I V E R I N G S U P P L Y C H A I N E X C E L L E

More information

Capgemini Financial Services. 29 July 2010

Capgemini Financial Services. 29 July 2010 Regulatory Compliance: The critical importance of data quality Capgemini Financial Services ACORD IT Club Presentation 29 July 2010 Confidentiality Agreement Notice to the Recipient of this Document The

More information

Chapter 4 SUPPLY CHAIN PERFORMANCE MEASUREMENT USING ANALYTIC HIERARCHY PROCESS METHODOLOGY

Chapter 4 SUPPLY CHAIN PERFORMANCE MEASUREMENT USING ANALYTIC HIERARCHY PROCESS METHODOLOGY Chapter 4 SUPPLY CHAIN PERFORMANCE MEASUREMENT USING ANALYTIC HIERARCHY PROCESS METHODOLOGY This chapter highlights on supply chain performance measurement using one of the renowned modelling technique

More information

Assessing the Appropriate Level of Project, Program, and PMO Structure

Assessing the Appropriate Level of Project, Program, and PMO Structure PMI Virtual Library 2011 Daniel D. Magruder Assessing the Appropriate Level of Project, Program, and PMO Structure By Daniel D. Magruder, PMP Executive Summary Does your organization have in-flight projects

More information

Healthcare Measurement Analysis Using Data mining Techniques

Healthcare Measurement Analysis Using Data mining Techniques www.ijecs.in International Journal Of Engineering And Computer Science ISSN:2319-7242 Volume 03 Issue 07 July, 2014 Page No. 7058-7064 Healthcare Measurement Analysis Using Data mining Techniques 1 Dr.A.Shaik

More information

Internal Audit. Audit of HRIS: A Human Resources Management Enabler

Internal Audit. Audit of HRIS: A Human Resources Management Enabler Internal Audit Audit of HRIS: A Human Resources Management Enabler November 2010 Table of Contents EXECUTIVE SUMMARY... 5 1. INTRODUCTION... 8 1.1 BACKGROUND... 8 1.2 OBJECTIVES... 9 1.3 SCOPE... 9 1.4

More information

On the Setting of the Standards and Practice Standards for. Management Assessment and Audit concerning Internal

On the Setting of the Standards and Practice Standards for. Management Assessment and Audit concerning Internal (Provisional translation) On the Setting of the Standards and Practice Standards for Management Assessment and Audit concerning Internal Control Over Financial Reporting (Council Opinions) Released on

More information

Measuring the Performance of an Agent

Measuring the Performance of an Agent 25 Measuring the Performance of an Agent The rational agent that we are aiming at should be successful in the task it is performing To assess the success we need to have a performance measure What is rational

More information

Key performance indicators

Key performance indicators Key performance indicators Winning tips and common challenges Having an effective key performance indicator (KPI) selection and monitoring process is becoming increasingly critical in today s competitive

More information

Answers to Review Questions

Answers to Review Questions Tutorial 2 The Database Design Life Cycle Reference: MONASH UNIVERSITY AUSTRALIA Faculty of Information Technology FIT1004 Database Rob, P. & Coronel, C. Database Systems: Design, Implementation & Management,

More information

Busting 7 Myths about Master Data Management

Busting 7 Myths about Master Data Management Knowledge Integrity Incorporated Busting 7 Myths about Master Data Management Prepared by: David Loshin Knowledge Integrity, Inc. August, 2011 Sponsored by: 2011 Knowledge Integrity, Inc. 1 (301) 754-6350

More information

Achieving ITSM Excellence Through Availability Management

Achieving ITSM Excellence Through Availability Management Achieving ITSM Excellence Through Availability Management Technology Concepts and Business Considerations Abstract This white paper outlines the motivation behind Availability Management, and describes

More information

5 Best Practices for SAP Master Data Governance

5 Best Practices for SAP Master Data Governance 5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 www.winshuttle.com Introduction

More information

Reusing Meta-Base to Improve Information Quality

Reusing Meta-Base to Improve Information Quality Reusable Conceptual Models as a Support for the Higher Information Quality 7DWMDQD :HO]HU %UXQR 6WLJOLF,YDQ 5R]PDQ 0DUMDQ 'UXåRYHF University of Maribor Maribor, Slovenia ABSTRACT Today we are faced with

More information

Demand forecasting & Aggregate planning in a Supply chain. Session Speaker Prof.P.S.Satish

Demand forecasting & Aggregate planning in a Supply chain. Session Speaker Prof.P.S.Satish Demand forecasting & Aggregate planning in a Supply chain Session Speaker Prof.P.S.Satish 1 Introduction PEMP-EMM2506 Forecasting provides an estimate of future demand Factors that influence demand and

More information

Data Analytics in Organisations and Business

Data Analytics in Organisations and Business Data Analytics in Organisations and Business Dr. Isabelle E-mail: isabelle.flueckiger@math.ethz.ch 1 Data Analytics in Organisations and Business Some organisational information: Tutorship: Gian Thanei:

More information

Analysis One Code Desc. Transaction Amount. Fiscal Period

Analysis One Code Desc. Transaction Amount. Fiscal Period Analysis One Code Desc Transaction Amount Fiscal Period 57.63 Oct-12 12.13 Oct-12-38.90 Oct-12-773.00 Oct-12-800.00 Oct-12-187.00 Oct-12-82.00 Oct-12-82.00 Oct-12-110.00 Oct-12-1115.25 Oct-12-71.00 Oct-12-41.00

More information

Comparative Analysis of ROTC, OCS and Service Academies as Commissioning Sources

Comparative Analysis of ROTC, OCS and Service Academies as Commissioning Sources Advanced Management Program November 19, 2004 Comparative Analysis of ROTC, OCS and Service Academies as Commissioning Sources Team: 2 NON-ATTRIBUTION STATEMENT The Advanced Management Program (AMP) offered

More information

Product Data Quality Control for Collaborative Product Development

Product Data Quality Control for Collaborative Product Development 12-ICIT 9-11/4/07 in RoC Going for Gold ~ Quality Tools and Techniques Paper #: 04-06 Page- 1 /6 Product Data Quality Control for Collaborative Product Development Hsien-Jung Wu Department of Information

More information

Responsibility I Assessing Individual and Community Needs for Health Education

Responsibility I Assessing Individual and Community Needs for Health Education CHE Competencies Starting in the early 1990s, three national professional organizations the Society for Public Health Education, the American Association for Health Education, and the American Alliance

More information

Understanding Financial Management: A Practical Guide Guideline Answers to the Concept Check Questions

Understanding Financial Management: A Practical Guide Guideline Answers to the Concept Check Questions Understanding Financial Management: A Practical Guide Guideline Answers to the Concept Check Questions Chapter 8 Capital Budgeting Concept Check 8.1 1. What is the difference between independent and mutually

More information

A Software Engineering View of Data Quality

A Software Engineering View of Data Quality A Software Engineering View of Data Quality Mónica Bobrowski, Martina Marré, Daniel Yankelevich Departamento de Computación, FCEyN, Universidad de Buenos Aires, Argentina {monicab,martina,dany}@dc.uba.ar

More information

Actuarial Standard of Practice No. 23. Data Quality. Revised Edition

Actuarial Standard of Practice No. 23. Data Quality. Revised Edition Actuarial Standard of Practice No. 23 Data Quality Revised Edition Developed by the General Committee of the Actuarial Standards Board and Applies to All Practice Areas Adopted by the Actuarial Standards

More information

Oversight of Information Technology Projects. Information Technology Audit

Oversight of Information Technology Projects. Information Technology Audit O L A OFFICE OF THE LEGISLATIVE AUDITOR STATE OF MINNESOTA FINANCIAL AUDIT DIVISION REPORT Oversight of Information Technology Projects Information Technology Audit May 29, 2009 Report 09-19 FINANCIAL

More information

A Metadata-based Approach to Leveraging the Information Supply of Business Intelligence Systems

A Metadata-based Approach to Leveraging the Information Supply of Business Intelligence Systems University of Augsburg Prof. Dr. Hans Ulrich Buhl Research Center Finance & Information Management Department of Information Systems Engineering & Financial Management Discussion Paper WI-325 A Metadata-based

More information

PROJECT TIME MANAGEMENT

PROJECT TIME MANAGEMENT 6 PROJECT TIME MANAGEMENT Project Time Management includes the processes required to ensure timely completion of the project. Figure 6 1 provides an overview of the following major processes: 6.1 Activity

More information

Management and optimization of multiple supply chains

Management and optimization of multiple supply chains Management and optimization of multiple supply chains J. Dorn Technische Universität Wien, Institut für Informationssysteme Paniglgasse 16, A-1040 Wien, Austria Phone ++43-1-58801-18426, Fax ++43-1-58801-18494

More information

The Benefits of Data Modeling in Data Warehousing

The Benefits of Data Modeling in Data Warehousing WHITE PAPER: THE BENEFITS OF DATA MODELING IN DATA WAREHOUSING The Benefits of Data Modeling in Data Warehousing NOVEMBER 2008 Table of Contents Executive Summary 1 SECTION 1 2 Introduction 2 SECTION 2

More information

Operations and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Operations and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Operations and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture - 36 Location Problems In this lecture, we continue the discussion

More information

Formulation of A Decision Support Model Using Quality Attributes (Research In-Process)

Formulation of A Decision Support Model Using Quality Attributes (Research In-Process) Formulation of A Decision Support Model Using Quality Attributes (Research In-Process) Michael S Gendron Central Connecticut State University gendronm@ccsu.edu Marianne J. D Onofrio Central Connecticut

More information

7 Conclusions and suggestions for further research

7 Conclusions and suggestions for further research 7 Conclusions and suggestions for further research This research has devised an approach to analyzing system-level coordination from the point of view of product architecture. The analysis was conducted

More information

SPATIAL DATA CLASSIFICATION AND DATA MINING

SPATIAL DATA CLASSIFICATION AND DATA MINING , pp.-40-44. Available online at http://www. bioinfo. in/contents. php?id=42 SPATIAL DATA CLASSIFICATION AND DATA MINING RATHI J.B. * AND PATIL A.D. Department of Computer Science & Engineering, Jawaharlal

More information

Toward Quantitative Process Management With Exploratory Data Analysis

Toward Quantitative Process Management With Exploratory Data Analysis Toward Quantitative Process Management With Exploratory Data Analysis Mark C. Paulk Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Abstract The Capability Maturity Model

More information

Methodological Approaches to Evaluation of Information System Functionality Performances and Importance of Successfulness Factors Analysis

Methodological Approaches to Evaluation of Information System Functionality Performances and Importance of Successfulness Factors Analysis Gordana Platiša Neđo Balaban Methodological Approaches to Evaluation of Information System Functionality Performances and Importance of Successfulness Factors Analysis Article Info:, Vol. 4 (2009), No.

More information

Mergers and Acquisitions: The Data Dimension

Mergers and Acquisitions: The Data Dimension Global Excellence Mergers and Acquisitions: The Dimension A White Paper by Dr Walid el Abed CEO Trusted Intelligence Contents Preamble...............................................................3 The

More information

Risk Knowledge Capture in the Riskit Method

Risk Knowledge Capture in the Riskit Method Risk Knowledge Capture in the Riskit Method Jyrki Kontio and Victor R. Basili jyrki.kontio@ntc.nokia.com / basili@cs.umd.edu University of Maryland Department of Computer Science A.V.Williams Building

More information

Requirements engineering

Requirements engineering Learning Unit 2 Requirements engineering Contents Introduction............................................... 21 2.1 Important concepts........................................ 21 2.1.1 Stakeholders and

More information

The Case for Strategic Alignment

The Case for Strategic Alignment The Case for Strategic Alignment Introduction In the corporate sector, the impediments between formulating strategic plans and carrying them out at the operational level have always been formidable; often

More information

Pharmaceutical Sales Certificate

Pharmaceutical Sales Certificate Pharmaceutical Sales Certificate Target Audience Medical representatives Objective The objective of this program is to provide the necessary skills and knowledge needed to succeed as medical representatives.

More information

Part II Management Accounting Decision-Making Tools

Part II Management Accounting Decision-Making Tools Part II Management Accounting Decision-Making Tools Chapter 7 Chapter 8 Chapter 9 Cost-Volume-Profit Analysis Comprehensive Business Budgeting Incremental Analysis and Decision-making Costs Chapter 10

More information

The Series of Discussion Papers. Conceptual Framework of Financial Accounting

The Series of Discussion Papers. Conceptual Framework of Financial Accounting The Series of Discussion Papers Conceptual Framework of Financial Accounting Working Group on Fundamental Concepts September 2004 (Tentative translation: 28 Feb. 2005) Contents Issuance of the Series of

More information

CHAPTER-6 DATA WAREHOUSE

CHAPTER-6 DATA WAREHOUSE CHAPTER-6 DATA WAREHOUSE 1 CHAPTER-6 DATA WAREHOUSE 6.1 INTRODUCTION Data warehousing is gaining in popularity as organizations realize the benefits of being able to perform sophisticated analyses of their

More information

Performance management is viewed as a necessary evil

Performance management is viewed as a necessary evil [ white paper ] Performance Management Best Practices: Part One, Program Design First of Two-Part Series By Stephen C. Schoonover, M.D., President, Schoonover Associates, LLC Performance management is

More information

Data Quality in Information Integration and Business Intelligence

Data Quality in Information Integration and Business Intelligence Data Quality in Information Integration and Business Intelligence Leopoldo Bertossi Carleton University School of Computer Science Ottawa, Canada : Faculty Fellow of the IBM Center for Advanced Studies

More information

Making A Case For Project Management

Making A Case For Project Management AN INTERTHINK CONSULTING WHITE PAPER Making A Case For Project Management An Overview Of Interthink Consulting's Project Management Business Case Approach Contents: Introduction Defining Organizational

More information

Errors in Operational Spreadsheets: A Review of the State of the Art

Errors in Operational Spreadsheets: A Review of the State of the Art Errors in Operational Spreadsheets: A Review of the State of the Art Stephen G. Powell Tuck School of Business Dartmouth College sgp@dartmouth.edu Kenneth R. Baker Tuck School of Business Dartmouth College

More information

THE QUALITY OF DATA AND METADATA IN A DATAWAREHOUSE

THE QUALITY OF DATA AND METADATA IN A DATAWAREHOUSE THE QUALITY OF DATA AND METADATA IN A DATAWAREHOUSE Carmen Răduţ 1 Summary: Data quality is an important concept for the economic applications used in the process of analysis. Databases were revolutionized

More information

Business Intelligence: Recent Experiences in Canada

Business Intelligence: Recent Experiences in Canada Business Intelligence: Recent Experiences in Canada Leopoldo Bertossi Carleton University School of Computer Science Ottawa, Canada : Faculty Fellow of the IBM Center for Advanced Studies 2 Business Intelligence

More information

Tentative Translation

Tentative Translation Tentative Translation GUIDELINES TO APPLICATION OF THE ANTIMONOPOLY ACT CONCERNING REVIEW OF BUSINESS COMBINATION May 31, 2004 Japan Fair Trade Commission Revised as of May 1, 2006 Revised as of March

More information

Analysis of Load Frequency Control Performance Assessment Criteria

Analysis of Load Frequency Control Performance Assessment Criteria 520 IEEE TRANSACTIONS ON POWER SYSTEMS, VOL. 16, NO. 3, AUGUST 2001 Analysis of Load Frequency Control Performance Assessment Criteria George Gross, Fellow, IEEE and Jeong Woo Lee Abstract This paper presents

More information

A Framework for Identifying and Managing Information Quality Metrics of Corporate Performance Management System

A Framework for Identifying and Managing Information Quality Metrics of Corporate Performance Management System Journal of Modern Accounting and Auditing, ISSN 1548-6583 February 2012, Vol. 8, No. 2, 185-194 D DAVID PUBLISHING A Framework for Identifying and Managing Information Quality Metrics of Corporate Performance

More information

DEVELOPING REQUIREMENTS FOR DATA WAREHOUSE SYSTEMS WITH USE CASES

DEVELOPING REQUIREMENTS FOR DATA WAREHOUSE SYSTEMS WITH USE CASES DEVELOPING REQUIREMENTS FOR DATA WAREHOUSE SYSTEMS WITH USE CASES Robert M. Bruckner Vienna University of Technology bruckner@ifs.tuwien.ac.at Beate List Vienna University of Technology list@ifs.tuwien.ac.at

More information

ACH 1.1 : A Tool for Analyzing Competing Hypotheses Technical Description for Version 1.1

ACH 1.1 : A Tool for Analyzing Competing Hypotheses Technical Description for Version 1.1 ACH 1.1 : A Tool for Analyzing Competing Hypotheses Technical Description for Version 1.1 By PARC AI 3 Team with Richards Heuer Lance Good, Jeff Shrager, Mark Stefik, Peter Pirolli, & Stuart Card ACH 1.1

More information

Faculty Productivity and Costs at The University of Texas at Austin

Faculty Productivity and Costs at The University of Texas at Austin Faculty Productivity and Costs at The University of Texas at Austin A Preliminary Analysis Richard Vedder Christopher Matgouranis Jonathan Robe Center for College Affordability and Productivity A Policy

More information

Do Programming Languages Affect Productivity? A Case Study Using Data from Open Source Projects

Do Programming Languages Affect Productivity? A Case Study Using Data from Open Source Projects Do Programming Languages Affect Productivity? A Case Study Using Data from Open Source Projects Daniel P. Delorey pierce@cs.byu.edu Charles D. Knutson knutson@cs.byu.edu Scott Chun chun@cs.byu.edu Abstract

More information

Draft Martin Doerr ICS-FORTH, Heraklion, Crete Oct 4, 2001

Draft Martin Doerr ICS-FORTH, Heraklion, Crete Oct 4, 2001 A comparison of the OpenGIS TM Abstract Specification with the CIDOC CRM 3.2 Draft Martin Doerr ICS-FORTH, Heraklion, Crete Oct 4, 2001 1 Introduction This Mapping has the purpose to identify, if the OpenGIS

More information

A Review of Contemporary Data Quality Issues in Data Warehouse ETL Environment

A Review of Contemporary Data Quality Issues in Data Warehouse ETL Environment DOI: 10.15415/jotitt.2014.22021 A Review of Contemporary Data Quality Issues in Data Warehouse ETL Environment Rupali Gill 1, Jaiteg Singh 2 1 Assistant Professor, School of Computer Sciences, 2 Associate

More information

AN OVERVIEW OF QUALITY OF SERVICE COMPUTER NETWORK

AN OVERVIEW OF QUALITY OF SERVICE COMPUTER NETWORK Abstract AN OVERVIEW OF QUALITY OF SERVICE COMPUTER NETWORK Mrs. Amandeep Kaur, Assistant Professor, Department of Computer Application, Apeejay Institute of Management, Ramamandi, Jalandhar-144001, Punjab,

More information

Compensation accounts for nearly 70 percent

Compensation accounts for nearly 70 percent THE BUSINESS CASE FOR COMPENSATION TECHNOLOGY Compensation accounts for nearly 70 percent of operating expenses for most organizations, 1 and is one of the main reasons employees join and leave organizations.

More information

THE PATH TO AGILE LOCALIZATION

THE PATH TO AGILE LOCALIZATION THE PATH TO AGILE LOCALIZATION LINGOPORT RESOURCE MANAGER A Lingoport White Paper Authored by Richard Sikes Localization Flow Technologies www.lingoport.com TABLE OF CONTENTS Does Agile Development = Fragile

More information

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same!

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same! Software Metrics & Software Metrology Alain Abran Chapter 4 Quantification and Measurement are Not the Same! 1 Agenda This chapter covers: The difference between a number & an analysis model. The Measurement

More information

Predictive and Prescriptive Analytics An Example: Advanced Sales & Operations Planning

Predictive and Prescriptive Analytics An Example: Advanced Sales & Operations Planning Arnold Mark Wells 12 April 2015 Predictive and Prescriptive Analytics An Example: Advanced Sales & Operations Planning 2 Good Decisions Are Integrated Decisions Requires an understanding of inter-related

More information

Inventory Management. NEELU TIWARI Jaipuria Institute of Management, Vasundhara Gzb.

Inventory Management. NEELU TIWARI Jaipuria Institute of Management, Vasundhara Gzb. INTERNATIONAL JOURNAL OF BUSINESS MANAGEMENT, ECONOMICS AND INFORMATION TECHNOLOGY Vol. 3, No. 2, July-December 2011: 303-207 Inventory Management NEELU TIWARI Jaipuria Institute of Management, Vasundhara

More information

ECON 312: Oligopolisitic Competition 1. Industrial Organization Oligopolistic Competition

ECON 312: Oligopolisitic Competition 1. Industrial Organization Oligopolistic Competition ECON 312: Oligopolisitic Competition 1 Industrial Organization Oligopolistic Competition Both the monopoly and the perfectly competitive market structure has in common is that neither has to concern itself

More information

How To Manage A Project

How To Manage A Project Professor A. Jaafari, ME, MSc, PhD, CPEng, FIEA Online Project Diagnostics: A Tool for Management of Project Complexities There is evidence that the current project management models and practices do not

More information

A Foolish Consistency: Technical Challenges in Consistency Management

A Foolish Consistency: Technical Challenges in Consistency Management A Foolish Consistency: Technical Challenges in Consistency Management Anthony Finkelstein University College London, Department of Computer Science, Gower St., London WC1E 6BT UK a.finkelstein@cs.ucl.ac.uk

More information

Classification of Fuzzy Data in Database Management System

Classification of Fuzzy Data in Database Management System Classification of Fuzzy Data in Database Management System Deval Popat, Hema Sharda, and David Taniar 2 School of Electrical and Computer Engineering, RMIT University, Melbourne, Australia Phone: +6 3

More information

A Production Planning Problem

A Production Planning Problem A Production Planning Problem Suppose a production manager is responsible for scheduling the monthly production levels of a certain product for a planning horizon of twelve months. For planning purposes,

More information

Integrated Risk Management:

Integrated Risk Management: Integrated Risk Management: A Framework for Fraser Health For further information contact: Integrated Risk Management Fraser Health Corporate Office 300, 10334 152A Street Surrey, BC V3R 8T4 Phone: (604)

More information

Patterns for Data Migration Projects

Patterns for Data Migration Projects Martin Wagner martin.wagner@tngtech.com http://www.tngtech.com Tim Wellhausen kontakt@tim-wellhausen.de http://www.tim-wellhausen.de March 17, 2011 Introduction Data migration is a common operation in

More information

BUSINESS INTELLIGENCE AS SUPPORT TO KNOWLEDGE MANAGEMENT

BUSINESS INTELLIGENCE AS SUPPORT TO KNOWLEDGE MANAGEMENT ISSN 1804-0519 (Print), ISSN 1804-0527 (Online) www.academicpublishingplatforms.com BUSINESS INTELLIGENCE AS SUPPORT TO KNOWLEDGE MANAGEMENT JELICA TRNINIĆ, JOVICA ĐURKOVIĆ, LAZAR RAKOVIĆ Faculty of Economics

More information