Reusing Meta-Base to Improve Information Quality
|
|
- Gwendolyn Collins
- 3 years ago
- Views:
Transcription
1 Reusable Conceptual Models as a Support for the Higher Information Quality 7DWMDQD :HO]HU %UXQR 6WLJOLF,YDQ 5R]PDQ 0DUMDQ 'UXåRYHF University of Maribor Maribor, Slovenia ABSTRACT Today we are faced with an increasing demand for more and more complex database applications. The rapid growth has stimulated the need for a high level concepts, tools and techniques for a database design, development and retrieval with a final goal: better information quality. One of the new possibilities is using a meta data model repository with reusable components (MetaBase). Even more, just as important as reusability is in such a concept the quality dimension of reusable components which enables more expedient and efficient design of the high quality data models. As a consequence, data quality as well as information quality of any information system improves. In the paper the influence of reusability on the information quality, quality dimensions of the meta data model repository and influence of TQM as well as some of Deming s fourteen points will be presented and discussed in more details. 1 INTRODUCTION Over the years many of approaches to improve information quality have been developed and employed in various situations. It is important to recognize that most of these approaches are only vaguely aware that the prerequisite for the information quality is data quality, despite the fact that in the information technology aggressive steps to improve just the data quality are being taken.
2 In the last few years the information technology has been spectacularly successful in automating many operations and making data available to more people. The advances of information technology also have had an impact on poor data quality. But unfortunately just as it is natural to assume that computerized data are correct, so too it is natural to blame the information technology when the data are incorrect. These problems can grow out of proportions especially in the data warehouse environments as well as on the Internet. Further, data (information) users usually use data in their applications without giving the conceptual view much thought. But in opposite, a high quality conceptual view is of a great importance to avoid the above mentioned problems and to improve the data as well as the information quality. Conceptual modelling as a process of producing a conceptual view is a complex and a hard job. A conceptual model, as a result of this process, is used only once, i.e. within a single project. This seems to be not very practical, because a model (parts of a model) derived for one enterprise could also be used in a similar projects in the future for more or less similar enterprises. This argument results in the meta data model repository MetaBase, which enables the use of models or submodels of previous projects in the actual design. One of the most important characteristics of the Meta Base is high quality level of the saved models. Components (models and submodels) saved in the repository are of at least basic quality. This is confirmed by D.C. Rine (Rine, 1997) while through object technology (technology of the MetaBase) as well as reuse methodology the high quality tested models (software) are already developed (from the 15 quality characteristics the basic quality of the MetaBase components fulfil already 10). But for an additional increase of the data quality the information (data) should be managed too, just as products are managed. Total Data Quality Management (TDQM) and Deming s Fourteen Points are used for this purpose and result in high-quality information (data) products. The MetaBase also already fulfils some of these points. After a short presentation of terms data and information quality in chapter 2, an overview of reusability in conceptual modelling with a short presentation of MetaBase is described in chapter 3. The quality of reusable components and an influence of Deming s Fourteen Points as well as Total Data Quality Management on the higher information quality in a frame of the MetaBase are the main goal of chapter 4. We finally conclude with a summary of the proposed concepts and future research. 2 DATA AND INFORMATION QUALITY The quality paradigm is difficult to describe because of its amorphous nature. Therefore different authors tend to emphasize different aspects (Fox, 1997). When the quality paradigm was formed emphasis was given only to an inspection to achieve quality - conformance to a standard or a specification. Rapid changes in the last years have lead to new definitions of quality. One of the most important is the IEEE standard definition (IEEE, 1998) in which the quality is defined as the totality of features and characteristics of a product or service that bears on its ability to satisfy given needs. Between the above mentioned different aspects of quality are for our further
3 discussion most important data quality and information quality, again presented by different authors. Thomas C. Redman defines in Data Quality for Information Age (Redman, 1996) the data quality in its broadest sense. This implies to data that are relevant to their intended uses and are of sufficient detail and quality with a high degree of accuracy and completeness, consistent with other sources and presented in appropriate ways. Giri Kumar Tayi and Donald P. Bollou as guest editors of Examing Data Quality in the Communications of the ACM, have defined the term data quality as fitness for use which implies that the concept of data quality is relative (Tayi, 1998). Data appropriate for one use may not possess sufficient quality for another use. Or opposite, already used data comply to some kind of quality. A related problem with multiple users of data is also that of semantics. The data designer and /or gatherer as well as initial user may fully agree with same definitions regarding the meaning of the various data items, but probably this will not be a view of the other users. Such problems are becoming increasingly critical as organizations implement data warehouses. To the same time the conceptual view on data is becoming more and more important as a possible solution for the mentioned problems. The data quality of Ken Orr (Orr, 1998) introduces a kind of measurement view on this term. It is defined as a measure of the agreement between the data views presented by an information system and that same data in the real world. Of course no serious information system has data quality of 100%, but tries to ensure that the data is accurate enough, timely and consistent for the enterprise to survive and make reasonable decisions. Actually the real problem with data quality is change. Data in any IS database are static, but in the real world they are changing. One reason more to have a conceptual view. If defining and understanding data and data quality is difficult and different from source to source, then defining and understanding information is a hornet s nest. In some environments the term information refers to both data and information (Strong, 1997). Data usually refer to information at their early stages of processing and information to the product at a later stage. Rather than switching between the terms the information is used to refer to data or information values at any point in the process. But still we must bear in our minds that different information definitions depend upon different points of view. For example: The information management point of view Information is processed data (Redman, 1996). The information theory point of view Information is the non-redundant part of a message (Redman, 1996). The information technology for management point of view Information is data that have been organized so that they have a meaning to the user (Turban, 1996) However once a point of view is fixed, no conflict should arise and once again it is important to recognize that the prerequisite for the information quality is a data quality.
4 3 REUSABILITY IN CONCEPTUAL MODELING Database is the main building block of any modern information system. As an essential component, the database must be designed very carefully to contain all the information required by the user. To achieve greater efficiency the database is designed in a progressive process decomposed into conceptual, logical and physical design. The aim of the conceptual design, which results in a conceptual model, is to describe the information content of the database on an abstract and general way, independent from a particular logical model or DBMS. For that reasons the conceptual model is an essential, important component that is in principle used only once, i.e. within a single database design. This seems to be at least very time consuming, while the design process for each of different databases requires a lot of time. Numberless conceptual models were once designed and never used again, not even parts of them (Welzer 1997). Why? Is each database really so unique that no common and hence reusable data components of conceptual models are to be found? In software engineering reusable components have been well known and practiced almost since the first programs were written. Reuse has been considered as a way for overcoming the software crisis. It was never restricted only to the programs (Krueger 1992) and was also applied to the results of different phases of development as well as to human resources. However, reuse means to use previously acquired concepts and objects in a new situation. Reuse is a very simple activity that is not even listed in the most dictionaries or its explanation is very simple (Merriam-Webster s Collegiate Dictionary: Reuse means to use again especially after reclaiming or reprocessing or further repeated use). In the software environment the reuse is defined as a process of implementing or updating software systems using existing software assets (Reifer, 19979). Reuse can occur within a system, across similar systems or in widely different systems. Reusable software assets include more than just a code. Requirements, designs models, algorithms, tests, documents and many other products of the software process can be reused. Subsequently, according to the above definitions we will try to give detailed explanation of the terms database reuse and reusable database components (Welzer, 1995): Database reuse is a process of using existing database conceptual models or parts of them rather than building them from the scratch. Typically, reuse involves abstraction, selection, specialization and integration of reusable parts, although different techniques (when they will be defined) may emphasize or de-emphasize some of them. The primary motivation to reuse database components (conceptual models or parts of them) is to reduce the time and efforts required when building a conceptual model, actually a database. Because the quality of software systems is enhanced by reusing quality software artifacts, which also reduces the time and efforts required to maintain software, (similarly) reusable database components can, first of all, influence logical and physical database design and (not at last) also the database maintenance. Further, considering the above definitions we should try to answer also the following question: Is the software reuse easy and the database reuse difficult? Actually, every reuse in the field of computer science is difficult while usually useful abstractions for
5 large complex reusable software components or reusable database components will typically be complex (Welzer, 1995). For this reasons an appropriate way of presenting reusable concepts, either software, artifacts or conceptual models or parts of them must be found. Our suggestion is named MetaBase and presents meta models repository. 3.1 MetaBase Our approach to apply the concepts of reuse also to the database design is based on an object oriented meta data model. We have decided for the object oriented paradigm with a purpose to take advantages of its concepts (reuse, inheritance, methods) to represent our meta data model. The MetaBase (Figure 1) model is introduced as a three levels model distinguishing object level, enterprise level and function level. The enterprise level is central in the MetaBase model. It contains the conceptual models and external models (submodels) that describe a particular enterprise. This central block of the MetaBase is topped with a function level into which business and functional domains are integrated. An application domain links the function and the enterprise level. On the other hand the enterprise level is also related to a subordinated object level across objects. The object level contains the representation of semantic objects, which constitute the conceptual and external models. Enterprise level presents a connection to the functional level across application domain and enterprise. The application domain is a very important part of our structure because reuse of conceptual models (database components) is more promising if their application domains are the same or similar (Freeman, 1987) while a domain model could be a terminological framework for discussing commonalties and differences between conceptual models of an enterprise within the application domain, considering the above presented outputs. According to the MetaBase structure the conceptual models or parts of them are saved in the repository and reused in a database design. They should be reused when designing more or less similar databases. To enable this process a successful search for the appropriate components as well as modified database design process is defined FUNCTION ENTERPRISE business domain functional domain enterprise application domain object external model conceptual model method property OBJECT
6 (Welzer, 1997). Figure 1. MetaBase concept 4 QUALITY OF REUSABLE COMPONENTS It is obvious already from the presentation of the MetaBase repository (meta data model) that components saved in the repository are of at least basic quality (Welzer, 1998). The intention is to reach an "ideal" conceptual model. For this purpose 15 characteristics should be fulfilled: relevance, obtainability, clarity of definition, essentialness, attribute granularity, domain precision, naturalness, occurrence identifiability, homogeneity, minimum redundancy, semantic consistency, structural consistency, robustness, flexibility (Redman, 1996) and according to the user needs some characteristics of his or her own could be added. Almost all characteristics are subjective, but of a great importance for the conceptual data model and the process of modelling. Additionally we should mention that the characteristics are not independent from each other. Some of them are even in direct conflicts. To confirm the before mentioned basic qualities of the MetaBase components the features are checked against the list: Relevance - objects needed by the applications are included in conceptual models. For that the relevance is user-driven or application related. This means that a new application may require objects, which are alike objects in an existing database. The reusability of the MetaBase is additionally supported. Clarity of definition - all the terms used in the conceptual model are clearly defined. Such definitions are needed by existing and potential users. Comprehensiveness - each attribute needed should also be included. Ideally a view should be broad enough to satisfy the needs of all applications but nothing else. Occurrence identifiability - identification of the individual objects is made easy. Homogeneity, structural consistency - object level enables the minimization of unnecessary attributes. Minimum redundancy - only models checked are included. Semantic consistency - models are clear and organized according to the application domains. Robustness, flexibility - through the reuse both characteristics are fulfilled. They work together to increase the useful life time of components. Robustness refers to the ability of the view to accommodate changes in the enterprise without changing the basic structure of the component. Flexibility refers to the capacity to change components to accommodate new demands.
7 But for an additional increase of the data quality the information (data) should be managed too, just as products are managed. Total Data Quality Management (TDQM) is used for this purpose and results in a high-quality information (data) products in order to satisfy internal and external customers (Turban, 1996). Namely TQM is a focused management philosophy for providing the leadership training and motivation to continuously improve an organization s management and product (information) oriented process. Further TQM is a seven-step process (Turban, 1996) which has some similarities to the MetaBases approach: Step 1 - Establish the management and cultural environment. With MetaBaserepository the environment is established. Step 2 Define the mission for each component. Roles of reusable MetaBase components are well known. Step 6, Step 7 Evaluate performance, review and repeat. MetaBase ensures high quality components for building new conceptual models. An additional influence on the methodology have Deming s Fourteen Points for quality management adapted for information (data) (Redman, 1996). Some of these points are already fulfilled by MetaBase: Point 1 Recognize the importance of data and information to the enterprise. MetaBase components support the recognition and already enable some solutions. Point 2 Adopt new philosophy. The enterprise can no longer live with currently accepted levels of information (data) quality. MetaBase enables at least the basic quality of its components. Point 6 Institute job training. Through the existing models individuals and organization teach how to solve similar problems. Point 9 Break down barriers between organizations. Application, functional and business domain ensure a free flow of high quality models across the organizational boundaries. Point 11 Eliminate production quotas and management by objective. Reusable models learn how to manage and improve processes that create and use data and information. Point 12 Remove barriers standing between data products and their rights to pride in their work. MetaBase motivates designers to quickly develop new data models of a high quality. Point 13 Institute training on data and information, their roles in the enterprise and how they may be improved. Reusable models and submodels support the training. Point 14 Create a structure in top management that recognizes the importance of data and information and their relationships to the rest of the business. Business domain supports this recognition and the top management can always find a support for understanding data in existing models.
8 5 CONCLUSION Conceptual models of different databases are neither right nor wrong, they are more or less useful. Similarly, reusable database components from the MetaBase repository are more or less useful and present a starting point for conceptual database design supported by database reuse and of course according to the MetaBase view they are more or less reusable. Even just as important as reusability or may be even more is the quality dimension of reusable components. According to the 15 quality characteristics the basic quality of the MetaBase components fulfil already 10. But the final goal on the way to the information quality are not the 15 characteristics, but Deming's fourteen points focused on data and information (Redman, 1996). Now, they are already fulfilling 7 points. In further research on the meta data models repository they should be considered in a way to assure better quality of conceptual models and reusable components. A great support in this way is given also by TQM and its 7 steps from which 4 of them have a lot in common with MetaBase research. Further research is planed on problems of TQM and the remaining Deming s points in connection with meta data model repository MetaBase. REFERENCES Reifer, D.J. (1997). Practical Software Reuse. Toronto: John Wiley & Sons. Ermer, S. D. (1991). An analitical analysis of Pre-Control. ASQC QUALITY CONGRESS TRANSACTIONS MILWAUKEE, pp Grant, E.L. and Leavenworth, R.S. (1996). Statistical quality control. New York: McGraw-Hill. Ledolter, J. and Swersey, A. (1997). An Evaluation of Pre-Control. Journal of Quality Technology, vol. 29, no. 2, pp Mackertich, A. N. (1990). Precontrol vs. control charting: a critical comparison. Quality Engineering, vol. 2, no. 3, pp Masig, W. (1994). Handbuch Qualitatsmanagement. Munchen, Wien: Carl Hanser Verlag. Montgomery, D.C. (1996). Introduction to Statistical Quality Control. New York: John Wiley & Sons. Pfeifer, T. (1996). Praxishandbuch Qualitatsmanagement. Munchen, Wien: Carl Hanser Verlag. Reinhart, G.; Lindemann, U. and Heinzl, J. (1996). Qualitatsmanagement. Berlin Heidelberg: Springer-Verlag. Salvia, A. A. (1988). AOQ and AOQL for Stoplight Acceptance Sampling. Journal of Quality Technology, vol. 20, no. 3, pp Shainin, D. and Shainin, P. (1989). Pre-Control versus x&r charting: continuous or immediate quality improvement? Quality Engineering, vol. 1, no. 4, pp.
9 Steiner, H. S. (1997). Pre-Control and some simple alternatives. Quality Engineering, vol. 10, no. 1, pp
Appendix B Data Quality Dimensions
Appendix B Data Quality Dimensions Purpose Dimensions of data quality are fundamental to understanding how to improve data. This appendix summarizes, in chronological order of publication, three foundational
More informationTOWARD A FRAMEWORK FOR DATA QUALITY IN ELECTRONIC HEALTH RECORD
TOWARD A FRAMEWORK FOR DATA QUALITY IN ELECTRONIC HEALTH RECORD Omar Almutiry, Gary Wills and Richard Crowder School of Electronics and Computer Science, University of Southampton, Southampton, UK. {osa1a11,gbw,rmc}@ecs.soton.ac.uk
More informationMETA DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING
META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 asistithod@gmail.com
More informationV&V and QA throughout the M&S Life Cycle
Introduction to Modeling and Simulation and throughout the M&S Life Cycle Osman Balci Professor Department of Computer Science Virginia Polytechnic Institute and State University (Virginia Tech) Blacksburg,
More informationTHE QUALITY OF DATA AND METADATA IN A DATAWAREHOUSE
THE QUALITY OF DATA AND METADATA IN A DATAWAREHOUSE Carmen Răduţ 1 Summary: Data quality is an important concept for the economic applications used in the process of analysis. Databases were revolutionized
More informationIntroduction to Modeling and Simulation. Certification. Osman Balci Professor
Introduction to ing and Certification Osman Balci Professor Department of Computer Science Virginia Polytechnic Institute and State University (Virginia Tech) Blacksburg, VA 24061, USA http://manta.cs.vt.edu/balci
More informationRisk Knowledge Capture in the Riskit Method
Risk Knowledge Capture in the Riskit Method Jyrki Kontio and Victor R. Basili jyrki.kontio@ntc.nokia.com / basili@cs.umd.edu University of Maryland Department of Computer Science A.V.Williams Building
More informationTurning Data into Knowledge: Creating and Implementing a Meta Data Strategy
EWSolutions Turning Data into Knowledge: Creating and Implementing a Meta Data Strategy Anne Marie Smith, Ph.D. Director of Education, Principal Consultant amsmith@ewsolutions.com PG 392 2004 Enterprise
More informationJOURNAL OF OBJECT TECHNOLOGY
JOURNAL OF OBJECT TECHNOLOGY Online at http://www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2005 Vol. 4, No.2, March-April 2005 On Metadata Management Technology: Status and Issues
More informationBackground: Business Value of Enterprise Architecture TOGAF Architectures and the Business Services Architecture
Business Business Services Services and Enterprise and Enterprise This Workshop Two parts Background: Business Value of Enterprise TOGAF s and the Business Services We will use the key steps, methods and
More informationCorresponding Author email: javeri_mit@yahoo.com
International Research Journal of Applied and Basic Sciences 2013 Available online at www.irjabs.com ISSN 2251838X / Vol, 5 (11): 14381445 Science Explorer Publications Presenting a model for the deployment
More informationSOA: The missing link between Enterprise Architecture and Solution Architecture
SOA: The missing link between Enterprise Architecture and Solution Architecture Jaidip Banerjee and Sohel Aziz Enterprise Architecture (EA) is increasingly being acknowledged as the way to maximize existing
More informationData Dictionary and Normalization
Data Dictionary and Normalization Priya Janakiraman About Technowave, Inc. Technowave is a strategic and technical consulting group focused on bringing processes and technology into line with organizational
More information1.1 The Nature of Software... Object-Oriented Software Engineering Practical Software Development using UML and Java. The Nature of Software...
1.1 The Nature of Software... Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 1: Software and Software Engineering Software is intangible Hard to understand
More informationKnowledge Management Systems: Essential Requirements and Generic Design Patterns
Published in: Smari, W.W.; Melab, N.; Yetongnon, K. (Eds.): Proceedings of the International Symposium on Information Systems and Engineering, ISE'2001, Las Vegas: CSREA Press 2001, pp. 114-121 Knowledge
More informationElite: A New Component-Based Software Development Model
Elite: A New Component-Based Software Development Model Lata Nautiyal Umesh Kumar Tiwari Sushil Chandra Dimri Shivani Bahuguna Assistant Professor- Assistant Professor- Professor- Assistant Professor-
More informationHow To Develop Software
Software Engineering Prof. N.L. Sarda Computer Science & Engineering Indian Institute of Technology, Bombay Lecture-4 Overview of Phases (Part - II) We studied the problem definition phase, with which
More informationChapter 10 Practical Database Design Methodology and Use of UML Diagrams
Chapter 10 Practical Database Design Methodology and Use of UML Diagrams Copyright 2011 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 10 Outline The Role of Information Systems in
More informationMethodological Approaches to Evaluation of Information System Functionality Performances and Importance of Successfulness Factors Analysis
Gordana Platiša Neđo Balaban Methodological Approaches to Evaluation of Information System Functionality Performances and Importance of Successfulness Factors Analysis Article Info:, Vol. 4 (2009), No.
More informationCDC UNIFIED PROCESS PRACTICES GUIDE
Purpose The purpose of this document is to provide guidance on the practice of Modeling and to describe the practice overview, requirements, best practices, activities, and key terms related to these requirements.
More informationChapter 3 Chapter 3 Service-Oriented Computing and SOA Lecture Note
Chapter 3 Chapter 3 Service-Oriented Computing and SOA Lecture Note Text book of CPET 545 Service-Oriented Architecture and Enterprise Application: SOA Principles of Service Design, by Thomas Erl, ISBN
More informationAdventures in Estimating Open Source, Component Systems, Agile, and SOA Projects
Open Source, Component Systems, Agile, and SOA Projects Terry Vogt Lead Associate Booz Allen Hamilton Sept 13, 2011 Ready for what s next 1 Booz Allen Hamilton 1 Agenda Background Open Source Component
More informationII. TYPES OF LEVEL A.
Study and Evaluation for Quality Improvement of Object Oriented System at Various Layers of Object Oriented Matrices N. A. Nemade 1, D. D. Patil 2, N. V. Ingale 3 Assist. Prof. SSGBCOET Bhusawal 1, H.O.D.
More informationEnhancing DataQuality. Environments
Nothing is more likely to undermine the performance and business value of a data warehouse than inappropriate, misunderstood, or ignored data quality. Enhancing DataQuality in DataWarehouse Environments
More informationChap 1. Software Quality Management
Chap 1. Software Quality Management Part 1.1 Quality Assurance and Standards Part 1.2 Software Review and Inspection Part 1.3 Software Measurement and Metrics 1 Part 1.1 Quality Assurance and Standards
More informationOpen S-BPM: Goals and Architecture
Open S-BPM: Goals and Architecture Albert Fleischmann Werner Schmidt Table of Content 1 Introduction... 2 2 Mission, Vision and Objectives... 2 3 Research and Development Areas... 3 4 Open S-BPM Architecture...
More informationI. INTRODUCTION A. There are two aspects to an effective operating system: 1. Design 2. Control.
Operations Control Chapter 8 CHAPTER OUTLINE I. INTRODUCTION A. There are two aspects to an effective operating system: 1. Design 2. Control. B. Effective operations control is attained by applying the
More informationTOPIC 8 QUALITY OBJECTIVE. Quality
TOPIC 8 QUALITY Q Copyright Houghton Mifflin Company. All rights reserved. 8-1 OBJECTIVE What is Quality & Its Importance Total Quality Management (TQM) Dimensions of Quality Element of TQM TQM vs. Traditional
More informationSoftware Development in the Large!
Software Development in the Large! Peter Eeles Executive IT Architect, IBM peter.eeles@uk.ibm.com IBM Rational Software Development Conference 2007 2007 IBM Corporation Agenda IBM Rational Software Development
More informationLightweight Service-Based Software Architecture
Lightweight Service-Based Software Architecture Mikko Polojärvi and Jukka Riekki Intelligent Systems Group and Infotech Oulu University of Oulu, Oulu, Finland {mikko.polojarvi,jukka.riekki}@ee.oulu.fi
More informationOn the general structure of ontologies of instructional models
On the general structure of ontologies of instructional models Miguel-Angel Sicilia Information Engineering Research Unit Computer Science Dept., University of Alcalá Ctra. Barcelona km. 33.6 28871 Alcalá
More informationSemantic Business Process Management Lectuer 1 - Introduction
Arbeitsgruppe Semantic Business Process Management Lectuer 1 - Introduction Prof. Dr. Adrian Paschke Corporate Semantic Web (AG-CSW) Institute for Computer Science, Freie Universitaet Berlin paschke@inf.fu-berlin.de
More informationLiterature review on TQM
Literature review on TQM 1 Kamini Srinivasa Rao, 2 Dr Adimulam Yesu Babu 1 Asst.Professor, 1 Department of information technology 1 Bapatla Engineering college,bapatla 2 I/C Principal, 2 Department of
More informationRequirements engineering
Learning Unit 2 Requirements engineering Contents Introduction............................................... 21 2.1 Important concepts........................................ 21 2.1.1 Stakeholders and
More informationEnterprise Architecture Development Based on Enterprise Ontology
Enterprise Architecture Development Based on Enterprise Ontology 1, 2, 3 1 Nooretouba University, E-Commerce Group, Tehran, Iran, rajabi.ze@gmail.com 2 Iran University of Science & Technology, School of
More informationVerifying Semantic of System Composition for an Aspect-Oriented Approach
2012 International Conference on System Engineering and Modeling (ICSEM 2012) IPCSIT vol. 34 (2012) (2012) IACSIT Press, Singapore Verifying Semantic of System Composition for an Aspect-Oriented Approach
More informationLessons Learned from the Teaching of IS Development
Journal of Information Technology Education Volume 1 No. 2, 2002 Lessons Learned from the Teaching of IS Development Filomena Lopes and Paula Morais Universidade Portucalense, Porto, Portugal flopes@upt.pt
More informationRun-time Variability Issues in Software Product Lines
Run-time Variability Issues in Software Product Lines Alexandre Bragança 1 and Ricardo J. Machado 2 1 Dep. I&D, I2S Informática Sistemas e Serviços SA, Porto, Portugal, alexandre.braganca@i2s.pt 2 Dep.
More informationA terminology model approach for defining and managing statistical metadata
A terminology model approach for defining and managing statistical metadata Comments to : R. Karge (49) 30-6576 2791 mail reinhard.karge@run-software.com Content 1 Introduction... 4 2 Knowledge presentation...
More informationSystem Software Product Line
System Software Product Line 2 1 Introduction The concept of Software Product Lines has been developed for more than a decade. Being initially an academic topic, product lines are more and more incorporated
More information2. MOTIVATING SCENARIOS 1. INTRODUCTION
Multiple Dimensions of Concern in Software Testing Stanley M. Sutton, Jr. EC Cubed, Inc. 15 River Road, Suite 310 Wilton, Connecticut 06897 ssutton@eccubed.com 1. INTRODUCTION Software testing is an area
More informationModern Systems Analysis and Design
Modern Systems Analysis and Design Prof. David Gadish Structuring System Data Requirements Learning Objectives Concisely define each of the following key data modeling terms: entity type, attribute, multivalued
More informationEvaluating Data Warehousing Methodologies: Objectives and Criteria
Evaluating Data Warehousing Methodologies: Objectives and Criteria by Dr. James Thomann and David L. Wells With each new technical discipline, Information Technology (IT) practitioners seek guidance for
More informationJOURNAL OF OBJECT TECHNOLOGY
JOURNAL OF OBJECT TECHNOLOGY Online at http://www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2007 Vol. 6, No. 1, January-February 2007 CM Configuration Change Management John D.
More informationKunal Jamsutkar 1, Viki Patil 2, P. M. Chawan 3 (Department of Computer Science, VJTI, MUMBAI, INDIA)
Software Project Quality Management Kunal Jamsutkar 1, Viki Patil 2, P. M. Chawan 3 (Department of Computer Science, VJTI, MUMBAI, INDIA) ABSTRACT Quality Management is very important in Software Projects.
More informationPerformance Evaluation of Reusable Software Components
Performance Evaluation of Reusable Software Components Anupama Kaur 1, Himanshu Monga 2, Mnupreet Kaur 3 1 M.Tech Scholar, CSE Dept., Swami Vivekanand Institute of Engineering and Technology, Punjab, India
More informationDatabase Schema Management
Whitemarsh Information Systems Corporation 2008 Althea Lane Bowie, Maryland 20716 Tele: 301-249-1142 Email: Whitemarsh@wiscorp.com Web: www.wiscorp.com Table of Contents 1. Objective...1 2. Topics Covered...2
More informationWhat is Wrong with EMR?
What is Wrong with EMR? James J. Cimino, M.D. Associate Professor, Department of Medical Informatics, Columbia University 161 Fort Washington Avenue, New York, New York 10032 USA Phone: (212) 305-8127
More informationWhat is Business Process Design and Why Should I Care?
What is Business Process Design and Why Should I Care? by Jay Cousins and Tony Stewart, RivCom Ltd Introduction No matter how hard individuals work, they cannot overcome a flawed process design, much less
More informationCreating the Golden Record
Creating the Golden Record Better Data through Chemistry Donald J. Soulsby metawright.com Agenda The Golden Record Master Data Discovery Integration Quality Master Data Strategy DAMA LinkedIn Group C.
More informationQuality Management of Software and Systems: Continuous Improvement Approaches
Quality Management of Software and Systems: Continuous Improvement Approaches Contents Quality Improvement Paradigm (QIP) Experience Factory (EF) Goal Question Metric (GQM) GQM + Strategies TQM Definition
More informationC. Wohlin, "Managing Software Quality through Incremental Development and Certification", In Building Quality into Software, pp. 187-202, edited by
C. Wohlin, "Managing Software Quality through Incremental Development and Certification", In Building Quality into Software, pp. 187-202, edited by M. Ross, C. A. Brebbia, G. Staples and J. Stapleton,
More informationIntroduction. 1.1 Motivation. Chapter 1
Chapter 1 Introduction The automotive, aerospace and building sectors have traditionally used simulation programs to improve their products or services, focusing their computations in a few major physical
More informationBCS Professional Examination 2015 Professional Graduate Diploma. April 2015. Examiners Report. System Design Methods
BCS Professional Examination 2015 Professional Graduate Diploma April 2015 Examiners Report System Design Methods Question 1 1.a) Discuss why prototyping and agile approaches to systems design are increasingly
More informationAuthoring Within a Content Management System. The Content Management Story
Authoring Within a Content Management System The Content Management Story Learning Goals Understand the roots of content management Define the concept of content Describe what a content management system
More informationClass Objectives. Total Quality Management. TQM Definitions. TQM Definitions. TQM Definitions. TQM Definitions. Basic concepts on TQM
Class Objectives Total Quality Management FScN 4131 Food Quality Basic concepts on TQM Compare TQM philosophies Describe the TQM process Total: Everyone should be involved Quality: customers should be
More informationApplied Analytics in a World of Big Data. Business Intelligence and Analytics (BI&A) Course #: BIA 686. Catalog Description:
Course Title: Program: Applied Analytics in a World of Big Data Business Intelligence and Analytics (BI&A) Course #: BIA 686 Instructor: Dr. Chris Asakiewicz Catalog Description: Business intelligence
More informationTotal Quality Management
Total Quality Management 1 Chapter 12: Total Employee involvement 2 Human factor is very important in implementation of any process or principle. It is all the more important in Quality Management. Organization
More informationBusiness Modeling with UML
Business Modeling with UML Hans-Erik Eriksson and Magnus Penker, Open Training Hans-Erik In order to keep up and be competitive, all companies Ericsson is and enterprises must assess the quality of their
More informationExtracting Business. Value From CAD. Model Data. Transformation. Sreeram Bhaskara The Boeing Company. Sridhar Natarajan Tata Consultancy Services Ltd.
Extracting Business Value From CAD Model Data Transformation Sreeram Bhaskara The Boeing Company Sridhar Natarajan Tata Consultancy Services Ltd. GPDIS_2014.ppt 1 Contents Data in CAD Models Data Structures
More informationBusiness Process Change and the Role of the Management Accountant
Butler University Digital Commons @ Butler University Scholarship and Professional Work - Business College of Business 1998 Business Process Change and the Role of the Management Accountant Sakthi Mahenthiran
More informationMeasurement Information Model
mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides
More informationData Modeling Basics
Information Technology Standard Commonwealth of Pennsylvania Governor's Office of Administration/Office for Information Technology STD Number: STD-INF003B STD Title: Data Modeling Basics Issued by: Deputy
More informationComparing Different Approaches to Two-Level Modelling of Electronic Health Records
113 Comparing Different Approaches to Two-Level Modelling of Electronic Health Records Line Michelsen, Signe S. Pedersen, Helene B. Tilma, Stig K. Andersen Abstract Department of Health Science and Technology
More informationBachelor's Degree in Management Information Systems
Description for all courses in BIS for Bachelor s Degree in Management Information Systems and Master s Degree in Management Information Systems Bachelor's Degree in Management Information Systems Department
More informationTalend Metadata Manager. Reduce Risk and Friction in your Information Supply Chain
Talend Metadata Manager Reduce Risk and Friction in your Information Supply Chain Talend Metadata Manager Talend Metadata Manager provides a comprehensive set of capabilities for all facets of metadata
More informationA Knowledge Management Framework Using Business Intelligence Solutions
www.ijcsi.org 102 A Knowledge Management Framework Using Business Intelligence Solutions Marwa Gadu 1 and Prof. Dr. Nashaat El-Khameesy 2 1 Computer and Information Systems Department, Sadat Academy For
More informationAUTOMATING DISCRETE EVENT SIMULATION OUTPUT ANALYSIS AUTOMATIC ESTIMATION OF NUMBER OF REPLICATIONS, WARM-UP PERIOD AND RUN LENGTH.
Proceedings of the 2009 INFORMS Simulation Society Research Workshop L.H. Lee, M. E. Kuhl, J. W. Fowler and S.Robinson, eds. AUTOMATING DISCRETE EVENT SIMULATION OUTPUT ANALYSIS AUTOMATIC ESTIMATION OF
More informationINFORMATION FOR OBSERVERS. Joint International Working Group on Leasing 15 February 2007, London
International Accounting Standards Board This document is provided as a convenience to observers at IASB/FASB joint international working group meeting on leasing, to assist them in following the working
More informationAnalyzing and Improving Data Quality
Analyzing and Improving Data Quality Agustina Buccella and Alejandra Cechich GIISCO Research Group Departamento de Ciencias de la Computación Universidad Nacional del Comahue Neuquen, Argentina {abuccel,acechich}@uncoma.edu.ar
More informationAn Object-Oriented Analysis Method for Customer Relationship Management Information Systems. Abstract
75 Electronic Commerce Studies Vol. 2, No.1, Spring 2004 Page 75-94 An Object-Oriented Analysis Method for Customer Relationship Management Information Systems Jyh-Jong Lin Chaoyang University of Technology
More informationDATA MINING TECHNOLOGY. Keywords: data mining, data warehouse, knowledge discovery, OLAP, OLAM.
DATA MINING TECHNOLOGY Georgiana Marin 1 Abstract In terms of data processing, classical statistical models are restrictive; it requires hypotheses, the knowledge and experience of specialists, equations,
More informationSoftware Engineering. System Models. Based on Software Engineering, 7 th Edition by Ian Sommerville
Software Engineering System Models Based on Software Engineering, 7 th Edition by Ian Sommerville Objectives To explain why the context of a system should be modeled as part of the RE process To describe
More informationModeling the User Interface of Web Applications with UML
Modeling the User Interface of Web Applications with UML Rolf Hennicker,Nora Koch,2 Institute of Computer Science Ludwig-Maximilians-University Munich Oettingenstr. 67 80538 München, Germany {kochn,hennicke}@informatik.uni-muenchen.de
More informationestatistik.core: COLLECTING RAW DATA FROM ERP SYSTEMS
WP. 2 ENGLISH ONLY UNITED NATIONS STATISTICAL COMMISSION and ECONOMIC COMMISSION FOR EUROPE CONFERENCE OF EUROPEAN STATISTICIANS Work Session on Statistical Data Editing (Bonn, Germany, 25-27 September
More informationImproving Traceability of Requirements Through Qualitative Data Analysis
Improving Traceability of Requirements Through Qualitative Data Analysis Andreas Kaufmann, Dirk Riehle Open Source Research Group, Computer Science Department Friedrich-Alexander University Erlangen Nürnberg
More informationDATA QUALITY DATA BASE QUALITY INFORMATION SYSTEM QUALITY
DATA QUALITY DATA BASE QUALITY INFORMATION SYSTEM QUALITY The content of those documents are the exclusive property of REVER. The aim of those documents is to provide information and should, in no case,
More informationHow To Develop An Enterprise Architecture
OSI Solution Architecture Framework Enterprise Service Center April 2008 California Health and Human Services Agency Revision History REVISION HISTORY REVISION/WORKSITE # DATE OF RELEASE OWNER SUMMARY
More informationMeasurement and Metrics Fundamentals. SE 350 Software Process & Product Quality
Measurement and Metrics Fundamentals Lecture Objectives Provide some basic concepts of metrics Quality attribute metrics and measurements Reliability, validity, error Correlation and causation Discuss
More informationA Fragmented Approach To CRM: An Oxymoron? By Glen S. Petersen
A Fragmented Approach To CRM: An Oxymoron? By Glen S. Petersen All rights reserved GSP & Associates, LLC 2000 Introduction Many organizations find themselves in the position of having an obvious problem
More informationThe Phases of an Object-Oriented Application
The Phases of an Object-Oriented Application Reprinted from the Feb 1992 issue of The Smalltalk Report Vol. 1, No. 5 By: Rebecca J. Wirfs-Brock There is never enough time to get it absolutely, perfectly
More informationEnterprise Resource Planning Analysis of Business Intelligence & Emergence of Mining Objects
Enterprise Resource Planning Analysis of Business Intelligence & Emergence of Mining Objects Abstract: Build a model to investigate system and discovering relations that connect variables in a database
More informationCONTEMPORARY SEMANTIC WEB SERVICE FRAMEWORKS: AN OVERVIEW AND COMPARISONS
CONTEMPORARY SEMANTIC WEB SERVICE FRAMEWORKS: AN OVERVIEW AND COMPARISONS Keyvan Mohebbi 1, Suhaimi Ibrahim 2, Norbik Bashah Idris 3 1 Faculty of Computer Science and Information Systems, Universiti Teknologi
More informationAn Automatic Tool for Checking Consistency between Data Flow Diagrams (DFDs)
An Automatic Tool for Checking Consistency between Data Flow Diagrams (DFDs) Rosziati Ibrahim, Siow Yen Yen Abstract System development life cycle (SDLC) is a process uses during the development of any
More informationManaging and Tracing the Traversal of Process Clouds with Templates, Agendas and Artifacts
Managing and Tracing the Traversal of Process Clouds with Templates, Agendas and Artifacts Marian Benner, Matthias Book, Tobias Brückmann, Volker Gruhn, Thomas Richter, Sema Seyhan paluno The Ruhr Institute
More informationDesign of a Multi Dimensional Database for the Archimed DataWarehouse
169 Design of a Multi Dimensional Database for the Archimed DataWarehouse Claudine Bréant, Gérald Thurler, François Borst, Antoine Geissbuhler Service of Medical Informatics University Hospital of Geneva,
More informationA Service Modeling Approach with Business-Level Reusability and Extensibility
A Service Modeling Approach with Business-Level Reusability and Extensibility Jianwu Wang 1,2, Jian Yu 1, Yanbo Han 1 1 Institute of Computing Technology, Chinese Academy of Sciences, 100080, Beijing,
More informationComparing Methods to Identify Defect Reports in a Change Management Database
Comparing Methods to Identify Defect Reports in a Change Management Database Elaine J. Weyuker, Thomas J. Ostrand AT&T Labs - Research 180 Park Avenue Florham Park, NJ 07932 (weyuker,ostrand)@research.att.com
More informationProcessing Requirements by Software Configuration Management
Processing Requirements by Software Configuration Management Ivica Crnkovic 1, Peter Funk 1, Magnus Larsson 2 1 Mälardalen University, Department of Computer Engineering, S-721 23 Västerås, Sweden {ivica.crnkovic,
More informationData Warehouses in the Path from Databases to Archives
Data Warehouses in the Path from Databases to Archives Gabriel David FEUP / INESC-Porto This position paper describes a research idea submitted for funding at the Portuguese Research Agency. Introduction
More informationComponent visualization methods for large legacy software in C/C++
Annales Mathematicae et Informaticae 44 (2015) pp. 23 33 http://ami.ektf.hu Component visualization methods for large legacy software in C/C++ Máté Cserép a, Dániel Krupp b a Eötvös Loránd University mcserep@caesar.elte.hu
More informationMining the Software Change Repository of a Legacy Telephony System
Mining the Software Change Repository of a Legacy Telephony System Jelber Sayyad Shirabad, Timothy C. Lethbridge, Stan Matwin School of Information Technology and Engineering University of Ottawa, Ottawa,
More informationONTOLOGY FOR MOBILE PHONE OPERATING SYSTEMS
ONTOLOGY FOR MOBILE PHONE OPERATING SYSTEMS Hasni Neji and Ridha Bouallegue Innov COM Lab, Higher School of Communications of Tunis, Sup Com University of Carthage, Tunis, Tunisia. Email: hasni.neji63@laposte.net;
More informationArchitecture Artifacts Vs Application Development Artifacts
Architecture Artifacts Vs Application Development Artifacts By John A. Zachman Copyright 2000 Zachman International All of a sudden, I have been encountering a lot of confusion between Enterprise Architecture
More informationData Quality Assessment. Approach
Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source
More informationRoot causes affecting data quality in CRM
MKWI 2010 Business Intelligence 1125 Root causes affecting data quality in CRM Chair of Business Informatics, Catholic University of Eichstaett-Ingolstadt 1 Introduction An important field of application
More informationSoftware Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University
Software Engineering Introduction & Background Department of Computer Science Kent State University Complaints Software production is often done by amateurs Software development is done by tinkering or
More informationEffecting Data Quality Improvement through Data Virtualization
Effecting Data Quality Improvement through Data Virtualization Prepared for Composite Software by: David Loshin Knowledge Integrity, Inc. June, 2010 2010 Knowledge Integrity, Inc. Page 1 Introduction The
More informationInvestigating Role of Service Knowledge Management System in Integration of ITIL V3 and EA
Investigating Role of Service Knowledge Management System in Integration of ITIL V3 and EA Akbar Nabiollahi, Rose Alinda Alias, Shamsul Sahibuddin Faculty of Computer Science and Information System Universiti
More information