TOTAL DATA QUALITY MANAGEMENT: A STUDY OF BRIDGING RIGOR AND RELEVANCE
|
|
|
- Nigel Blake
- 5 years ago
- Views:
From this document you will learn the answers to the following questions:
What dimensions does the TDQM methodology use to determine?
What theory describes the dimensions for data products?
What do the theories about the possible impacts of data quality?
Transcription
1 TOTAL DATA QUALITY MANAGEMENT: A STUDY OF BRIDGING RIGOR AND RELEVANCE Fons Wijnhoven, University of Twente, Enschede, Netherlands, [email protected] Roy Boelens, SG Automatisering, Emmen, Netherlands, [email protected] Rick Middel, University of Twente, Enschede, Netherlands, [email protected] Abstract Kor Louissen, Honeywell, Emmen, Netherlands, [email protected] Ensuring data quality is of crucial importance to organizations. The Total Data Quality Management (TDQM) theory provides a methodology to ensure data quality. Although well researched, the TDQM methodology is not easy to apply. In the case of Honeywell Emmen, we found that the application of the methodology requires considerable contextual redesign, flexibility in use, and the provision of practical tools. We identified team composition, toolsets, development of obvious actions, the design of phases, steps, and actions, and sessions as vital elements of making an academically rooted methodology applicable. Such an applicable methodology, we name well articulated, because it incorporates the existing academic theory and it is made operational. This enables the methodology to be systematically beta tested and made useful for different organizational conditions. Keywords: Total data quality management, applicability of theory, design science. 925
2 1 INTRODUCTION AND RESEARCH DESIGN Data is computer stored information (Stamper, 1973) and its quality can substantially influence on three strategic vectors of organizations, customer interaction, asset configuration, and knowledge leverage [Venkatraman & Henderson, 1998]. Therefore, data is an important resource for organizational competitiveness [Redman, 1998]. Incorrect data at the point of customer interaction may result in significant perceived quality problems by clients, whereas high quality data gives opportunities of serving clients better. For asset configuration, high quality of data may improve inventory management, production and resources planning. In business networks, poor data quality may result in accumulating errors through the value chain, and high quality data may reduce human coordination costs. High quality data enables business analysts to find proper insights in production and service processes and propose ways of improving them. The literature presents different views on quality [Garvin, 1988], which are reflected in the concept of data quality. Zahedi [1995] emphasizes two relevant views on data quality, i.e. a product based view to data quality, which defines quality as the difference in the quantity of some desired ingredient or material in a product and reality, and a production oriented view on data quality, which defines data quality as process conformance to requirements. In total quality management (TQM), both views have been integrated by Juran as fitness-for-use. This is probably the most fruitful perspective on data quality, because data is primarily important for users. The current total data quality management Data quality category Intrinsic data quality Contextual data quality Representational data quality Accessibility data quality Data quality dimensions Believability, Accuracy, Objectivity, Reputation Value-Added, Relevancy, Timeliness, Completeness, Appropriate amount of data Interpretability, Ease of understanding, Representation consistency, Concise representation Accessibility, Access security (TDQM) theory is an implementation of the idea of fitness-for-use [Wang & Strong, 1996]. The main quality dimensions of this approach are summarized in Table 1. TDQM is specifically a problem for larger firms, who own large data bases. In such organizations, as a consequence of required Table 1. Data quality dimensions [Wang & Strong, 1996: 20] specialization and differentiation [Mintzberg, 1983], many people have responsibilities for data and many people may have different data use needs. Additionally, the coordination costs [Gurbaxani & Wang, 1991] related to discussing data requirements and quality actions within Honeywell are often perceived as high and the intangible nature of data management makes an assessment of its benefits difficult [Sveiby, 1997]. Consequently, an organization of this kind needs efficient procedures and methods to coordinate data quality, without concrete value deliveries. Many companies are aware of data quality issues, and TDQM has become a common theme in organizations. Surprisingly enough, the literature in the field of Total Data Quality Management is scarce. Only a few papers in academic journals report on data quality ([Ballou et al, 1997; Pipino et al, 2002; Redman, 1998; Wang & Strong, 1996] and only one paper includes a full discussion of a TDQM methodology [Wang, 1998]. This TDQM methodology has not resulted in a wide spread use, and we assume that considerable problems exist in the applicability of this body of knowledge. Applicability of knowledge can be studied in at least three ways, i.e. 1) by an action research study, which stresses the contextual local relevance of the knowledge, 2) by an evaluation study, which stresses the academic quality and rigor of the knowledge and its application, and 3) by a design science study, which aims at integrating the relevance and rigor in one study [Hevner et al, 2004; Romme, 2003; Van Aken, 2004]. This article chooses for the last option. Design science sees the relation between academic insight and practice from two ways; i.e. application of academic insights should deliver concrete value for practice, and applying academic insights should give feedback to 926
3 academic work [Hevner et al, 2004; Van Aken, 2004]. To assure utility, both Van Aken [2004] and Hevner et al [2004] suggest assessing the designed methodology in practice and using the evaluation of these tests to refine the methodology. To assure utility of the TDQM methodology, this paper follows the logic of the design science framework as given in figure 1. Figure 1: The research plan as an example of design research (based on Hevner et al., 2004, p. 80) The described TDQM methodology (Knowledge Base) will be applied within the environment of Honeywell Combustion Controls Center Emmen (named Honeywell Emmen in the rest of this paper). A specific method for Honeywell Emmen is first developed (i.e. IS research), based on the existing TDQM knowledge base and the needs of Honeywell Emmen. These needs for Honeywell Emmen are based on interviews with key members in the area of the product-attributes, the business improvement and business IT manager, a financial manager responsible for the status of the products in the database, and a financial controller/planner. An assessment within Honeywell Emmen will then be executed. The results of the assessment are used to refine the method and to find additions to the knowledge base for TDQM with respect to its applicability. 2 THE TDQM KNOWLEDGE BASE The current literature on data quality reports on concepts to describe the field (e.g. Wang & Strong, 1996) and theories that explain possible impacts and causes of data quality (e.g. Redman, 1998). This study, though, does not focus on this theory but aims at studying a methodology of handling data quality. Such a methodology should help solving problems (as explained by theory), avoiding problems, or achieving certain ambitions. An initial step for such a methodology is the design of the team that has to run the activities The TDQM team Following Wang (1998), the data product quality team should consist of: Team leader: A senior executive able to implement chosen solutions. Team engineer: One who should have knowledge about the method and the techniques used in the method. Can also be the facilitator of the team meetings. Data suppliers: Those who create or collect data for the data product. Data manufacturers: Those who design, develop, or maintain the data and systems infrastructure for the data product. Data consumers: Those who use the data product in their work. 927
4 Data product managers: Those who are responsible for managing the entire data product production process throughout the data product life cycle. This team should be well trained in data quality assessment and management skills. 2.2 The TDQM cycle TDQM methodologies use an adapted version of the Deming cycle. This cycle has become a key issue in the TQM literature, and it identifies four main steps; Plan, Do, Check, Act [Deming, 1986]. The TDQM methodology uses a version that is directed towards data quality, consisting of the steps; Defining, Measuring, Analysing, and Improving [Wang, 1998] The define phase identifies three steps Step 1 determines the data products characteristics. In describing the characteristics of the data product, Wang [1998] uses two levels. The highest level describes the characteristics of the total data product, and the lowest level describes each product-attribute individually. Figure 2 illustrates the difference between data products and product-attributes. Figure 2: Data products and data product-attributes Because not all data products and attributes can be assessed in one time effort it is useful to focus on the most important data product or attribute at a time, which may be the products and attributes whose quality problems have highest impact. Therefore Redman [1998] created a list of ten most commonly seen impacts of poor data quality. Step 2 determines the requirements for the data products. The TDQM methodology proposes to determine which of the quality dimensions for data products (or attributes) [Wang & Strong, 1996] are most important by answering questions about: How important each team member thinks the dimension is, The perceived level of quality in a dimension, and The expected level of quality in a dimension. Step 3 determines the data manufacturing process The data manufacturing process consists of data flows from the supplier to the user of the data, including certain processing activities and quality checks. Wang [1998] explains that knowledge of the manufacturing process serves as a basis for better understanding why certain quality dimensions are important and how the responsibilities over this process are divided. Additionally, redundant processes and storages can be detected that often easily lead to errors and inconsistencies. Having a clearly defined process also helps in data quality management by codifying processes, making them personindependently and reliable. A simplified version of such a data manufacturing process is given in Figure
5 Fig. 3: Illustration of an information manufacturing systems (based on Wang, 1998, p. 64) The measurement phase The measurement phase determines the quality of the dimensions identified in the define phase. This phase involves two steps: Step 1 selects proper metrics. In determining a metric for a data quality dimension (see Table 1), the team should keep in mind underlying business rules, (ISO) norms, and laws that might have contributed to a dimension s importance. Three quality dimensions measurement forms were identified by Pipino et al [2002]. The Simple Ratio, measures the ratio of outcomes of a selected variable to total outcomes of that variable The Min or Max Operation, handles dimensions that require the aggregation of multiple data quality variables The Weighted Average. If a company for example has a good understanding of the importance of each variable to the overall evaluation of a dimension, then a weighted average of the variables is appropriate Because it can be difficult to translate the dimensions chosen in the previous step into metrics which represent quality, the following guidelines (see Table 2) can be used [Kovac et al, 1997]. R U M B A Is the metric Reasonable? Is the metric Understandable? Is the metric Measurable? Is the metric Believable? Is the metric Achievable? Table 2. RUMBA guidelines for metrics[kovac et al., 1997] Step 2 measures and presents data. Once the metrics are determined, measurement can be conducted. If there are for example products in the database, a sample size of about 400 products is needed to reach a reliability of 95% with a possible variance of 5% [Moore et al, 1989]. Several types of charts can be used to display the results. To identify which dimensions cause the problems the results can also be displayed using Pareto diagrams [Zahedi, 1995]. In this chart the horizontal axel displays the errors in the different dimensions, and the vertical axel displays the percentage of the errors in the different dimensions compared to the total number of errors. This is an easy way of showing which dimensions cause most problems The analysis phase The goal of this phase is to find the root causes for the problems in the different dimensions [Wang, 1998]. Three methods to find root causes have become popular nowadays: the cause and effect diagram (CED), the interrelationship diagram (ID), and the current reality tree (CRT). According to Doggett [2005] the CED method is the easiest tool for identifying root causes. This step should answer the questions what the problem is, when it happened, why it happened, and how it impacts the overall goals of the firm. 929
6 2.2.4 The improvement phase Wang (1998) proposes four steps for the improvement phase: Step 1 solution generations. Wang [1998] proposes to use the information manufacturing analysis matrix designed by Ballou et al. [1997] and presented in Figure 3. This figure may make the lack of quality checks or processes apparent, or identify possible redundant processes and redundant data sources. Step 2 selects solutions. When selecting solutions, it is necessary to know what impacts the solutions might have, the cost to design and implement the solution, the resources required, and the reduction of risk. Priorities for the solutions can be stated when the importance of each evaluation criterion is known [Zahedi, 1995]. Step 3 develops an action plan. The selected solution will have to be transferred into specific actions. The team should assign team members or other employees to these actions to make sure they are executed. A Project Action Plan can be set up to keep track of all actions and their statuses. Step 4 checks progress. An obvious step, but only marginally discussed in the literature. 3. HONEYWELL EMMEN S ENVIRONMENT Honeywell Emmen is a production factory in The Netherlands which mainly produces gas valves. The data of these gas valves and the data of all its parts is stored in the Honeywell Emmen database system. Honeywell Emmen (and its customers) has experienced problems with the quality of this data in the recent months. The main causes of these problems that are: No structured way to analyze and improve product-attribute data quality No structured way to analyze and improve the division of responsibilities over the productattributes Lack of a structured way for communication between stakeholders about problems in productattribute data quality The problems that Honeywell Emmen experiences are connected to multiple data products which are generated by the database system. The basis of these data products are about hundred different product-attributes. 4. DESIGN OF THE HONEYWELL EMMEN TDQM METHOD The TDQM method for Honeywell Emmen was designed on basis of the TDQM theory of section 2, and modifications where needed to make TDQM feasible for them. 4.1 The TDQM team Like Wang [1998] prescribes, Honeywell should first establish a data quality team. Because of the difference in data product handled by the TDQM methodology and product-attribute handled by the method of Honeywell, the following changes to the team described by Wang [1998] have to be made. First of all, the team members may vary. The method has to be able to handle multiple productattributes instead of one data product. Therefore, multiple data suppliers and consumers can be involved with each product-attribute. This leads to the need to select those team members based on their role in the chosen product-attribute. Data product managers, however, can not be selected, because the method does not handle a data product but data attributes instead. In this place a new role has to be assigned. A role with much influence in the data lifecycle of the product-attributes in Honeywell Emmen is the financial team member. The financial member has to assure that all the 930
7 necessary product-attributes are filled out before making a product active in the database, which means that it can be ordered and produced. These changes lead to the following team members: Permanent members: o The team champion, i.e. a senior executive, to be able to implement the solutions generated by the team. o The team engineer, who has knowledge about the method and the techniques used in the method. Can also be the facilitator of the team meetings. o The data manufacturer, i.e. the person who designs, develops, and maintains the data for the product-attribute. o The financial team member, who is responsible for an important attribute and who has knowledge about the financial consequences of a proposed solution. Non permanent members: o The data supplier, who creates and collects data for the product-attribute. o The data consumer/user, who uses the product-attribute in his/her work. 4.2 Honeywell Emmen s TDQM cycle The TDQ method we designed for Honeywell Emmen, in many ways is close to the theory described in section 2, and summarized in Table 3. This method, though, deviates on four fundamental issues from the theory: Wang [1998] prescribes three steps in the define phase; determining the data products characteristics, determining the requirements for the data products, and determining the data manufacturing process. To apply the TDQM method on product-attributes (an essential decision based on the large size of the Honeywell Emmen databases), a step had to be added in which a product-attribute and the required non-permanent team members were selected. Experienced permanent team members should have knowledge of past product-attribute problems. Redman s list of ten commonly seen impacts of poor data can be used here to identify key attributes. The selection of the product-attribute influences who the involved stakeholders are. Therefore, the product-attribute selection should be done before the (non-permanent) team members can be selected. This means that the team should consist of permanent members, who can then choose non-permanent members depending on their role with the selected product-attribute. 931
8 The designed Total Data Quality Management Method Step 1: Product-attribute and team members selection Actions: Selection of the product-attribute to improve; Selection of team members; Schedule the first meeting Tools 1 Fill in the Standard Project Description Form Step 2: Characteristics of the Product-attribute Actions: Description of the main characteristics of the selected product-attribute. Tools 2 Product-attribute Characteristics Form Step 3: Important Quality dimensions Actions: The members each score the importance, perceived quality and expected quality of each quality dimension; Determine which dimensions need improvement Tools 3 Quality Dimension Score Sheet; Calculation Sheet Step 4: Data manufacturing process Actions: Determining the Data manufacturing process and make a flowchart of it Tools 4 MS Visio TDQM Template Step 5: Measurement Metrics Actions: (Reflect on the results of the previous meeting); Determine the metrics for the quality dimensions; Plan the measurement steps and actions that follow from the first steps Tools 5 RUMBA criteria Step 6: Measurement Actions: (Reflect on the chosen Metrics); Measure a sample of products and determine the quality; Possibly other data (then Oracle data) should be collected by other DQ Team members; Prepare Pareto diagrams and other graphs Tools 6 MS Excel Step 7: Describe Specific Problems Actions: Describe the specific problems foundby the measurement Tools 7 Problem Description Form Step 8: Analysis of the problems Actions: Analysis of the problem by determining the cause map; Draw the cause map Tools 8 MS Visio, MS Excel, or a whiteboard Step 9: Solution Generation Actions: Think of solutions to every cause that is defined on the cause map Tools 9 MS Visio, MS Excel, or a whiteboard Step 10: Solution Selection Actions: The team selects possible solutions; The solutions have to be scored based on their impact, costs, and risks. The solution that scores best has to be implemented. Tools 10 Solution Score Sheet Step 11: Action Plan Actions: Tasks should be identified to implement the solution; The DQ team should assign these tasks to team members or other employees. Tools 11 Project Action Plan Step 12: Check Progress Actions: Discuss the progress of implementing the solutions; Solve problems Tools 12 Update the Project Action Plan if needed Define Phase Measure Phase Analysis Phase Improvement Phase Table 3 The Total Data Quality Management Method for Honeywell Startup Session First Group Session Measurement Session Second Group Session Second Group Session Extra Session We also had to develop toolsets to facilitate concrete actions in each set. These toolset, consisting of forms, excel sheets, and drawing tools, have been regarded by the Honeywell team as essential for easing discussions in each phase, and for documenting and maintaining consistency during the TDQM cycle. It also made the cycle less sensitive on employee job changes which otherwise may result in substantial lost memory. We followed the prescriptions of section 2 in many ways, though we had to be much more specific with regards to the last step, i.e. Check Progress, because for Honeywell Emmen real improvements 932
9 are aimed at, and a quality cycle was not believed to result in improvements by itself. To be sure that the actions of the action plan are implemented, the team leader can plan a meeting to review the progress of the action plan. Problems can be dealt with and new actions can be initiated if needed. We also had to plan concrete sessions, each of which made sense in terms of the cycle. 5. ASSESSMENT AND REFINEMENT OF HONEYWELL EMMEN S TDQM The designed method has been field tested on two different product-attributes. Before the first product-attribute was selected the permanent team members were selected based on the descriptions in the previous chapter. The manager of the product data management department was chosen as the team leader. This manager is closely related to the product-attributes and is part of the management team of Honeywell Emmen. The team engineer and data-product manufacturer roles were combined into one person, the database administrator. He has extensive knowledge of the database systems, knows the possibilities of the system, and knows the MS Excel and MS Visio tools to analyze the data and present it to the team. The cost administrator was chosen as the financial team member. She has knowledge about the possible financial impacts of a chosen solution and is also involved in the product data lifecycle. She has the responsibility to activate new products on the database. Section 5.1 gives the results of the first cycle of the method within Honeywell and section 5.2 gives the results of the second cycle of the method. 5.1 The first TDQM Cycle: Country-of-Origin attribute In the starting meeting, the permanent team members selected Country-of-origin as the first data product attribute to consider based on the problems it had caused in the past. At this meeting, also the non-permanent team members from the purchasing and customer logistics departments were selected. The first complete data quality team meeting was also planned. This data was filled out in the Standard Project Description Form The first team session In the first data quality team meeting, the team engineer, the data supplier (i.e. the representative from the purchasing department), and a data user (i.e. someone from the customer logistics department) were present. The team leader and the financial member were not present. Management did not formally approve the team yet, so these team members gave priority to other tasks. Other members also indicated that they had little time to attend the meetings. The third step of the method was done in a group discussion instead of using the calculation sheets, because a prior discussion could have influenced the scores on the Quality Dimensions Score Sheets, and because it saved time. Step five was not completed, because the data supplier and data user had to go to other meetings. These events lead to the first conclusion that the method has to be flexible, depending on the complexity of the product-attribute handled. Next to this insight the team also made some adjustments to the standard documents, to make them clearer and easier to interpret The measurement session The measurement phase was conducted by the team engineer. Both step five and six were conducted without problems. The results of the measurements revealed problems in the completeness and consistency dimension. The results in the completeness dimension are given in Figure 4. The solutions are sorted in the main product categories Honeywell Emmen uses i.e. Make-products which are made by Honeywell Emmen themselves 933
10 Buy-products which Honeywell buys from other Honeywell companies and sells on the Dutch market Active buy products, which are buy products that can be bought and sold. Next to this sortation, the results are also sorted across the different fields in which the data for the attribute is stored, in this case six different fields. The completeness dimensions revealed missing data in 21% of the 400 sample products, measured on six database levels. Completeness 0,25 0,2 0,15 0,1 0,05 on flex 567 on flex NL1 on flex NLV on cat All products Active Products Make Products Buy Products Active Make Active Buy Active Buy Finals on cat NL1 on cat NLV Figure 4: The measurement results in the completeness dimension The second team session The second complete data quality team meeting started without the team leader and the data supplier. Both were on vacation. Step seven until eleven were conducted. The Standard Problem Description Form was filled out, and based on the conclusions of the cause analysis the Project Action Plan was filled out. Since the team had already a clear idea about the causes and the solutions to these causes, the steps did not take much time. The cause map was not drawn and the team quickly decided on which solutions to implement Evaluation The main conclusion that can be stated after the evaluation of the first cycle, based on interviews and an evaluation survey, is that time was a limiting factor in the team meetings, because of lacking formal approval. This increases the need to institutionalize the data quality team. The evaluation revealed that the knowledge of the team members about the product-attribute improved. The team members indicated that this method provided a structured way to handle the problems with this product-attribute. They liked the fact that it guided the meetings about the analysis of the product-attribute quality. They were also confident that the method would be able to improve the quality of the product-attributes. 5.2 Second TDQM Cycle: The Product-family-code attribute Prior to the second cycle the method was reviewed and a shorter version of the third step was added. A new Quality Dimensions Score Sheet was developed to be able to document the results of the group discussion about the importance of quality dimensions. Also the scoring scale was reduced from 7 to 3 levels. This was done to improve clarity of the results. To be able to determine how the method handles a less complicated product-attribute (a productattribute with only one stakeholder), and to see how much time this takes, the team decided to handle 934
11 the Product-Family-Code (PFC) in the second cycle. This product-attribute plays an important role in sales analysis and could therefore influence the decision-making process. The product-attribute is supplied by the financial team member, and also used primarily by the financial team member. Therefore no non-permanent team members had to be invited The first team session The first complete data quality team meeting was planned in the holiday season, therefore, only the financial team member (next to the BIT&BI manager and one of the authors of this paper) was present. During this first session all planned steps (step two till step five) were conducted. In step three the team used the new quality dimension score sheet, which provided more explanation about the dimensions and ed the three point scale (low, medium, high). Step four was skipped, because the data manufacturing process was too simple to describe. A short description of this process was added to the characteristics of the product-attribute. In step five three measurement metrics were chosen to measure the completeness, consistency, and accuracy dimensions The measurement session The measurement session was done by the financial team member with help of one of the authors. The measurement revealed that only five missing data fields were found in a sample size of 400 products. These five products were all new products, which lead to the conclusion that no further actions were required in this product-attribute. The financial team member had no other remarks about the productattribute and its process. The documents were stored and the project was completed Evaluation Handling this product-attribute proved that the method is able to quickly come to a conclusion about the quality of this relative simple product-attribute. The first session took about one and a half hour and the measurement session took about four hours. The evaluation did not show new issues. Although the method did not increase the knowledge about the product-attribute (only one stakeholder is involved in the process and she already knew the product-attribute), the method did give insight in the quality of the product-attribute. 6 CONCLUSION: CONTRIBUTIONS TO THE KNOWLEDGE BASE AND APPLICABILITY Equipping the TDQM team with the designed method has provided Honeywell Emmen with a structured way to communicate about product-attribute problems, and analyzing and improving the quality of the product-attributes. This resulted in for instance the improvements to the completeness quality dimension of the Country-of-Origin product-attribute realized in the first cycle. See Figure
12 Completeness (Active, Buy, 567-FGB) 0,12 0,1 0,08 0,06 0,04 0,02 0 on flex 567 on flex NL1 on flex NLV on cat 567 on cat NL1 on cat NLV Active Buy Finals Before Active Buy Finals After Figure 5: Quality improvements of Completeness of the Country of Origin product-attribute after using the TDQM method. It is important to note that the TDQM theory was not able to realize these impacts in a direct way, as we had to make the theory applicable by improving it on five points: Adjust the TDQM team composition by splitting it in a permanent and non permanent team part. The permanent team enables continuity in TDQM and the non-permanent team members help to focus on specific urgent topics. We added toolsets to support the actions required in each phase of the TDQM, to improve ways of doing the tasks needed and to develop references points for the execution of the cycles. Some obvious actions were incompletely defined in the theory, but although the lacking actions seem rather obvious, they are essential to make the TDQM a success. It is surprising how vital and complex actions can be that from an academic perspective seem rather obvious. We particularly refer here to the Check Progress activities. We also became aware that a good methodology here needs phases, steps, actions and tools, including a concrete session plan. These topics are rather frequently mixed up in unclear definitions of methods and techniques. In the application of the methodology we also met the urgency of flexibility. Situational aspects require a cycle to be shorter or longer, more detailed or more superficial, have other team members and techniques. We believe that team composition, toolsets, development of obvious actions, the design of phases, steps, actions, and session plans are vital elements of making an academically rooted methodology applicable. A methodology we want to name well articulated methodology, when it incorporates the existing academic theory and it is made operational. Such well-articulated methodologies are sources for: Beta testing by other companies, so that their development is improved by practical experiences of others, and Development of so-called kernel theories (Walls et al, 1992) at the middle range level that explain what choices with respect to the methodology design issues are best given what conditions. The experiences of other companies are important here, and may be collected by surveys and comparative cases studies. It is obvious that the study we did here is limited to data in databases. The research may be extended to: Information, i.e. also data on non-database media. Probably this requires theory from very different fields, like communication theory and interpersonal sense making (Te eni, 2002). Data on the Internet. Knowing the quality here is difficult and people are easily misled by incorrect data on the Internet (Poston & Speier, 2005). 936
13 Literature Ballou, D.P., Wang, R.Y., Pazer, H., & Tayi, G.K. (1997). Modelling information manufacturing systems to determine information product quality. Management Science, 44 (4), Deming, E.W. (1986). Out of the Crisis. Center for Advanced Engineering Study, MIT, Cambridge, MA. Doggett, A.M. (2005). Root cause analysis: A framework for tool selection. Quality Management Journal, 12 (4), Garvin, (1988). Managing Quality: The strategic and Competitive Edge. Free Press, New York. Gurbaxani, V. & Wang, S. (1991). The impact of information systems on organizations and markets. Communications of the ACM, 34 (1), Hevner, A.R.; March, S.T.; Park, J.; Ram, S. (2004). Design Science in Information Systems Research, MIS Quarterly 28 (1), Kovac, R., Lee, Y.W., & Pipino, L.L. (1997). Total Data Quality Management: the case of IRI. Proceedings of the Conference on Information Quality, pp Cambridge (MA). Mintzberg, H. (1983). Structures in fives: Designing effective organizations. Prentice-Hall, Englewood Cliffs (NJ). Moore, D.S. & McCabe, G.P. (1989). Introduction to the Practice of Statistics. W.H. Freeman & Company, London. Pipino, L.L., Lee, Y.W., & Wang, R.Y. (2002) Data quality assessment. Communications of the ACM, 45 (4), Poston, R. & C. Speier (2005), Effective use of knowledge management systems: A process model of content ratings and credibility indicators. Management Information Systems Quarterly, 29 (2), Redman, T. C. (1998). The impact of poor data quality on the typical enterprise. Communications of the ACM, 41 (2), Romme, A.G. (2003), Making a difference: Organization as design. Organization Science, 14 (5), Stamper, R. (1973), Information in business and administrative systems. John Wiley, New York. Sveiby, K.E. (1997). The new organizational wealth: Managing and measuring knowledge-based assets. Berrett-Koechler, San Francisco (CA). Te eni, D. (2001), Review: A cognitive-affective model of organizational communication for designing IT. Management information systems quarterly, 25 (2), Van Aken, J.E. (2004). Management Research Based on the Paradigm of the Design Sciences: The Quest for Field-Tested and Grounded Technological Rules. Journal of Management Studies, 41 (2), Venkatraman, N. & Henderson, J.C. (1998). Real Strategies for Virtual Organizing. Sloan Management Review, 41 (1), Walls, J.G., G.R. Widmeyer, & O.A. El Sawy (1992). Building an information system design theory for vigilant EIS. Information Systems Research, 3 (1), Wang, R.Y. (1998). A product perspective on total data quality management. Communications of the ACM, 41 (2), Wang, R. Y. & Strong, D. M. (1996). Beyond accuracy: What data quality means to data consumers. Journal of Management Information Systems, 12 (4), Zahedi, F. (1995). Quality Information Systems. Boyd & Fraser, Danvers (MA). 937
Data Quality Assessment
Data Quality Assessment Leo L. Pipino, Yang W. Lee, and Richard Y. Wang How good is a company s data quality? Answering this question requires usable data quality metrics. Currently, most data quality
Total Data Quality Management: The Case of IRI
Total Quality Management: The Case of IRI Rita Kovac Yang W. Lee Leo L. Pipino Information Resources Incorporated Cambridge Research Group University of Massachusetts Lowell [email protected] [email protected]
TOWARD A FRAMEWORK FOR DATA QUALITY IN ELECTRONIC HEALTH RECORD
TOWARD A FRAMEWORK FOR DATA QUALITY IN ELECTRONIC HEALTH RECORD Omar Almutiry, Gary Wills and Richard Crowder School of Electronics and Computer Science, University of Southampton, Southampton, UK. {osa1a11,gbw,rmc}@ecs.soton.ac.uk
Analyzing and Improving Data Quality
Analyzing and Improving Data Quality Agustina Buccella and Alejandra Cechich GIISCO Research Group Departamento de Ciencias de la Computación Universidad Nacional del Comahue Neuquen, Argentina {abuccel,acechich}@uncoma.edu.ar
Quality. Data. In Context
Diane M. Strong, Yang W. Lee, and Richard Y. Wang Data A new study reveals businesses are defining Quality data quality with the consumer in mind. In Context DATA-QUALITY (DQ) PROBLEMS ARE INCREASINGLY
Appendix B Data Quality Dimensions
Appendix B Data Quality Dimensions Purpose Dimensions of data quality are fundamental to understanding how to improve data. This appendix summarizes, in chronological order of publication, three foundational
IQ Principles in Software Development
IQ Principles in Software Development Dipl.Kfm. Michael Mielke DB Bildung (DZB / TQM Team) Teamleiter Informationsmanagement TQM Solmsstrasse, 18 60486 Frankfurt / Main [email protected] Abstract:
The Development of a Data Quality Framework and Strategy for. the New Zealand Ministry of Health
The Development of a Data Quality Framework and Strategy for the New Zealand Ministry of Health Karolyn Kerr Department of Information Systems and Operations Management, University of Auckland, Private
Conceptualizing Total Quality Management (TQM) for Improving Housing Areas for the Urban Poor
Conceptualizing Total Quality Management (TQM) for Improving Housing Areas for the Urban Poor Abstract This paper examines the concept of TQM and investigates and identifies factors in all three phases
Management Information System Prof. Biswajit Mahanty Department of Industrial Engineering & Management Indian Institute of Technology, Kharagpur
Management Information System Prof. Biswajit Mahanty Department of Industrial Engineering & Management Indian Institute of Technology, Kharagpur Lecture - 02 Introduction Part II Welcome to all of you
Information Quality Benchmarks: Product and Service Performance
Information Quality Benchmarks: Product and Service Performance Beverly K. Kahn, Diane M. Strong, and Richard Y. Wang Information quality (IQ) is an inexact science in terms of assessment and benchmarks.
Software Process Improvement Framework Based on CMMI Continuous Model Using QFD
www.ijcsi.org 281 Software Process Improvement Framework Based on CMMI Continuous Model Using QFD Yonghui CAO 1, 2 1, School of Economics & Management, Henan Institute of Science and Technology, Xin Xiang,
10 Potholes in the Road to Information Quality
Cybersquare Diane M. Strong Yang W. Lee Richard Y. Wang Worcester Polytechnic Institute Cambridge Research Group Massachusetts Institute of Technology 10 Potholes in the Road to Quality Worganizations
An Enterprise Framework for Evaluating and Improving Software Quality
An Enterprise Framework for Evaluating and Improving Software Quality Abstract Philip Lew [email protected] With the world s economy increasingly driven by software products, there has been a relentless
The Design and Improvement of a Software Project Management System Based on CMMI
Intelligent Information Management, 2012, 4, 330-337 http://dx.doi.org/10.4236/iim.2012.46037 Published Online November 2012 (http://www.scirp.org/journal/iim) The Design and Improvement of a Software
Random forest algorithm in big data environment
Random forest algorithm in big data environment Yingchun Liu * School of Economics and Management, Beihang University, Beijing 100191, China Received 1 September 2014, www.cmnt.lv Abstract Random forest
Center for Effective Organizations
Center for Effective Organizations TOTAL QUALITY MANAGEMENT AND EMPLOYEE INVOLVEMENT: SIMILARITIES, DIFFERENCES, AND FUTURE DIRECTIONS CEO PUBLICATION G 92-16 (219) EDWARD E. LAWLER III University of Southern
DATA QUALITY DATA BASE QUALITY INFORMATION SYSTEM QUALITY
DATA QUALITY DATA BASE QUALITY INFORMATION SYSTEM QUALITY The content of those documents are the exclusive property of REVER. The aim of those documents is to provide information and should, in no case,
Building a Data Quality Scorecard for Operational Data Governance
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
Data Governance in the B2B2C World
WHITE PAPER Data Governance in the B2B2C World Today s empowered consumers and their demand for product data have given rise to challenges in product data management. Manufacturers and retailers more than
Self-assessment for Reliable Cash Register Quality Mark
Secretariaat: ECP Postbus 262 2260 AG Leidschendam 070-4190309 [email protected] http://www.keurmerkafrekensystemen.nl/ Self-assessment for Reliable Cash Register Quality Mark Version 0.3, 10 May 2012
Business Continuity Position Description
Position Description February 9, 2015 Position Description February 9, 2015 Page i Table of Contents General Characteristics... 2 Career Path... 3 Explanation of Proficiency Level Definitions... 8 Summary
A Privacy Officer s Guide to Providing Enterprise De-Identification Services. Phase I
IT Management Advisory A Privacy Officer s Guide to Providing Enterprise De-Identification Services Ki Consulting has helped several large healthcare organizations to establish de-identification services
ORACLE INTEGRATED OPERATIONAL PLANNING
ORACLE INTEGRATED OPERATIONAL PLANNING KEY FEATURES AND BENEFTIS KEY FEATURES Integrated operational and financial planning models to help develop accurate revenue and profit projections Change based calculation
The Current Role of Accounting Information Systems
'Club of Economics in Miskolc' TMP Vol. 8., Nr. 1., pp. 91-95. 2012. The Current Role of Accounting Information Systems ZSUZSANNA TÓTH PH.D. STUDENT e-mail: [email protected] SUMMARY The primary purpose
Chapter 3 Local Marketing in Practice
Chapter 3 Local Marketing in Practice 3.1 Introduction In this chapter, we examine how local marketing is applied in Dutch supermarkets. We describe the research design in Section 3.1 and present the results
Do you know? "7 Practices" for a Reliable Requirements Management. by Software Process Engineering Inc. translated by Sparx Systems Japan Co., Ltd.
Do you know? "7 Practices" for a Reliable Requirements Management by Software Process Engineering Inc. translated by Sparx Systems Japan Co., Ltd. In this white paper, we focus on the "Requirements Management,"
BENEFITS DERIVED BY SMEs THROUGH IMPLEMENTATION OF TQM
BENEFITS DERIVED BY SMEs THROUGH IMPLEMENTATION OF TQM Yogesh A. Chauhan 1 1 Associate Professor, Mechatronics Engineering Department, G.H.Patel College of Engineering & Technology, Gujarat, India, Abstract
How to Lead a CRM Planning Workshop
How to Lead a CRM Planning Workshop APracticalGuideforLaunchingaSuccessfulCRMInitiative www.ismsystems.com 2008IntegratedSalesManagement Page1 Why Hold a CRM Planning Workshop? Challenge: You are buying
Project: Operations Management- Theory and Practice
Operations management can be defined as the management of the supply chain logistics of an organisation to the contemporary measures of performance of cost, time and quality. Research the literature on
Summary Ph.D. thesis Fredo Schotanus Horizontal cooperative purchasing
Summary Ph.D. thesis Fredo Schotanus Horizontal cooperative purchasing Purchasing in groups is a concept that is becoming increasingly popular in both the private and public sector. Often, the advantages
Overview. The Concept Of Managing Phases By Quality and Schedule
The Project Management Dashboard: A Management Tool For Controlling Complex Projects Project Management White Paper Series--#1001 John Aaron, Milestone Planning And Research, Inc. 5/14/01 Overview The
Temkin Group Insight Report
ROI of Customer, 2014 CX Highly Correlates to Loyalty Across 19 Industries, Delivers Up To $460M Over 3 Years By Bruce Customer Transformist & Managing Partner Group September 2014 Group [email protected]
Reusing Meta-Base to Improve Information Quality
Reusable Conceptual Models as a Support for the Higher Information Quality 7DWMDQD :HO]HU %UXQR 6WLJOLF,YDQ 5R]PDQ 0DUMDQ 'UXåRYHF University of Maribor Maribor, Slovenia ABSTRACT Today we are faced with
Aspen Collaborative Demand Manager
A world-class enterprise solution for forecasting market demand Aspen Collaborative Demand Manager combines historical and real-time data to generate the most accurate forecasts and manage these forecasts
Chapter 4 SUPPLY CHAIN PERFORMANCE MEASUREMENT USING ANALYTIC HIERARCHY PROCESS METHODOLOGY
Chapter 4 SUPPLY CHAIN PERFORMANCE MEASUREMENT USING ANALYTIC HIERARCHY PROCESS METHODOLOGY This chapter highlights on supply chain performance measurement using one of the renowned modelling technique
INFORMATION TECHNOLOGIES AND MATERIAL REQUIREMENT PLANNING (MRP) IN SUPPLY CHAIN MANAGEMENT (SCM) AS A BASIS FOR A NEW MODEL
Bulgarian Journal of Science and Education Policy (BJSEP), Volume 4, Number 2, 2010 INFORMATION TECHNOLOGIES AND MATERIAL REQUIREMENT PLANNING (MRP) IN SUPPLY CHAIN MANAGEMENT (SCM) AS A BASIS FOR A NEW
Full Time Master of Science in Management program. Core concepts and disciplinary foundations of the courses. Marketing Management Specialization
Full Time Master of Science in program Core concepts and disciplinary foundations of the courses Specialization Courses during the adaptation phase (Pre-Master) Deep Dive Business Strategy Managerial Economics
SUPPLY CHAIN SEGMENTATION 2.0: WHAT S NEXT. Rich Becks, General Manager, E2open. Contents. White Paper
White Paper SUPPLY CHAIN SEGMENTATION 2.0: WHAT S NEXT Rich Becks, General Manager, E2open 2 3 4 8 Contents Supply Chain Segmentation, Part II: Advances and Advantages A Quick Review of Supply Chain Segmentation
ENHANCEMENT OF FINANCIAL RISK MANAGEMENT WITH THE AID OF ANALYTIC HIERARCHY PROCESS
ISAHP 2005, Honolulu, Hawaii, July 8-10, 2003 ENHANCEMENT OF FINANCIAL RISK MANAGEMENT WITH THE AID OF ANALYTIC HIERARCHY PROCESS Jerzy Michnik a,b, 1, Mei-Chen Lo c a Kainan University, No.1, Kainan Rd.,
Using systems thinking to better understand the implications of ebusiness strategies. Mark Rowland Principal Consultant, Business Dynamics Group, PricewaterhouseCoopers email: [email protected]
Software Engineering Compiled By: Roshani Ghimire Page 1
Unit 7: Metric for Process and Product 7.1 Software Measurement Measurement is the process by which numbers or symbols are assigned to the attributes of entities in the real world in such a way as to define
Business Intelligence Not a simple software development project
Business Intelligence Not a simple software development project By V. Ramanathan Executive Director, Saksoft Date: Mar 2006 India Phone: +91 44 2461 4501 Email: [email protected] USA Phone: +1 212 286 1083
How To Develop Software
Software Engineering Prof. N.L. Sarda Computer Science & Engineering Indian Institute of Technology, Bombay Lecture-4 Overview of Phases (Part - II) We studied the problem definition phase, with which
Information Quality for Business Intelligence. Projects
Information Quality for Business Intelligence Projects Earl Hadden Intelligent Commerce Network LLC Objectives of this presentation Understand Information Quality Problems on BI/DW Projects Define Strategic
Avaya Strategic Communications. Consulting. A Strong Foundation for Superior Business Results. Table of Contents. Taking Business Vision to Reality
Avaya Strategic Communications Consulting Table of Contents Taking Business Vision to Reality... 1 Section 1: The Technology Contribution Challenge..... 1 Section 2: A Systematic Approach for Ensuring
Trading Partner Practices January February March 2008
Perfecting Retailer-Supplier Execution Journal of Trading Partner Practices January February March 2008 What is SRM and Why Does it Matter to the Retail Industry? Reprinted with permisson Journal of Trading
Anatomy of an Enterprise Software Delivery Project
Chapter 2 Anatomy of an Enterprise Software Delivery Project Chapter Summary I present an example of a typical enterprise software delivery project. I examine its key characteristics and analyze specific
STUDY ON MANAGEMENT INFORMATION SYSTEM, ITS COMPONENTS AND IMPLEMENTATION PROCESS
STUDY ON MANAGEMENT INFORMATION SYSTEM, ITS COMPONENTS AND IMPLEMENTATION PROCESS Ms. Supriya Mahajan 1, Mr. Vansh Raheja 2 1 M.Phil Research Scholar, Department of Management, Lovely Professional University,
Gap analysis: an assessment of service users needs in a given field of service compared with services currently available.
NONPROFIT BUSINESS PLANNING PROJECT Available tools and approaches for planning One of the main purposes of the Nonprofit Business Planning Project is to contextualize current ways of thinking about planning.
The Development of a Supply Chain Management Process Maturity Model Using the Concepts of Business Process Orientation
The Development of a Supply Chain Management Process Maturity Model Using the Concepts of Business Process Orientation Dr. Kevin McCormack Instructor, University of Alabama at Birmingham, School of Business
The Role of Knowledge Management in Building E-Business Strategy
The Role of Knowledge Management in Building E-Business Strategy Mohammad A. ALhawamdeh, P.O.BOX (1002), Postal Code 26110, Jarash, Jordan, +962-2-6340222 [email protected] Abstract - e-business
A Guide to Process Mapping
A Guide to Process Mapping Process mapping helps represent work processes visually and identify problem areas and opportunities for process improvement. It provides a common understanding of the entire
Enhancing DataQuality. Environments
Nothing is more likely to undermine the performance and business value of a data warehouse than inappropriate, misunderstood, or ignored data quality. Enhancing DataQuality in DataWarehouse Environments
SOLUTION: BUSINESS MANAGEMENT, MAY 2014. (b) Factors to consider when deciding on the appropriate structure for an organization include the following:
SOLUTION 1 (a) The structure of an organization defines the patterns of communication, the system of control and the command structure. (b) Factors to consider when deciding on the appropriate structure
Lean Six Sigma Black Belt Body of Knowledge
General Lean Six Sigma Defined UN Describe Nature and purpose of Lean Six Sigma Integration of Lean and Six Sigma UN Compare and contrast focus and approaches (Process Velocity and Quality) Y=f(X) Input
Experience Report: Using Internal CMMI Appraisals to Institutionalize Software Development Performance Improvement
Experience Report: Using Internal MMI Appraisals to Institutionalize Software Development Performance Improvement Dr. Fredrik Ekdahl A, orporate Research, Västerås, Sweden [email protected] Stig
3 Keys to Preparing for CRM Success: Avoid the Pitfalls and Follow Best Practices
CRM Expert Advisor White Paper 3 Keys to Preparing for CRM Success: Avoid the Pitfalls and Follow Best Practices Ten years ago, when CRM was nascent in the market, companies believed the technology alone
Body of Knowledge for Six Sigma Green Belt
Body of Knowledge for Six Sigma Green Belt What to Prepare For: The following is the Six Sigma Green Belt Certification Body of Knowledge that the exam will cover. We strongly encourage you to study and
Impact of Information Technology in Human Resources Management
Global Journal of Business Management and Information Technology. ISSN 2278-3679 Volume 4, Number 1 (2014), pp. 33-41 Research India Publications http://www.ripublication.com Impact of Information Technology
9 TH INTERNATIONAL ASECU CONFERENCE ON SYSTEMIC ECONOMIC CRISIS: CURRENT ISSUES AND PERSPECTIVES
Matilda Alexandrova Liliana Ivanova University of National and World Economy,Sofia, Bulgaria CRITICAL SUCCESS FACTORS OF PROJECT MANAGEMENT: EMPIRICAL EVIDENCE FROM PROJECTS SUPPORTED BY EU PROGRAMMES
Knowledge based system to support the design of tools for the HFQ forming process for aluminium-based products
MATEC Web of Conferences 21, 05008 (2015) DOI: 10.1051/matecconf/20152105008 C Owned by the authors, published by EDP Sciences, 2015 Knowledge based system to support the design of tools for the HFQ forming
How To Model Data For Business Intelligence (Bi)
WHITE PAPER: THE BENEFITS OF DATA MODELING IN BUSINESS INTELLIGENCE The Benefits of Data Modeling in Business Intelligence DECEMBER 2008 Table of Contents Executive Summary 1 SECTION 1 2 Introduction 2
Towards Collaborative Requirements Engineering Tool for ERP product customization
Towards Collaborative Requirements Engineering Tool for ERP product customization Boban Celebic, Ruth Breu, Michael Felderer, Florian Häser Institute of Computer Science, University of Innsbruck 6020 Innsbruck,
Introduction to Management Information Systems
IntroductiontoManagementInformationSystems Summary 1. Explain why information systems are so essential in business today. Information systems are a foundation for conducting business today. In many industries,
An Analysis of The SABSA Framework. Note: Most of this information comes from the SABSA website. TJS. SABSA Overview
Note: Most of this information comes from the SABSA website. TJS SABSA Overview SABSA is a model and a methodology for developing risk-driven enterprise information security architectures and for delivering
Process/Workflow Analysis Quiz
Process/Workflow Analysis Quiz Question ID: 1 Outline Section: WF A flowchart can be used to show all except A: the specifications of the system. B: re-engineered clarity. C: existing confusion. D: the
A Variability Viewpoint for Enterprise Software Systems
2012 Joint Working Conference on Software Architecture & 6th European Conference on Software Architecture A Variability Viewpoint for Enterprise Software Systems Matthias Galster University of Groningen,
Exploring Information Quality in Accounting Information Systems Adoption
IBIMA Publishing Communications of the IBIMA http://www.ibimapublishing.com/journals/cibima/cibima.html Vol. 2011 (2011), Article ID 683574, 12 pages DOI: 10.5171/2011.683574 Exploring Information Quality
Integrated Marketing Performance Using Analytic Controls and Simulation (IMPACS SM )
WHITEPAPER Integrated Performance Using Analytic Controls and Simulation (IMPACS SM ) MAY 2007 Don Ryan Senior Partner 35 CORPORATE DRIVE, SUITE 100, BURLINGTON, MA 01803 T 781 494 9989 F 781 494 9766
CHAPTER 6 QUALITY ASSURANCE MODELING FOR COMPONENT BASED SOFTWARE USING QFD
81 CHAPTER 6 QUALITY ASSURANCE MODELING FOR COMPONENT BASED SOFTWARE USING QFD 6.1 INTRODUCTION Software quality is becoming increasingly important. Software is now used in many demanding application and
Data Quality in Information Systems
A College Course: Data Quality in Information Systems PG 895 Key Topics from Research Measurement Impacts TQM Data Entry Policies Error Detection Dimensions Change Processes User Requirements Information
*Quality. Management. Module 5
*Quality Management Module 5 * After WW2 Mid-1960 s USA: Huge domestic market, high capacity Price for competitive advantage Oversupply. Europe, Japan: Can t match US productivity & economies of scale
HOW TO USE THE DGI DATA GOVERNANCE FRAMEWORK TO CONFIGURE YOUR PROGRAM
HOW TO USE THE DGI DATA GOVERNANCE FRAMEWORK TO CONFIGURE YOUR PROGRAM Prepared by Gwen Thomas of the Data Governance Institute Contents Why Data Governance?... 3 Why the DGI Data Governance Framework
EVALUATION OF THE EFFECTIVENESS OF ACCOUNTING INFORMATION SYSTEMS
49 International Journal of Information Science and Technology EVALUATION OF THE EFFECTIVENESS OF ACCOUNTING INFORMATION SYSTEMS H. Sajady, Ph.D. M. Dastgir, Ph.D. Department of Economics and Social Sciences
Operations and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras
Operations and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture - 36 Location Problems In this lecture, we continue the discussion
White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program
White Paper from Global Process Innovation by Jim Boots Fourteen Metrics for a BPM Program This white paper presents 14 metrics which may be useful for monitoring progress on a BPM program or initiative.
STRATEGIC INTELLIGENCE WITH BI COMPETENCY CENTER. Student Rodica Maria BOGZA, Ph.D. The Bucharest Academy of Economic Studies
STRATEGIC INTELLIGENCE WITH BI COMPETENCY CENTER Student Rodica Maria BOGZA, Ph.D. The Bucharest Academy of Economic Studies ABSTRACT The paper is about the strategic impact of BI, the necessity for BI
A QUESTIONNAIRE-BASED DATA QUALITY METHODOLOGY
A QUESTIONNAIRE-BASED DATA QUALITY METHODOLOGY Reza Vaziri 1 and Mehran Mohsenzadeh 2 1 Department of Computer Science, Science Research Branch, Azad University of Iran, Tehran Iran rvaziri@iauctbacir
Software Development and Testing: A System Dynamics Simulation and Modeling Approach
Software Development and Testing: A System Dynamics Simulation and Modeling Approach KUMAR SAURABH IBM India Pvt. Ltd. SA-2, Bannerghatta Road, Bangalore. Pin- 560078 INDIA. Email: [email protected],
A Framework for Identifying and Managing Information Quality Metrics of Corporate Performance Management System
Journal of Modern Accounting and Auditing, ISSN 1548-6583 February 2012, Vol. 8, No. 2, 185-194 D DAVID PUBLISHING A Framework for Identifying and Managing Information Quality Metrics of Corporate Performance
Hoshin Kanri Planning Process. In today s business world, the goal of any organization is to survive in the market,
Christina C. Udasco Mini-Tutorial 470W Due: 02 April 01 Hoshin Kanri Planning Process In today s business world, the goal of any organization is to survive in the market, by having a strong competitive
Inventory Management in Small and Medium-Sized Manufacturing Companies and Its Main Dilemmas
Proceedings of the 2014 International Conference on Industrial Engineering and Operations Management Bali, Indonesia, January 7 9, 2014 Inventory Management in Small and Medium-Sized Manufacturing Companies
