INDICATORS FOR SELECTING SOFTWARE QUALITY MANAGEMENT TOOLS 1 Luisa A. De Luca Banco Central de Venezuela Gerencia de Sistemas e Informática Caracas - Venezuela ldeluca@cantv.net Luis E. Mendoza, María A. Pérez, Teresita Rojas Laboratorio de Investigación en Sistemas de Información (LISI) Departamento de Procesos y Sistemas Universidad Simón Bolívar Caracas - Venezuela lmendoza@usb.ve, movalles@usb.ve, trojas@usb.ve ABSTRACT The objective of this paper is to propose a set of indicators to support the selection of tools for software quality management. The method of evaluation used as a framework is the "Feature Analysis Case Study", applying the DESMED method, specially developed to select methods for evaluating Software Engineering methods and tools. As a result of this research, a set of fifty-nine indicators has been identified to guide in the selection of tools that support the software quality management process. The indicators proposed were applied to nine tools among those available in the market: EssentialSET, Estimate Professional, GQMaspect, IQUAL, MetriFlame, Quality Builder, SQUID, The Defect Detective and THESEUS. Keywords: Methods, Techniques, Languages and Tools for Software Engineering; Software Quality; Strategic Planning; Quality Management, Quality Indicators. 1. INTRODUCTION The definition of software quality includes several aspects that make it unique in relation to the quality of other kinds of products. The most relevant is that, for software products, the quality must be built from the beginning, it is not something that can be added latter. To obtain a quality software product, the process followed to develop it also must be of quality. Some international norms or models of evaluation for software quality are centered in the product quality, while other are centered in the process quality. In the first group, among other, there are ISO/IEC 9126 [13] and the model described by Dromey [5]. In the second group there are ISO 9000 [19], Capability Maturity Model (CMM) [14], ISO/IEC 15540 or SPICE [12] and the IDEAL model [6]. There are tools to allow software quality management from different points of view (planning and estimate, processes, documentation, etc.) and these tools can help in some of the tasks and activities of the software development process. Some of these tools are based on international 1 Co-financed by CONICIT (S1-2000000437 project) and DID-CAI-USB (S1-00094 project).
norms and models of the software quality evaluation. At the present, very few software development organizations have tools to support quality management, mainly due to lack of information about their availability. It is sure that no indicators are used to support the software development organizations in their selection. Therefore, the objective of this research is to propose a set of indicators that support the selection of software quality management tools. With these indicators, any organization can draw a quality assurance plan and support the selection process of one of these tools. 2. METHOD According to Kitchenham et al. [9], DESMET is a method to select methods for evaluating Software Engineering methods and tools. This method identifies nine evaluation methods and a set of criteria to help evaluators select an appropriate method according to the characteristics of the work to develop. DESMET separates the evaluation in two types: quantitative and qualitative, and identifies three different ways for organizing an evaluation exercise: as a formal experiment, as a case study and as a survey. Combining these three approaches, the DESMET method identifies nine methods of evaluation. DESMET is based on technical and practical criteria to help to determine the most appropriate evaluation method in specific circumstances. Based on those technical and practical criteria, the evaluation method: "Feature Analysis Case Study" was selected, and applied in the Information Systems Management of the Banco Central de Venezuela (BCV). In order to use the method "Feature Analysis Case Study", DESMET proposes several characteristics; these are: benefits difficult to quantify, benefits observable in a simple project, stable procedures of development, users population limited and time scales for evaluation provided with the time of projects development. There is a group of activities, specific for the selected method of evaluation, that should be performed, these are: 1. To identify the tools to evaluate. 2. To identify a group of characteristic to evaluate. 3. To evaluate the tools against the identified characteristics. 4. To select a project pilot. 5. To test each tool in the project pilot. 6. To assign value to each characteristic for each tool. 7. To analyze the resulting values and to carry out an evaluation report. The characteristics to evaluate in each tool are reflected in the set of indicators proposed to select tools that support the software quality management, which constitutes the objective of this research. The group of tools to be evaluated is described in the section 3; in the section 4 the steps 2, 3, 4, 5 and 6 are described, and lastly, the step 7 is described in the section 5 of the present document. According to Kitchenham and Jones [10], certain important considerations must be done at 2
the moment to present and analyze the results of an evaluation method based on "Feature Analysis Case Study". When there is an explicit level of acceptance, the analysis should be based on the differences among the values obtained for each tool evaluated. 3. EVALUATED TOOLS A group of nine tools of the available ones in the market was selected: EssentialSET, Estimate Professional, GQMaspect, IQUAL, MetriFlame, Quality Builder, SQUID, The Defect Detective and THESEUS. Next, a brief description of each one is presented. 3.1 EssentialSET For the "Software Productivity Center" (SPC) [16], their developers, EssentialSET understands a group of tools that provide a framework for the software development key practices. These tools can be used individually or combined, they satisfy the levels 2 and 3 of CMM and they are according to ISO 9001. The complete group of tools understands material that covers disciplines like: business planning, projects management, software development, maintenance and software operation, quality assurance, requirements management, configuration management, processes definition and improvement, change management, projects control and monitor, projects closing and revision. 3.2 Estimate Professional For SPC [17], Estimate Professional eliminates all projects conjecture and uncertainty. It recognizes the natural volatility in the software development and taking in bill the variability using the Monte Carlo simulation. This is the only management tool that combines the estimation models (COCOMO 2.0 and the Putnam Methodology) with statistical methods (Monte Carlo simulation) to show how can be maximized the success probabilities. 3.3 GQMaspect GQMaspect (GQM Abstraction Sheet and GQM Plan Editing and Construction Tool), as it describes Hoffmann et al. [7], it is a prototype that implements the state of the art of the processes and serves from support to the phase of planning of the measuring of programs based on the GQM paradigm. It was developed by the group of Software Engineering of the University of Kaiserslautern in Germany. The GQM process is subdivided in four phases: "identify goals of GQM", "produce the plan of GQM", "produce the measuring plan", and "gather and validate the data". GQMaspect supports the phase "produce the plan of GQM" with an iterative process. 3.4 Integrated QUALity (IQUAL) IQUAL was created by Two Consult to integrate all the aspects and critical activities that understands the modern quality management and the handling of the derived documentation of the application of ISO 9000. As Two Consult describes [18], IQUAL understands eight functional units or modules: Management of Processes, Management of Documentation, Audits, non Conformity and Corrective Actions, Calibration, Meetings of Revision of the Quality, Results of Quality, and Projects and Groups of Work. 3
3.5 MetriFlame According to VTT Electronics [20], the process of goals refinement through questions to take them to metric is documented in a plan "Goal Question Metrics" (GQM). The way MetriFlame uses the GQM paradigm in defining metrics is that all its constituent parts (goals, questions and metrics) can be fed into the system and the results of the metrics can be examined question by question or goal by goal. For VTT Electronics [20] MetriFlame is a tool that supports the GQM paradigm to gather data for measuring, to define and to calculate metric, and to analyze the results of the metric representing them in graphic form. 3.6 Quality Builder: A Quality Management Tool For MCD [11], Quality Builder is a tool to implement quality programs into the whole organization. This management tool helps to the organizations to introduce improvements of the quality and to offer the quality assurance improvement, besides producing appraisal results. Quality Builder understands a group of processes that promotes the quality practices, it leans on in a software based on internet/intranet and it uses the technology "workflow" and database. 3.7 Software QUality In Development (SQUID) As BØegh defines [3], SQUID is a method and a tool for quality assurance and control that allows a software development organization to plan and to control the product quality during development. SQUID was developed to satisfy the necessity of comparison of projects with similar characteristic. SQUID allows to control and to assure the product and the process quality, which is carried out according to the necessities of each organization and based on its own database. 3.8 The Defect Detective The tool Defect Detective is marketed by "Information Technology Effectiveness, Inc", provides an only combination of capacities to manage the quality processes through the life cycle of complete development, from the requirements until the implementation. According to IT Effectiveness [8], the tool supports important elements of the SEI/CMM, ISO 9000 and SPICE, relating the product quality with the process quality. 3.9 THESEUS THESEUS is a management tool created by Ariane II [1] to print quality to the systems with conformity to the ISO 9001 and SPICE. This tool is one of the products of "AMPLI Programme" (software processes improves) that was thrown by the "Centers of Recherche Henri Tudor, Luxemburg" (CRP-HT). THESEUS is formed by THESEUS*Management and THESEUS*Broadcast. The first allows coding procedures and guides, work descriptions, documents models, etc. The data are stored in a relational database to be used with THESEUS*Broadcast, which is based on technology internet/extranet and it assures that the manuals of quality circulate through the whole organization. 4
4. PROPOSED INDICATORS The set of proposed indicators has been classified technologically and organizationally. These indicators support the selection process of a quality management tool. The technological indicators refer directly to the tool: design, uses and context. The organizational indicators are related with the use of this type of tools in the organizations. Based on the indicators set of Rojas et al. [15], the indicators for the selection of tools that support the software quality management have been classified (as well as the technological and the organizational ones) in internal and external. 4.1 Technological Indicators The technological indicators, either internal or external, are classified based on twelve approaches [4]: methodology, phases, functionality, reliability, maintainability, evaluation and certification models, structural forms, on-line help, platform, licenses, costs and support. The Table 1 and the Table 2 shows respectively internal and external technological indicators. Approach Indicator Supports to quality methodologies Methodology Supports to necessary methodologies that requires the organization Satisfaction rate Number of life cycle phases that supports Life cycle phases Number of life cycle phases that requires the organizations Satisfaction rate Adaptability Functionality Interoperability Security Maturity Reliability Fault tolerance Recoverability Maintainability Stability Evaluation and certification models Structural forms On-line help Promotes the use of evaluation and certification models of the software quality Promotes the use of evaluation and certification models of the software quality that uses the organization Satisfaction rate Quality planning Quality control Software quality evaluation Processes documentation Development processes quality analysis Software product quality analysis Measuring the product and the software development process quality Costs estimate Resources estimate Defect estimate Data analysis Data Import/Export Reports and graphics Facility of help in the tool Facility of use Satisfaction rate Table 1. Internal technological indicators. 5
Approach Platform Licenses Costs Support Indicator Hardware readiness to operate the tool Readiness of additional software to the tool Satisfaction rate Server/user licenses Commercialization system Tool and transfer costs Training costs Maintenance costs Technical support costs Additional software costs Additional hardware costs Technical support in the country Training in the Country Training type Bring up to date of versions Manuals Base installed Satisfaction rate Table 2. External technological indicators. 4.2 Organizational Indicators The organizational indicators, either internal or external, are classified based on four approaches [4]: projects management, development of personal, institutional image and interinstitutional relationship. The Table 3 shows the internal and external organizational indicators. Category Approach Indicator Acceptance Maintenance Project management Standardization Internal Quality plan Training Personal Development Learning Ability Institutional image Vision External Inter-institutional relationship Impact Table 3. Internal and external organizational indicators. 4.3 Indicators Calculation Each one of these indicators, either technological or organizational, has associated a series of questions that allow determining if the indicator is present or not in the tool being evaluated. There are seven types of possible answers that can be obtained, according to their domain type [4], as it is shown in Table 4. To carry out any mathematical and logic operation on the answers to the questions associated to each indicator, the values of all the answers should be standardized using the domain types. In this sense, all the answers were taken to the scale from the 1 to the 5 (where 1 are the minimum and 5 the maximum). 6
Domain Value Y/N Represents presence (Y) or not (N) of the characteristic that is evaluated. Integer between 1 and 5, corresponding to the life cycle phases of the 1 5 development of systems: planning, analysis, design, construction and tests. 1 n n is a positive real number and it represents costs of the tools. Possible values: 0; 0.5 and 1 that means: 0: more than two versions in two years 0 1 0.5: a version in two years 1: two versions in two years % A positive integer that represents a percentage rate. Possible values: 0; 1 and 2 that means: 0: negative experiences 0 2 1: without experiences 2: positive experiences Possible values: 1 y 2 that means: 1 2 1: only licenses for server or only for user 2: licenses for the server and for the user Table 4. Classification of the answers according to the domain type. The evaluator organization assigns a weigh to each question. The weigh corresponds to a real value between 0 and 1, where 0 represents less importance and 1 represents more importance to the evaluator organization. Once all answers are in the 1-5 scale, each one must be multiplied by its associated weigh. This way the final value of the answer is obtained. Now, each indicator has associated a series of questions and their answers have values in one domain. To assign a value to the indicator, an algorithm should be applied to allow taking into account the final value of the answers. Therefore, for each indicator, their value is:!"if at least half of the answers have higher or iqual value of 3 points, then the indicator value is the average of the answers;!"else, the indicator value is 1 Applying the same algorithm, it can be calculated the value for each approach, based on the obtained values of the indicators, for each category (internal or external) based on the obtained values of the approaches, and even for each indicator type (technological or organizational) based on the obtained values of the category. In this way: The value of each approach is:!"if at least half of the indicators have a value higher or equal to 3 points, then the approach value is the indicators average;!"else, the approach value is 1 The value of each category is:!"if at least half of the approaches have a value higher or equal to 3 points, then the category value is the approaches average;!"else, the category value is 1 The value of each indicator type is:!"if at least half of the categories have a value higher or equal to 3 points, then the indicator type value is the categories average;!"else, the indicator type value is 1 7
4.4 Tools Evaluation In order to carry out the step 4 of the evaluation method, it has been taken as case study the Banco Central de Venezuela (BCV). According to the Law that rules its operation [2], the BCV is "a public legal entity of unique nature, with the responsibility of creating and maintaining monetary, credit and exchange conditions favorable to the stability of the currency, to the economic balance and the orderly development of the economy, as well as to assure the continuity of the international payments of the country". Based on the concepts exposed in section 2, about the "Feature Analysis Case Study" evaluation method, it has to be mentioned that, in the evaluation of the selected tools, the Manager of the Information Systems of the BCV carried out the role of sponsor. The authors of this paper carried out the evaluator and advisor roles. The technological user was the head of one department within the Information Systems Management of the BCV. A questionnaire was elaborated containing the questions associated to each indicator. Then, for each tool it was created a sheet in Microsoft Excel 97 to collect the questions and answers associated to each indicator. In this Excel sheet, there is a column for the answers, another one to standardize the value of the answers to the same scale, another one for the organization weigh assigned to each question, and another one with the final answer value. Each sheet contains the calculations according to the algorithm presented previously to obtain the value of the indicators for approach, category and type. In another sheet, the values of the indicators and their precedent hierarchies were placed (approach, category and type). This way, one could easily obtain the final value associated to the tool. To obtain the answer to each question, it was used the documentation provided by the suppliers of each tool. For the questions whose answer was not present in the manual, the whole questionnaire was sent to the suppliers and they answered the questions made on their tools. 5. ANALYSIS OF RESULTS To make a decision with regard to the tool to acquire, was considered the final value of the evaluation (where the technological and organizational indicators are involved) and has been took as approach that a tool with a smaller value that 3 (explicit level of acceptance, according to the method of evaluation "Feature Analysis Case Study"), it is not advisable and needs a bigger analysis on the part of the evaluator organization, in order to study the convenience of their acquisition. All the studied tools overcome in their evaluation the 3 points (see Total row in the Table 5). The one that obtained an evaluation with more value was SQUID (4.4068), continuing Estimate Professional (4.0373), EssentialSET (4.0088) and The Defect Detective (3.9795). The difference between the first one and the second is quite big (0.3695) with regard to the difference between the second and the third (0.0285) and the difference between the third and the fourth (0.0293). This suggests that for the evaluator organization, it would not be very complicated to select to SQUID like the best tool [4]. In the Figure 1 are shown the obtained results graphically. 8
Type Category Essential Estimate GQMaspect IQUAL MetriF Qual.Build SQUID The D. D. THESEUS Total 4,0088 4,0373 3,8780 3,5610 3,8733 3,4395 4,4068 3,9795 3,8783 Technological 3,3926 3,4495 3,0060 2,9554 3,3299 2,5040 4,0635 3,2089 3,0066 Internal 3,4936 3,6699 2,2500 3,0417 2,9455 2,3413 4,2580 3,2869 2,8109 External 3,2917 3,2292 3,7619 2,8690 3,7143 2,6667 3,8690 3,1310 3,2024 Organizational 4,6250 4,6250 4,7500 4,1667 4,4167 4,3750 4,7500 4,7500 4,7500 Internal 4,2500 4,2500 4,5000 3,3333 3,8333 3,7500 4,5000 4,5000 4,5000 External 5,0000 5,0000 5,0000 5,0000 5,0000 5,0000 5,0000 5,0000 5,0000 Table 5. Result of the application of the indicators, contained by type and category, for each one of the tools. The evaluation of these tools was carried out according to the necessities of the Banco Central de Venezuela; due to it, the results can vary from an organization to another. The remaining five tools obtained good results in aspects that the evaluator company didn't consider important according to their current necessities. Although the tools IQUAL, Quality Builder, THESEUS, GQMaspect and MetriFlame were not inside the first four positions that they were mentioned previously, it is important to highlight the following: The "processes documentation" indicator shows that IQUAL is an excellent tool to manage the documentation activities. The "quality planning" and "costs estimate" indicators place Quality Builder like a good tool to carry out the inherent activities to the quality planning process and to reduce the defects correction costs. The "development processes quality analysis", "processes documentation", "quality planning" and "software quality evaluation" indicators, demonstrate that THESEUS is a good tool to introduce quality to the software processes of conformity to ISO 9001 and SPICE. The "quality control" and "software quality evaluation" indicators, place to GQMaspect like an excellent tool that serves from support to the planning phase of the measuring of programs based on the paradigm GQM. The "software product quality analysis", "software quality evaluation" and "measuring the product and the software development process quality" indicators, shows that MetriFlame is a very good tool to implement the metric derived of the GQM paradigm. 9
Values 4,5 4 3,5 3 2,5 2 1,5 1 0,5 0 Essential Estimate GQMaspect Iqual MetriF Qual.Build. Squid The D.D. Theseus Tools Figure 1. Results of the evaluation of the tools according to the proposed technological and organizational indicators. Since the value obtained in the organizational indicators oscillates between 4.166 and 4.750 points (all quite satisfactory ones) and taking into account that described in the previous paragraph, can be made an analysis of the results based on the technological indicators and if some doubt is presented, to make the final decision based on the organizational indicators [4]. The results of the technological indicators are shown in the Figure 2. Values 4,5 4 3,5 3 2,5 2 1,5 1 0,5 0 Essential Estimate GQMaspect Iqual MetriF Qual.Build. Squid The D.D. Theseus Tools Figure 2. Results of the evaluation of the tools according to the technological indicators. It is opportune to observe that only seven tools (of nine) has obtained a bigger value to 3: SQUID (4.0635), Estimate Professional (3.4495), EssentialSET (3.3926), MetriFlame (3.3299), The Defect Detective (3.2089), GQMaspect (3.006) and THESEUS (3.0066). The other two tools not mentioned in this group have their strength in aspects that the evaluator organization didn't take as high-priority. The first three tools continue being the same ones, but the fourth became MetriFlame. The difference between the first one and the second is even bigger (0.614) with regard to the difference between the second and the third (0.0569) and the difference between the third and the fourth (0.0627). Again, this suggests for the evaluator organization, it would not be very complicated to select to SQUID like the most appropriate tool. 10
On the other hand, the values can also be analyzed obtained in the internal and external technological indicators; only four tools obtained both values superior to 3 points: SQUID (4.258 and 3.869 respectively), Estimate Professional (3.6699 and 3.2292 respectively), EssentialSET (3.4936 and 3.2917 respectively) and The Defect Detective (3.2869 and 3.131 respectively). The differences among the values of the first two tools are again big (0.5881 and 0.6398 respectively) when comparing them with the differences between the second and the third (0.1763 and 0.0625 respectively) and the third and the fourth (0.2067 and 0.1607 respectively). These results are shown in the Figure 3. Values Figure 3. Comparison of internal and external technological indicators. The reasons for which don't figure in the first places the tools IQUAL, Quality Builder, THESEUS, GQMaspect and MetriFlame are similar to those outlined when was carried out the analysis of the results of the evaluation of the tools according to the proposed technological and organizational indicators. In the figure 3 are observed that, of the five tools whose values of internal and external technological indicators are inferior to 3 points, four of them (GQMaspect, MetriFlame, Quality Builder and THESEUS) have the external technological indicators with more value that the internal. This indicates that the final value of the technological indicators for these tools is being impelled by the external technological indicators, being considerably lower the internal that are really more important than the external ones to the moment to make a decision. 6. CONCLUSIONS 4,5 4 3,5 3 2,5 2 1,5 1 0,5 0 Internal Technological External Technological Essential Estimate GQMaspect Iqual To get the quality in the software products is necessary to have present that the quality should be built from the beginning, it is not something that can be added at the end. Thereby the importance that has the quality in the software development process. The tools that support the software quality management have of great utility since they allow to plan and to take a pursuit of the quality activities characteristic of each one of the software development process phases. In this investigation has been used DESMET, a method to select methods for evaluating Software Engineering methods and tools. The evaluation method followed was "Feature Analysis Case Study", which allowed to identify the indicators to evaluate tools that support the software quality management. MetriF Qual.Build. Squid The D.D. Theseus Tools 11
As a result of this research, a set of fifty-nine indicators has been identified to guide in the selection of tools that support the software quality management process. The case of study outlined in this paper allowed to validate the proposed indicators. However, we suggest to apply the evaluation to a set of tools in a private organization with external clients. This will allow to obtain a validation of the most complete indicators and with another perspective. REFERENCES [1] Ariane II Group. Theseus. 2000. [On-line] Available on: http://www.arianeii.com/uk/methode/produits/theseus.htm [2] Ley del Banco Central de Venezuela (1992). Gaceta Oficial de la República de Venezuela, N 35.106, Diciembre 1992. [3] BØegh, JØrggen; Depanfilis, Stefano; Kitchenham, Barbara and Pasquini, Alberto. A Method for Software Quality Planning, Control and Evaluation. IEEE Software. March/April 1999. Vol. 16, N 2. Pp. 69-77. [4] De Luca, Luisa. Indicadores para la Selección de Herramientas que Soportan la Gerencia de la Calidad del Software. Trabajo de Grado presentado y publicado en la Universidad Simón Bolívar. Febrero 2001. [5] Dromey, Geoff. A Model for Software Product Quality. IEEE Transactions on Software Engineering, February 1995, Vol. 2, N 2 Pp. 146-162. [6] Gremba, Jennifer and Myers, Chuck. The IDEAL SM Model: A Practical Guide for Improvement. Software Engineering Institute (SEI) publication, "Bridge", issue 3, 1997. [On-line] Available on: http://www.sei.cmu.edu/ideal/ideal.bridge.html [7] Hoffmann, M.; Birk, A.; Els, F. and Kempkens, R. GQMaspect v.1.0. User Manual. November 1996. [On-line] Available on: http://www.iese.fhg.de/perfect/perfect.html [8] Information Technology Effectiveness, Inc. The Defect Detective. 1998. [On-line] Available on: http://www.iteffectiveness.com/ [9] Kitchenham, Barbara; Linkman, Stephen and Law, David. DESMET: A method for Evaluating Software Engineering Methods and Tools. 1996. Department of Computer Science, Keele University, Technical Report [ISSN 13 53-7776]. Vol. TR96, 09, pp.1-67. [10] Kitchenham, Barbara and Jones, Lindsay. Evaluating Software Engineering Methods and Tools. Part 8: Analyzing a Feature Analysis Evaluation. 6th SQUAD Meeting, Chile, December 2000. [11] MCD, Human and Organizational Development Consultants. Quality Builder. 1997. [On-line] Available on: http://www.mcd.uk.com/pages/qualbuild.htm [12] ISO/IEC. SPICE. Software Process Assessment - Part 1: Concepts and Introductory Guide. 1997. [On-line] Available on: http://www-sqi.cit.gu.edu.au/spice/ [13] ISO/IEC FCD 9126-1.2: Information Technology - Software Product Quality. June 1998. [14] Paulk, Mark; Weber, Charles; García, Suzanne; Chrissis, Mary Beth and Bush, Marilyn. Key Practices of the Capability Maturity Model SM, Version 1.1. Technical Report CMU/SEI-93-TR-025 ESC-TR-93-178, February, 1993. [On-line] Available on: http://www.ibp.com/pit/ispi/cmm.html [15] Rojas, T.; Pérez, M; Grimán, A.; Ortega, M. and Díaz, A. Modelo de Decisión para Soportar la Selección de Herramientas CASE. Revista de la Facultad de Ingeniería, 12
UCV. Mayo 2000, Vol. 15, N 2, pp. 117-144. [16] Software Productivity Center Inc. Estimate Professional. Canada, 2000. [On-line] Available on: http://www.spc.ca/products/eset/index.htm [17] Software Productivity Center Inc. Estimate Professional. Canada, 2000. [On-line] Available on: http://www.spc.ca/products/estimate/index.htm [18] Two Consult. IQUAL. The Comprehensive Tool for Integrated Quality Management. Snapshot. 1999. [On-line] Available on: http://www.twoconsult.com/twoconsult/website.nsf/v0/snapshot.htm [19] Vidal, Henry; Wan, Jian and Han, Xuan. Capability Models: ISO and CMM. Department of Computing and Information Sciences. Kansas State University. Summer, 1998. [On-line] Available on: http://www.cis.ksu.edu/~vidal/ [20] VTT Electronics. MetriFlame User Guide. June, 1999. [On-line] Available on: http://www.vtt.fi/ele/research/soh/products/metriflame/ 13