Statistical Process Control (SPC)

Size: px
Start display at page:

Download "Statistical Process Control (SPC)"

Transcription

1 Statistical Process Control (SPC) A Metrics-Based Point of View of Software Processes Achieving the CMMI Level Four Reiner Dumke, Isabelle Côté, Olga Andruschak Otto-von-Guericke-Universität Magdeburg, Institut für Verteilte Systeme dumke@ivs.cs.uni-magdeburg.de, Contents 1 The CMMI Approach Basic Intentions of the CMMI The CMMI Levels The CMMI Metrication 7 2 Software Measurement Intentions The CAME Measurement Framework The CMMI Metrics Set by Kulpa and Johnson The CMMI-Based Organization s Measurement Repository The Statistical Software Process (SPC) Foundations of the SPC Empirical Strategies Testing Methods Methods of Data Analysis SPC and CMMI Basics of Quantified Process Management Controlling the Process Improvement References. 79 Abstract The following preprint gives a new form of integration of the idea of the statistical based analysis of the software process (SPC) in the assessment and improvement activities considering the Capability Maturity Model Integration initiative. Including the basic statistical methods and software experiment foundations we will describe a structured approach for metrication of the different stages of the CMMI approach. Further, this preprint shows appropriate methods of statistical analysis in order to improve the software process areas and activities for a quantified managed process level based on metrics set defines by Kulpa and Johnson. 1

2 1 The CMMI Approach 1.1 Basic Intentions of the CMMI CMMI stands for Capability Maturity Model Integration and is an initiative for changing the general intention of an assessment view based of the classical CMM or ISO 9000 to an improvement view integrating the System Engineering CMM (SE-CMM), the Software Acquisition Capability Maturity Model (SA-CMM), the Integrated Product Development Team Model (IDP-CMM), the System Engineering Capability Assessment Model (SECAM), the Systems Engineering Capability Model (SECM), and basic ideas of the new versions of the ISO 9001 and The following semantic network shows some classical approaches in the software process evaluation without any comments [Ferguson 1998]. PSP SDCCR MIL-Q-9858 DOD-STD- MIL-STD People CMM SDCE NATO DOD-STD- SA-CMM IEEE Stds. 730, AQAP1,4,9 SW-CMM 828,829,830,1012 DOD-STD- SCE 1016,1028, A FAA-iCMM 1063 EQA Baldrige ISO (SPICE) Trillium BS 5750 MIL-STD-498 ISO/IEC SE-CMM CMMI DO-178B EIA/IEEE SSE-CMM SECM J-STD-016 (EIA/IS 731) IEEE 1074 TickIT ISO 9000 SECAM IPD-CMM Series DOD IPPD IEEE 1220 Q9000 IEEE/EIA EIA/IS 632 AF IPD Guide ISO MIL-STD-499B EIA 632 ISO Figure 1: Dependencies of software process evaluation methods and standards The CMMI is structured in the five maturity levels, the considered process areas, the specific goals (SG) and generic goals (GG), the common features and the specific practices (SP) and generic practices (GP). The process areas are defined as follows [Kulpa 2003]: The Process Area is s group of practices or activities performed collectively to achieve a specific objective. Such objectives could be the requirements management at the level 2, the requirements development at the maturity level 3 or the quantitative project management at the level 4. The difference between the specific and the general goals, practices or process area is reasoning in the special aspects or areas which are considered in opposition to the general IT or company wide analysis or improvement. There are four common features: The commitment to perform (CO) The ability to perform (AB) The directing implementation (DI) The verifying implementation (VE). The CO is shown through senior management commitment, the AB is sown through the training personnel, the DI is demonstrated by managing configurations, and the VE is demonstrated via objectively evaluating adherence and by reviewing status with higher-level management. 2

3 The following Figure 2 shows the general relationships between the different components of the CMMI approach. Process Area 1 Process Area 2 Process Area n Specific Goals Generic Goals Specific Practices Capability Levels Generic Practices Figure 2: The CMMI model components The CMMI gives us some guidance as to what is a required component, an expected component, and simply informative. 1.2 CMMI Levels There are six capability levels (but five maturity levels), designated by the numbers 0 through 5 [SEI 2002], including the following process areas: 0. Incomplete: - 1. Performed: best practices; 2. Managed: requirements management, project planning, project monitoring and control, supplier agreement management, measurement and analysis, process and product quality assurance; 3. Defined: requirements development, technical solution, product integration, verification, validation, organizational process focus, organizational process definition, organizational training, integrated project management, risk management, integrated teaming, integrated supplier management, decision analysis and resolution, organizational environment for integration; 4. Quantitatively Managed: organizational process performance, quantitative project management; 5. Optimizing: organizational innovation and deployment, causal analysis and resolution. Kulpa and Johnson consider the following specific goals and practices achieving the different maturity levels relating to the quantification [Kulpa 2003]: Level 2: Measurement and Analysis: The purpose of Measurement and Analysis is to develop and sustain a measurement capability that is used to support management information needs. Specific Practices by Specific Goal: SG1 Align Measurement and Analysis Activities: Measurement objectives and activities are aligned with identified information needs and objectives. SP1.1 Establish Measurement Objectives: Establish and maintain measurement objectives that are derived from identified information needs and objectives. SP1.2 Specify Measures: Specify measures to address the measurement objectives. 3

4 SP1.3 Specify Data Collection and Storage Procedures: Specify how measurement data will be obtained and stored. SP1.4 Specify Analysis Procedures: Specify how measurement data will be analyzed and reported. SG2 Provide Measurement Results: Measurement results that address identified information needs and objectives are provided. SP2.1 Collect Measurement Data: Obtain specified measurement data. SP2.2 Analyze Measurement Data: Analyze and interpret measurement data. SP2.3 Store Data and Results: Manage and store measurement data, measurement specifications, and analysis results. SP2.4 Communicate Results: Report results of measurement and analysis activities to all relevant stakeholders. Level 2: Specific Practices by Specific Goal: SG1 Objectively Evaluate Processes and Work Products: Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated. SP1.1 Objectively Evaluate Processes: Objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures. SP1.2 Objectively Evaluate Work Products and Services: Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures. SG2 Provide Objective Insight: Noncompliance issues are objectively tracked and communicated, and resolution is ensured. SP2.1 Communicate and Ensure Resolution of Noncompliance Issues: Communicate quality issues and ensure resolution of noncompliance issues with the staff and managers. SP2.2 Establish Records: Establish and maintain records of the quality assurance activities. Level 3: Verification: The purpose of Verification is to ensure that selected work products meet their specified requirements. Specific Practices by Specific Goal: SG1 Prepare for Verification: Preparation for verification is conducted. SP1.1 Select Work Products for Verification: Select the work products to be verified and the verification methods that will be used for each. SP1.2 Establish the Verification Environment: Establish and maintain the environment needed to support verification. SP1.3 Establish Verification Procedures and Criteria: Establish and maintain verification procedures and criteria for the selected work products. SG2 Perform Peer Reviews: Peer reviews are performed on selected work products. SP2.1 Prepare for Peer Reviews: Prepare for peer reviews of selected work products. SP2.2 Conduct Peer Reviews: Conduct peer reviews on selected work products and identify issues resulting from the peer review. SP2.3 Analyze Peer Review Data: Analyze data about preparation, conduct, and results of the peer reviews. SG3 Verify Selected Work Products: Selected work products are verified against their specified requirements. SP3.1 Perform Verification: Perform verification on the selected work products. SP3.2 Analyze Verification Results and Identify Corrective Action: Analyze the results of all verification activities and identify corrective action. 4

5 Level 3: Validation: The purpose of Validation is to demonstrate that a product or product component fulfills its intended use when placed in its intended environment. Specific Practices by Specific Goal: SG1 Prepare for Validation: Preparation for validation is conducted. SP1.1 Select Products for Validation: Select products and product components to be validated and the validation methods that will be used for each. SP1.2 Establish the Validation Environment: Establish and maintain the environment needed to support validation. SP1.3 Establish Validation Procedures and Criteria: Establish and maintain procedures and criteria for validation. SG2 Validate Product or Product Components: The product or product components are validated to ensure that they are suitable for use in their intended operating environment. SP2.1 Perform Validation: Perform validation on the selected products and product components. SP2.2 Analyze Validation Results: Analyze the results of the validation activities and identify issues. Level 3: Decision Analysis and Resolution: The purpose of Decision Analysis and Resolution is to analyze possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. Specific Practices by Specific Goal: SG1 Evaluate Alternatives: Decisions are based on an evaluation of alternatives using established criteria. SP1.1 Establish Guidelines for Decision Analysis: Establish and maintain guidelines to determine which issues are subject to a formal evaluation process. SP1.2 Establish Evaluation Criteria: Establish and maintain the criteria for evaluating alternatives, and the relative ranking of these criteria. SP1.3 Identify Alternative Solutions: Identify alternative solutions to address issues. SP1.4 Select Evaluation Methods: Select the evaluation methods. SP1.5 Evaluate Alternatives: Evaluate alternative solutions using the established criteria and methods. SP1.6 Select Solutions: Select solutions from the alternatives based on the evaluation criteria. Level 4: Quantitative Project Management: The purpose of the Quantitative Project Management process area is to quantitatively manage the project s defined process to achieve the project s established quality and process-performance objectives. Specific Practices by Specific Goal: SG1 Quantitatively Manage the Project: The project is quantitatively managed using quality and process- performance objectives. SP1.1 Establish the Project s Objectives: Establish and maintain the project s quality and processperformance objectives. SP1.2 Compose the Defined Process: Select the subprocesses that compose the project s defined process, based on historical stability and capability data. SP1.3 Select the Subprocesses that Will Be Statistically Managed: Select the subprocesses of the project s defined process that will be statistically managed. SP1.4 Manage Project Performance: Monitor the project to determine whether the project s objectives for quality and process performance will be satisfied, and identify corrective action as appropriate. SG2 Statistically Manage Subprocess Performance: The performance of selected subprocesses within the project s defined process is statistically managed. SP2.1 Select Measures and Analytic Techniques: Select the measures and analytic techniques to be used in statistically managing the selected subprocesses. SP2.2 Apply Statistical Methods to Understand Variation: Establish and maintain an understanding of the variation of the selected subprocesses using the selected measures and analytic techniques. SP2.3 Monitor Performance of the Selected Subprocesses: Monitor the performance of the selected 5

6 subprocesses to determine their capability to satisfy their quality and process-performance objectives, and identify corrective action as necessary. SP2.4 Record Statistical Management Data: Record statistical and quality management data in the organization s measurement repository. Level 5: Causal Analysis and Resolution: The purpose of Causal Analysis and Resolution is to identify causes of defects and other problems and take action to prevent them from occurring in the future. Specific Practices by Specific Goal: SG1 Determine Causes of Defects: Root causes of defects and other problems are systematically determined. SP1.1 Select Defect Data for Analysis: Select the defects and other problems for analysis. SP1.2 Analyze Causes: Perform causal analysis of selected defects and other problems and propose actions to address them. SG2 Address Causes of Defects: Root causes of defects and other problems are systematically addressed to prevent their future occurrence. SP2.1 Implement the Action Proposals: Implement the selected action proposals that were developed in causal analysis. SP2.2 Evaluate the Effect of Changes: Evaluate the effect of changes on process performance. SP2.3 Record Data: Record causal analysis and resolution data for use across the project and organization. Addressing the basics of the project management CMMI considers the following components for the management of the IT processes [SEI 2002]: Process Management process areas Configuration management, verification, and integration data ISM Process Performance objectives, baselines, models Organization s standard processes and supporting assets Monitoring data as part of supplier agreement Statistical Mgmt Data Lessons Learned, Planning and Performance Data Product architecture for structuring teams Quantitative objectives Subprocesses to statistically manage IPM for IPPD Project s defined process Engineering and Support process areas QPM Coordination and collaboration among project stakeholders Shared vision and integrated team structure for the project Coordination, commitments, issues to resolve Integrated team management for performing engineering processes Project performance data Integrated work environment and people practices Risk exposure due to unstable processes Identified risks IT Project s defined process RSKM Risk taxonomies & parameters Risk status Risk mitigation plans Corrective action Basic Project Management process areas Figure 3: The CMMI project management process areas 6

7 Where QPM stands for Quantitative Project Management, IPM for Integrated Project Management, IPPD for Integrated Product and Process Development, RSKM for risk management, and ISM for Integrated Supplier Management. 1.3 CMMI Metrication In order to manage the software process quantitatively, the CMMI defines a set of metrics examples. Some of these appropriate software measurement intentions are [SEI 2002] Examples of quality and process performance attributes for which needs and priorities might be identified include the following: o Functionality o Reliability o Maintainability o Usability o Duration o Predictability o Timeliness o Accuracy Examples of quality attributes for which objectives might be written include the following: o Mean time between failures o Critical resource utilization o Number and severity of defects in the released product o Number and severity of customer complaints concerning the provided service Examples of process performance attributes for which objectives might be written include the following: o Percentage of defects removed by product verification activities (perhaps by type of verification, such as peer reviews and testing) o Defect escape rates o Number and density of defects (by severity) found during the first year following product delivery (or start of service) o Cycle time o Percentage of rework time Examples of sources for objectives include the following: o Requirements o Organization's quality and process-performance objectives o Customer's quality and process-performance objectives o Business objectives o Discussions with customers and potential customers o Market surveys Examples of sources for criteria used in selecting subprocesses include the following: o Customer requirements related to quality and process performance o Quality and process-performance objectives established by the customer o Quality and process-performance objectives established by the organization o Organization s performance baselines and models o Stable performance of the subprocess on other projects o Laws and regulations Examples of product and process attributes include the following: o Defect density o Cycle time o Test coverage Example sources of the risks include the following: o Inadequate stability and capability data in the organization s measurement repository o Subprocesses having inadequate performance or capability o Suppliers not achieving their quality and process-performance objectives 7

8 o o o o Lack of visibility into supplier capability Inaccuracies in the organization s process performance models for predicting future performance Deficiencies in predicted process performance (estimated progress) Other identified risks associated with identified deficiencies Examples of actions that can be taken to address deficiencies in achieving the project s objectives include the following: o Changing quality or process performance objectives so that they are within the expected range of the project s defined process o Improving the implementation of the project s defined process so as to reduce its normal variability (reducing variability may bring the project s performance within the objectives without having to move the mean) o Adopting new subprocesses and technologies that have the potential for satisfying the objectives and managing the associated risks o Identifying the risk and risk mitigation strategies for the deficiencies o Terminating the project Examples of subprocess measures include the following: o Requirements volatility o Ratios of estimated to measured values of the planning parameters (e.g., size, cost, and schedule) o Coverage and efficiency of peer reviews o Test coverage and efficiency o Effectiveness of training (e.g., percent of planned training completed and test scores) o Reliability o Percentage of the total defects inserted or found in the different phases of the project life cycle o Percentage of the total effort expended in the different phases of the project life cycle Sources of anomalous patterns of variation may include the following: o Lack of process compliance o Undistinguished influences of multiple underlying subprocesses on the data o Ordering or timing of activities within the subprocess o Uncontrolled inputs to the subprocess o Environmental changes during subprocess execution o Schedule pressure o Inappropriate sampling or grouping of data Examples of criteria for determining whether data are comparable include the following: o Product lines o Application domain o Work product and task attributes (e.g., size of product) o Size of project Examples of where the natural bounds are calculated include the following: o Control charts o Confidence intervals (for parameters of distributions) o Prediction intervals (for future outcomes) Examples of techniques for analyzing the reasons for special causes of variation include the following: o Cause-and-effect (fishbone) diagrams o Designed experiments o Control charts (applied to subprocess inputs or to lower level subprocesses) o Subgrouping (analyzing the same data segregated into smaller groups based on an understanding of how the subprocess was implemented facilitates isolation of special causes) Examples of when the natural bounds may need to be recalculated include the following: o There are incremental improvements to the subprocess o New tools are deployed for the subprocess o A new subprocess is deployed 8

9 o The collected measures suggest that the subprocess mean has permanently shifted or the subprocess variation has permanently changed Examples of actions that can be taken when a selected subprocess performance does not satisfy its objectives include the following: o Changing quality and process-performance objectives so that they are within the subprocess process capability o Improving the implementation of the existing subprocess so as to reduce its normal variability (reducing variability may bring the natural bounds within the objectives without having to move the mean) o Adopting new process elements and subprocesses and technologies that have the potential for satisfying the objectives and managing the associated risks o Identifying risks and risk mitigation strategies for each subprocess process capability deficiency Examples of other resources provided include the following tools: o System dynamics models o Automated test-coverage analyzers o Statistical process and quality control packages o Statistical analysis packages Examples of training topics include the following: o Process modelling and analysis o Process measurement data selection, definition, and collection Examples of work products placed under configuration management include the following: o Subprocesses to be included in the project s defined process o Operational definitions of the measures, their collection points in the subprocesses, and how the integrity of the measures will be determined o Collected measures Examples of activities for stakeholder involvement include the following: o Establishing project objectives o Resolving issues among the project s quality and process-performance objectives o Appraising performance of the selected subprocesses o Identifying and managing the risks in achieving the project s quality and processperformance objectives o Identifying what corrective action should be taken Examples of measures used in monitoring and controlling include the following: o Profile of subprocesses under statistical management (e.g., number planned to be under statistical management, number currently being statistically managed, and number that are statistically stable) o Number of special causes of variation identified Examples of activities reviewed include the following: o Quantitatively managing the project using quality and process-performance objectives o Statistically managing selected subprocesses within the project s defined process Examples of work products reviewed include the following: o Subprocesses to be included in the project s defined process o Operational definitions of the measures o Collected measures Based on these quantifications CMMI defines: A `managed process` is a performed process that is planned and executed in accordance with policy; employs skilled people having adequate resources to produce controlled outputs; involves relevant stakeholders; is monitored, controlled, and reviewed; and is evaluated for adherence to its process description. 9

10 2 Software Measurement Intentions 2.1 The CAME Measurement Framework The following measurement and evaluation framework addressed to the software product, process and resources was developed at the University of Magdeburg [Dumke 1999]. The measurement framework is embedded in some aspects of strategy in the IT area in organizations and societies which is shown in the following Figure 4. Society Organization IT area CAME strategy CAME framework CAME tools Figure 4: Main areas relating to the software measurement and evaluation framework We will describe shortly some essential aspects of this framework and the characteristics of the framework environments. The CAME strategy is related to the experience of measurement frameworks or metric programs which are embedded in the enterprise area ([Dumke 2002], [Eickelmann 2000], [Fehrling 2003], [Kitchenham 1997], [Munson 2003]) and stands for Community: the necessity of a group or a team that is motivated and has the knowledge of software measurement to install software metrics. In general, the members of these groups are organised in metrics communities such as our German Interest Group on Software Metrics. Acceptance: the agreement of the (top) management to install a metrics program in the (IT) business area. This aspect is strong connected with the knowledge about required budgets and personnel resources. Motivation: the production of measurement and evaluation results in a first metrics application which demonstrates the convincing benefits of the metrics application. This very important aspect can be achieved by the application of essential results in the (world-wide) practice which are easy to understand and should motivate the management. One of the problem of this aspect is the fact that the management wants to obtain one single (quality) number as a summary of all measured characteristics. Engagement: the acceptance of spending effort to implement the software measurement as a permanent metrics system (with continued measurement, different statistical analysis, metrics set updates etc.). This aspect includes also the requirement to dedicate personnel resources such as measurement teams etc. The CAME framework consists of the following four phases which are defined to install a metrics program in the IT area and which can be used to evaluate the measurement level of this metrics program itself (see also [Dumke 2001], [Fenton 1997], [Kitchenham 1995], [Putnam 2003], [Zuse 1998]): Choice: the selection of metrics based on a special or general measurement view on the kind of measurement and the related measurement goals, 10

11 Adjustment: the investigation and definition of the measurement characteristics of the metrics for the specific application field, Migration: the installation of a high metrication coverage based on semantic relations between the metrics along the whole life cycle and along the system architecture, Efficiency: the automation level of the construction of a tool-based measurement for the used metrics. The phases of this framework will be explained in the following sections including the detailed aspects software measurement evaluation and the role of the CAME tools. The Measurement Choice involves the use of metrics involves the following two essential questions: What is possible to measure? vs. What is necessary to measure? Obviously, we only want to measure, what is necessary. But, in most software engineering areas, this aspect is unknown (especially for modern software development paradigms or methodologies such as software agents and multi-agent systems). The first framework step includes the choice of the software metrics and measures. Therefore, we must define the set of software metrics explicitly [Dumke 2003]. The structure of this set of metrics is based on the following classification principles software product measurement and evaluation is based on the three components: model, implementation and documentation (see Figure 5), software architecture: software operation: software documentation: human interface aspects appropriateness user interface user interface marketing documents tutorials user problem domain product data manual confi- tasks accessing guration development task data documents manage- manage- (technology, tests, ment ment distributed tasks and data bases tools, supports) components tasks behaviour completeness data basis data handling readability Figure 5: Simplified visualisation of the product metrication Note that the metrication process depends on the kind of the development method, of the application area of the software system, of the implementation paradigm etc. 11

12 software process measurement and evaluation is based on the process aspects: controlling, phases/steps and methodologies (see Figure 6), software life cycle: software management: software methodology: milestones controlling versioning suitability support problem definition project management ap- develop- upper requirement analysis/ proach ment me- CASE specification quality configu- thodology design manage- ration ma- para-... implementation ment nagement digm implemen- lower field test maintenance management tation me- CASE thodology phases aspects evaluation workflow efficiency Figure 6: Simplified visualisation of the process metrication software resources measurement and evaluation is based on the three resource parts: personnel, software and hardware (see Figure 7). personnel: software resources: hardware resources: skills communication compatibility paradigm reliability availability (mobile) user customer COTS CASE computers peripherals (hosts) development team (test team) maintenance team system software architectures networks productivity performance performance Figure 7: Simplified visualisation of the resources metrication Our framework starts with the investigation of the chosen metrics and assumes an underlying choice method such as the general measurement goal planning by [Basili 1986] (see also [Wohlin 2000]) which consider the different measurement goals as understanding of systems, assessment, proof of hypothesis, understanding of metrics etc., the Goal Question Metrics (GQM) paradigm [Solingen 1999] which is directed on the improvement of a special aspect or component of the software system related to a special goal. The measurement choice step defines the static characteristics of the software measurement process [Feiler 1993]. Note, that the choice of software metrics or software measures decides about the areas of controlling and the areas out of controlling in the IT department. 12

13 The Measurement Adjustment is related to the experience (expressed in values) of the measured attributes for the evaluation. The adjustment includes the metrics validation ([Card 2000], [Kitchenham 1995], [Zelkowitz 1997]) and the determination of the metrics algorithm based on the measurement theory ([Henderson 1996], [Zuse 2003]). The steps in the measurement adjustment are the determination of the scale type and (if possible) the unit, the determination of the favourable values (as thresholds) for the evaluation of the measurement component, e. g. by o o o o discussion or brainstorming in the development or quality team, analysing and using the examples in the literature, using the thresholds of the metrics tools, taking the results of appropriate case studies and experimentation, the tuning of the thresholds as o o approximation during the software development from other project components, application of a metrics tool for a chosen software product that was classified as a good qualitative example, the calibration of the scale (as transformation of the numerical scale part to the empirical) depends on the improvement of the knowledge in the problem domain. In the adjustment step mainly, we consider the metrics characteristics addressed to the qualitative evaluation (nominal and ordinal scale types) or to the quantitative evaluation (interval or ratio scale types). The Measurement Migration step is aimed to the dynamic aspects of the measurement framework or metrics program. This means that we must install a metrics-based network over the software product, process, and resources components as an Internal Measurement Process (IMP). We migrate the idea of metrication to all of the components of the software development and maintenance. Note, that the most existing software measurement approaches or frameworks do not consider this step explicitly. First intentions of this idea are described as complexity traces in [Ebert 1993] and measurement through the life cycle in [Cool 1993], and as granularity of object-oriented systems in [Abreu 1995]. Some examples of these kinds of migration for software products are [Dumke 1999] metrics tracing along the software life cycle, e. g. #notions (problem definition) #classes (specification) #new-defined-classes (design) #implemented-classes (implementation), metrics refinement along the software life cycle, e. g. informal description of a specified service (text metrics) PDL description of a service (design metrics) Java form of a service (code metrics), metrics granulation related to the architecture, e. g. in an object-oriented development as the system, the component, the class/object and the method. In the process and resources area the semantic characteristics such as process phases and resources versions are also considered. Observing the software metrics as class hierarchy, we can understand the measurement migration as the definition and design of the metrics behaviour. On the other hand, the migration step includes the definition and installation of the External Measurement Process (EMP) as software measurement integration. This means that we must consider the final goals of software measurement in the IT area. Hence, we need all of the process steps such as measurement, evaluation, exploitation and application (assessment, decision support, improvement) in a persistent manner ([Eickelmann 2000], [Jacquet 1997], [Wohlin 2000]). 13

14 The Measurement Efficiency step includes the instrumentation or the automation of the measurement process by tools. It requires to analyse the algorithmic character of the software metrics and the possibility of the integration of tool-based control cycles in the software development or maintenance process. We will call the metrics tools as CAME (Computer Assisted software Measurement and Evaluation) tools [Dumke 1996]. In most cases, it is necessary to combine different metrics tools and techniques related to the measurement phases. Finally, we can describe software measurement intentions as following: We don t have any general system of measures in software engineering like in physics. Hence, we must consider in the software development the rules of thumb, statements of trends, analogue conclusions, expertise, estimations and predictions also ([Dumke 2003], [Endres 2003]). We also don t have any standardised measurement system which performs the system of measures. Therefore, we must use the general techniques of assessment (continues, periodic or certified), general evaluation, experiences and experimentation. Sometimes, the experimentation is not immediately used for decision support, improvement or controlling. We also use the experimentation for understanding of new paradigms or the cognition of new kinds of problems ([Basili 1986], [Wohlin 2000]). Software measurement instruments are mostly not based on a physical analogy such the column of mercury to measure the temperature. In the most cases, software measurement is counting [Kitchenham 1995]. Software measurement has a context and is not finished with measurement values or thresholds. Software measurement can be a generic measurement and analysis process ([Card 2000], [Jacquet 1997]). Empirical techniques are divided into informally observing, formal experiments, industrial case studies and benchmarking exercises or surveys ([Juristo 2003], [Kitchenham 1997]). In software engineering metrics area, should place more emphasis on the validity of the mathematical (and statistical) tools which have been (and are currently being) used in their development and use. Areas which give cause for concern in the past include the use of dimensionally incorrect equations, incorrect plotting of equations and consequent incorrect inferences, the sloppy use of mathematical notation and of calculated values and the lack of underpinning mathematical models. [Henderson 1996] Hence, the software metrics application based on different methodologies or frameworks requires statistical methods ([Juristo 2003], [Munson 2003], [Pandian 2003], [Sigpurwalla 1999], [Wohlin 2000], [Zuse 1998]). 14

15 2.2 The CMMI Metrics Set by Kulpa and Johnson The following set of metrics is defined by Kulpa and Johnson in order to keep the quantified requirements for the different CMMI levels [Kulpa 2003]. CMMI Level 2: Requirements Management 1. Requirements volatility- (percentage of requirements changes) 2. Number of requirements by type or status (defined, reviewed. approved. and implemented) 3. Cumulative number of changes to the allocated requirements, including total number of changes proposed, open, approved, and incorporated into the system baseline 4. Number of change requests per month, compared to the original number of requirements for the project 5. Amount of time spent, effort spent, cost of implementing change requests 6. Number and size of change requests after the Requirements phase is completed 7. Cost of implementing a change request 8. Number of change requests versus the total number of change requests during the life of the project 9. Number of change requests accepted but not implemented 10. Number of requirements (changes and additions to the baseline) Project Planning 11. Completion of milestones for the project planning activities compared to the plan (estimates versus actuals) 12. Work completed, effort and funds expended in the project planning activities compared to the plan 13. Number of revisions to the project plan 14. Cost, schedule, and effort variance per plan revision 15. Replanning effort due to change requests 16. Effort expended over time to manage the hmject compared to the plan 17. Frequency, causes, and magnitude of the replanning effort Project Monitoring and Control 18. Effort and other resources expended in performing monitoring and oversight activities 19. Change activity for the project plan, which includes changes to size estimates of the work products, cost/resource estimates, and schedule 20. Number of open and closed corrective actions or action items 21. Project milestone dates (planned versus actual) 22. Number of project milestone dates made on time 23. Number and types of reviews performed 24. Schedule, budget, and size variance between planned and actual reviews 25. Comparison of actuals versus estimates for all planning and tracking items Measurement and Analysis 26. Number of projects using progress and performance measures 27. Number of measurement objectives addressed Supplier Agreement Management 28. Cost of the COTS (commercial off-the-shelf) products 29. Cost and effort to incorporate the COTS products into the project 30. Number of changes made to the supplier requirements 31. Cost and schedule variance per supplier agreement 32. Costs of the activities for managing the contract compared to the plan 33. Actual delivery dates for contracted products compared to the plan 34. Actual dates of prime contractor deliveries to the subcontractor compared to the plan 35. Number of on-time deliveries from the vendor, compared with the contract 36. Number and severity of errors found after delivery 37. Number of exceptions to the contract to ensure schedule adherence 38. Number of quality audits compared to the plan 15

16 39. Number of Senior Management reviews to ensure adherence to hudget and schedule versus the plan 40. Number of contract violations by supplier or vendor Process and Product Quality Assurance (QA) 41. Completions of milestones for the QA activities compared to the plan 42. Work completed, effort expended in the QA activities compared to the plan 43. Number of product audits and activity reviews compared to the plan 44. Number of process audits and activities versus those planned 45. Number of defects per release and/or build 46. Amount of time/effort spent in rework 47. Amount of QA time/effort spent in each phase of the life cycle 48. Number of reviews and audits versus number of defects found 49. Total number of defects found in internal reviews and testing versus those found by the customer or end user after delivery 50. Number of defects found in each phase of the life cycle 51. Number of defects injected during each phase of the life cycle 52. Number of noncompliances written versus the number resolved 53. Number of noncompliances elevated to senior management 54. Complexity of module or component (McCabe, MeClure, and Halstead metrics) Configuration Management (CM) 55. Number of change requests or change board requests processed per unit of time 56. Completions of milestones for the CM activities compared to the plan 57. Work completed, effort expended, and funds expended in the CM activities 58. Number of changes to configuration items 59. Number of configuration audits conducted 60. Number of fixes returned as "Not Yet Fixed" 61. Number of fixes returned as "Could Not Reproduce Error" 62. Number of violations of CM procedures (noncompliance found in audits) 63. Number of outstanding problem reports versus rate of repair 64. Number of times changes are overwritten by someone else (or number of times people have the wrong initial version or baseline) 65. Number of engineering change proposals proposed, approved, rejected, implemented 66. Number of changes by category to code source, and to supporting documentation 67. Number of changes by category, type, and severity 68. Source lines of code stored in libraries placed under configuration control CMMI Level 3: Requirements Development 69. Cost, schedule, and effort expended for rework 70. Defect density of requirements specifications 71. Number of requirements approved for build (versus the total number of requirements) 72. Actual number of requirements documented (versus the total number of estimated requirements) 73. Staff hours (total and by Requirements Development activity) 74. Requirements status (percentage of defined specifications out of the total approved and proposed; number of requirements defined) 75. Estimates of total requirements, total requirements definition effort, requirements analysis effort, and schedule 76. Number and type of requirements changes Technical Solution 77. Cost, schedule, and effort expended for rework 78. Number of requirements addressed in the product or productcomponent design 79. Size and complexity of the product, product components, interfaces, and documentation 80. Defect density of technical solutions work products (number of defects per page) 81. Number of requirements by status or type throughout the life of the project (for example, number defined, approved, documented, implemented, tested, and signed-off by phase) 82. Problem reports by severity and length of time they are open 16

17 83. Number of requirements changed during implementation and test 84. Effort to analyze proposed changes for each proposed change and cumulative totals 85. Number of changes incorporated into the baseline by category (e.g., interface, security, system configuration, performance, and useability) 86. Size and cost to implement and test incorporated changes, including initial estimate and actual size and cost 87. Estimates and actuals of system size, reuse, effort, and schedule 88. The total estimated and actual staff hours needed to develop the system by job category and activity 89. Estimated dates and actuals for the start and end of each phase of the life cycle 90. Number of diagrams completed versus the estimated total diagrams 91. Number of design modules/units proposed 92. Number of design modules/units delivered 93. Estimates and actuals of total lines of code - new, modified, and reused 94. Estimates and actuals of total design and code modules and units 95. Estimates and actuals for total CPU hours used to date 96. The number of units coded and tested versus the number planned 97. Errors by category, phase discovered, phase injected, type, and severity 98. Estimates of total units, total effort, and schedule 99. System tests planned, executed, passed, or failed 100. Test discrepancies reported, resolved, or not resolved 101. Source code growth by percentage of planned versus actual Product Integration 102. Product-component integration profile (i.e., product-component assemblies planned and performed, and number of exceptions found) 103. Integration evaluation problem report trends (e.g., number written and number closed) 104. Integration evaluation problem report aging (i.e., how long each problem report has been open) Verification 105. Verification profile (e.g., the number of verifications planned and performed, and the defects found; perhaps categorized by verification method or type) 106. Number of defects detected by defect category 107. Verification problem report trends (e.g., number written and number closed) 108. Verification problem report status (i.e., how long each problem report has been open) 109. Number of peer reviews performed compared to the plan 110. Overall effort expended on peer reviews compared to the plan 111. Number of work products reviewed compared to the plan Validation 112. Number of validation activities completed (planned versus actual) 113. Validation problem reports trends (e.g., number written and number closed) 114. Validation problem report aging (i.e., how long each problem report has been open) Organizational Process Focus 115. Number of process improvement proposals submitted, accepted, or implemented 116. CMMI maturity or capability level 117. Work completed, effort and funds expended in the organization's activities for process assessment, development, and improvement compared to the plans for these activities 118. Results of each process assessment, compared to the results and recommendations of previous assessments Organizational Process Definition 119. Percentage of projects using the process architectures and process elements of the organization's set of standard processes 120. Defect density of each process element of the organization's set of standard processes 121. Number of on-schedule milestones for process development and maintenance 122. Costs for the process definition activities Organizational Training 123. Number of training courses delivered (e.g., planned versus actual) 124. Post-training evaluation ratings 125. Training program quality surveys 17

18 126. Actual attendance at each training course compared to the projected attendance 127. Progress in improving training courses compared to the organization's and projects' training plans 128. Number of training waivers approved over time Integrated Project Management for IPPD 129. Number of changes to the project's defined process 130. Effort to tailor the organization's set of standard processes 131. Interface coordination issue trends (e.g., number identified and closed) Risk Management 132. Number of risks identified, managed, tracked, and controlled 133. Risk exposure and changes to the risk exposure for each assessed risk, and as a summary percentage of management reserve 134. Change activity for the risk mitigation plans (e.g., processes, schedules, funding) 135. Number of occurrences of unanticipated risks 136. Risk categorization volatility 137. Estimated versus actual risk mitigation effort 138. Estimated versus actual risk impact 139. The amount of effort and time spent on risk management activities versus the number of actual risks 140. The cost of risk management versus the cost of actual risks 141. For each identified risk, the realized adverse impact compared to the estimated impact Integrated Teaming 142. Performance according to plans, commitments, and procedures for the integrated team, and deviations from expectations 143. Number of times team objectives were not achieved 144. Actual effort and other resources expended by one group to support another group or groups, and vice versa 145. Actual completion of specific tasks and milestones by one group to support the activities of other groups, and vice versa Integrated Supplier Management 146. Effort expended to manage the evaluation of sources and selection of suppliers 147. Number of changes to the requirements in the supplier agreement 148. Number of documented commitments between the project and the supplier 149. Interface coordination issue trends (e.g., number identified and number closed) 150. Number of defects detected in supplied products (during integration and after delivery) Decision Analysis and Resolution 151. Cost-to-benefit ratio of using formal evaluation processes Organizational Environment for Integration 152. Parameters for key operating characteristics of the work environment CMMI Level 4: Organizational Process Performance 153. Trends in the organization's process performance with respect to changes in work products and task attributes (e.g., size growth, effort, schedule, and quality) Quantitative Project Management 154. Time between failures 155. Critical resource utilization 156. Number and severity of defects in the released product 157. Number and severity of customer complaints concerning the provided service 158. Number of defects removed by product verification activities (perhaps by type of verification, such as peer reviews and testing) 159. Defect escape rates 160. Number and density of defects by severity found during the first year following product delivery or start of service 18

19 161. Cycle time 162. Amount of rework time 163. Requirements volatility (i.e., number of requirements changes per phase) 164. Ratios of estimated to measured values of the planning parameters (e.g., size, cost, and schedule) 165. Coverage and efficiency of peer reviews (i.e., number/amount of products reviewed compared to total number, and number of defects found per hour) 166. Test coverage and efficiency (i.e., number/amount of products tested compared to total number, and number of defects found per hour) 167. Effectiveness of training (i.e., percent of planned training completed and test scores) 168. Reliability (i.e., mean time-to-failure usually measured during integration and systems test) 169. Percentage of the total defects inserted or found in the different phases of the project life cycle 170. Percentage of the total effort expended in the different phases of the project life cycle 171. Profile of subprocesses under statistical management (i.e., number planned to be under statistical management, number currently being statistically managed, and number that are statistically stable) 172. Number of special causes of variation identified 173. The cost over time for the quantitative process management activities compared to the plan 174. The accomplishment of schedule milestones for quantitative process management activities compared to the approved plan (i.e., establishing the process measurements to be used on the project, determining how the process data will be collected, and collecting the process data) 175. The cost of poor quality (e.g., amount of rework, re-reviews and re-testing) 176. The costs for achieving quality goals (e.g., amount of initial reviews, audits, and testing) CMMI Level 5: Organizational Innovation and Deployment 177. Change in quality after improvements (e.g., number of reduced defects) 178. Change in process performance after improvements (e.g., change in baselines) 179. The overall technology change activity, including number, type, and size of changes 180. The effect of implementing the technology change compared to the goals (e.g., actual cost saving to projected) 181. The number of process improvement proposals submitted and implemented for each process area 182. The number of process improvement proposals submitted by each project, group, and department 183. The number and types of awards and recognitions received by each of the projects, groups, and departments 184. The response time for handling process improvement proposals 185. Number of process improvement proposals accepted per reporting period 186. The overall change activity including number, type, and size of changes 187. The effect of implementing each process improvement compared to its defined goals 188. Overall performance of the organization's and projects' processes, including effectiveness, quality, and productivity compared to their defined goals 189. Overall productivity and quality trends for each project 190. Process measurements that relate to the indicators of the customers' satisfaction (e.g., surveys results, number of customer complaints, and number of customer compliments) Causal Analysis and Resolution 191. Defect data (problem reports, defects reported by the customer, defects reported by the user, defects found in peer reviews, defects found in testing, process capability problems, time and cost for identifying the defect and fixing it, estimated cost of not fixing the problem) 192. Number of root causes removed 193. Change in quality or process performance per instance of the causal analysis and resolution process (e.g., number of defects and changes in baseline) 194. The costs of defect prevention activities (e.g., holding causal analysis meetings and implementing action items), cumulatively 195. The time and cost for identifying the defects and correcting them compared to the estimated cost of not correcting the defects 196. Profiles measuring the number of action items proposed, open, and completed 197. The number of defects injected in each stage, cumulatively, and over-releases of similar products 198. The number of defects 19

20 2.3 The CMMI-Based Organization s Measurement Repository The following section includes the main activities for defining and implementation of measurement repositories using in an organizational context. The repository contains both product and process measures that are related to the organization's set of standard processes ([SEI 2002]). It also contains or refers to the information needed to understand and interpret the measures and assess them for reasonableness and applicability. For example, the definitions of the measures are used to compare similar measures from different processes. Typical Work Products: 1. Definition of the common set of product and process measures for the organization's set of standard processes 2. Design of the organization s measurement repository 3. Organization's measurement repository (i.e., the repository structure and support environment) 4. Organization s measurement data Subpractices: 1. Determine the organization's needs for storing, retrieving, and analyzing measurements. 2. Define a common set of process and product measures for the organization's set of standard processes. The measures in the common set are selected based on the organization's set of standard processes. The common set of measures may vary for different standard processes. Operational definitions for the measures specify the procedures for collecting valid data and the point in the process where the data will be collected. Examples of classes of commonly used measures include the following: Estimates of work product size (e.g., pages) Estimates of effort and cost (e.g., person hours) Actual measures of size, effort, and cost Quality measures (e.g., number of defects found, severity of defects) Peer review coverage Test coverage Reliability measures (e.g., mean time to failure). Refer to the Measurement and Analysis process area for more information about defining measures. 3. Design and implement the measurement repository. 4. Specify the procedures for storing, updating, and retrieving measures. 5. Conduct peer reviews on the definitions of the common set of measures and the procedures for storing and retrieving measures. Refer to the Verification process area for more information about conducting peer reviews. 6. Enter the specified measures into the repository. Refer to the Measurement and Analysis process area for more information about collecting and analyzing data. 7. Make the contents of the measurement repository available for use by the organization and projects as appropriate. 8. Revise the measurement repository, common set of measures, and procedures as the organization s needs change. Examples of when the common set of measures may need to be revised include the following: New processes are added Processes are revised and new product or process measures are needed Finer granularity of data is required Greater visibility into the process is required Measures are retired. 20

CMMI KEY PROCESS AREAS

CMMI KEY PROCESS AREAS CMMI KEY PROCESS AREAS http://www.tutorialspoint.com/cmmi/cmmi-process-areas.htm Copyright tutorialspoint.com A Process Area is a cluster of related practices in an area that, when implemented collectively,

More information

MKS Integrity & CMMI. July, 2007

MKS Integrity & CMMI. July, 2007 & CMMI July, 2007 Why the drive for CMMI? Missed commitments Spiralling costs Late delivery to the market Last minute crunches Inadequate management visibility Too many surprises Quality problems Customer

More information

The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.

The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements. CAPACITY AND AVAILABILITY MANAGEMENT A Project Management Process Area at Maturity Level 3 Purpose The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision

More information

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. February 2013 1 Executive Summary Adnet is pleased to provide this white paper, describing our approach to performing

More information

Capability Maturity Model Integration (CMMI SM ) Fundamentals

Capability Maturity Model Integration (CMMI SM ) Fundamentals Capability Maturity Model Integration (CMMI SM ) Fundamentals Capability Maturity Model Integration and CMMI are are service marks of Carnegie Mellon University 2008, GRafP Technologies inc. 1 What is

More information

Distributed and Outsourced Software Engineering. The CMMI Model. Peter Kolb. Software Engineering

Distributed and Outsourced Software Engineering. The CMMI Model. Peter Kolb. Software Engineering Distributed and Outsourced Software Engineering The CMMI Model Peter Kolb Software Engineering SEI Trademarks and Service Marks SM CMM Integration SCAMPI are service marks of Carnegie Mellon University

More information

<name of project> Software Project Management Plan

<name of project> Software Project Management Plan The document in this file is adapted from the IEEE standards for Software Project Management Plans, 1058-1998, which conforms to the requirements of ISO standard 12207 Software Life Cycle Processes. Tailor

More information

Measurement Strategies in the CMMI

Measurement Strategies in the CMMI Measurement Strategies in the CMMI International Software Measurement & Analysis Conference 9-14 September 2007 Rick Hefner, Ph.D. Director, Process Management Northrop Grumman Corporation One Space Park,

More information

CMMI: Specific Goals and Practices

CMMI: Specific Goals and Practices Software Engineering for Outsourced & Offshore Development CMMI: Specific Goals and Practices PeterKolb Software Engineering CMMI Process Areas for R&D Projects Slide 2 Content Management in Projects Project

More information

Fundamentals of Measurements

Fundamentals of Measurements Objective Software Project Measurements Slide 1 Fundamentals of Measurements Educational Objective: To review the fundamentals of software measurement, to illustrate that measurement plays a central role

More information

Interpreting Capability Maturity Model Integration (CMMI ) for Service Organizations a Systems Engineering and Integration Services Example

Interpreting Capability Maturity Model Integration (CMMI ) for Service Organizations a Systems Engineering and Integration Services Example Interpreting Capability Maturity Model Integration (CMMI ) for Service Organizations a Systems Engineering and Integration Services Example Mary Anne Herndon, SAIC Robert Moore, SAIC Mike Phillips, Software

More information

Measurement Information Model

Measurement Information Model mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides

More information

Interpretation and lesson learned from High Maturity Implementation of CMMI-SVC

Interpretation and lesson learned from High Maturity Implementation of CMMI-SVC Interpretation and lesson learned from High Maturity Implementation of CMMI-SVC Agenda and Topics Opening Recap High Maturity Process Areas Main Questions for High Maturity Process Improvement Pilot Lessoned

More information

Software Quality Standards and. from Ontological Point of View SMEF. Konstantina Georgieva

Software Quality Standards and. from Ontological Point of View SMEF. Konstantina Georgieva SMEF 10-11 June, 2010 Software Quality Standards and Approaches from Ontological Point of View Konstantina Georgieva Otto-von-Guericke University Magdeburg Department of Computer Science, Software Engineering

More information

Software Process Improvement Framework for Software Outsourcing Based On CMMI Master of Science Thesis in Software Engineering and Management

Software Process Improvement Framework for Software Outsourcing Based On CMMI Master of Science Thesis in Software Engineering and Management Software Process Improvement Framework for Software Outsourcing Based On CMMI Master of Science Thesis in Software Engineering and Management ZAHOOR UL ISLAM XIANZHONG ZHOU University of Gothenburg Chalmers

More information

Software Quality Assurance: VI Standards

Software Quality Assurance: VI Standards Software Quality Assurance: VI Standards Room E 3.165 Tel. 60-3321 Email: hg@upb.de Outline I Introduction II Software Life Cycle III Quality Control IV Infrastructure V Management VI Standards VII Conclusion

More information

Process Improvement. Objectives

Process Improvement. Objectives Process Improvement Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 28 Slide 1 Objectives To explain the principles of software process improvement To explain how software process factors

More information

Truly Managing a Project and Keeping Sane While Wrestling Elegantly With PMBOK, Scrum and CMMI (Together or Any Combination)

Truly Managing a Project and Keeping Sane While Wrestling Elegantly With PMBOK, Scrum and CMMI (Together or Any Combination) Truly Managing a Project and Keeping Sane While Wrestling Elegantly With PMBOK, Scrum and CMMI (Together or Any Combination) Neil Potter The Process Group Lead Appraiser / Improvement Coach Organization

More information

ITIL-CMM Process Comparison

ITIL-CMM Process Comparison ITIL-CMM Process Comparison For More information: l.lee@pinkelephant.com s.crymble@pinkelephant.com www.pinkelephant.com Page 1 Pink Elephant understands many organizations are currently striving to improve

More information

Reaching CMM Levels 2 and 3 with the Rational Unified Process

Reaching CMM Levels 2 and 3 with the Rational Unified Process Reaching CMM Levels 2 and 3 with the Rational Unified Process Rational Software White Paper TP174 Table of Contents INTRODUCTION... 1 LEVEL-2, REPEATABLE... 3 Requirements Management... 3 Software Project

More information

Camber Quality Assurance (QA) Approach

Camber Quality Assurance (QA) Approach Camber Quality Assurance (QA) Approach Camber s QA approach brings a tested, systematic methodology, ensuring that our customers receive the highest quality products and services, delivered via efficient

More information

CAPABILITY MATURITY MODEL INTEGRATION

CAPABILITY MATURITY MODEL INTEGRATION CAPABILITY MATURITY MODEL INTEGRATION Radu CONSTANTINESCU PhD Candidate, University Assistant Academy of Economic Studies, Bucharest, Romania E-mail: radu.constantinescu@ie.ase.ro Web page: http:// www.raduconstantinescu.ase.ro

More information

Testing Metrics. Introduction

Testing Metrics. Introduction Introduction Why Measure? What to Measure? It is often said that if something cannot be measured, it cannot be managed or improved. There is immense value in measurement, but you should always make sure

More information

Process Improvement -CMMI. Xin Feng

Process Improvement -CMMI. Xin Feng Process Improvement -CMMI Xin Feng Objectives History CMMI Why CMMI CMMI representations 4/11/2011 Software Engineering 2 Process Improvement Achieve both qualityand productivity ( 生 产 力 ) It is not necessary

More information

Certified Software Quality Engineer (CSQE) Body of Knowledge

Certified Software Quality Engineer (CSQE) Body of Knowledge Certified Software Quality Engineer (CSQE) Body of Knowledge The topics in this Body of Knowledge include additional detail in the form of subtext explanations and the cognitive level at which the questions

More information

Implementation of ANSI/AAMI/IEC 62304 Medical Device Software Lifecycle Processes.

Implementation of ANSI/AAMI/IEC 62304 Medical Device Software Lifecycle Processes. Implementation of ANSI/AAMI/IEC 62304 Medical Device Software Lifecycle Processes.. www.pharmout.net Page 1 of 15 Version-02 1. Scope 1.1. Purpose This paper reviews the implementation of the ANSI/AAMI/IEC

More information

The Configuration Management process area involves the following:

The Configuration Management process area involves the following: CONFIGURATION MANAGEMENT A Support Process Area at Maturity Level 2 Purpose The purpose of is to establish and maintain the integrity of work products using configuration identification, configuration

More information

A Report on The Capability Maturity Model

A Report on The Capability Maturity Model A Report on The Capability Maturity Model Hakan Bayraksan hxb07u 29 November 2009 G53QAT Table of Contents Introduction...2 The evolution of CMMI...3 CMM... 3 CMMI... 3 The definition of CMMI... 4 Level

More information

Process Improvement. From the Software Engineering Institute:

Process Improvement. From the Software Engineering Institute: Process Improvement From the Software Engineering Institute: The Software Capability Maturity Model (SW-CMM, CMMI) (Especially CMMI V1.1 Tutorial) The Personal Software Process (PSP) (Also see The Team

More information

An Introduction to. Metrics. used during. Software Development

An Introduction to. Metrics. used during. Software Development An Introduction to Metrics used during Software Development Life Cycle www.softwaretestinggenius.com Page 1 of 10 Define the Metric Objectives You can t control what you can t measure. This is a quote

More information

CENTRE (Common Enterprise Resource)

CENTRE (Common Enterprise Resource) CENTRE (Common Enterprise Resource) Systems and Software Engineering Platform designed for CMMI compliance Capability Maturity Model Integration (CMMI) is a process improvement approach that provides organizations

More information

Introduction to SEIs Capability Maturity Model Integration (CMMI)

Introduction to SEIs Capability Maturity Model Integration (CMMI) Introduction to SEIs Capability Maturity Model Integration (CMMI) Rajiv Kapur, Ph.D. President and CEO Cura Consulting Solutions Principal, CCI Group Adjunct Professor, Industrial & Systems Engineering,

More information

Concept of Operations for the Capability Maturity Model Integration (CMMI SM )

Concept of Operations for the Capability Maturity Model Integration (CMMI SM ) Concept of Operations for the Capability Maturity Model Integration (CMMI SM ) August 11, 1999 Contents: Introduction CMMI Overview Concept for Operational Use of the CMMI Migration to CMMI Models Concept

More information

Supporting the CMMI Metrics Framework thru Level 5. Márcio. Silveira. page 1

Supporting the CMMI Metrics Framework thru Level 5. Márcio. Silveira. page 1 September 03-23-05 2009 EDS-Electronic Electronic Data Systems do Brasil Ltda. Márcio Silveira page Agenda Objective EDS Overall Process Improvement Strategy Measurement Elements of the CMMI Model M&A

More information

Moving from ISO9000 to the Higher Levels of the Capability Maturity Model (CMM)

Moving from ISO9000 to the Higher Levels of the Capability Maturity Model (CMM) Moving from ISO9000 to the Higher Levels of the Capability Maturity Model (CMM) Pankaj Jalote 1 Infosys Technologies Ltd. Bangalore 561 229 Fax: +91-512-590725/590413 Jalote@iitk.ernet.in, jalote@iitk.ac.in

More information

Using Measurement to translate Business Vision into Operational Software Strategies

Using Measurement to translate Business Vision into Operational Software Strategies Using Measurement to translate Business Vision into Operational Software Strategies Victor R. Basili University of Maryland and Fraunhofer Center - Maryland BUSINESS NEEDS Any successful business requires:

More information

The Compelling Case For CMMI-SVC: CMMI-SVC, ITIL & ISO20000 demystified

The Compelling Case For CMMI-SVC: CMMI-SVC, ITIL & ISO20000 demystified The Compelling Case For CMMI-SVC: CMMI-SVC, ITIL & ISO20000 demystified T: 01748 821824 E: marketing@lamri.com Agenda What is CMMI-SVC? How Does CMMI-SVC Relate to Existing Models? CMMI-SVC and ISO 20000

More information

Towards a new approach of continuous process improvement based on CMMI and PMBOK

Towards a new approach of continuous process improvement based on CMMI and PMBOK www.ijcsi.org 160 Towards a new approach of continuous process improvement based on CMMI and PMBOK Yassine Rdiouat 1, Naima Nakabi 2, Khadija Kahtani 3 and Alami Semma 4 1 Department of Mathematics and

More information

TEST METRICS AND KPI S

TEST METRICS AND KPI S WHITE PAPER TEST METRICS AND KPI S Abstract This document serves as a guideline for understanding metrics and the Key performance indicators for a testing project. Metrics are parameters or measures of

More information

ISO 9001/TL 9000 and CMMI Comparison

ISO 9001/TL 9000 and CMMI Comparison ISO 9001/TL 9000 and CMMI Comparison www.questforum.org Copyright QuEST Forum 2007 1 Purpose This summary is intended to give those familiar with CMMI a general sense of the additional requirements contained

More information

How To Understand And Understand The Cmm

How To Understand And Understand The Cmm W H I T E P A P E R SEI's Capability Maturity Model Integrated (CMMI) Relative to ICM's CMII (Rev B) SUMMARY CMMI is built on a set of integrated processes and includes CM as a supporting process. The

More information

Capability Maturity Model Integrated (CMMI)

Capability Maturity Model Integrated (CMMI) When the Outcome Matters Capability Maturity Model Integrated (CMMI) Configuration Management Considerations Gerard Dache Gerard.dache@psgs.com 703-560-9477 Agenda SEI Overview Capability Maturity Models

More information

ISO, CMMI and PMBOK Risk Management: a Comparative Analysis

ISO, CMMI and PMBOK Risk Management: a Comparative Analysis ISO, CMMI and PMBOK Risk Management: a Comparative Analysis Cristine Martins Gomes de Gusmão Federal University of Pernambuco / Informatics Center Hermano Perrelli de Moura Federal University of Pernambuco

More information

How To Create A Process Measurement System

How To Create A Process Measurement System Set Up and Operation of a Design Process Measurement System Included below is guidance for the selection and implementation of design and development process measurements. Specific measures can be found

More information

Using the Agile Methodology to Mitigate the Risks of Highly Adaptive Projects

Using the Agile Methodology to Mitigate the Risks of Highly Adaptive Projects Transdyne Corporation CMMI Implementations in Small & Medium Organizations Using the Agile Methodology to Mitigate the Risks of Highly Adaptive Projects Dana Roberson Quality Software Engineer NNSA Service

More information

V. Phani Krishna et al, / (IJCSIT) International Journal of Computer Science and Information Technologies, Vol. 2 (6), 2011, 2915-2919

V. Phani Krishna et al, / (IJCSIT) International Journal of Computer Science and Information Technologies, Vol. 2 (6), 2011, 2915-2919 Software Quality Assurance in CMM and XP- A Comparative Study CH.V. Phani Krishna and Dr. K.Rajasekhara Rao CSE Department, KL University, Guntur dt., India. Abstract Software Quality Assurance is a planned

More information

CENTRE (Common Enterprise Resource)

CENTRE (Common Enterprise Resource) CENTRE (Common Enterprise Resource) Systems and Software Engineering Platform designed for CMMI compliance Capability Maturity Model Integration (CMMI) is a process improvement approach that provides organizations

More information

SW Process Improvement and CMMI. Dr. Kanchit Malaivongs Authorized SCAMPI Lead Appraisor Authorized CMMI Instructor

SW Process Improvement and CMMI. Dr. Kanchit Malaivongs Authorized SCAMPI Lead Appraisor Authorized CMMI Instructor SW Process Improvement and CMMI Dr. Kanchit Malaivongs Authorized SCAMPI Lead Appraisor Authorized CMMI Instructor Topics of Presentation Why improvement? What is CMMI? Process Areas and Practices in CMMI

More information

ADOPTION AND UP GRADATION OF CMMI: PROSPECT OF SOFTWARE INDUSTRY OF BANGLADESH. A Thesis

ADOPTION AND UP GRADATION OF CMMI: PROSPECT OF SOFTWARE INDUSTRY OF BANGLADESH. A Thesis ADOPTION AND UP GRADATION OF CMMI: PROSPECT OF SOFTWARE INDUSTRY OF BANGLADESH A Thesis Submitted to the Department of Computer Science and Engineering of BRAC University by Md. Samirul Haque Student ID:

More information

Software Engineering. Standardization of Software Processes. Lecturer: Giuseppe Santucci

Software Engineering. Standardization of Software Processes. Lecturer: Giuseppe Santucci Software Engineering Standardization of Software Processes Lecturer: Giuseppe Santucci Summary Introduction to Process Models The Capability Maturity Model Integration The ISO 12207 standard for software

More information

ITIL-CMMII Comparison

ITIL-CMMII Comparison ITIL-CMMII Comparison Today we can see and understand that many IT organizations are striving to improve how they do business throughout the organization. In doing so, many organizations undertake a number

More information

Leveraging CMMI framework for Engineering Services

Leveraging CMMI framework for Engineering Services Leveraging CMMI framework for Engineering Services Regu Ayyaswamy, Mala Murugappan Tata Consultancy Services Ltd. Introduction In response to Global market demand, several OEMs adopt Global Engineering

More information

Lecture Slides for Managing and Leading Software Projects. Chapter 1: Introduction

Lecture Slides for Managing and Leading Software Projects. Chapter 1: Introduction Lecture Slides for Managing and Leading Software Projects Chapter 1: Introduction developed by Richard E. (Dick) Fairley, Ph.D. to accompany the text Managing and Leading Software Projects published by

More information

Using Rational Software Solutions to Achieve CMMI Level 2

Using Rational Software Solutions to Achieve CMMI Level 2 Copyright Rational Software 2003 http://www.therationaledge.com/content/jan_03/f_cmmi_rr.jsp Using Rational Software Solutions to Achieve CMMI Level 2 by Rolf W. Reitzig Founder, Cognence, Inc. Over the

More information

Applying CMMI SM In Information Technology Organizations SEPG 2003

Applying CMMI SM In Information Technology Organizations SEPG 2003 Applying CMMI SM In Information Technology Organizations Mark Servello, Vice President Jim Gibson, Senior Consultant ChangeBridge, Incorporated Page 1 Portions Copyright 2002 Carnegie Mellon University

More information

SEI Level 2, 3, 4, & 5 1 Work Breakdown Structure (WBS)

SEI Level 2, 3, 4, & 5 1 Work Breakdown Structure (WBS) SEI Level 2, 3, 4, & 5 1 Work Breakdown Structure (WBS) 1.0 SEI Product 1.1 SEI Level 2 Product 1.1.1 SEI Level 2 Process 1.1.1.1 Requirements Management Process 1.1.1.2 Software Project Planning Process

More information

The Design and Improvement of a Software Project Management System Based on CMMI

The Design and Improvement of a Software Project Management System Based on CMMI Intelligent Information Management, 2012, 4, 330-337 http://dx.doi.org/10.4236/iim.2012.46037 Published Online November 2012 (http://www.scirp.org/journal/iim) The Design and Improvement of a Software

More information

Software Process Improvement CMM

Software Process Improvement CMM Software Process Improvement CMM Marcello Visconti Departamento de Informática Universidad Técnica Federico Santa María Valparaíso, Chile Software Engineering Institute Founded by the Department of Defense

More information

wibas Team CMMI-ITIL IT Maturity S e r v i c e s

wibas Team CMMI-ITIL IT Maturity S e r v i c e s wibas Team CMMI-ITIL ITIL integrated into CMMI IT Maturity S e r v i c e s 1 CMMI-ITIL Management Summary -2- Copyright 2007 wibas IT Maturity Services GmbH CMMI-ITIL ITIL is a reference model to improve

More information

PROJECT QUALITY MANAGEMENT

PROJECT QUALITY MANAGEMENT 8 PROJECT QUALITY MANAGEMENT Project Quality Management includes the processes required to ensure that the project will satisfy the needs for which it was undertaken. It includes all activities of the

More information

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management International Journal of Soft Computing and Engineering (IJSCE) A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management Jayanthi.R, M Lilly Florence Abstract:

More information

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME > PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME > Date of Issue: < date > Document Revision #: < version # > Project Manager: < name > Project Management Plan < Insert Project Name > Revision History Name

More information

CSTE Mock Test - Part I - Questions Along with Answers

CSTE Mock Test - Part I - Questions Along with Answers Note: This material is for Evaluators reference only. Caters to answers of CSTE Mock Test - Part I paper. 1. A branch is (Ans: d) a. An unconditional transfer of control from any statement to any other

More information

The Challenge of Productivity Measurement

The Challenge of Productivity Measurement Proceedings: Pacific Northwest Software Quality Conference, 2006 The Challenge of Productivity Measurement David N. Card Q-Labs, Inc dca@q-labs.com Biography- David N. Card is a fellow of Q-Labs, a subsidiary

More information

Capability Maturity Model Integration (CMMI)

Capability Maturity Model Integration (CMMI) COPYRIGHT 2011 IJCIT, ISSN 2078-5828 (PRINT), ISSN 2218-5224 (ONLINE), VOLUME 02, ISSUE 01, MANUSCRIPT CODE: IJCIT-110748 Capability Maturity Model Integration (CMMI) Anasis Majumdar, Muhammad Ashiqe-Ur-Rouf,

More information

QUALITY ASSURANCE IN EXTREME PROGRAMMING Plamen Balkanski

QUALITY ASSURANCE IN EXTREME PROGRAMMING Plamen Balkanski International Journal "Information Theories & Applications" Vol.10 113 QUALITY ASSURANCE IN EXTREME PROGRAMMING Plamen Balkanski Abstract: Our previous research about possible quality improvements in Extreme

More information

SOFTWARE ASSURANCE STANDARD

SOFTWARE ASSURANCE STANDARD NOT MEASUREMENT SENSITIVE National Aeronautics and NASA-STD-8739.8 w/change 1 Space Administration July 28, 2004 SOFTWARE ASSURANCE STANDARD NASA TECHNICAL STANDARD REPLACES NASA-STD-2201-93 DATED NOVEMBER

More information

Quality Management. Lecture 12 Software quality management

Quality Management. Lecture 12 Software quality management Quality Management Lecture 12 Software quality management doc.dr.sc. Marko Jurčević prof.dr.sc. Roman Malarić University of Zagreb Faculty of Electrical Engineering and Computing Department of Fundamentals

More information

CMMI and IBM Rational Unified Process

CMMI and IBM Rational Unified Process IBM Software Group CMMI and IBM Rational Unified Process A practical route to greater development maturity CMMI Made Practical, London, 19-20 th March, 2007 Keith Mantell IBM Rational, UK keith_mantell@uk.ibm.com

More information

Future of CMM and Quality Improvement. Roy Ko Hong Kong Productivity Council

Future of CMM and Quality Improvement. Roy Ko Hong Kong Productivity Council Future of CMM and Quality Improvement Roy Ko Hong Kong Productivity Council 1 Agenda Future Development of CMMI CMMI and Small Organizations CMMI and Agile Development Good Enough Quality CMMI and Other

More information

Lecture 8 About Quality and Quality Management Systems

Lecture 8 About Quality and Quality Management Systems Lecture 8 About Quality and Quality Management Systems Kari Systä 10.03.2014 10.03.2014 TIE-21100/21106; K.Systä 1 Content of today s lecture Two weeks ago we discussed about testing and inspections, that

More information

MNLARS Project Audit Checklist

MNLARS Project Audit Checklist Audit Checklist The following provides a detailed checklist to assist the audit team in reviewing the health of a project. Relevance (at this time) How relevant is this attribute to this project or audit?

More information

Automated Office Systems Support Quality Assurance Plan. A Model DRAFT. December 1996

Automated Office Systems Support Quality Assurance Plan. A Model DRAFT. December 1996 Quality Assurance Plan A Model DRAFT United States Department of Energy Office of Nonproliferation and National Security Title Page Document Name: Publication Date: Draft, ontract Number: Project Number:

More information

International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research)

International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) International Journal of Engineering, Business and Enterprise

More information

Software Engineering Compiled By: Roshani Ghimire Page 1

Software Engineering Compiled By: Roshani Ghimire Page 1 Unit 7: Metric for Process and Product 7.1 Software Measurement Measurement is the process by which numbers or symbols are assigned to the attributes of entities in the real world in such a way as to define

More information

IA Metrics Why And How To Measure Goodness Of Information Assurance

IA Metrics Why And How To Measure Goodness Of Information Assurance IA Metrics Why And How To Measure Goodness Of Information Assurance Nadya I. Bartol PSM Users Group Conference July 2005 Agenda! IA Metrics Overview! ISO/IEC 21827 (SSE-CMM) Overview! Applying IA metrics

More information

Knowledge Infrastructure for Project Management 1

Knowledge Infrastructure for Project Management 1 Knowledge Infrastructure for Project Management 1 Pankaj Jalote Department of Computer Science and Engineering Indian Institute of Technology Kanpur Kanpur, India 208016 Jalote@iitk.ac.in Abstract In any

More information

Development, Acquisition, Implementation, and Maintenance of Application Systems

Development, Acquisition, Implementation, and Maintenance of Application Systems Development, Acquisition, Implementation, and Maintenance of Application Systems Part of a series of notes to help Centers review their own Center internal management processes from the point of view of

More information

Developing CMMI in IT Projects with Considering other Development Models

Developing CMMI in IT Projects with Considering other Development Models Developing CMMI in IT Projects with Considering other Development Models Anahita Ahmadi* MSc in Socio Economic Systems Engineering Organizational Process Development Engineer, International Systems Engineering

More information

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same!

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same! Software Metrics & Software Metrology Alain Abran Chapter 4 Quantification and Measurement are Not the Same! 1 Agenda This chapter covers: The difference between a number & an analysis model. The Measurement

More information

The Advantages of Using CENTRE

The Advantages of Using CENTRE CENTRE (Common Enterprise Resource) Systems and Software Engineering Platform designed for CMMI compliance Capability Maturity Model Integration (CMMI) is a process improvement approach that provides organizations

More information

Software Engineering: Analysis and Design - CSE3308

Software Engineering: Analysis and Design - CSE3308 CSE3308/DMS/2004/25 Monash University - School of Computer Science and Software Engineering Software Engineering: Analysis and Design - CSE3308 Software Quality CSE3308 - Software Engineering: Analysis

More information

Evaluation and Integration of Risk Management in CMMI and ISO/IEC 15504

Evaluation and Integration of Risk Management in CMMI and ISO/IEC 15504 Evaluation and Integration of Risk Management in CMMI and ISO/IEC 15504 Dipak Surie, Email : ens03dse@cs.umu.se Computing Science Department Umea University, Umea, Sweden Abstract. During software development,

More information

Application Support Solution

Application Support Solution Application Support Solution White Paper This document provides background and administration information on CAI s Legacy Application Support solution. PRO00001-MNGMAINT 080904 Table of Contents 01 INTRODUCTION

More information

Software Project Management and Support - Practical Support for CMMI -SW Project Documentation: Using IEEE Software Engineering Standards

Software Project Management and Support - Practical Support for CMMI -SW Project Documentation: Using IEEE Software Engineering Standards Software Project Management and Support - Practical Support for CMMI -SW Project Documentation: Using IEEE Software Engineering Standards John Walz The Sutton Group IEEE Computer Society Standards Activities

More information

How Rational Configuration and Change Management Products Support the Software Engineering Institute's Software Capability Maturity Model

How Rational Configuration and Change Management Products Support the Software Engineering Institute's Software Capability Maturity Model How Rational Configuration and Change Management Products Support the Software Engineering Institute's Software Capability Maturity Model by Bill Cottrell and John Viehweg Software Engineering Specialists

More information

Application of software product quality international standards through software development life cycle

Application of software product quality international standards through software development life cycle Central Page 284 of 296 Application of software product quality international standards through software development life cycle Mladen Hosni, Valentina Kirinić Faculty of Organization and Informatics University

More information

EASPI EASPI. The Integrated CMMI-based Improvement Framework for Test and Evaluation. Jeffrey L. Dutton Principal Consultant

EASPI EASPI. The Integrated CMMI-based Improvement Framework for Test and Evaluation. Jeffrey L. Dutton Principal Consultant The Integrated CMMI-based Improvement Framework for Test and Evaluation Jeffrey L. Dutton Principal Consultant Engineering and Services Performance Improvement LLC 22 Copyrights and Service Marks CMMI

More information

Process Improvement. Objectives

Process Improvement. Objectives Process Improvement cmsc435-1 Objectives To explain the principles of software process improvement To explain how software process factors influence software quality and productivity To introduce the SEI

More information

Quality Manual. DuraTech Industries, Inc. 3216 Commerce Street La Crosse, WI 54603 MANUAL SERIAL NUMBER 1

Quality Manual. DuraTech Industries, Inc. 3216 Commerce Street La Crosse, WI 54603 MANUAL SERIAL NUMBER 1 Quality Manual Approval Page Document: QA1000 Issue Date: 5/29/1997 Page 1 of 17 Revision Date: 5/20/2013 DuraTech Industries, Inc. 3216 Commerce Street La Crosse, WI 54603 MANUAL SERIAL NUMBER 1 This

More information

Request for Proposal for Application Development and Maintenance Services for XML Store platforms

Request for Proposal for Application Development and Maintenance Services for XML Store platforms Request for Proposal for Application Development and Maintenance s for ML Store platforms Annex 4: Application Development & Maintenance Requirements Description TABLE OF CONTENTS Page 1 1.0 s Overview...

More information

Darshan Institute of Engineering & Technology Unit : 7

Darshan Institute of Engineering & Technology Unit : 7 1) Explain quality control and also explain cost of quality. Quality Control Quality control involves the series of inspections, reviews, and tests used throughout the software process to ensure each work

More information

Software Process Improvement Software Business. Casper Lassenius

Software Process Improvement Software Business. Casper Lassenius Software Process Improvement Software Business Casper Lassenius Topics covered ² The process process ² Process measurement ² Process analysis ² Process change ² The CMMI process framework 2 Process ² Many

More information

An Integrated Model of ISO 9001:2000 and CMMI for ISO Registered Organizations

An Integrated Model of ISO 9001:2000 and CMMI for ISO Registered Organizations An Integrated Model of ISO 9001:2000 and CMMI for ISO Registered Organizations Chanwoo Yoo 1, Junho Yoon 1, Byungjeong Lee 2, Chongwon Lee 1, Jinyoung Lee 1, Seunghun Hyun 1, and Chisu Wu 1 1 School of

More information

CS 1632 SOFTWARE QUALITY ASSURANCE. 2 Marks. Sample Questions and Answers

CS 1632 SOFTWARE QUALITY ASSURANCE. 2 Marks. Sample Questions and Answers CS 1632 SOFTWARE QUALITY ASSURANCE 2 Marks Sample Questions and Answers 1. Define quality. Quality is the degree of goodness of a product or service or perceived by the customer. Quality concept is the

More information

SOFTWARE QUALITY IN 2002: A SURVEY OF THE STATE OF THE ART

SOFTWARE QUALITY IN 2002: A SURVEY OF THE STATE OF THE ART Software Productivity Research an Artemis company SOFTWARE QUALITY IN 2002: A SURVEY OF THE STATE OF THE ART Capers Jones, Chief Scientist Emeritus Six Lincoln Knoll Lane Burlington, Massachusetts 01803

More information

Appendix V Risk Management Plan Template

Appendix V Risk Management Plan Template Appendix V Risk Management Plan Template Version 2 March 7, 2005 This page is intentionally left blank. Version 2 March 7, 2005 Title Page Document Control Panel Table of Contents List of Acronyms Definitions

More information

Security Engineering Best Practices. Arca Systems, Inc. 8229 Boone Blvd., Suite 750 Vienna, VA 22182 703-734-5611 ferraiolo@arca.com.

Security Engineering Best Practices. Arca Systems, Inc. 8229 Boone Blvd., Suite 750 Vienna, VA 22182 703-734-5611 ferraiolo@arca.com. Tutorial: Instructor: Topics: Biography: Security Engineering Best Practices Karen Ferraiolo, Arca Systems, Inc. 8229 Boone Blvd., Suite 750 Vienna, VA 22182 703-734-5611 ferraiolo@arca.com This tutorial

More information

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Project Management Self-Assessment Contents Introduction 3 User Guidance 4 P3M3 Self-Assessment Questionnaire

More information

Software Quality Management

Software Quality Management Software Lecture 9 Software Engineering CUGS Spring 2011 Kristian Sandahl Department of Computer and Information Science Linköping University, Sweden A Software Life-cycle Model Which part will we talk

More information