# An Introduction to. Metrics. used during. Software Development

 To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
Save this PDF as:

Size: px
Start display at page:

Download "An Introduction to. Metrics. used during. Software Development"

## Transcription

1 An Introduction to Metrics used during Software Development Life Cycle Page 1 of 10

2 Define the Metric Objectives You can t control what you can t measure. This is a quote from Tom De- Marco s book, Controlling Software Projects, in which he describes how to organize and control a software project so it is measurable in the context of time and cost projections. Control is the extent to which a manager can ensure minimum surprises. Deviations from the plan should be signaled as early as possible in order to react. Another quote from DeMarco s book, The only unforgivable failure is the failure to learn from past failure, stresses the importance of estimating and measurement. Measurement is a recording of past effects to quantitatively predict future effects. How do we define the Metrics? Software testing as a test development project has deliverables such as test plans, test design, test development and test execution. The objective of this task is to apply the principles of metrics to control the testing process. A metric is a measurable indication of some quantitative aspect of a system and has the following characteristics: a) Measurable: A metric point must be measurable for it to be a metric, by definition. If the phenomenon can t be measured, there is no way to apply management methods to control it. b) Independent: Metrics need to be independent of the human influence. There should be no way of changing the measurement other than changing the phenomenon that produced the metric. c) Accountable: Any analytical interpretation of the raw metric data rests on the data itself, and it is, therefore, necessary to save the raw data and the methodical audit trail of the analytical process. d) Precise: Precision is a function of accuracy. The key to precision is, therefore, that a metric is explicitly documented as part of the data collection process. If a metric varies, it can be measured as a range or tolerance. A metric can be a result, or a predictor. A result metric measures a completed event or process. Examples include actual total elapsed time to process a business transaction or total test costs of a project. A predictor metric is an early warning metric that has a strong correlation to some later result. An example is the predicted response-time through statistical regression analysis when more terminals are added to a system when that many terminals have not yet been measured. A result or predictor metric can also be a derived metric. A derived metric is one that is derived from a calculation or graphical technique involving one or more metrics. The motivation for collecting test metrics is to make the testing process more effective. This is achieved by carefully analyzing the metric data and taking the appropriate action to correct problems. The starting point is to define the metric objectives of interest. Some examples include: a) Defect analysis: Every defect must be analyzed to answer such questions as the root causes, how detected, when detected, who detected, etc. b) Test effectiveness: How well is testing doing, e.g., return on investment? c) Development effectiveness: How well is development fixing defects? d) Test automation: How much effort is expended on test automation? e) Test cost: What are the resources and time spent on testing? f) Test status: Another important metric is status tracking, or where are we in the testing process? g) User involvement: How much is the user involved in testing? Page 2 of 10

3 How do we define the Metric Points? Following are some metric points associated with the general metrics and the corresponding actions to improve the testing process. The source or the method to compute the metric point is also described. Sr. Metric Metric Point How to Compute Distribution of defect causes Histogram, Pareto Analysis. (Histogram is a graphical representation of measured values organized according to the frequency of occurrence used to pinpoint hot spots Pareto Analysis involves Analysis of defect patterns to identify causes and sources) 1) Defect analysis 2) Development effectiveness Number of defects by cause over time Number of defects by how found over time Distribution of defects by module Distribution of defects by priority (critical, high, medium, low) Distribution of defects by functional area Distribution of defects by environment (platform) Distribution of defects by type (architecture, connectivity, consistency, database integrity, documentation, GUI, installation, memory, performance, security, standards and conventions, stress, usability, bad fixes) Distribution of defects by who detected (external customer, internal customer, development, QA, other) Distribution by how detected (technical review, walkthroughs, JAD, prototyping, inspection, test execution) Distribution of defects by severity (high, medium, low defects) Average time for development to repair defect Multi-line graph Multi-line graph Histogram, Pareto Analysis Histogram Histogram Histogram, Pareto Analysis Histogram, Pareto Analysis Histogram, Pareto Analysis Histogram, Pareto Analysis Histogram Total repair time Number of repaired defects Page 3 of 10

4 Sr. Metric Metric Point How to Compute 3) Test automation Percent of manual vs. automated testing Distribution of cost by cause Distribution of cost by application Cost of manual test effort Total test cost By Histogram, Pareto Analysis By Histogram, Pareto Analysis Percent of costs for testing Test testing cost Total system cost Total costs of testing over time By Line graph Average cost of locating a defect Total cost of testing Number of defects detected Anticipated costs of testing vs. actual By comparison cost 4) Test cost Average cost of locating a requirements defect with requirements reviews Average cost of locating a design defect with design reviews Average cost of locating a code defect with reviews Average cost of locating a defect with test execution Number of testing resources over time Requirements review costs during requirement reviews Design review costs during design reviews Code review costs during code reviews Test execution costs during test execution By Line plot Percentage of defects discovered during maintenance Percent of defects uncovered due to testing Number of defects discovered during maintenance Total number of defects uncovered Number of detected errors through testing Total system defects 5) Test effectiveness Average effectiveness of a test Number of tests Total system defects Value returned while reviewing requirements Value returned while reviewing design during requirements review Requirements test costs during design review Design test costs Page 4 of 10

5 Sr. Metric Metric Point How to Compute Value returned while reviewing programs Value returned during test execution during program review Program test costs during testing Test costs 5) Test Effectiveness Effect of testing changes Number of tested changes Problems attributable to the changes People s assessment of effectiveness of testing By subjective scaling (1 10) Average time for QA to verify fix Total QA verification time Total number of defects to verify Number of defects over time Cumulative number of defects over time Number of application defects over time Percent of statements executed By line graph By line graph By Multi-line graph Number of statements executed Total statements Percent of logical paths executed Number of logical paths Total number of paths Percent of acceptance criteria tested Acceptance criteria tested Total acceptance criteria 6) Test Extent Number of requirements tested over time Number of statements executed over time Number of data elements exercised over time Number of decision statements executed over time Number of tests ready to run over time Number of tests runs over time Number of tests run without defects uncovered Number of defects corrected over time 7) User involvement Percentage of user testing User testing time Total test time Page 5 of 10

6 How do we Define Test Metrics used in Reporting? The most common Test Report is a simple matrix, which indicates the test cases, the test objectives, and the results of testing at any point in time. The following six tasks define how a test team can define metrics to be used in test reporting. An important part of these tasks is to assure that the data needed (i.e., measures) to create the test metrics is available. 1) Establish a test metrics team: The measurement team should include individuals who: a) Have a working knowledge of quality and productivity measures b) Are knowledgeable in the implementation of statistical process control tools c) Have a working understanding of benchmarking techniques d) Know the organization s goals and objectives e) Are respected by their peers and management The measurement team may consist of two or more individuals, relative to the size of the organization. Representatives should come from management and development and maintenance projects. For an average-size organization, the measurement team should be between three and five members. 2) Inventory existing IT measures: The inventory of existing measures should be performed in accordance with a plan. Should problems arise during the inventory, the plan and the inventory process should be modified accordingly. The formal inventory is a systematic and independent review of all-existing measures and metrics captured and maintained. All identified data must be validated to determine if they are valid and reliable. The inventory process should start with an introductory meeting of the participants. The objective of this meeting is to review the inventory plan with management and representatives of the projects that are to be inventoried. A sample agenda for the introductory meeting is: a) Introduce all members. b) Review scope and objectives of the inventory process c) Summarize the inventory processes to be used d) Establish communication channels to use e) Confirm the inventory schedule with major target dates The inventory involves these activities: a) Review all measures currently being captured and recorded: Measures should include, but not be limited to, functionality, schedule, budgets, and quality. b) Document all findings: Measures should be defined; samples captured, and related software and methods of capture documented. Data file names and media location should be recorded. It is critical that this be as complete as possible in order to determine the consistency of activities among different projects. c) Conduct interviews: These interviews should determine what and how measurement data is captured and processed. Through observation, the validity of the data can be determined. 3) Develop a consistent set of metrics: To implement a common set of test metrics for reporting that enables senior management to quickly access the status of each project, it is critical to develop a list of consistent measures spanning all project lines. Initially, this can be challenging, but with cooperation and some negotiating, a reasonable list of measures can be drawn up. Organizations with mature processes will have an easier time completing this step, as well as those with automated tools that collect data. 4) Define desired test metrics: The objective of this task is to use the information collected in above tasks to define the metrics for the test reporting process. Major criteria of this task includes: a) Description of desired output reports b) Description of common measures Page 6 of 10

7 c) Source of common measures and associated software tools for capture d) Definition of data repositories (centralized and/or segregated) 5) Develop and implement the process for collecting measurement data: The objective of this step is to document the process used to collect the measurement data. The implementation will involve the following activities: a) Document the workflow of the data capture and reporting process b) Procure software tool(s) to capture, analyze, and report the data, if such tools are not currently available c) Develop and test system and user documentation d) Beta-test the process using a small to medium-size project e) Resolve all management and project problems f) Conduct training sessions for management and project personnel on how to use the process and interrelate the reports g) Roll out the test status process 6) Monitor the process: Monitoring the test reporting process is very important because the metrics reported must be understood and used. It is essential to monitor the outputs of the system to ensure usage. The more successful the test reporting process, the better the chance that management will want to use it and perhaps expand the reporting criteria. How do we Define Effective Test Metrics? A metric is a mathematical number that shows a relationship between two variables. Software metrics are measures used to quantify status or results. This includes items that are directly measurable, such as lines of code, as well as items that are calculated from measurements, such as earned value. Metrics specific to testing include data regarding testing, defect tracking, and software performance. The following definitions of metrics are available: a) Metric: A metric is a quantitative measure of the degree to which a system, component, or process possesses a given attribute. b) Process Metric: A process metric is a metric used to measure characteristics of the methods, techniques, and tools employed in developing, implementing, and maintaining the software system. c) Product Metric: A product metric is a metric used to measure the characteristics of the documentation and code. d) Software Quality Metric: A software quality metric is a function whose inputs are software data and whose output is a single numerical value that can be interpreted as the degree to which software possesses a given attribute that affects its quality. Testers are typically responsible for reporting their test status at regular intervals. The following measurements generated during testing are applicable: a) Total number of tests b) Number of tests executed to date c) Number of tests executed successfully to date Data concerning software defects include: a) Total number of defects corrected in each activity b) Total number of defects detected in each activity c) Average duration between defect detection and defect correction d) Average effort to correct a defect e) Total number of defects remaining at delivery Some of the basic measurement concepts are described below to help testers use quantitative data effectively. Page 7 of 10

8 How do we compare Objective versus Subjective Measures? Measurement can be either objective or subjective. An objective measure is a measure that can be obtained by counting. For example, objective data is hard data, such as defects, hours worked, and completed deliverables. Subjective data normally has to be calculated. It is a person s perception of a product or activity. For example, a subjective measure would involve such attributes of an information system as how easy it is to use and the skill level needed to execute the system. As a general rule, subjective measures are much more important than objective measures. For example, it is more important to know how effective a person is in performing a job (a subjective measure) versus whether or not they got to work on time (an objective measure). QAI believes that the more difficult something is to measure, the more valuable that measure. Individuals seem to want objective measures because they believe they are more reliable than subjective measures. It is unfortunate, but true, that many bosses are more concerned that the workers are at work on time and do not leave early, than they are about how productive they are during the day. You may have observed the type of people that always want to arrive at work before the boss, because they believe meeting objective measures is more important than meeting subjective measures, such as how easy the systems they built are to use. How Do You Know a Metric is Good? Before a measure is approved for use, there are certain tests that it must pass. QAI has identified the following tests that each measure and metric should be subjected to before it is approved for use: a) Reliability: This refers to the consistency of measurement. If taken by two people, would the same results be obtained? b) Validity: This indicates the degree to which a measure actually measures what it was intended to measure. c) Ease of Use and Simplicity: These are functions of how easy it is to capture and use the measurement data. d) Timeliness: This refers to whether the data was reported in sufficient time to impact the decisions needed to manage effectively. e) Calibration: This indicates the movement of a metric so it becomes more valid, for example, changing a customer survey so it better reflects the true opinions of the customer. What are the Standard Units of Measure? A measure is a single attribute of an entity. It is the basic building block for a measurement program. Measurement cannot be used effectively until the standard units of measure have been defined. You cannot intelligently talk about lines of code until the measure lines of code have been defined. For example, lines of code may mean lines of code written, executable lines of code written, or even noncompound lines of code written. If a line of code was written that contained a compound statement it would be counted as two or more lines of code, such as a nested IF statement two levels deep. In addition, organizations may desire to use weighting factors; for example, one verb would be weighted as more complete than other verbs in the same programming language. Measurement programs can be started with as few as five or six standard units of measure, but rarely would exceed 50 standard units of measure. How do we compare Productivity versus Quality? Quality is an attribute of a product or service. Productivity is an attribute of a process. They have frequently been called two sides of the same coin. This is because one has a significant impact on the other. There are two ways in which quality can drive productivity. The first, and an undesirable method, is to lower or not meet quality standards. For example, if one chose to eliminate the testing and rework components of a system development process, productivity as measured in lines of code per hours worked would be increased. This is done frequently in information services under the guise of completing projects on time. While testing and rework may not be eliminated, they are not complete when the project is placed into production. The second method for improving productivity through Page 8 of 10

9 quality is to improve processes so that defects do not occur, thus minimizing the need for testing and rework. The QAI Approach uses quality improvement processes to drive productivity. What are the Categories of Test Metric? While there are no generally accepted categories of metrics, it has proved helpful to many test organizations to establish categories for status and reporting purposes. The metric categories and metrics within those categories provide an inventory of the metrics that testers will use in status reporting and final test reports. In examining many reports prepared by testers the following eight metric categories are commonly used: a) Metrics unique to test b) Complexity measurements c) Project metrics d) Size measurements e) Defect metrics f) Product measures g) Satisfaction metrics h) Productivity metrics What are the Metrics that are Unique to Test? This category includes metrics such as Defect Removal Efficiency, Defect Density, and Mean Time to Last Failure. The following are examples of metrics unique to test: a) Defect removal efficiency: the percentage of total defects occurring in a phase or activity removed by the end of that activity. b) Defect density: the number of defects in a particular product. c) Mean time to failure: the average operational time it takes before a software system fails. d) Mean time to last failure: an estimate of the time it will take to remove the last defect from the software e) Coverage metrics: the percentage of instructions or paths executed during tests. f) Test cycles: the number of testing cycles required to complete testing that may be related to the size of the software system or complexity of the system. g) Requirements tested: the percentage of requirements tested during testing How do we measure the Complexity? This category includes quantitative values accumulated by a predetermined method, which measure the complexity of a software product. The following are examples of complexity measures: a) Size of module / unit:: larger module/units are considered more complex. b) Logic complexity: the number of opportunities to branch/transfer within a single module. c) Documentation complexity: the difficulty level in reading documentation usually expressed as an academic grade level. Project Metrics: This category includes status of the project including milestones, budget and schedule variance and project scope changes. The following are examples of project metrics: a) Percent of budget utilized b) Days behind or ahead of schedule c) Percent of change of project scope d) Percent of project completed (not a budget or schedule metric, but rather an assessment of the functionality/structure completed at a given point in time) Page 9 of 10

10 Size Measurements: This category includes methods primarily developed for measuring the software size of software systems, such as lines of code, and function points. These can also be used to measure software-testing productivity. Sizing is important in normalizing data for comparison to other projects. The following are examples of size metrics: a) KLOC: thousand lines of code, used primarily with statement level languages. b) Function points: a defined unit of size for software. c) Pages or words of documentation Defect Metrics: This category includes values associated with numbers or types of defects, usually related to system size, such as defects/1000 lines of code or defects/100 function points, severity of defects, uncorrected defects, etc. The following are examples of defect metrics: a) Defects related to size of software. b) Severity of defects such as very important, important, and unimportant. c) Priority of defects: the importance of correcting defects. d) Age of defects: the number of days the defect has been uncovered but not corrected. e) Defects uncovered in testing f) Cost to locate a defect Product Measures: This category includes measures of a product s attributes such as performance, reliability, usability. The following are examples of product measures: a) Defect density : the expected number of defects that will occur in a product during development. Satisfaction Metrics: This category includes the assessment of customers of testing on the effectiveness and efficiency of testing. The following are examples of satisfaction metrics: a) Ease of use: the amount of effort required to use software and/or software documentation. b) Customer complaints: some relationship between customer complaints and size of system or number of transactions processed. c) Customer subjective assessment: a rating system that asks customers to rate their satisfaction on different project characteristics on a scale, for example a scale of 1-5. d) Acceptance criteria met: the number of user defined acceptance criteria met at the time software goes operational. e) User participation in software development: an indication of the user desire to produce high quality software on time and within budget. Productivity Metrics: This category includes the effectiveness of test execution. Examples of productivity metrics are: a) Cost of testing in relation to overall project costs: assumes a commonly accepted ratio of the costs of development versus tests. b) Under budget/ahead of schedule. c) Software defects uncovered after the software is placed into an operational status. d) Amount of testing using automated tools. Page 10 of 10

### CSTE Mock Test - Part I - Questions Along with Answers

Note: This material is for Evaluators reference only. Caters to answers of CSTE Mock Test - Part I paper. 1. A branch is (Ans: d) a. An unconditional transfer of control from any statement to any other

### Fundamentals of Measurements

Objective Software Project Measurements Slide 1 Fundamentals of Measurements Educational Objective: To review the fundamentals of software measurement, to illustrate that measurement plays a central role

### Testing Metrics. Introduction

Introduction Why Measure? What to Measure? It is often said that if something cannot be measured, it cannot be managed or improved. There is immense value in measurement, but you should always make sure

### PMP Prep Test Bank - Monitoring & Controlling Process Group Questions

PMP Prep Test Bank - Monitoring & Controlling Process Group Questions Multiple Choice Identify the letter of the choice that best completes the statement or answers the question. 1. Which of the following

### Total Quality Management (TQM) Quality, Success and Failure. Total Quality Management (TQM) vs. Process Reengineering (BPR)

Total Quality Management (TQM) Quality, Success and Failure Total Quality Management (TQM) is a concept that makes quality control a responsibility to be shared by all people in an organization. M7011

### PROJECT QUALITY MANAGEMENT

8 PROJECT QUALITY MANAGEMENT Project Quality Management includes the processes required to ensure that the project will satisfy the needs for which it was undertaken. It includes all activities of the

### BENCHMARK YOUR TEST FUNCTION AGAINST A WORLD-CLASS TESTING ORGANIZATION

BENCHMARK YOUR TEST FUNCTION AGAINST A WORLD-CLASS TESTING ORGANIZATION PRACTICE OBJECTIVE This practice will enable you to conduct an assessment of your testing organization, using eight criteria that

### 1. Introduction. Annex 7 Software Project Audit Process

Annex 7 Software Project Audit Process 1. Introduction 1.1 Purpose Purpose of this document is to describe the Software Project Audit Process which capable of capturing different different activities take

### pm4dev, 2007 management for development series The Project Management Processes PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS

pm4dev, 2007 management for development series The Project Management Processes PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS A methodology to manage

### Defect Prevention: A Tester s Role in Process Improvement and reducing the Cost of Poor Quality. Mike Ennis, Senior Test Manager Accenture

Defect Prevention: A Tester s Role in Process Improvement and reducing the Cost of Poor Quality Mike Ennis, Senior Test Manager Accenture IISP, 1996-2008 www.spinstitute.org 1 Defect Prevention versus

### HOW TO IMPROVE QUALITY AND EFFICIENCY USING TEST DATA ANALYTICS

HOW TO IMPROVE QUALITY AND EFFICIENCY USING TEST DATA ANALYTICS Discover 8 ways in our guide for advanced manufacturers Do you perform advanced manufacturing in an industry such as aerospace, automotive,

### Software Project Audit Process

Software Project Audit Process Version 1.2 Information and Communication Technology Agency of Sri Lanka July 2013 Copyright 2011 ICTA Software Project Audit Process-v-1.2 Revision History Date Version

### A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

International Journal of Soft Computing and Engineering (IJSCE) A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management Jayanthi.R, M Lilly Florence Abstract:

### Baseline Code Analysis Using McCabe IQ

White Paper Table of Contents What is Baseline Code Analysis?.....2 Importance of Baseline Code Analysis...2 The Objectives of Baseline Code Analysis...4 Best Practices for Baseline Code Analysis...4 Challenges

### rtech laboratories began working toward ISO 9002:1994 in ISO 17025: Practical Benefits of Implementing a Quality System STATISTICAL ANALYSIS

1038 HONSA & MCINTYRE : JOURNAL OF AOAC INTERNATIONAL VOL. 86, NO. 5, 2003 STATISTICAL ANALYSIS ISO 17025: Practical Benefits of Implementing a Quality System JULIE D. HONSA and DEBORAH A. MCINTYRE rtech

### Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. February 2013 1 Executive Summary Adnet is pleased to provide this white paper, describing our approach to performing

### Quality Assurance Plan

For the SeaPort-e contract, The Squires Group Inc. (TSGi) has developed a QA Plan that creates a strategic quality partnership at both the contract and Task Order level and is firmly based on the metrics

### Software Engineering Compiled By: Roshani Ghimire Page 1

Unit 7: Metric for Process and Product 7.1 Software Measurement Measurement is the process by which numbers or symbols are assigned to the attributes of entities in the real world in such a way as to define

### Noorul Islam College of Engineering M. Sc. Software Engineering (5 yrs) IX Semester XCS592- Software Project Management

Noorul Islam College of Engineering M. Sc. Software Engineering (5 yrs) IX Semester XCS592- Software Project Management 8. What is the principle of prototype model? A prototype is built to quickly demonstrate

### QUALITY AND PERFORMANCE IN PROJECTS. Project Management and Leadership 2015D, PhD, PMP

QUALITY AND PERFORMANCE IN PROJECTS Project Management and Leadership 2015D, PhD, PMP Our PROGRAMME: 1. INTRODUCTION TO PROJECT MANAGEMENT 2. STARTING A PROJECT 3. WORK MOTIVATION 4. COMMUNICATION 5: TEAMS

### 20 PMP Live Questions and Answers!

20 PMP Live Questions and Answers! 20 PMP Questions to help Pass Your PMP! David Geoffrey Litten CasaBlanca Publishing Corp 20 PMP Question 1 A project may need to update its schedule baseline to: A. Report

### Integrated methodology for testing and quality management.

Integrated methodology for testing and quality management. MindTest overview MindTest is an integrated testing methodology that meshes all the components of a testing engagement, manages the quality of

### Brillig Systems Making Projects Successful

Metrics for Successful Automation Project Management Most automation engineers spend their days controlling manufacturing processes, but spend little or no time controlling their project schedule and budget.

### TEST METRICS AND MEASUREMENTS

TEST METRICS AND MEASUREMENTS OVERVIEW: Derive information from raw data to help in decision making Metrics derived from measurements using formulae/calculations Metrics program to decide what measurements

### 2007 GIS T Symposium. GIS Project Planning. Jim Pugh, GISP GIS Project Manager

2007 GIS T Symposium GIS Project Planning Jim Pugh, GISP GIS Project Manager March 26 28, 2007 GIS Project Planning GIS Project Planning 85% of GIS Projects FAIL: 87% go more than 50% over budget 45% don

### The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.

CAPACITY AND AVAILABILITY MANAGEMENT A Project Management Process Area at Maturity Level 3 Purpose The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision

### Industry Metrics for Outsourcing and Vendor Management

Industry Metrics for Outsourcing and Vendor Management Scott Goldfarb Q/P Management Group, 10 Bow Street Stoneham, Massachusetts 02180 sgoldfarb@qpmg.com Tel: (781) 438-2692 FAX (781) 438-5549 www.qpmg.com

### Corrective Action and Preventive Actions and its Importance in Quality Management System: A Review

Available online on www.ijpqa.com International Journal of Pharmaceutical Quality Assurance 2016; 7(1); 1-6 Review Article ISSN 0975 9506 Corrective Action and Preventive Actions and its Importance in

### Anatomy of an Enterprise Software Delivery Project

Chapter 2 Anatomy of an Enterprise Software Delivery Project Chapter Summary I present an example of a typical enterprise software delivery project. I examine its key characteristics and analyze specific

### Content Sheet 7-1: Overview of Quality Control for Quantitative Tests

Content Sheet 7-1: Overview of Quality Control for Quantitative Tests Role in quality management system Quality Control (QC) is a component of process control, and is a major element of the quality management

### CPM -100: Principles of Project Management

CPM -00: Principles of Project Management Lesson C: Quality Management Presented by Jim Lightfoot lightfoj@erols.com Ph: 30-932-9004 Presented at the PMI-CPM 2002 Fall Conference Prepared by the Washington,

### Description of Services for A Quality Assurance Engineer for SQA Assignment for eservices Development Projects ICTA/CON/IC/P5/411B

Description of Services for A Quality Assurance Engineer for SQA Assignment for eservices Development Projects ICTA/CON/IC/P5/411B 1. Introduction The Information and Communication Technology Agency of

### Project Scorecard Template

Project Scorecard Template 1. Identify criteria for success: Review the objectives and deliverables in the Project Definition, as well as any other existing information that is relevant to the project.

### Measurement Information Model

mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides

### Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

Software Testing Rajat Kumar Bal Introduction In India itself, Software industry growth has been phenomenal. IT field has enormously grown in the past 50 years. IT industry in India is expected to touch

### SOFTWARE QUALITY & SYSTEMS ENGINEERING PROGRAM. Quality Assurance Checklist

SOFTWARE QUALITY & SYSTEMS ENGINEERING PROGRAM Quality Assurance Checklist The following checklist is intended to provide system owners, project managers, and other information systems development and

### CSSE 372 Software Project Management: Managing Software Projects with Measures

CSSE 372 Software Project Management: Managing Software Projects with Measures Shawn Bohner Office: Moench Room F212 Phone: (812) 877-8685 Email: bohner@rose-hulman.edu Dimensional Analysis Abuse Learning

### ADMINISTRATIVE SUPPORT AND CLERICAL OCCUPATIONS SIN 736 1

Following are the Contractor Site and Government Site Labor Categories for SIN 736-1, SIN 736-1, and SIN 736-5. Please do not hesitate to contact us at gsataps@amdexcorp.com if you have any questions ADMINISTRATIVE

### Quick Reference Guide Interactive PDF Project Management Processes for a Project

Project Processes for a Project Click the Knowledge Area title (below and left in blue underline) to view the details of each Process Group. Project Process Groups and Knowledge Areas Mapping Project Process

### IMPLEMENTATION NOTE. Validating Risk Rating Systems at IRB Institutions

IMPLEMENTATION NOTE Subject: Category: Capital No: A-1 Date: January 2006 I. Introduction The term rating system comprises all of the methods, processes, controls, data collection and IT systems that support

### Develop Project Charter. Develop Project Management Plan

Develop Charter Develop Charter is the process of developing documentation that formally authorizes a project or a phase. The documentation includes initial requirements that satisfy stakeholder needs

### A Framework for Project Metrics

A Framework for Project Metrics Deciding what to measure and how to measure it August 13, 2007 Dr. Gary J. Evans, PMP CVR/IT Consulting LLC www.cvr-it.com Welcome! Focus of this workshop: Project Metrics

### PROJECT MANAGEMENT PLAN CHECKLIST

PROJECT MANAGEMENT PLAN CHECKLIST The project management plan is a comprehensive document that defines each area of your project. The final document will contain all the required plans you need to manage,

### ISO 9001 Quality Systems Manual

ISO 9001 Quality Systems Manual Revision: D Issue Date: March 10, 2004 Introduction Micro Memory Bank, Inc. developed and implemented a Quality Management System in order to document the company s best

### Project Integration Management

Integration Initiating ning Executing Monitoring & Controlling Closing 4.1 Develop Charter Statement Of Work Business Case 4.2 Develop 4.3 Direct and Manage Work 4.4 Monitor and Control Work 4.5 Perform

### Practical Metrics for Managing and Improving Software Testing

Practical Metrics for Managing and Improving Software Testing Presented By: Shaun Bradshaw shaun.bradshaw@zenergytechnologies.com Slide 1 Part 1 Test Metrics Ten key metrics testers should track One bonus

### Performance Testing Process A Whitepaper

Process A Whitepaper Copyright 2006. Technologies Pvt. Ltd. All Rights Reserved. is a registered trademark of, Inc. All other trademarks are owned by the respective owners. Proprietary Table of Contents

### Software Metrics. Er. Monika Verma*, Er. Amardeep Singh **, Er. Pooja Rani***, Er. Sanjeev Rao****

Software Metrics Er. Monika Verma*, Er. Amardeep Singh **, Er. Pooja Rani***, Er. Sanjeev Rao**** * M.Tech(Hons-CSE), B.Tech.(CSE), Assistant Professor in Computer Sc.and Engg Department, Swami Vivekanand

### SOFTWARE REQUIREMENTS

SOFTWARE REQUIREMENTS http://www.tutorialspoint.com/software_engineering/software_requirements.htm Copyright tutorialspoint.com The software requirements are description of features and functionalities

### SOFTWARE TESTING. Yogesh Singh MM CAMBRIDGE UNIVERSITY PRESS

SOFTWARE TESTING Yogesh Singh MM CAMBRIDGE UNIVERSITY PRESS Contents List of Figures List of Tables Preface Acknowledgements xi xv xxi xxiii 1. Introduction 1 1.1 Some Software Failures 1 1.1.1 The Explosion

### SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT Mar 31, 2014 Japan Aerospace Exploration Agency This is an English translation of JERG-2-610. Whenever there is anything ambiguous in this document, the original

### Surgi Manufacturing Quality Manual

Surgi Manufacturing Page 1 of 18 Approvals: QA: Eng. Mgt. : A Date: 18Aug98 1. Introduction... 4 1.1 Scope... 4 1.2 Purpose... 4 1.3 Authority... 4 1.4 Issue of the Manual... 4 1.5 Amendments... 4 1.6

### Integration Mgmt / Initiating Process Group 4.1 Develop Project Charter

1 Mgmt / Initiating Process Group 4.1 Develop Project Charter Project statement of work Business case Agreements Facilitation techniques Project charter 26/02/2013 18:23:36 1 2 Mgmt / Planning Process

### Formal Software Testing. Terri Grenda, CSTE IV&V Testing Solutions, LLC www.ivvts.com

Formal Software Testing Terri Grenda, CSTE IV&V Testing Solutions, LLC www.ivvts.com Scope of Testing Find defects early Remove defects prior to production Identify Risks Unbiased opinion When Should Testing

### Quality Assurance Plan PPM Version 2.0

Quality Assurance Plan PPM Version 2.0 U.S. Department of Housing and Urban Development PPM Version 2.0 January 2014 Quality Assurance Plan Document Control Information

### NORTH AMERICA OPERATIONS. (Fairmont and Montreal Facilities) QUALITY MANUAL. Prepared to comply with the requirements of ISO 9001:2008

WEIGH-TRONIX CANADA ULC NORTH AMERICA OPERATIONS (Fairmont and Montreal Facilities) QUALITY MANUAL Prepared to comply with the requirements of ISO 9001:2008 Meets or exceeds the requirements for design,

### Optimization of Software Quality using Management and Technical Review Techniques

Optimization of Software Quality using Management and Technical Review Techniques Inibehe Emmanuel Akpannah Post Graduate Student (MSc. Information Technology), SRM University, Chennai, India Abstract

### QA Procedure Page 1 Appendix A

*Changes from the previous QASE noted by "yellow" highlight of block Evaluation Summary Company: Prepared By: Section Element Manual Audit OK Objective Evidence 2.1 Objective of Quality Assurance Program

### MNLARS Project Audit Checklist

Audit Checklist The following provides a detailed checklist to assist the audit team in reviewing the health of a project. Relevance (at this time) How relevant is this attribute to this project or audit?

### TestScape. On-line, test data management and root cause analysis system. On-line Visibility. Ease of Use. Modular and Scalable.

TestScape On-line, test data management and root cause analysis system On-line Visibility Minimize time to information Rapid root cause analysis Consistent view across all equipment Common view of test

### Industry Metrics for Outsourcing and Vendor Management

Industry Metrics for Outsourcing and Vendor Management Scott Goldfarb Q/P Management Group, Inc. 10 Bow Street Stoneham, Massachusetts 02180 sgoldfarb@qpmg.com Tel: (781) 438-2692 FAX (781) 438-5549 www.qpmg.com

### QUALITY ASSURANCE IN EXTREME PROGRAMMING Plamen Balkanski

International Journal "Information Theories & Applications" Vol.10 113 QUALITY ASSURANCE IN EXTREME PROGRAMMING Plamen Balkanski Abstract: Our previous research about possible quality improvements in Extreme

### Chapter 11 Audit Sampling Concepts

Chapter 11 Audit Sampling Concepts Review Questions 11-1 A representative sample is one in which the characteristics of interest for the sample are approximately the same as for the population (that is,

### VisuAL PM Sampler. By Projerra Management Inc. For the PMBOK Guide Fourth Edition

VisuAL PM Sampler By Projerra Inc. For the PMBOK Guide Fourth Edition VisuAL PM Sampler By Projerra Inc. THANK YOU Thank you for downloading the VisuAL PM Sampler by Projerra Inc. We hope that this small

### Measurements & Metrics. Software measurement and metrics. Definitions

Measurements & Metrics Not everything that can be counted counts, and not everything that counts can be counted. -Albert Einstein 1 Software measurement and metrics Software measurement is concerned with

### Effective Test Management Practices

Effective Test Management Practices Dr. Magdy Hanna Chairman International Institute for Software Testing mhanna@testinginstitute.com http:// Principles-1 What is most frustrating in your role as a test

### SoMA. Automated testing system of camera algorithms. Sofica Ltd

SoMA Automated testing system of camera algorithms Sofica Ltd February 2012 2 Table of Contents Automated Testing for Camera Algorithms 3 Camera Algorithms 3 Automated Test 4 Testing 6 API Testing 6 Functional

### Supplier Quality Requirements

Key Elements Procedure #2 Supplier Quality Requirements Section List Of Contents Page 1.0 Foreword... 2 2.0 Basic Quality Requirements 2 3.0 Part Specific Quality Considerations 4 4.0 Advanced Product

### Amajor benefit of Monte-Carlo schedule analysis is to

2005 AACE International Transactions RISK.10 The Benefits of Monte- Carlo Schedule Analysis Mr. Jason Verschoor, P.Eng. Amajor benefit of Monte-Carlo schedule analysis is to expose underlying risks to

### Software Project Management Matrics. Complied by Heng Sovannarith heng_sovannarith@yahoo.com

Software Project Management Matrics Complied by Heng Sovannarith heng_sovannarith@yahoo.com Introduction Hardware is declining while software is increasing. Software Crisis: Schedule and cost estimates

### ORACLE PROJECT PLANNING AND CONTROL

ORACLE PROJECT PLANNING AND CONTROL (Formerly Oracle Project Management) KEY FEATURES COLLABORATIVE PROJECT PLANNING Define a project work breakdown structure to align plans to execution Coordinate financial

### Quality Control Inspection Program Counterstone of a High-Performance Project Organization

Quality Control Inspection Program Counterstone of a High-Performance Project Organization Jose E. Reyes Panamanian Association of Project Management, Panama e-mail: jreyes@apgp-ipma.org DOI 10.5592/otmcj.2012.1.2

### Smarter Balanced Assessment Consortium. Recommendation

Smarter Balanced Assessment Consortium Recommendation Smarter Balanced Quality Assurance Approach Recommendation for the Smarter Balanced Assessment Consortium 20 July 2012 Summary When this document was

### Input, Output and Tools of all Processes

1 CIS12-3 IT Project Management Input, Output and Tools of all Processes Marc Conrad D104 (Park Square Building) Marc.Conrad@luton.ac.uk 26/02/2013 18:22:06 Marc Conrad - University of Luton 1 2 Mgmt /

### Service Delivery Module

Service Delivery Module Software Development Methodology -India follows international industry standards and has adopted the standard methodology in our Software Development Life Cycle (SDLC). It is a

### Closed Loop Quality Management: Integrating PLM and Quality Management

Integrating PLM and Quality Management In recent Aberdeen research of over 500 manufacturers it was shown that 100% of Best-in-Class manufacturers having both a Quality Management solution and Product

### CORPORATE QUALITY MANUAL

Corporate Quality Manual Preface The following Corporate Quality Manual is written within the framework of ISO 9001:2008 Quality System by the employees of CyberOptics. CyberOptics recognizes the importance

### Quality Assurance - Karthik

Prevention is better than cure Quality Assurance - Karthik This maxim perfectly explains the difference between quality assurance and quality control. Quality Assurance is a set of processes that needs

### Test Execution through Metrics

Test Execution through Jayaprakash Prabhakar QA Manager Consumer Engineering McAfee Inc 2007 McAfee, Inc. Inc. Agenda 2 What is? Flow of Collection Who are the interested Parties? When to collect? Various

### Current Standard: Mathematical Concepts and Applications Shape, Space, and Measurement- Primary

Shape, Space, and Measurement- Primary A student shall apply concepts of shape, space, and measurement to solve problems involving two- and three-dimensional shapes by demonstrating an understanding of:

### Reaching CMM Levels 2 and 3 with the Rational Unified Process

Reaching CMM Levels 2 and 3 with the Rational Unified Process Rational Software White Paper TP174 Table of Contents INTRODUCTION... 1 LEVEL-2, REPEATABLE... 3 Requirements Management... 3 Software Project

### PMO Metrics Recommendations

Introduction The purpose of this document is to recommend metrics to be used by the Project Management Office (PMO) to measure and analyze their project and PMO success. The metrics are divided into Project

### Request for Proposal for Application Development and Maintenance Services for XML Store platforms

Request for Proposal for Application Development and Maintenance s for ML Store platforms Annex 4: Application Development & Maintenance Requirements Description TABLE OF CONTENTS Page 1 1.0 s Overview...

### <name of project> Software Project Management Plan

The document in this file is adapted from the IEEE standards for Software Project Management Plans, 1058-1998, which conforms to the requirements of ISO standard 12207 Software Life Cycle Processes. Tailor

### Requirements-Based Testing Process in Practice (Originally presented as Getting it right the first time )

International Journal of Industrial Engineering and Management (), Vol.1 No 4, 2010, pp. 155-161 Available online at www.ftn.uns.ac.rs/ijiem ISSN 2217-2661 Requirements-Based Testing Process in Practice

### Agile Master Data Management A Better Approach than Trial and Error

Agile Master Data Management A Better Approach than Trial and Error A whitepaper by First San Francisco Partners First San Francisco Partners Whitepaper Executive Summary Market leading corporations are

### Softjourn, Inc. s QA Testing Process

Softjourn, Inc. s QA Process Date of Last Update: June 05, 2007 Version: 2.0 Author: Softjourn, Inc. Headquarters 39270 Paseo Padre Pkwy #251 Fremont, CA 94536 USA p: +1.510.744.1528 f: +1. 815.301.2772

### Quality Management. Lecture 12 Software quality management

Quality Management Lecture 12 Software quality management doc.dr.sc. Marko Jurčević prof.dr.sc. Roman Malarić University of Zagreb Faculty of Electrical Engineering and Computing Department of Fundamentals

### Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University

Software Engineering Introduction & Background Department of Computer Science Kent State University Complaints Software production is often done by amateurs Software development is done by tinkering or

### The 10 Knowledge Areas & ITTOs

This document is part of a series that explain the newly released PMBOK 5th edition. These documents provide simple explanation and summary of the book. However they do not replace the necessity of reading

### Introduction to Software Engineering. 8. Software Quality

Introduction to Software Engineering 8. Software Quality Roadmap > What is quality? > Quality Attributes > Quality Assurance: Planning and Reviewing > Quality System and Standards 2 Sources > Software

### EST.03. An Introduction to Parametric Estimating

EST.03 An Introduction to Parametric Estimating Mr. Larry R. Dysert, CCC A ACE International describes cost estimating as the predictive process used to quantify, cost, and price the resources required

### PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME > Date of Issue: < date > Document Revision #: < version # > Project Manager: < name > Project Management Plan < Insert Project Name > Revision History Name

### QDA Q-Management A S I D A T A M Y T E S P E C S H E E T. From stand-alone applications to integrated solutions. Process optimization tool

QDA Q-Management Q-Management is the powerful base software package within ASI DATAMYTE s QDA suite that facilitates achievement and verification of quality goals such as process control, cost reduction,

### Quality Manual. DuraTech Industries, Inc. 3216 Commerce Street La Crosse, WI 54603 MANUAL SERIAL NUMBER 1

Quality Manual Approval Page Document: QA1000 Issue Date: 5/29/1997 Page 1 of 17 Revision Date: 5/20/2013 DuraTech Industries, Inc. 3216 Commerce Street La Crosse, WI 54603 MANUAL SERIAL NUMBER 1 This

### Software Quality. Software Quality Assurance and Software Reuse. Three Important Points. Quality Factors

Software Quality Software Quality Assurance and Software Reuse Peter Lo Conformance to explicitly-stated functional and performance requirements, explicitly-documented development standards, and implicit

### INTERNATIONAL STANDARD ON AUDITING 500 AUDIT EVIDENCE CONTENTS

INTERNATIONAL STANDARD ON AUDITING 500 AUDIT EVIDENCE (Effective for audits of financial statements for periods beginning on or after December 15, 2009) CONTENTS Paragraph Introduction Scope of this ISA...

### 8. Master Test Plan (MTP)

8. Master Test Plan (MTP) The purpose of the Master Test Plan (MTP) is to provide an overall test planning and test management document for multiple levels of test (either within one project or across