Software Project Measurement



Similar documents
Fundamentals of Measurements

ESTABLISHING A MEASUREMENT PROGRAM

WHAT IS SOFTWARE PERFORMANCE ENGINEERING? By Michael Foster

Benefits of conducting a Project Management Maturity Assessment with PM Academy:

Software Metrics. Er. Monika Verma*, Er. Amardeep Singh **, Er. Pooja Rani***, Er. Sanjeev Rao****

Topics. Project plan development. The theme. Planning documents. Sections in a typical project plan. Maciaszek, Liong - PSE Chapter 4

1 What does the 'Service V model' represent? a) A strategy for the successful completion of all service management projects

Introduction to Software Engineering. 8. Software Quality

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

Testing Metrics. Introduction

Workshop agenda. Data Quality Metrics and IT Governance. Today s purpose. Icebreaker. Audience Contract. Today s Purpose

Unit 11: Software Metrics

Performance Testing Percy Pari Salas

Project Risks. Risk Management. Characteristics of Risks. Why Software Development has Risks? Uncertainty Loss

IT Security Risk Management Model for Cloud Computing: A Need for a New Escalation Approach.

In a dynamic economic environment, your company s survival

Project management. Organising, planning and scheduling software projects. Ian Sommerville 2000 Software Engineering, 6th edition.

Agile Metrics - What You Need to, Want to, and Can Measure. June 9, 2014

Brand metrics: Gauging and linking brands with business performance

Measurement Information Model

Free ITIL v.3. Foundation. Exam Sample Paper 4. You have 1 hour to complete all 40 Questions. You must get 26 or more correct to pass

AIPM PROFESSIONAL COMPETENCY STANDARDS FOR PROJECT MANAGEMENT PART B CERTIFIED PRACTISING PROJECT PRACTITIONER (CPPP)

Retained Fire Fighters Union. Introduction to PRINCE2 Project Management

PROCESS IMPROVEMENT CAPABILITY MATURITY MODEL

Project Management in the Rational Unified Process

Free ITIL v.3. Foundation. Exam Sample Paper 1. You have 1 hour to complete all 40 Questions. You must get 26 or more correct to pass

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University

Clinical Risk Management: Agile Development Implementation Guidance

The Incremental Advantage:

Free ITIL v.3. Foundation. Exam Sample Paper 3. You have 1 hour to complete all 40 Questions. You must get 26 or more correct to pass

THE SECRETS OF SUCCESSFUL ERP SOFTWARE / BUSINESS ACCOUNTING SOFTWARE SELECTION

Frequently Asked Questions in Project Management

CDC UNIFIED PROCESS JOB AID

CSSE 372 Software Project Management: Managing Software Projects with Measures

AN INNOVATIVE SQA SERVICE MATURITY MODEL USING CMMI AND ITIL

TPI a model for Test Process Improvement

Software Quality Management

RISK MANAGEMENT OVERVIEW - APM Project Pathway (Draft) RISK MANAGEMENT JUST A PART OF PROJECT MANAGEMENT

CSC 408F/CSC2105F Lecture Notes

Practitioner Certificate Software Asset Management Syllabus. Version 2.0

OPERATIONAL PROJECT MANAGEMENT (USING MS PROJECT)

Distributed and Outsourced Software Engineering. The CMMI Model. Peter Kolb. Software Engineering

Project Planning. COSC345 Lecture 3 Slides: Andrew Trotman Dramatic presentation: Richard O Keefe. Software Engineering 2013

SCRAM: A Method for Assessing the Risk of Schedule Compliance

Project management. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 5 Slide 1

A Report on The Capability Maturity Model

CMMI: Specific Goals and Practices

Software Engineering: Analysis and Design - CSE3308

Software Engineering. Project Management. Based on Software Engineering, 7 th Edition by Ian Sommerville

TEC Capital Asset Management Standard January 2011

Data Management Maturity Model. Overview

Use of Measurements and Metrics for the Project Management Office (PMO)

These functionalities have been reinforced by methodologies implemented by several of our customers in their own portfolio optimization processes.

Software Engineering 9.1. Quality Control

ITIL Introducing continual service improvement

ESKITP7072 IT/Technology Capacity Management Level 2 Role

Leveraging CMMI framework for Engineering Services

System Development and Life-Cycle Management (SDLCM) Methodology

Driving Quality Improvement and Reducing Technical Debt with the Definition of Done

ITRM Guideline CPM Date: January 23, 2006 SECTION 4 - PROJECT EXECUTION AND CONTROL PHASE

Input, Output and Tools of all Processes

Information Sharing Lessons Learned from Gateway Reviews: Gate 5 Benefits Realisation Review

Optimizing IV&V Benefits Using Simulation

Project Roles and Responsibilities

TEST METRICS AND KPI S

A FRAMEWORK FOR ASSESSING ASSET MANAGEMENT PERFORMANCE IN AUSTRALIA. Neville Binning, MBA, AEng, GCertHRDev, MIEAust, CPEng, MCIT

Overview. The Concept Of Managing Phases By Quality and Schedule

Sample risk committee charter

ESKITP Manage IT service delivery performance metrics

Technology management in warship acquisition

SEI Level 2, 3, 4, & 5 1 Work Breakdown Structure (WBS)

Brillig Systems Making Projects Successful

IMPLEMENTATION OF A SOFTWARE PROJECT OFFICE AT HONEYWELL AIR TRANSPORT SYSTEMS. by Michael A. Ross

Steve Masters (SEI) SEPG North America March Carnegie Mellon University

Do not open this paper until instructed by the invigilator. Please note: This question paper must not be removed from the examination room.

Information Security Risk Management

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT OFFICES A PRACTICAL GUIDE TO SETTING UP A PMO (WITH EXAMPLES)

Suncorp Accelerates Project Delivery and Reduces Costs with CA Project & Portfolio Management

Skatteudvalget (2. samling) SAU Alm.del Bilag 48 Offentligt. Programme, Project & Service Management Analysis

Seion. A Statistical Method for Alarm System Optimisation. White Paper. Dr. Tim Butters. Data Assimilation & Numerical Analysis Specialist

Certification in Humanitarian Supply Chain Management (CHSCM) Competence Model. Final Version 2007

HACCP System. Introduction

Test Plan Template (IEEE Format)

An Approach for Monitoring Software Development Using Timesheet and Project Plan

The Personal Software Process (PSP) Tutorial

Certified Professional in Configuration Management Glossary of Terms

An Introduction to. Metrics. used during. Software Development

ITIL Version 3.0 (V.3) Service Transition Guidelines By Braun Tacon

Transcription:

Software Project Measurement We can t accurately measure software, yet we must have measures if we are to understand large-scale design. This lecture discusses: the practical aims of measurement; the measures appropriate to them; ways of identifying and prioritising measurement issues; how to put together a measurement plan; the limitations common to many measurement methods; and the use of measurement indicators and estimators. 5.1 Why Measure? In traditional, structured lifecycles we want to: assess and manage risk trade design decisions against others track progress justify objectives It is difficult to do any of these things in an objective way unless we have some picture of where we are in a project and how much progress we have made in design. 5.2 What is it Useful to Measure Although software itself resists absolute measurement there are many aspects of software projects for which measurement (even rough or indirect measurement) may be useful: Schedule : Is it on time? Cost : Can we afford to finish it? Growth : Will it scale? Quality : Is it well made? Ability : How good are we at design? Technology : Is the technology viable? 1

These interact. For example, design ability influences the cost of running to completion which interacts with the way the project schedule is put together which has an impact on software quality which contributes to the growth of the system. In the sections below we look at each of these aspects in more detail, listing important categories of measure and suggesting some appropriate units of measure for these categories. 5.2.1 Schedule Issues Milestone Work unit Date of delivery Component status Requirement status Paths tested Problem report status Reviews completed Change request status 5.2.2 Cost Issues Personnel Financial performance Environment availability Effort Staff experience Staff turnover Earned value Cost Availability dates Resource utilisation 5.2.3 Growth Issues Product size and stability Functional size and stability Lines of code Components Words of memory Database size Requirements Function points Change request workload 2

5.2.4 Quality Issues Defects Rework Problem reports Defect density Failure interval Rework size Rework effort 5.2.5 Ability Issues Process maturity Productivity Capability maturity model level Product size/effort Functional size/effort 5.2.6 Technology Issues Performance Resource utilisation Cycle time CPU utilisation I/O utilisation Memory utilisation Response time 5.3 Identifying and Prioritising Issues The issues we described above are not equally important for all projects. It is necessary, therefore, to find the ones which matter and prioritise them. Identification is possible from various sources, including: Risk assessments Project constraints (e.g. budgets) Leveraging technologies (e.g. COTS) Product acceptance criteria External requirements Past projects 3

Prioritisation usually requires some succinct form of contrast between issues, based on previous projects. Sometimes it is possible to obtain (rough) probabilities of occurrence of identified issues; then modify these according to the impact each would have if it became a real problem and the exposure to which your particular project has to that sort of problem. The table below is an example presentation of this sort of prioritisation. Issue Probability Relative Project of occurrence impact exposure Aggressive schedule 1.0 10 10 Unstable requirements 1.0 8 8 Staff experience 1.0 5 8 Reliability reqs 0.9 3 4 COTS performance 0.2 9 1 5.4 Making a Measurement Plan It is likely that you will want to take measurements several times during the course of a large software project and in those circumstances a measurement plan will be needed. The following are some of the things measurement plans typically contain: Issues and measures Data sources Levels of measurement Aggregation structure Frequency of collection Method of access Communication and interfaces Frequency of reporting 5.5 Limitations of Measurement Measurement is necessary but fallible and subject to many practical limitations. Some of these, concerning the measurement categories introduced in Section 5.2, are listed below. Milestones don t measure effort, only give critical paths Difficult to compare relative importance of measures 4

Incremental design requires measuring of incomplete functions Important measures may be spread across components Cost of design is not an indicator of performance Current resource utilisation may not be best Reliable historical data is hard to find Some software statistics are time consuming to collect Some measures only apply after coding has been done Size doesn t map directly to functionality, complexity or quality Time lag between problems and their appearance in reports Changes suggested by one performance indicator may affect others Often no distinction between work and re-work Overall capability maturity level may not predict performance on a specific project Technical performance measures are often not as precise as they may seem Technical resource utilisation may only be known after integration and testing Be sure also to consider how you will maintain the quality of your measurement data, for example: Are units of measure comparable (e.g. lines of code in Ada versus Java)? Normalisation? What are acceptable ranges for data values? Can we tolerate gaps in data supplied? When does change to values amount to re-planning. 5

Example Indicator: Design Progress Number of units completing design Actual Planned Example Indicator: Effort Project time Staff hours per month Actual Planned Project time Example Estimator: Size-Effort Upper 95% Staff months (log scale) Lower 95% Number of lines of source code 6

ADDENDUM Capability Maturity Model The Capability Maturity Model was developed by Software Engineering Institute at Carnegie-Mellon University (Pittsburgh Pa). Initial level At this level, an organisation does not have effective management procedures or project plans. If formal procedures for project control exist, there are no organisational mechanisms to ensure that they are used consistently. The organisation may successfully develop software but the characteristcs of the software and the software process will be unpredictable. Repeatable level An organisation has formal management, quality assurance and configuration control processes in place. The organisation can successfully repeat projects of the same type. However, there is no formal process model; project success is dependent on individual managers motivating a team and on organisational folklore acting as an intuitive process description. Defined level An organisation has defined its process and thus has a basis for qualitative process improvement. Formal procedures are in place to ensure that the process is followed in all projects. Managed level An organisation has a defined process and a formal programme of quantitative data collection. Process and product metrics are collected and fed into process improvement activity. Optimising level An organisation is committed to continuous process improvement. Process improvement is budgeted and planned and is an integral part of the organisation s process. 7