ESTABLISHING A MEASUREMENT PROGRAM



Similar documents
Fundamentals of Measurements

Software Project Measurement

JOURNAL OF OBJECT TECHNOLOGY

System Development and Life-Cycle Management (SDLCM) Methodology

Understanding the Financial Value of Data Quality Improvement

Capability Maturity Model Integration (CMMI SM ) Fundamentals

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

Iterative Project Management 1

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University

The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.

Measurement Information Model

White Paper Software Quality Management

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >

Processing Requirements by Software Configuration Management

Chapter 3 The Integrated Requirements Management Framework (IREQM)

Custom Software Development Approach

BAL2-1 Professional Skills for the Business Analyst

EXHIBIT L. Application Development Processes

Software Quality Data Part 1: Basic and Derived Metrics

37 Marketing Automation Best Practices David M. Raab Raab Associates Inc.

Managing Successful Software Development Projects Mike Thibado 12/28/05

Metrics in Software Test Planning and Test Design Processes

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Requirements Management Database

Alliance Scorecarding and Performance Management at TechCo. A Vantage Partners Case Study

Why Traditional ESPs Aren t Cutting It for Marketers Results of an Adobe Study Conducted Across DMA Members

How To Improve Your Business Recipe Cards

DATA QUALITY MATURITY

Sabre Travel Network Consulting Services

Productivity Measurement and Analysis

CSTE Mock Test - Part III Questions Along with Answers

<name of project> Software Project Management Plan

ALM120 Application Lifecycle Management 11.5 Essentials

Software Process Models. Xin Feng

<Insert Picture Here> When to Automate Your Testing (and When Not To)

A. Waterfall Model - Requirement Analysis. System & Software Design. Implementation & Unit Testing. Integration & System Testing.

Developing and Implementing a Strategy for Technology Deployment

10426: Large Scale Project Accounting Data Migration in E-Business Suite

Software Quality Management

ájoƒ ùdg á«hô dg áµلªÿg Yesser Overall SDLC Process Definition

Reaching CMM Levels 2 and 3 with the Rational Unified Process

Microsoft PPM for Application Administrators using Microsoft Project Online

Software Life Cycle Processes

BYOD & MOBILE SECURITY: EMPOWERING EMPLOYEES WHLE SECURING CORPORATE ASSETS

Information Technology Infrastructure Library (ITIL )

GENERAL PLATFORM CRITERIA. General Platform Criterion Assessment Question

Software Metrics. Er. Monika Verma*, Er. Amardeep Singh **, Er. Pooja Rani***, Er. Sanjeev Rao****

The Configuration Management process area involves the following:

Project Execution - PM Elements

General Platform Criterion Assessment Question

Peer Review Process Description

Empirical Software Engineering Introduction & Basic Concepts

SEI Level 2, 3, 4, & 5 1 Work Breakdown Structure (WBS)

Supporting Workflow Overview. CSC532 Fall06

TEST METRICS AND KPI S

Software Development Life Cycle (SDLC)

Service Portfolio Management PinkVERIFY

Life Cycle Quality Gates

Lecture Slides for Managing and Leading Software Projects. Chapter 1: Introduction

The Challenge of Productivity Measurement

Auditing in an Automated Environment: Appendix C: Computer Operations

Requirements-Based Testing: Encourage Collaboration Through Traceability

CREDENTIALS & CERTIFICATIONS 2015

Peer Review Process Description

Three Fundamental Techniques To Maximize the Value of Your Enterprise Data

Program Lifecycle Methodology Version 1.7

Development, Acquisition, Implementation, and Maintenance of Application Systems

Software Development Process

How Silk Central brings flexibility to agile development

Security Engineering Best Practices. Arca Systems, Inc Boone Blvd., Suite 750 Vienna, VA

The OMG BPM Standards

White Paper. Java versus Ruby Frameworks in Practice STATE OF THE ART SOFTWARE DEVELOPMENT 1

SOLUTION WHITE PAPER. Align Change and Incident Management with Business Priorities

Center for Business and Industrial Marketing

MKS Integrity & CMMI. July, 2007

Requirements engineering

Team A SaaS Strategy

ITIL Version 3.0 (V.3) Service Transition Guidelines By Braun Tacon

SWEBOK Certification Program. Software Engineering Management

Quality Perspective: Managing Software Development Projects

Quality Management. Lecture 12 Software quality management

The CMDB at the Center of the Universe

Project Scorecard Template

How To Write Software

Evaluating the Business Impacts of Poor Data Quality

The Phases of an Object-Oriented Application

Using Productivity Measure and Function Points to Improve the Software Development Process

Software Configuration Management Plan

Introducing Formal Methods. Software Engineering and Formal Methods

Requirements Volatility in Software Development Process

CHAPTER 7 Software Configuration Management

INTERNAL AUDIT REPORT REVIEW OF PHYSICAL FACILITIES MAINTENANCE AND REPAIRS

Fault Slip Through Measurement in Software Development Process

Transcription:

ESTABLISHING A MEASUREMENT PROGRAM The most important rule is to Understand that software measurement is a means to an end, not an end in itself Three key reasons for Software Measurement Understanding Software Baseline models and relationships Key process characteristics Managing Software Projects Planning and estimating Tracking actuals versus estimates Validating models Guiding Process Improvement Understanding Assessing Packaging 1

Understanding Software The most important reason for measurement is to understand your business How much are we spending on software development? Where do we allocate and use resources throughout the life cycle? How much effort do we expend specifically on testing software? What types of errors and changes are typical on our projects? How long will it take me to finish testing this software? Is reliability a function of testing time? Should I impose stronger testing standards?... So we need to build baseline models and relationships as a basis for all forms of understanding Understanding Software Associated with whatever we want to understand are a set of characteristics that provide us with insights into achieving that particular goal If we want to understand the cost of development then key characteristics include: distribution of effort among development activities typical cost per line of code cost of maintenance hours spent on documentation computer resources required amount of rework expected If we want to understand the reliability of our systems then key characteristics include: number and classes of defects found how and when defects are found pass/fail rates for integration and system testing 2

Managing Software Projects Planning and estimating Build models of relationships for key variables Tracking actuals versus estimates Track your progress in real time and compare to your baselines Validating models Learn how and when your models are changing so you can modify them Focus on applying results rather than collecting data Guiding Process Improvement The three basic steps are: Understanding Assessing Packaging Understanding and characterizing helps you understand where you are Assessing involves learning what works and what doesn t Packaging involves making what you have learned a part of your business 3

Understand the goals prioritize Key Issue for Setting Up a Program Understand how to apply measurement multiple customers for the results Set expectations for change measurement introduces change Plan to achieve an early success show the investment is worth while Focus locally gain should be to local organization Start small let the scope evolve based upon success Key Issue for Setting Up a Program Organize the analysts separately from the developer their goals and processes are different Make sure the measures apply to the goals don t collect data for data s sake Keep the number of measures to a minimum there is a real cost associated with measurement Avoid over-reporting measurement data make the results as crisp and clear as possible Budget for the cost of the measurement program include all costs in planning and tailor it to fit the goals and budget Plan to spend at least three times as much on data analysis and use as on the data collection the real payoff is in the analysis and use 4

Costs The cost of data collection should not add more than 2 percent to the software development or maintenance budget includes completing forms, participating in interviews, attending training sessions and helping characterize project development The data processing element of the measurement program may cost 3 to 7 percent of the total development budget includes collecting, archiving, validating, and maintaining the measurement data The cost of the analysis element of the measurement program ranges from 5 to 15 percent of the total project budget includes design of studies, information analysis, project interaction, packaging Core Measures Cost reporting period dates total effort effort by development and maintenance activity Errors dates error reported and corrected effort to isolated and correct the error source and class of error Process Characteristics identification of programming language indication of the use of significant processes description of measurement study goals 5

Core Measures Project Dynamics changes to requirements changes to code growth of code predicted characteristics Project Characteristics development dates total effort project size component information software classification Core Measures Collect effort data at least monthly, preferably weekly Clarify the scope of the effort collection (who, what, when, why) Collect defect data only for configuration controlled software Do not expect to measure error correction effort precisely Do not expect to find generalized, well-defined process measures Do not expect to find a data base of process measurements (check reports) Understand the high-level process characteristics Use simple definitions of life cycle phases Use lines of code to represent size Specify which software is to be counted 6

Operation of a Measurement Program Do not expect to automate data collection Make providing data easy Use commercially available tools Expect measurement data to be flawed, inexact, inconsistent Example Goals: Understanding Language Evolution Goal: Analyze the project set to characterize the language usage trend from the point of view of the organization Measures: Project dates, sizes, and languages Project Profiles Goal: Analyze the project set to characterize the levels and trends of code reuse from the point of view of the organization Measures: Project dates, sizes, and percentage of reuse Total effort on each project Cost vs. Size Goal: Analyze the project set to characterize the cost of reusing code and the cost of producing code levels from the point of view of the organization Measures: Project size, dates, reuse, and effort data 7

Example Goals: Understanding Effort Distribution Goals: Analyze the project set to characterize the cost of each life-cycle phase and the characteristics of staffing profiles from the point of view of the organization Measures: Project phase dates, effort data, and developer activity data Cost of Major Activities Goals: Analyze the project set to characterize the cost of maintenance, documentation, quality assurance and configuration management from the point of view of the organization Measures: Developer activity data, effort, and software size Defect Rate Goals: Analyze the project set to characterize the average rate of uncovering defects, the defect rate in delivered software and which lifecycle phases yield the most defects from the point of view of the organization Measures: Project size, phase dates, and reported defects Example Goals: Understanding Error Classes Goals: Analyze the project set to characterize the what types of defects occur most often from the point of view of the organization Measures: reported defect information Defects vs. size and complexity Goals: Analyze the project set to characterize the relationship between defect rates and module size and complexity from the point of view of the organization Measures: Error reports by module, module size, and module complexity Growth Rate Dynamics Goal: Analyze the project to characterize the local rate of code production from the point of view of the organization Measures: Phase dates and weekly count of completed code 8

Example Goals: Managing Projected Cost, Scheduling, and Phases Goals: Estimate cost, schedule, effort, and defects Analyze the project characteristics to predict the cost, schedule, effort, and defects from the point of view of the project manager Measures: Project size estimate, project characteristics, models, and relationships Project Dynamics Goals: Estimate cost, schedule, effort, and defects Analyze the project characteristics to predict the expected growth rate, change rate, and defect rate of source code from the point of view of the manager Measures: Project size estimate, project characteristics, models, and relationships Example Goals: Tracking Tracking Code Production Goal: Analyze the project code growth to track it against the estimated code growth from the point of view of the project manager Measures: Biweekly count of source library size, manager s updated at-completion estimates Tracking Software Change Stability Goal: Analyze the project requirements and design changes to monitor/track the project stability from the point of view of the project manager Measures: Changes to source code and manager s predicted estimates Tracking Staff Effort for Possible Replanning Goal: Analyze the project staffing profile to monitor them for replanning the appropriate staffing profiles from the point of view of the manager Measures: Changes to source code and manager s predicted estimates 9

Example Goals: Tracking Tracking Test Progress Goal: Analyze the project closed vs. open defect profiles to estimate the test progress from the point of view of the manager Measures: Failure report data and change data Tracking Software Defect Quality Goal: Analyze the project defect profiles to assess the defect quality from the point of view of the manager Measures: Defect report data, historical models, and size estimates Checking if Cleanroom is being used Goal: Analyze the project library code growth to check for the Cleanroom code growth characteristics from the point of view of the manager Measures: Project phase date estimates, completed source code, and historical models Example Goals: Guiding Improvement Checking if Cleanroom is effective Goal: Analyze the Cleanroom method as applied to several projects to evaluate it with respect to effort, effort distribution, software size, number of defects from the point of view of the organization Measures: Effort, effort distribution, software size, size growth, number and types of defects Checking if Independent Validation and Verification is effective Goal: Analyze the Independent Validation and Verification as applied to several projects to evaluate it with respect to cost and extra defects uncovered from the point of view of the organization Measures: Effort, number and types of defects 10

Experience-Based Guidelines Data collection should not be the dominant element of process improvement; application of measures is the goal The focus of a measurement program must be self improvement, not external comparison Measurement data are fallible, inconsistent, and incomplete The capability to quantify a process or product with measurement data is limited by the abilities of the analysis Personnel treat measurement as an annoyance, not a significant threat Automation of measurement has limits 11