Essential Metrics for Agile Project Management



Similar documents
Agile Metrics - What You Need to, Want to, and Can Measure. June 9, 2014

Agile Metrics. It s Not All That Complicated

How To Plan An Agile Project

Leveraging Agile and CMMI for better Business Benefits Presented at HYDSPIN Mid-year Conference Jun-2014

Agile Scrum Workshop

The 3C Approach for Agile Scrum Software Methodology Jisha Johns, Akhil P Sivan, Prof. K Balachandran, Prof. B R Prathap

AGILE Burndown Chart deviation - Predictive Analysis to Improve Iteration Planning

IMPLEMENTING SCRUM. PART 5 of 5: SCRUM SUCCESS METRICS

Earned Value and Agile Reporting

MTAT Software Economics. Lecture 6: Software Cost Estimation (part II)

Agile Project Management Controls

Using the Agile Methodology to Mitigate the Risks of Highly Adaptive Projects

Text. Key Performance Measures in a Lean Agile Program. Thomas Blackburn 2/19/2015

An Introduction to Agile Performance Management

Measuring for Results: Metrics and Myths

Maintaining Quality in Agile Environment

Call for Tender for Application Development and Maintenance Services

Agile Scrum Foundation Training

Agile Project Management and the Real World. Emily Lynema DLF Fall 2010 November 1, 2010

TEST METRICS AND KPI S

Agile Team Roles Product Owner & ScrumMaster. Brian Adkins Rick Smith

SECC Agile Foundation Certificate Examination Handbook

When is Agile the Best Project Management Method? Lana Tylka

Metrics and scope management in agile projects

The Agile Manifesto is based on 12 principles:

Agile & PMI Project Management Mapping MAVERIC S POINT OF VIEW Vol. 7

Atern The latest version of the DSDM approach which makes DSDM appropriate to all types of project.

Scrum vs. Kanban vs. Scrumban

Quality Assurance in an Agile Environment

Would you like to have a process that unlocks ability to learn and produce faster?

Agile Project Forecasting Techniques. "Who Says You Can't Plan Agile Projects?" Matt Davis, PMP, MCITP October 21, 2013

Lean QA: The Agile Way. Chris Lawson, Quality Manager

Measuring ROI of Agile Transformation

Lean Software Development and Kanban

MM Agile: SCRUM + Automotive SPICE. Electronics Infotainment & Telematics

Managing Agile Projects in TestTrack GUIDE

PLM - Agile. Design Code Test. Sprints 1, 2, 3, 4.. Define requirements, perform system design, develop and test the system. Updated Project Plan

Applying Agile Project Management to a Customized Moodle Implementation

How NOT to Do Scrum. Patterns and Anti-patterns. Revised July First presented at New York City Scrum User Group June 17, 2010

Agile Project Management and Agile Practices Training; with a Scrum Project that you will do.

Executive Guide to SAFe 24 July An Executive s Guide to the Scaled Agile Framework.

Waterfall to Agile. DFI Case Study By Nick Van, PMP

Troy Magennis Moneyball for Software Projects: Agile Metrics for the Metrically Challenged

Agile Planning & Metrics That Matter

Agile Power Tools. Author: Damon Poole, Chief Technology Officer

The Agile Service Management Guide. By Jayne Gordon Groll

Manager Domain Experts. Delivery Team. C h ic a g o

Introduction to Agile Scrum

Agile Projects 7. Agile Project Management 21

Chapter 7: Project Cost Management. Munawar

10 Keys to Successful Scrum Adoption

Building Software in an Agile Manner

Hybrid-Agile Software Development

EXIN Agile Scrum Foundation

Agility via Software Engineering Practices

Testing in Scrum Projects

Transitioning Your Software Process To Agile Jeffery Payne Chief Executive Officer Coveros, Inc.

Agile Development and Software Architecture: Understanding Scale and Risk

ScrumMaster Certification Workshop: Preparatory Reading

Scrum in a Large Project Theory and Practice

Scrum In 10 Slides. Inspect & Adapt

How can I be agile and still satisfy the auditors?

TeamCompanion Solution Overview. Visual Studio

Agile Software Development. Stefan Balbo / Patrick Dolemieux

Scrum, User Stories, and More! CSCI 5828: Foundations of Software Engineering Lecture 22 11/06/2014

Introduction to Agile and Scrum

Agile in the IT World kpmg.com/in

PMBOK? You Can Have Both! June 10, Presented by:

Applying Lean on Agile Scrum Development Methodology

RISK MANAGMENT ON AN AGILE PROJECT

MTAT Software Engineering

A Viable Systems Engineering Approach. Presented by: Dick Carlson

Introduction to Agile Software Development Process. Software Development Life Cycles

!"#$%&'(%)*$+ :%;$)*%<&%6 4.7&68'9"/6")& 0)1.%$2.3*%./'4"55*)6 ,&+-%$+./ !"#$%&##'()*+&## Figure 1: Five OSP Dimensions

Issues in Internet Design and Development

SmartBear Software Pragmatic Agile Development (PAD) Conceptual Framework

Traditional SDLC Vs Scrum Methodology A Comparative Study

Monitoring Scrum Projects with AgileEVM and Earned Business Value (EBV) Metrics

LEAN AGILE POCKET GUIDE

Brillig Systems Making Projects Successful

SESSION 303 Wednesday, March 25, 3:00 PM - 4:00 PM Track: Support Center Optimization

Capstone Agile Model (CAM)

How Silk Central brings flexibility to agile development

D25-2. Agile and Scrum Introduction

Getting Agile with Scrum

Continuous Delivery / Continuous Deployment How to automate your Deliveries. Bernhard Keprt

Process Increments: An Agile Approach to Software Process Improvement

Mastering the Iteration: An Agile White Paper

Preparation Guide. EXIN Agile Scrum Foundation

What is Scrum? Scrum Roles. A lean approach to software development. A simple framework. A time-tested process

An Agile Approach to Metrics :

Agile Notetaker & Scrum Reference. Designed by Axosoft, the creators of OnTime the #1 selling scrum software.

QUANTIFIED THE IMPACT OF AGILE. Double your productivity. Improve Quality by 250% Balance your team performance. Cut Time to Market in half

Process Increments:An Agile Approach to Software Process Improvement

Whitepaper: How to Add Security Requirements into Different Development Processes. Copyright 2013 SD Elements. All rights reserved.

Transcription:

Metrics for the transformational age Essential Metrics for Agile Project Management Alex Birke, Agile World 2015 Accenture, its logo, and 'High Performance. Delivered.' are trademarks of Accenture.

Why Metrics? they help [ ] to make decisions (Eric Ries, Lean Startup) 2

Wait! Aren t Scrum s metrics, Burndown and Velocity, not good enough? 3

Metrics especially for the current transformational decade There are occasions when you are really agile and can get rid of most metrics you lucky one! But facts are: Companies out there are still not agile enough! 4

What makes a good agile metric? As many as required, as less as possible Leading indicators over trailing indicators Measure results, not activity Assess trends, not snapshots Minimize overhead 5

ASet of Core Metrics for Agile Project management* Scope Cost! Scope Volatility Schedule Quality! Running Tested Features! Cost of Rework! Story Point Effort Performance Indicator Agile Benefits! Release Slippage Risk Indicator! Sprint Burndown (Performance Indicator)! Release Burnchart (Performance Indicator)! Time to market! Business value delivered! User satisfaction! Employee Engagement *) Project management is agnostic of technology or domain 6

Scope Volatility (SV) Definition Scope Volatility depicts the amount of change in size of the release scope, comparing the release scope size measured at start of the release and after the last completed sprint. Calculation Current Size of Must-Have Scope Initial Size of Must-Have Scope SV = * 100 Initial Size of Must-Have Scope SV > 0: Scope creep < 0: Scope drop = 0: Planned scope size retained (often a corridor) Alternative metric: Changed Scope % measures not only the difference in absolute amount 7

Scope Volatility Release Burn Up Chart 1800 1600 Rating epics de-scoped Work ( Story Points) 1400 1228 1143 1200 Must-Have Scope 1237 1000 2 new Billing epics added 800 600 400 200 1273 1330 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 Must-Have Scope Sprint # Must-Have + Should-Have Scope Planned Dev Complete or Accenture Scope Complete Ideal Burn-Up Ideal Burn-Up In the example above, Scope Volatility = (1273 [SP] 1143 [SP] ) / 1143 [SP] * 100 =.11 8

Story Point Effort Performance Indicator (SPEPI) Definition SPEPI (aka CPI, but sprint-wise) indicates if the ongoing project release is currently on budget, depicting deviation in planned effort per story point to the actual effort per story point. Calculation SPEPI = Planned effort to be spent in the release so far Planned story points delivered in the release so far Actual effort spent on the release so far Actual story points delivered in the release so far SPEPI = 1 : as planned < 1 : cost overrun > 1 : under budget 9

Story Point Effort Performance Indicator Value per Effort / Money SPEPI = (875 / 630 ) / ( 957 / 490 ) = 0.71, which indicates a cost overrun. 10

Release Slippage Risk Indicator (RSRI) Definition RSRI indicates whether at least a minimal viable release (MVR) can be deployed to production on the scheduled date. Calculation RSRI = Past Sprint Productivity (P) Required Sprint Productivity for release date (RP) RSRI = 1 : as planned < 1 : delayed > 1 : ahead of plan 11

Sprint Productivity (P) Definition Productivity is measured as generated value in story points that can be completed per person day. Calculation P = # story points [SP] of a sprint # person days of a sprint 12

Required Productivity (RP) Definition RP is the productivity that the team would require in the remaining sprints, to complete at least remaining Must-Have user stories, so that a Minimal Viable Release (MVR) can be deployed to production. Calculation RP = SP estimate of remaining Must-Have stories Total planned effort in remaining sprints 13

Required Productivity Required Velocity = ( 3 [SP] + 3 [SP] + 8 [SP] + 3 [SP] ) / 3 [Sprints] = 5.66 [SP] Assuming 105 hours per upcoming Sprint: RP = 5.66 [SP] / 105 [hours] = 0.054 [SP / hours] 14

Release Slippage Risk Indicator Past Productivity 5 [SP] / 90 [hours] 0.055 [SP / hours] RSRI = = = = 1.02 Required Productivity 5,66 [SP] / 105 [hours] 0.054 [SP / hours] 15

Release BurnChart Performance Indicator (RBPI) Definition RBPI indicates if a project is on schedule by showing the variation in the amount of completed work compared to amount of work planned. Calculation RBPI = Story points fully completed so far in the release Story points planned currently in the release RBPI = 1 : as planned < 1 : delayed > 1 : ahead of plan 16

Release BurnChart Performance Indicator 1800 Release Burn Up Chart Work ( Story Points) 1600 Rating epics de-scoped 1330 1400 1210 1200 1110 1015 Summary Reports added 1000 920 2 new Billing epics added 820 800 720 560 600 400 400 280 160 200 0 20 60 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 Must-Have Scope Planned Ideal Burn-Up Sprint # Must-Have + Should-Have Scope Dev Complete or Accenture Scope Complete Ideal Burn-Up In the example above, at the end of Sprint 7: RBPI = 490 / 720 = 0.68, which indicates that the amount of user stories done was less than expected. 17

Sprint Burndown Performance Indicator (SBPI) Definition SBPI shows the deviation in completed work compared to work planned, for the current sprint ( Sprint Scope variance ) Calculation SBPI = Ideal remaining work Actual remaining work SBPI = 1: as planned < 1: delayed > 1: ahead of plan 18

Sprint Burndown Performance Indicator Fig. 1: Task Burndown (effort based) Fig. 2: User Story Burndown (Story Point based) Sprint Burndown Performance Indicator = 120 [hours] / 160 [hours] =.75 Sprint Burndown Performance Indicator = 40 [SP] / 50 [SP] =.80 (i.e. -25% deviation) 19

Running Tested Features* (RTF) Definition RTF depicts the variance of working (running) features over total features built to date. Calculation # completed user stories that still pass all acceptance tests RTF = x 100 total # of completed user stories to date *) A Metric leading to Agility, Ron Jeffries 20

Running Tested Features Sprint 6 RTF rate was only 71%. This indicates that the remaining 29% of the user stories built and tested are not working and cannot be deployed to production. 21

Cost of Rework: Delivered Defect Density (DDD) Definition DDD indicates the effectiveness of the review and testing activities, thus ensuring that fewer defects are identified on the delivered product (increment). Calculation DDD = Defects identified after Done-ness of user stories Size of Done user stories in SP Alternative: The metric Defect Rate is # of defects identified in completed user stories per total effort spent in all tasks until date. 22

Delivered Defect Rate DDR Sprint 1 Sprint 2 Sprint 3 Sprint 4 Defects Identified 7 12 19 25 SPE Effort 400 831 1262 1685 Defect Rate 0,018 0,014 0,015 0,015 Count only the defects logged after the story is marked complete by the developer Engineering effort includes effort from agile lifecycle tool for design, build, test, defect fix tasks. Includes PO & SM time as % of completed stories In the example above, DDR trend is stable. Therefore the delivered quality is stable. 23

ASet of Core Metrics for Agile Project management* Scope Cost! Scope Volatility Schedule Quality! Running Tested Features! Cost of Rework! Story Point Effort Performance Indicator Agile Benefits! Release Slippage Risk Indicator! Sprint Burndown (Performance Indicator)! Release Burnchart (Performance Indicator)! Time to market! Business value delivered! User satisfaction! Employee Engagement *) Project management is agnostic of technology or domain 24

Some further -sometimes helpful- metrics Stakeholder Involvement Index Test Automation Coverage % Stories accepted (Retrospective) Process improvement Epic Progress Report 25

Questions & Answers 26