2004 IFPUG Conference Performance Measurement of Software Application Development & Maintenance The David Consulting Group www.davidconsultinggroup.com 1
Measurement Must Consider Continuous Process Improvement Strategic Positioning (Business & Technical) Deliver Value Improve Productivity Reduce Time to Market Increase Quality Satisfy Customer Improve Competitive Position Lower Costs Increase Market Share Improve Margins Shareholder Value Increase Revenues 2
Presentation Topics Measurement for Process Improvement Baseline your Performance Model Performance Improvements 3
Measurement for Process Improvement QUANTITATIVE QUALITATIVE Measures how you are doing Deliverable Size Effort/Cost Duration Quality Process Methods Skills Tools Environment Identifies what you are doing Measured Performance Capability Maturity Standard of performance Baseline of Performance 4
What Gets Measured? Business Related Measures Unit Delivery Cost Time To Market Customer Satisfaction Contribution Measures the impact of IT on the business Process Related Measures Effectiveness Integration Compliance Identifies trends and helps to monitor progress Project Related Measures Project Tracking Estimating Change Management Effective utilization of measures in a pro-active format 5
Keys to Measurement Program Measure From User Viewpoint Base Metrics on the Goals of the Process Being Measured Keep Measures Simple & Flexible 6
Utilize Results in Decision Making Improvements resulting from current and future initiatives must be measured The basis for measuring improvements may include: Industry data Organizational baseline data It is necessary for the organization to put a stake in the ground relative to current performance level in order to improve development practices 7
Presentation Topics Measurement for Process Improvement Baseline your Performance Model Performance Improvements 8
Baseline Activities Identify sample set (typically project oriented) Collect baseline data Project measures (e.g., effort, size, cost, duration, defects) Project attributes (e.g., skill levels, tools, process, etc.) Analyze data Performance comparisons (identification of process strengths and weaknesses) Industry averages and best practices Performance modeling (identify high impact areas) Report results 9
Identify Sample Set & Collect Data Research MEASURES Software Size Level of Effort Time to Market Delivered Defects Cost CHARACTERISTICS Skill Levels Automation Process Management User Involvement Environment Analysis PERFORMANCE LEVELS PROFILES Results Correlate Performance Levels to Characteristics Substantiate Impact of Characteristics Identify Best Practices 10
Quantitative Data Start End Effort Schedule Delivered Project Date Date FP (Mths) (Months) Cost Defects Project abc 8/18/03 1/9/04 122 24.08 4.75 $375,600 17 Project xyz 3/15/03 12/10/03 111 8.63 8.75 $134,640 1 Project 123 5/27/02 5/9/03 83 25.77 11.50 $401,958 3 Project 890 8/15/03 10/19/03 52 5.50 2.25 $85,800 0 Capstone 12/1/02 4/1/03 20 1.34 4.00 $20,880 6 CRS ROCI Release 4.09.02 2/19/04 3/4/04 19 1.23 0.50 $19,200 0 Project data is collected for: Start and end date duration Size expressed in function points Effort labor Defects pre and post implementation S-Tracker Upgrade 8/1/03 8/22/03 19 0.92 0.75 $14,400 1 Low and Grow CLIP Enhancement 3/12/03 5/28/03 12 1.27 2.50 $19,800 2 Account Status Change 9/15/03 2/9/04 12 6.15 4.75 $95,880 0 Reversal Automation 8/27/03 1/5/04 9 1.64 4.25 $25,560 4 NBR Cash Differentiation 8/5/03 1/5/04 8 3.43 4.50 $53,460 1 CROI Multi-Accounts 8/1/03 9/15/03 7 0.77 1.50 $12,000 0 Low and Grow CLIP Delay 11/13/02 1/19/03 7 0.35 2.00 $5,400 1 Dialer Redesign 12/22/02 5/18/03 6 0.69 5.00 $10,800 2 CROI Enhancements (Feb 03) 2/2/03 2/13/03 3 0.25 0.40 $3,960 0 11
Analyze Project Attributes MANAGEMENT DEFINITION DESIGN Team Dynamics Clearly Stated Requirements Formal Process High Morale Formal Process Rigorous Reviews Project Tracking Customer Involvement Design Reuse Project Planning Experience Levels Customer Involvement Automation Business Impact Experienced Development Staff Management Skills Automation BUILD TEST ENVIRONMENT Code Reviews Formal Testing Methods New Technology Source Code Tracking Test Plans Automated Process Code Reuse Development Staff Experience Adequate Training Data Administration Effective Test Tools Organizational Dynamics Computer Availability Customer Involvement Certification Experienced Staff Automation 12
Software Practices Profile Profile scores reflect the goodness of the development practices for a given project. Six categories are evaluated and scored. The higher the score the higher the probability of a successful delivery. Profile Project Score Mgmt Def Des Build Test Env Project 1 75.9 84.09 76.92 77.27 65.38 81.25 65.38 Project 2 60.3 68.18 64.10 56.82 61.54 59.38 50.00 Project 3 60.0 61.36 33.33 81.82 57.69 65.63 65.38 Project 4 54.5 61.36 64.10 45.45 61.54 50.00 42.31 Project 5 39.4 68.18 20.51 50.00 53.85 25.00 46.15 Project 6 35.3 38.64 20.51 15.91 69.23 43.75 38.46 Project 7 31.4 36.36 30.77 4.55 42.31 43.75 46.15 46.16 13
Strengths And Opportunities (An Example) Definition Strengths Requirements are clearly stated and stable Development and customers are experienced in applications Opportunities for Improvement More formal requirements gathering process on larger projects More consistent use of prototyping on larger projects A formal review process 14
Establish A Baseline Size is expressed in terms of functionality delivered to the user Software Size 2200 2000 1800 1600 1400 1200 1000 800 600 400 200 0 Performance Productivity A representative selection of projects is measured Organizational Baseline 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of delivery is a measure of productivity Rate of Delivery Function Points per Person Month 15
Baseline Data Measures Performance CUSTOMER IT Product Deliverable Performance Indicators Industry Baselines B A D C D Functional Size Duration Cost Quality Average and Best in Class Performance Levels FUNCTIONALITY DELIVERED PRODUCTIVITY MEASURES COMPARATIVE ANALYSIS Business Value Operational Efficiency Ability to Compete Customer Satisfaction Delivery Rate Time to Market Cost per Unit of Work Defect Density Performance Comparison Identification of Risks Profile of Performance Best Practices Identification 16
Evaluating Performance Without Size Project Cost Quality (000 s) (Defects Released) PO Special $500 12 Vendor Mods $760 18 Pricing Adj. $ 40 5 Store Sys. $990 22 Which project produced the best results? 17
Evaluating Performance With Size Project Size Cost Rate Quality Density (000 s) (Defects Released) PO Special 250 $500 $2,000 12.048 Vendor Mods 765 $760 $ 993 18.023 Pricing Adj. 100 $ 80 $ 800 5.050 Store Sys. 1498 $990 $ 660 22.014 Size --- Function Points Rate --- Cost per Function Point Density -- Defects per FP 18
Compare to Industry Benchmarks Industry baseline performance Software Size 2200 2000 1800 1600 1400 1200 1000 800 600 400 200 0 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery Function Points per Person Month 19
Function Points Per Person Month Average of Recent Projects Across Different Platforms Client Server 17 Main Frame 13 Web 25 e-business Web 15 Vendor Packages 18 Data Warehouse 9 20
Function Points Supported By One FTE Average of Support Provided for Corrective Maintenance by One FTE Client Server 642 Main Frame 978 Web 756 e-business Web 438 Vendor Packages 740 Data Warehouse 392 21
Track Improvements Software Size 2200 2000 1800 1600 1400 1200 1000 800 600 400 200 0 Track Progress Rate of Delivery Function Points per Person Month Year 2 grouping of projects 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 22
Presentation Topics Measurement for Process Improvement Baseline your Performance Model Performance Improvements 23
Model Improvements Model the impact of implementing selected process improvements Evaluate the impact on productivity Modeling is performed from several perspectives: Management Improvements, Design Improvements, Definition Improvements, Build Improvements, Test Improvements, Environment Improvements and SEI CMM Specific Improvements EXAMPLE: Improvements are measured from the following baseline -- Average Project Size: 133 Function Points (FPs) Average Productivity: 10.7 FP/Effort Month (EM) Average Time-to-Market: 7.3 Months Average Cost/FP: $934.58 Projected Delivered Defects/FP:.0301 24
Modeled Improvements Current improvement initiatives (SEI) are appropriately targeted at the majority of weak spots revealed by the baseline results. 25
Strive for Continuous Improvement PERFORMANCE PRODUCTIVITY COSTS DEFECTS RESOURCES TIME TO MARKET CAPABILITIES BUSINESS VALUE DELIVERABLES SKILL LEVELS TECHNOLOGY PROCESS 35.00 MEASURED BASELINE 30.00 25.00 Industry Averages Best Practices PERFORMANCE 20.00 15.00 10.00 5.00 0.00 Sub Performance Organization Baseline 0 100 200 400 800 1600 3200 6400 RISKS SOFTWARE PROCESS IMPROVEMENT IMPROVEMENT INITIATIVES / BEST PRACTICES 26