Three Things I Wish I Learned in School www.construx.com 2008 Construx Software Builders, Inc. All Rights Reserved.
#1 Motion = Progress
The Cost of Defects 50 100X Phase in which a Defect Is Introduced Requirements Design Cost to Correct Construction Requirements Design Construction System test Post-Release Reference: [McConnell98] Phase in Which a Defect Is Corrected Software Development Best Practices 3
The Cost of Defects (cont) Decades of research confirm defect-cost increase Fagan, Michael E. 1976. Design and Code Inspections to Reduce Errors in Program Development. IBM Systems Journal 15, no. 3: 182 211 Humphrey, Watts S., Terry R. Snyder, and Ronald R. Willis. 1991. Software Process Improvement at Hughes Aircraft. IEEE Software 8, no. 4 (July): 11 23 Leffingwell, Dean, 1997. Calculating the Return on Investment from More Effective Requirements Management, American Programmer, 10(4):13-16 Willis, Ron R., et al, 1998. Hughes Aircraft s Widespread Deployment of a Continuously Improving Software Process, Software Engineering Institute/Carnegie Mellon University, CMU/SEI-98-TR-006, May 1998 Grady, Robert B. 1999. An Economic Release Decision Model: Insights into Software Project Management. In Proceedings of the Applications of Software Measurement Conference, 227-239. Orange Park, FL: Software Quality Engineering Shull, et al, 2002. What We Have Learned About Fighting Defects, Proceedings, Metrics 2002. IEEE; pp. 249-258 Boehm, Barry and Richard Turner, 2004. Balancing Agility and Discipline: A Guide for the Perplexed, Boston, Mass.: Addison Wesley, 2004 Software Development Best Practices 4
Frequency of Defects Code 7% Other 10% De s ign 27% Requirement 56% About 83% of of all all defects exist before a single line of of code is is written! Reference: [Mogyorodi03] Software Development Best Practices 5
Software Project Effort in Theory Requirements w% Design x% Construction y% Testing z% Total 100% Software Development Best Practices 6
Rework Percentage (R%) R% = Project effort spent on rework Total effort spent on project Estimates of of defect defect rework rework ranged ranged from from 30 30 to to 80 80 percent percent of of total total development effort effort (when (when no no form form of of peer peer review review is is used) used) [Fagan96] Software Development Best Practices 7
Software Project Effort in Practice Requirements w% Design x% Construction y% Testing z% Rework (R%) 51% Total 100% Rework Rework is is not not only only the the single single largest largest driver driver of of cost cost and and schedule on on a typical typical software project it is is larger larger than than all all others others combined! Software Development Best Practices 8
Early Defect Detection is Key Fix Here Not Here 50 100X Phase in which a Defect Is Introduced Requirements Design Cost to Correct Construction Requirements Design Construction System test Post-Release Phase in Which a Defect Is Corrected Software Development Best Practices 9
Inspections Critical examinations of software deliverables Performed by technically competent individuals who did not produce the deliverable being considered Identify defects as early in the software lifecycle as possible Defects can be removed while the cost of removal is still low Software Development Best Practices 10
Testing vs. Inspections Testing Inspections Effectiveness (%defects found) Efficiency (hours/defect found) Cost to repair (hours/defect fixed) 60% 70% 60% 90% 6 8 0.8 1.2 3.4 27.2 1.1 2.7 References: [Jones99], [O Neill97], [Russell91], [Wagner06] Software Development Best Practices 11
#2 Estimate = Commitment
Estimates Have Probabilities There is no such thing as a single-point estimate that is correct/meaningful Probability of a specific schedule outcome All estimates include probabilities stated or implied (even if the estimator doesn t know it) Estimate probabilities are not symmetrically distributed about a mean Nominal Project Schedule (50/50 schedule) Schedule Reference: [Tockey05] Software Development Best Practices 13
Sources of Estimation Uncertainty Will the customer want Feature X? Will the customer want the cheap or expensive version of Feature X? If you implement the cheap version of Feature X, will the customer later want the expensive version after all? How will Feature X be designed? How long will it take to debug and correct mistakes made in implementing Feature X? Software Development Best Practices 14
Measuring Estimate Uncertainty Project: Framus Estimate Act/Est Scope defined 7 mos 1.43 Requirements 11 mos 0.91 Design 10 mos 1.00 Code 9 mos 1.11 Test 9.5 mos 1.05 Project 10 mos 1.00 Software Development Best Practices 15
Relative Error in Estimates Project schedule 1.6x 1.25x 1.15x 1.1x 1.0x 0.9x 0.85x 0.8x 0.6x Scope defined Requirements Design Code Test Project Software Development Best Practices 16
Relative Error in Estimates (cont) 2.0 R e l a t i v e V a l u e 1.0 0.0 Scope defined Requirements Design Code Testing Actual Milestone Software Development Best Practices 17
The Cone of Uncertainty Project scope (effort, size, features) Project schedule 4x 1.6x 2x 1.25x 1.5x 1.15x 1.25x 1.0x 0.8x 1.1x 1.0x 0.9x 0.67x 0.85x 0.5x 0.8x 0.25x 0.6x Initial product definition Approved product definition Requirements specification Product design specification Detailed design specification Product Source: Software Cost Estimation with Cocomo II (Boehm 2000) Software Development Best Practices 18
Using Your Cone of Uncertainty 40 W e e k s 20 0.0 Scope defined Requirements Design Code Testing Actual Milestone Software Development Best Practices 19
Cloud of Uncertainty? Project scope (effort, size, features) 4x 2x 1.5x 1.25x 1.0x 0.8x 0.67x 0.5x Product Definition Detailed Requirements Product Design Detailed Design Project schedule 1.6x 1.25x 1.15x 1.1x 1.0x 0.9x 0.85x 0.8x 0.25x 0.6x Time Software Development Best Practices 20
Cone of Uncertainty Given Fixed Schedule Priority Feature Best-case estimate Worst-case estimate 1 F7 9 16 2 F3 5 9 3 F9 6 10 4 F2 1 3 5 F1 3 7 6 F5 4 8 7 F6 6 11 8 F10 2 6 9 F4 5 8 10 F8 6 10 Software Development Best Practices 21
Cone of Uncertainty Given Fixed Schedule (cont) P r i o r i t i e d F e a t u r e s In Not sure Out Scope defined Requirements Design Code Testing Actual Milestone Software Development Best Practices 22
#3 One software process does not fit all projects
One Size Does Not Fit All Extent of Distribution Horizontal market Vertical market Custom, In-House Personal card game mailing-label printer sales-lead tracker useful wordprocessor macro website (brochure-ware) desktop publishing software video-store management software management information system e-commerce site, spreadsheet, tax program video-display driver insurance quote program realtime racetrack betting system, single race track desktop operatingsystem mainframe operating-system embedded pacemaker control, embedded fuelinjector system 777 flight-control software Applications Systems Required Reliability Life-critical systems Software Development Best Practices 24
Activity = Phase Software development consists of wellknown activities: Requirements Design Construction Test These activities may or may not be performed in phases Software Development Best Practices 25
The Two Extremes: 100% Sequential vs. 100% Iterative 100% Sequential (e.g., Waterfall Development) 100% 100% 100% 100% Requirements Design Construction System Test 100% Iterative (e.g., Extreme Programming) Iteration 1 Iteration 2 Iteration 3 Iteration 4 Iteration n Software Development Best Practices 26
A Spectrum of Software Processes Pure Waterfall Staged Delivery Planned Iterations Feature Driven Development Scrum XP RUP Plan-based lifecycles Production-oriented Development Agile lifecycles Research-oriented Development Software Development Best Practices 27
Selecting a Lifecycle Does the lifecycle meet the project s need to: Control cost and schedule? Provide progress visibility? Handle requirements or design uncertainty? Produce reliable, maintainable products? Deliver business value early (i.e., in stages)? Be simple to understand and easy to use? Software Development Best Practices 28
References [Fagan96] Michael Fagan, Foreword in Wheeler, Brykczynski, Meeson, Software Inspection - An Industry Best Practice, IEEE Computer Society Press, 1996 [Jones99] Capers Jones, Software Quality in 1999: What Works and What Doesn t, Software Quality Research Inc., 1999 [McConnell98] Steve McConnell, Software Project Survival Guide, Microsoft Press, 1998 [Mogyorodi03] Gary Mogyoridi, What is Requirements-Based Testing, Crosstalk, March 2003 [O Neill97] Don O Neill, Setting Up a Software Inspection Program, CrossTalk, February, 1997 [Russell91] Glen Russell, Experience with Inspection in Ultralarge- Scale Developments, IEEE Software, Vol 8, No 1, January, 1991 [Tockey05] Steve Tockey, Return on Software: Maximizing the Return on your Software Investment, Addison Wesley, 2005 [Wagner06] Stefan Wagner, A literature survey of the quality economics of defect-detection techniques, Proceedings of the 2006 ACM/IEEE International Symposium on Empirical Software Engineering ISESE '06, 2006 Software Development Best Practices 29
Contact Information Construx Delivering Software Project Success Training Consulting/Coaching Software Tools sales@construx.com construx.com (425) 636-0100 Software Development Best Practices 30