The Economics of Software Quality

Similar documents
The Economics of Software Reliability

Assessing Quality Processes with ODC COQUALMO

Some Critical Success Factors for Industrial/Academic Collaboration in Empirical Software Engineering

The ROI of Systems Engineering: Some Quantitative Results

Integrated Modeling of Business Value and Software Processes

Software Economics: A Roadmap

The Dynamics of Software Testing

Simulation for Business Value and Software Process/Product Tradeoff Decisions

Value-Based Feedback in Software/IT Systems

Bridge the Gap between Software Test Process and Business Value: A Case Study

An Introduction to. Metrics. used during. Software Development

Cost/Benefit-Aspects of Software Quality Assurance

CDC UNIFIED PROCESS JOB AID

Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses

The Real Estate Philosopher Porter s Five Forces in the Real Estate World

The use of Trade-offs in the development of Web Applications

Top 10 Do s and Don ts in Selecting Campus Recreation Software

Improving Software Development Economics Part II: Reducing Software Product Complexity and Improving Software Processes

Measuring ROI of Agile Transformation

RFID Journal Live! 2006 May 1-3, 2006 MGM Grand Conference Center Las Vegas, Nevada

SoftwareCostEstimation. Spring,2012

Future of CMM and Quality Improvement. Roy Ko Hong Kong Productivity Council

Mitigating Risk through OEM Partnerships. Leveraging OEM to Drive the Bottom Line

Software Process Improvement Software Business. Casper Lassenius

case study Coverity Maintains Software Integrity of Sun Microsystems Award-Winning Storage Products

EXHIBIT L. Application Development Processes

ni.com/services NI Services

Current and Future Challenges for Systems and Software Cost Estimation

The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.

Extending CMMI Level 4/5 Organizational Metrics Beyond Software Development

Practical Metrics for Managing and Improving Software Testing

Measurement Strategies in the CMMI

Cost Estimation Driven Software Development Process

Current and Future Challenges for Software Cost Estimation and Data Collection

Automation can dramatically increase product quality, leading to lower field service, product support and

Wait-Time Analysis Method: New Best Practice for Performance Management

MTAT Software Economics. Lecture 5: Software Cost Estimation

Agility, Uncertainty, and Software Project Estimation Todd Little, Landmark Graphics

Optimizing Rewards and Employee Engagement

Skating to Where the Puck Is Going:!

CHAPTER 1: INTRODUCTION TO RAPID APPLICATION DEVELOPMENT (RAD)

Exhibit F. VA CAI - Staff Aug Job Titles and Descriptions Effective 2015

A Guide to Open Source Transformation Services. How and Why Organizations are Making the Move to Open Source

Management. Project. Software. Ashfaque Ahmed. A Process-Driven Approach. CRC Press. Taylor Si Francis Group Boca Raton London New York

Value-Based Software Engineering Concepts: Overview and Introduction

Latest Trends in Testing. Ajay K Chhokra

Points of Defect Creation

Measuring Success Service Desk Evaluation Guide for the Midsized Business: How to Choose the Right Service Desk Solution and Improve Your ROI

Software Engineering Compiled By: Roshani Ghimire Page 1

TEST MANAGEMENT SOLUTION Buyer s Guide WHITEPAPER. Real-Time Test Management

Integrated methodology for testing and quality management.

Strategic Solutions that Make Your Work Easier. Projects Made Easier Decisions Made Easier Business Made Easier

3 rd Party Vendor Risk Management

Treasure Trove The Rising Role of Treasury in Accounts Payable

Module J Trade Study Process

Creating Velocity in Data Center Migrations to AWS. Presented by Harald Viehweger Tuesday 30 June 2015

Project Management. Lecture 3. Software Engineering CUGS. Spring 2012 (slides made by David Broman) Kristian Sandahl

VENDOR SELECTION: WHERE TO BEGIN?

ITSM Maturity Model. 1- Ad Hoc 2 - Repeatable 3 - Defined 4 - Managed 5 - Optimizing No standardized incident management process exists

Project Management. Lecture 3. Software Engineering CUGS. Spring 2011 (slides made by David Broman)

CLOUD MIGRATION STRATEGIES

How To Write Software

Test Automation. Full service delivery for faster testing at optimum cost

Systems Engineering Complexity & Project Management

Software Process Engineering & Management Models

Windows Server 2003 migration: Your three-phase action plan to reach the finish line

Following is a discussion of the Hub s role within the health insurance exchanges, the results of our review, and concluding observations.

Competency Requirements for Executive Director Candidates

Aspire's Approach to Test Automation

Benefits of Test Automation for Agile Testing

Establishing your Automation Development Lifecycle

Fundamentals of Measurements

Major Project Governance Assessment Toolkit

Competitive Advantage

HOW TO IMPROVE QUALITY AND EFFICIENCY USING TEST DATA ANALYTICS

Software Quality Management

MKS Integrity & CMMI. July, 2007

Market Research. Market Research: Part II: How To Get Started With Market Research For Your Organization. What is Market Research?

Developing and Implementing a Balanced Scorecard: A Practical Approach

Ten steps to better requirements management.

ENTERPRISE ARCHITECTUE OFFICE

SAMPLE INVITATION TO TENDER ADVERTISEMENT (CONTRACT)

The ROI of Systems Engineering: Some Quantitative Results for Software-Intensive Systems

PROJECT MANAGEMENT PLAN CHECKLIST

Application Security in the Software Development Lifecycle

Reaching CMM Levels 2 and 3 with the Rational Unified Process

Selecting Enterprise Software

Building Relationships by Leveraging your Supply Chain. An Oracle White Paper December 2001

Lean operations: measurable results

Project Knowledge Areas

Agile project portfolio manageme nt

CSTE Mock Test - Part III Questions Along with Answers

Open Source CRM Benefits and Case Studies in the Public Sector Richard Baldwin, Pacific Northwest Regional VP

Guideline to purchase a CRM Solution

Using Measurement to translate Business Vision into Operational Software Strategies

Relationship Manager (Banking) Assessment Plan

ORACLE TELESALES ORACLE DATA SHEET KEY FEATURES

Take value-add on a test drive. Explore smarter ways to evaluate phone data providers.

ASAE s Job Task Analysis Strategic Level Competencies

Transcription:

The Economics of Software Quality Barry Boehm, USC Trustworthy Software Process Presentation June 3, 2009 boehm@usc.edu (http://csse.usc.edu) 06/03/09 USC-CSSE 1

Outline The business case for software quality Example: net value of test aids Value depends on stakeholder value propositions What qualities do stakeholders really value? Safety, accuracy, response time, Attributes often conflict Software quality in a competitive world Conclusions 06/03/09 USC-CSSE 2

Software Quality Business Case Software quality investments compete for resources With investments in functionality, response time, adaptability, speed of development, Each investment option generates curves of net value and ROI Net Value NV = PV(benefit flows-cost flows) Present Value PV = Flows at future times discounted Return on Investment ROI = NV/PV(cost flows) Value of benefits varies by stakeholder and role 06/03/09 USC-CSSE 3

Software Testing Business Case Vendor proposition Our test data generator will cut your test costs in half We ll provide it to you for 30% of your test costs After you run all your tests for 50% of your original costs, you re 20% ahead Any concerns with vendor proposition? 06/03/09 USC-CSSE 4

Software Testing Business Case Vendor proposition Our test data generator will cut your test costs in half We ll provide it to you for 30% of your test costs After you run all your tests for 50% of your original costs, you re 20% ahead Any concerns with vendor proposition? Test data generator is value-neutral* Every test case, defect is equally important Usually, 20% of test cases cover 80% of business value * As are most current software engineering techniques 06/03/09 USC-CSSE 5

20% of Features Provide 80% of Value: Focus Testing on These (Bullock, 2000) 100 80 % of Value for Correct Customer Billing 60 40 20 Automated test generation tool - all tests have equal value 5 10 15 Customer Type 06/03/09 USC-CSSE 6

Value-Based Testing Provides More Net Value 60 (30, 58) Value-Based Testing Net Value NV 40 20 0-20 -40 (100, 20) 20 40 60 80 100 Test Data Generator Percent of tests run % Tests Test Data Generator Value-Based Testing Cost Value NV Cost Value NV 0 30 0-30 0 0 0 10 35 10-25 10 50 40 20 40 20-20 20 75 55 30 45 30-15 30 88 58 40 50 40-10 40 94 54....... 100 80 100 +20 100 100 0 06/03/09 USC-CSSE 7

There is No Universal Quality-Value Metric Different stakeholders rely on different value attributes Protection: safety, security, privacy Robustness: reliability, availability, survivability Quality of Service: performance, accuracy, ease of use Adaptability: evolvability, interoperability Affordability: cost, schedule, reusability Value attributes continue to tier down Performance: response time, resource consumption (CPU, memory, comm.) Value attributes are scenario-dependent 5 seconds normal response time; 2 seconds in crisis Value attributes often conflict Most often with performance and affordability 06/03/09 USC-CSSE 8

Major Information System Quality Stakeholders Dependents - passengers, patients Information Suppliers -citizens, companies Information System Information Brokers -financial services, news media Information Consumers -decisions, education, entertainment Mission Controllers -pilots, distribution controllers Developers, Acquirers, Administrators 06/03/09 USC-CSSE 9

Overview of Stakeholder/Value Dependencies Strength of direct dependency on value attribute **- Critical ; *-Significant; blank-insignificant or indirect Stakeholders Attributes Info. Suppliers, Dependents Info. Brokers Info. Consumers Mission Controllers, Administrators Developers, Acquirers Protection Robustness Quality of Service ** * * ** ** ** ** * * ** * * * ** ** ** * * ** ** Adaptability Affordability 06/03/09 USC-CSSE 10

Elaboration of Stakeholder/Value Dependencies Stakeholders Info Suppliers Dependents Q-Attributes Mission - Protection crit uncrit Safety ** Security * ** ** Privacy ** * Robustross Reliability ** ** ** ** Availability * ** ** ** ** ** Survivability ** ** ** ** Quality of Service Performance ** ** ** ** *-** Accuracy, Consistency ** ** ** * ** * Accessibility, ease of use; ** ** ** ** ** ** difficulty of misuse Evolvability ** ** * * ** ** ** Interoperability ** ** ** Info Brokers Consumers Info Mission Controllers Developers, Maintenance Administrators Acquirers Cost * ** Schedule * ** ** Reusability ** * * 06/03/09 USC-CSSE 11

Implications for Quality Engineering There is no universal quality metric to optimize Need to identify system s success-critical stakeholders And their quality priorities Need to balance satisfaction of stakeholder dependencies Stakeholder win-win negotiation Quality attribute tradeoff analysis Need value-of-quality models, methods, and tools 06/03/09 USC-CSSE 12

Outline The business case for software quality What qualities do stakeholders really value? Software quality in a competitive world Is market success a monotone function of quality? Is quality really free? Is faster, better, cheaper really achievable? Value-based vs. value-neutral methods Conclusions 06/03/09 USC-CSSE 13

Competing on Cost and Quality - adapted from Michael Porter, Harvard ROI Too cheap (low Q) Cost leader (acceptable Q) Stuck in the middle Quality leader (acceptable C) Overpriced Price or Cost Point 06/03/09 USC-CSSE 14

Competing on Schedule and Quality - A risk analysis approach Risk Exposure RE = Prob (Loss) * Size (Loss) Loss financial; reputation; future prospects, For multiple sources of loss: RE = Σ [Prob (Loss) * Size (Loss)] source sources 06/03/09 USC-CSSE 15

Example RE Profile: Time to Ship - Loss due to unacceptable quality Many defects: high P(L) Critical defects: high S(L) RE = P(L) * S(L) Few defects: low P(L) Minor defects: low S(L) Time to Ship (amount of testing) 06/03/09 USC-CSSE 16

Example RE Profile: Time to Ship - Loss due to unacceptable quality - Loss due to market share erosion Many defects: high P(L) Critical defects: high S(L) Many rivals: high P(L) Strong rivals: high S(L) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Few defects: low P(L) Minor defects: low S(L) Time to Ship (amount of testing) 06/03/09 USC-CSSE 17

Example RE Profile: Time to Ship - Sum of Risk Exposures Many defects: high P(L) Critical defects: high S(L) Many rivals: high P(L) Strong rivals: high S(L) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Sweet Spot Few defects: low P(L) Minor defects: low S(L) Time to Ship (amount of testing) 06/03/09 USC-CSSE 18

Comparative RE Profile: Safety-Critical System RE = P(L) * S(L) Higher S(L): defects High-Q Sweet Spot Mainstream Sweet Spot Time to Ship (amount of testing) 06/03/09 USC-CSSE 19

Comparative RE Profile: Internet Startup Low-TTM Sweet Spot Higher S(L): delays RE = P(L) * S(L) Mainstream Sweet Spot TTM: Time to Market Time to Ship (amount of testing) 06/03/09 USC-CSSE 20

How much Architecting is Enough? 100 - A COCOMOII Analysis Percent of Time Added to Overall Schedule 90 80 70 60 50 40 30 20 10000 KSLOC 100 KSLOC 10 KSLOC Sweet Spot Percent of Project Schedule Devoted to Initial Architecture and Risk Resolution Added Schedule Devoted to Rework (COCOMO II RESL factor) Total % Added Schedule Sweet Spot Drivers: Rapid Change: leftward 10 High Assurance: rightward 0 0 10 20 30 40 50 60 Percent of Time Added for Architecture and Risk Resolution 06/03/09 USC-CSSE 21

Interim Conclusions Unwise to try to compete on both cost/schedule and quality Some exceptions: major technology or marketplace edge There are no one-size-fits-all cost/schedule/quality strategies Risk analysis helps determine how much testing (prototyping, formal verification, etc.) is enough Buying information to reduce risk Often difficult to determine parameter values Some COCOMO II values discussed next 06/03/09 USC-CSSE 22

RELY Rating Software Development Cost/Quality Tradeoff Defect Risk - COCOMO II calibration to 161 projects Rough MTBF(mean time between failures) Very High Loss of Human Life 100 years High High Financial Loss 2 years Nominal Moderate recoverable loss 1 month 1.0 In-house support software Low Low, easily recoverable loss 1 day Very Low Slight inconvenience 1 hour 0.8 0.9 1.0 1.1 1.2 1.3 Relative Cost/Source Instruction 06/03/09 USC-CSSE 23

RELY Rating Software Development Cost/Quality Tradeoff Defect Risk - COCOMO II calibration to 161 projects Rough MTBF(mean time between failures) Very High Loss of Human Life 100 years High High Financial Loss 2 years 1.10 Commercial quality leader Nominal Moderate recoverable loss 1 month 1.0 In-house support software Low Low, easily recoverable loss 1 day 0.92 Commercial cost leader Very Low Slight inconvenience 1 hour 0.8 0.9 1.0 1.1 1.2 1.3 Relative Cost/Source Instruction 06/03/09 USC-CSSE 24

RELY Rating Software Development Cost/Quality Tradeoff Defect Risk - COCOMO II calibration to 161 projects Rough MTBF(mean time between failures) Very High Loss of Human Life 100 years 1.26 Safety-critical High High Financial Loss 2 years 1.10 Commercial quality leader Nominal Moderate recoverable loss 1 month 1.0 In-house support software Low Low, easily recoverable loss 1 day 0.92 Commercial cost leader Very Low Slight inconvenience (1 hour) 0.82 Startup demo 0.8 0.9 1.0 1.1 1.2 1.3 Relative Cost/Source Instruction 06/03/09 USC-CSSE 25

Quality is Free Did Philip Crosby s book get it all wrong? Investments in dependable systems Cost extra for simple, short-life systems Pay off for high-value, long-life systems 06/03/09 USC-CSSE 26

Software Life-Cycle Cost vs. Quality 1.4 1.3 Relative Cost to Develop 1.2 1.1 1.10 1.26 1.0 1.0 0.9 0.92 0.8 0.82 Very Low Low Nominal High COCOMO II RELY Rating Very High 06/03/09 USC-CSSE 27

Software Life-Cycle Cost vs. Quality 1.4 1.3 Relative Cost to Develop, Maintain 1.2 1.1 1.0 1.23 1.10 1.10 1.26 1.07 0.99 0.9 0.92 0.8 0.82 Very Low Low Nominal High COCOMO II RELY Rating Very High 06/03/09 USC-CSSE 28

Software Life-Cycle Cost vs. Quality 1.4 1.3 1.26 Relative Cost to Develop, Maintain 1.2 1.1 1.0 1.23 1.11 1.10 1.05 1.10 1.07 1.20 1.07 70% Maint. 0.99 0.9 0.92 0.8 0.82 Very Low Low Nominal High COCOMO II RELY Rating Very High 06/03/09 USC-CSSE 29

Software Ownership Cost vs. Quality 1.4 1.3 VL = 2.55 L = 1.52 Operational-defect cost at Nominal dependability = Software life cycle cost 1.26 Relative Cost to Develop, Maintain, Own and Operate 1.2 1.1 1.0 0.9 1.23 1.11 Operational - defect cost = 0 1.10 1.05 0.92 1.10 1.07 0.99 1.20 1.07 70% Maint. 0.8 0.82 0.76 0.69 Very Low Low Nominal High COCOMO II RELY Rating Very High 06/03/09 USC-CSSE 30

Outline The business case for software quality What qualities do stakeholders really value? Software quality in a competitive world Is market success a monotone function of quality? Is quality really free? Is faster, better, cheaper really achievable? Value-based vs. value-neutral methods Conclusions 06/03/09 USC-CSSE 31

Cost, Schedule, Quality: Pick any Two? C S Q 06/03/09 USC-CSSE 32

Cost, Schedule, Quality: Pick any Two? Consider C, S, Q as Independent Variable Feature Set as Dependent Variable C C S Q S Q 06/03/09 USC-CSSE 33

Tradeoffs Among Cost, Schedule, and Reliability: COCOMO II Cost ($M) 9 8 7 6 5 4 3 2 1 0 For 100-KSLOC set of features Can pick all three with 77-KSLOC set of features 0 10 20 30 40 50 Development Time (Months) (RELY, MTBF (hours)) (VL, 1) (L, 10) (N, 300) (H, 10K) (VH, 300K) -- Cost/Schedule/RELY: pick any two points 06/03/09 USC-CSSE 34

C, S, Q as Independent Variable Determine Desired Delivered Defect Density (D4) Or a value-based equivalent Prioritize desired features Via QFD, IPT, stakeholder win-win Determine Core Capability 90% confidence of D4 within cost and schedule Balance parametric models and expert judgment Architect for ease of adding next-priority features Hide sources of change within modules (Parnas) Develop core capability to D4 quality level Usually in less than available cost and schedule Add next priority features as resources permit Versions used successfully on 32 of 34 USC digital library projects 06/03/09 USC-CSSE 35

Value-Based vs. Value Neutral Methods Value-based defect reduction Constructive Quality Model (COQUALMO) Information Dependability Attribute Value Estimation (idave) model 06/03/09 USC-CSSE 36

Value-Based Defect Reduction Example: Goal-Question-Metric (GQM) Approach Goal:Our supply chain software packages have too many defects. We need to get their defect rates down Question:? 06/03/09 USC-CSSE 37

Value-Based GQM Approach I Q: How do software defects affect system value goals? ask why initiative is needed - Order processing - Too much downtime on operations critical path - Too many defects in operational plans - Too many new-release operational problems G: New system-level goal: Decrease software-defect-related losses in operational effectiveness New Q:? - With high-leverage problem areas above as specific subgoals 06/03/09 USC-CSSE 38

Value-Based GQM Approach II New Q: Perform system problem-area root cause analysis: ask why problems are happening via models Example: Downtime on critical path Order items Validate order Validate items in stock Schedule packaging, delivery Produce status reports Prepare delivery packages Deliver order Where are primary software-defect-related delays? Where are biggest improvement-leverage areas? Reducing software defects in Scheduling module Reducing non-software order-validation delays Taking Status Reporting off the critical path Downstream, getting a new Web-based order entry system Ask why not? as well as why? 06/03/09 USC-CSSE 39

Value-Based GQM Results Defect tracking weighted by system-value priority Focuses defect removal on highest-value effort Significantly higher effect on bottom-line business value And on customer satisfaction levels Engages software engineers in system issues Fits increasing system-criticality of software Strategies often helped by quantitative models COQUALMO, idave 06/03/09 USC-CSSE 40

Current COQUALMO System Software Size Estimate Software platform, Project, product and personnel attributes COCOMO II COQUALMO Defect Introduction Model Software development effort, cost and schedule estimate Number of residual defects Defect density per unit of size Defect removal profile levels Automation, Reviews, Testing Defect Removal Model 06/03/09 USC-CSSE 41

Defect Removal Rating Scales COCOMO II p.263 Very Low Low Nominal High Very High Extra High Automated Analysis Simple compiler syntax checking Basic compiler capabilities Compiler extension Basic req. and design consistency More elaborate req./design Basic distprocessing Intermediatelevel module Simple req./design Formalized specification, verification. Advanced distprocessing Peer Reviews No peer review Ad-hoc informal walk-through Well-defined preparation, review, minimal follow-up Formal review roles and Welltrained people and basic checklist Root cause analysis, formal follow Using historical data Extensive review checklist Statistical control Execution Testing and Tools No testing Ad-hoc test and debug Basic test Test criteria based on checklist Well-defined test seq. and basic test coverage tool system More advance test tools, preparation. Dist-monitoring Highly advanced tools, modelbased test 06/03/09 USC-CSSE 42

Defect Removal Estimates - Nominal Defect Introduction Rates 70 60 60 Delivered Defects / KSLOC 50 40 30 20 10 0 28.5 14.3 7.5 3.5 1.6 VL Low Nom High VH XH Composite Defect Removal Rating 06/03/09 USC-CSSE 43

idave Model 06/03/09 USC-CSSE 44

ROI Analysis Results Comparison - Business Application vs. Space Application 16 14 12 10 ROI 8 6 4 2 0-2 0. 32 N- >H H- >VH VH- >XH Dependability Investment Levels -0.98 Sierra Order Processing Planetary Rover 06/03/09 USC-CSSE 45

Conclusions: Software Quality (SWQ) There is no universal SWQ metric to optimize Need to balance stakeholder SWQ value propositions Increasing need for value-based approaches to SWQ Methods and models emerging to address needs Faster, better, cheaper is feasible If feature content can be a dependent variable Quality is free for stable, high-value, long-life systems But not for dynamic, lower-value, short life systems Future trends intensify SWQ needs and challenges Criticality, complexity, decreased control, faster change 06/03/09 USC-CSSE 46