The Economics of Software Reliability



Similar documents
Assessing Quality Processes with ODC COQUALMO

Software Economics: A Roadmap

Simulation for Business Value and Software Process/Product Tradeoff Decisions

The Dynamics of Software Testing

Outline. Agile Methods. Converse of Conway s Law. The Silver Bullet Fantasy (Brooks, 1986)

CDC UNIFIED PROCESS JOB AID

SoftwareCostEstimation. Spring,2012

The use of Trade-offs in the development of Web Applications

How To Write Software

Automation can dramatically increase product quality, leading to lower field service, product support and

Agility, Uncertainty, and Software Project Estimation Todd Little, Landmark Graphics

Mitigating Risk through OEM Partnerships. Leveraging OEM to Drive the Bottom Line

Value-Based Software Engineering Concepts: Overview and Introduction

Application Security in the Software Development Lifecycle

Treasure Trove The Rising Role of Treasury in Accounts Payable

The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.

Future of CMM and Quality Improvement. Roy Ko Hong Kong Productivity Council

CLOUD MIGRATION STRATEGIES

Project Management. Lecture 3. Software Engineering CUGS. Spring 2012 (slides made by David Broman) Kristian Sandahl

Software Engineering Compiled By: Roshani Ghimire Page 1

Competency Requirements for Executive Director Candidates

Management. Project. Software. Ashfaque Ahmed. A Process-Driven Approach. CRC Press. Taylor Si Francis Group Boca Raton London New York

Competitive Advantage

RFID Journal Live! 2006 May 1-3, 2006 MGM Grand Conference Center Las Vegas, Nevada

3 rd Party Vendor Risk Management

Project Knowledge Areas

Using Measurement to translate Business Vision into Operational Software Strategies

More on Software Project Management Project and Organizations, Project Portfolio Management, Procurement Management

ASAE s Job Task Analysis Strategic Level Competencies

Impact Investing TAILORED, TRANSPARENT SOLUTIONS

To comment on these KPIs:

Institute for Business Continuity Training 1623 Military Road, # 377 Niagara Falls, NY

Benefits of Test Automation for Agile Testing

Extending CMMI Level 4/5 Organizational Metrics Beyond Software Development

Software Development Life Cycle (SDLC)

Aspire's Approach to Test Automation

Unit 11: Software Metrics

Measuring Success Service Desk Evaluation Guide for the Midsized Business: How to Choose the Right Service Desk Solution and Improve Your ROI

Today, the world s leading insurers

Sample Exam Syllabus

Systems Engineering Complexity & Project Management

Chapter 4 SUPPLY CHAIN PERFORMANCE MEASUREMENT USING ANALYTIC HIERARCHY PROCESS METHODOLOGY

Managing Data Center Growth Explore Your Options

If your company had an extra $41 million, what would you do with it? For every $1 billion in revenue,

How to choose the right marketing partner for you

Getting Things Done: Practical Web/e-Commerce Application Stress Testing

The increasing pace of change in the information

Guideline to purchase a CRM Solution

Thinking about APM? 4 key considerations for buy vs. build your own

Software Engineering: Analysis and Design - CSE3308

Selecting Enterprise Software

Creating Velocity in Data Center Migrations to AWS. Presented by Harald Viehweger Tuesday 30 June 2015

Thought Leadership White Paper

Risk Assessment for Medical Devices. Linda Braddon, Ph.D. Bring your medical device to market faster 1

The Real Estate Philosopher Porter s Five Forces in the Real Estate World

CORL Dodging Breaches from Dodgy Vendors

Why Agile Works: Economics, Psychology, and #PrDC16

Business Continuity Planning (800)

Four distribution strategies for extending ERP to boost business performance

Measurement Strategies in the CMMI

Chapter 8 Software Testing

QPR Performance Management

Verification of need. Assessment of options. Develop Procurement Strategy. Implement Procurement Strategy. Project Delivery. Post Project Review

Managed Services - Driving Business Value in Banking

Custom Software Development Approach

Test Automation. Full service delivery for faster testing at optimum cost

Digital Asset Manager, Digital Curator. Cultural Informatics, Cultural/ Art ICT Manager

Applying Lean on Agile Scrum Development Methodology

ANALYTICS PAYS BACK $13.01 FOR EVERY DOLLAR SPENT

The Business Case for Supporting Your Enterprise System Implementation with the ANCILE uperform Learning and Performance Solution. An ROI White Paper

ISTQB Certified Tester. Foundation Level. Sample Exam 1

Business white paper. Best practices for implementing automated functional testing solutions

An Introduction to. Metrics. used during. Software Development

Cost Estimation Driven Software Development Process

How to Develop Qualification Criteria

ISO/IEC Software Product Quality Model

Introduction to Risk Management for Software Projects. Peter Kolb. Distributed and Outsourced Software Engineering, ETH Zurich

Executive Summary Factors Affecting Benefits And Costs Disclosures TEI Framework And Methodology Analysis...

Life Cycle Quality Gates

A Capability Maturity Model (CMM)

Development, Acquisition, Implementation, and Maintenance of Application Systems

Transcription:

The Economics of Software Reliability Barry Boehm, ISSRE 2003 Keynote Address November 19, 2003 (boehm@sunset.usc.edu) (http://sunset.usc.edu) 11/19/03 -CSE 1 Outline The business case for software reliability Example: net value of test aids Value depends on stakeholder value propositions What are stakeholders really relying on? Safety, accuracy, response time, Attributes often conflict Software dependability in a competitive world Conclusions 11/19/03 -CSE 2 1

Software Reliability Business Case Software reliability investments compete for resources With investments in functionality, response time, adaptability, speed of development, Each investment option generates curves of net value and ROI Net Value NV = PV(benefit flows-cost flows) Present Value PV = Flows at future times discounted Return on Investment ROI = NV/PV(cost flows) Value of benefits varies by stakeholder and role 11/19/03 -CSE 3 Software Testing Business Case Vendor proposition Our test data generator will cut your test costs in half We ll provide it to you for 30% of your test costs After you run all your tests for 50% of your original costs, you re 20% ahead Any concerns with vendor proposition? 11/19/03 -CSE 4 2

Software Testing Business Case Vendor proposition Our test data generator will cut your test costs in half We ll provide it to you for 30% of your test costs After you run all your tests for 50% of your original costs, you re 20% ahead Any concerns with vendor proposition? Test data generator is value-neutral* Every test case, defect is equally important Usually, 20% of test cases cover 80% of business value * As are most current software engineering techniques 11/19/03 -CSE 5 20% of Features Provide 80% of Value: Focus Testing on These (Bullock, 2000) 100 80 % of Value for Correct Customer Billing 60 40 20 Automated test generation tool - all tests have equal value 5 10 15 Customer Type 11/19/03 -CSE 6 3

Value-Based Testing Provides More Net Value 60 (30, 58) Value-Based Testing Net Value NV 40 20 0-20 -40 (100, 20) 20 40 60 80 100 Test Data Generator % Tests Percent of tests run Test Data Generator Cost Value NV Value-Based Testing Cost Value NV 0 30 0-30 0 0 0 10 35 10-25 10 50 40 20 40 20-20 20 75 55 30 45 30-15 30 88 58 40 50 40-10 40 94 54....... 100 80 100 +20 100 100 0 11/19/03 -CSE 7 There is No Universal Dependability-Value Metric Different stakeholders rely on different value attributes Protection: safety, security, privacy Robustness: reliability, availability, survivability Quality of Service: performance, accuracy, ease of use Adaptability: evolvability, interoperability Affordability: cost, schedule, reusability Value attributes continue to tier down Performance: response time, resource consumption (CPU, memory, comm.) Value attributes are scenario-dependent 5 seconds normal response time; 2 seconds in crisis Value attributes often conflict Most often with performance and affordability 11/19/03 -CSE 8 4

Major Information System Dependability Stakeholders Dependents - passengers, patients Information Suppliers -citizens, companies Information System Information Brokers -financial services, news media Information Consumers -decisions, education, entertainment Mission Controllers -pilots, distribution controllers Developers, Acquirers, Administrators 11/19/03 -CSE 9 Overview of Stakeholder/Value Dependencies Strength of direct dependency on value attribute **- Critical ; *-Significant; blank-insignificant or indirect Attributes Stakeholders Info. Suppliers, Dependents Info. Brokers Info. Consumers Mission Controllers, Administrators Developers, Acquirers ** ** ** ** * * Protection Robustness * * ** ** ** * * ** * ** ** Quality of Service Adaptability Affordability * * ** 11/19/03 -CSE 10 5

Elaboration of Stakeholder/Value Dependencies Stakeholders Info Suppliers Dependents D-Attribute s Mission - Prote ction crit uncrit Sa fe ty ** Se curity * ** ** Privacy ** * Robustross Re liability ** ** ** ** Ava ilability * ** ** ** ** ** Surviva bility ** ** ** ** Quality of Service Pe rforma nce ** ** ** ** *-** Accuracy, Consiste ncy ** ** ** * ** * Acce ssibility, e ase of use ; ** ** ** ** ** ** difficulty of misuse Evolvability ** ** * * ** ** ** Interope rability ** ** ** Cost * ** Sche dule * ** ** Reusa bility ** * * 11/19/03 -CSE 11 Info Brokers Consumers Info Mission Controllers Developers, Maintenance Administrators Acquirers Implications for Dependability Engineering There is no universal dependability metric to optimize Need to identify system s success-critical stakeholders And their dependability priorities Need to balance satisfaction of stakeholder dependencies Stakeholder win-win negotiation Dependability attribute tradeoff analysis Need value-of-dependability models, methods, and tools 11/19/03 -CSE 12 6

Outline The business case for software reliability What are stakeholders really relying on? Software dependability in a competitive world Is market success a monotone function of dependability? Is quality really free? Is faster, better, cheaper really achievable? Value-based vs. value-neutral methods Conclusions 11/19/03 -CSE 13 Competing on Cost and Quality - adapted from Michael Porter, Harvard ROI Too cheap (low Q) Cost leader (acceptable Q) Stuck in the middle Quality leader (acceptable C) Overpriced Price or Cost Point 11/19/03 -CSE 14 7

Competing on Schedule and Quality - A risk analysis approach Risk Exposure RE = Prob (Loss) * Size (Loss) Loss financial; reputation; future prospects, For multiple sources of loss: RE = Σ [Prob (Loss) * Size (Loss)] source sources 11/19/03 -CSE 15 Example RE Profile: Time to Ship - Loss due to unacceptable dependability Many defects: high P(L) Critical defects: high S(L) RE = P(L) * S(L) Few defects: low P(L) Minor defects: low S(L) Time to Ship (amount of testing) 11/19/03 -CSE 16 8

Example RE Profile: Time to Ship - Loss due to unacceptable dependability - Loss due to market share erosion Many defects: high P(L) Critical defects: high S(L) Many rivals: high P(L) Strong rivals: high S(L) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Few defects: low P(L) Minor defects: low S(L) Time to Ship (amount of testing) 11/19/03 -CSE 17 Example RE Profile: Time to Ship - Sum of Risk Exposures Many defects: high P(L) Critical defects: high S(L) Many rivals: high P(L) Strong rivals: high S(L) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Sweet Spot Few defects: low P(L) Minor defects: low S(L) Time to Ship (amount of testing) 11/19/03 -CSE 18 9

Comparative RE Profile: Safety-Critical System RE = P(L) * S(L) er S(L): defects -Q Sweet Spot Mainstream Sweet Spot Time to Ship (amount of testing) 11/19/03 -CSE 19 Comparative RE Profile: Internet Startup Low-TTM Sweet Spot er S(L): delays RE = P(L) * S(L) Mainstream Sweet Spot TTM: Time to Market Time to Ship (amount of testing) 11/19/03 -CSE 20 10

Interim Conclusions Unwise to try to compete on both cost/schedule and quality Some exceptions: major technology or marketplace edge There are no one-size-fits-all cost/schedule/quality strategies Risk analysis helps determine how much testing (prototyping, formal verification, etc.) is enough Buying information to reduce risk Often difficult to determine parameter values Some COCOMO II values discussed next 11/19/03 -CSE 21 RELY Rating Software Development Cost/Quality Tradeoff Defect Risk - COCOMO II calibration to 161 projects Rough MTBF(mean time between failures) Loss of Human Life 100 years Financial Loss 2 years Nominal Moderate recoverable loss 1 month 1.0 In-house support software Low Low, easily recoverable loss 1 day Low Slight inconvenience 1 hour 0.8 0.9 1.0 1.1 1.2 1.3 Relative Cost/Source Instruction 11/19/03 -CSE 22 11

RELY Rating Software Development Cost/Quality Tradeoff Defect Risk - COCOMO II calibration to 161 projects Rough MTBF(mean time between failures) Loss of Human Life 100 years Financial Loss 2 years 1.10 Commercial quality leader Nominal Moderate recoverable loss 1 month 1.0 In-house support software Low Low, easily recoverable loss 1 day Commercial cost leader 0.92 Low Slight inconvenience 1 hour 0.8 0.9 1.0 1.1 1.2 1.3 Relative Cost/Source Instruction 11/19/03 -CSE 23 RELY Rating Software Development Cost/Quality Tradeoff Defect Risk - COCOMO II calibration to 161 projects Rough MTBF(mean time between failures) Loss of Human Life 100 years 1.26 Safety-critical Financial Loss 2 years Commercial quality leader 1.10 Nominal Moderate recoverable loss 1 month 1.0 In-house support software Low Low, easily recoverable loss 1 day Commercial cost leader 0.92 Low Slight inconvenience (1 hour) Startup demo 0.82 0.8 0.9 1.0 1.1 1.2 1.3 Relative Cost/Source Instruction 11/19/03 -CSE 24 12

Quality is Free Did Philip Crosby s book get it all wrong? Investments in dependable systems Cost extra for simple, short-life systems Pay off for high-value, long-life systems 11/19/03 -CSE 25 Software Life-Cycle Cost vs. Dependability 1.4 1.3 Relative Cost to Develop 1.2 1.1 1.10 1.26 1.0 1.0 0.9 0.92 0.8 0.82 Low Low Nominal COCOMO II RELY Rating 11/19/03 -CSE 26 13

Software Life-Cycle Cost vs. Dependability 1.4 1.3 Relative Cost to Develop, Maintain 1.2 1.1 1.0 1.23 1.10 1.10 1.26 1.07 0.99 0.9 0.92 0.8 0.82 Low Low Nominal COCOMO II RELY Rating 11/19/03 -CSE 27 Software Life-Cycle Cost vs. Dependability 1.4 1.3 1.26 Relative Cost to Develop, Maintain 1.2 1.1 1.0 1.23 1.11 1.10 1.05 1.10 1.07 1.20 1.07 70% Maint. 0.99 0.9 0.92 0.8 0.82 Low Low Nominal COCOMO II RELY Rating 11/19/03 -CSE 28 14

Software Ownership Cost vs. Dependability 1.4 1.3 VL = 2.55 L = 1.52 Operational-defect cost at Nominal dependability = Software life cycle cost 1.26 Relative Cost to Develop, Maintain, Own and Operate 1.2 1.1 1.0 0.9 1.23 1.11 Operational - defect cost = 0 1.10 1.05 0.92 1.10 1.07 0.99 1.20 1.07 70% Maint. 0.8 0.82 0.76 0.69 Low Low Nominal COCOMO II RELY Rating 11/19/03 -CSE 29 Outline The business case for software reliability What are stakeholders really relying on? Software dependability in a competitive world Is market success a monotone function of dependability? Is quality really free? Is faster, better, cheaper really achievable? Value-based vs. value-neutral methods Conclusions 11/19/03 -CSE 30 15

Cost, Schedule, Quality: Pick any Two? C S Q 11/19/03 -CSE 31 Cost, Schedule, Quality: Pick any Two? Consider C, S, Q as Independent Variable Feature Set as Dependent Variable C C S Q S Q 11/19/03 -CSE 32 16

C, S, Q as Independent Variable Determine Desired Delivered Defect Density (D4) Or a value-based equivalent Prioritize desired features Via QFD, IPT, stakeholder win-win Determine Core Capability 90% confidence of D4 within cost and schedule Balance parametric models and expert judgment Architect for ease of adding next-priority features Hide sources of change within modules (Parnas) Develop core capability to D4 quality level Usually in less than available cost and schedule Add next priority features as resources permit Versions used successfully on 32 of 34 digital library projects 11/19/03 -CSE 33 Value-Based vs. Value Neutral Methods Value-based defect reduction Constructive Quality Model (COQUALMO) Information Dependability Attribute Value Estimation (idave) model 11/19/03 -CSE 34 17

Value-Based Defect Reduction Example: Goal-Question-Metric (GQM) Approach Goal: Our supply chain software packages have too many defects. We need to get their defect rates down Question:? 11/19/03 -CSE 35 Value-Based GQM Approach I Q: How do software defects affect system value goals? ask why initiative is needed - Order processing - Too much downtime on operations critical path - Too many defects in operational plans - Too many new-release operational problems G: New system-level goal: Decrease software-defect-related losses in operational effectiveness - With high-leverage problem areas above as specific subgoals New Q:? 11/19/03 -CSE 36 18

Value-Based GQM Approach II New Q: Perform system problem-area root cause analysis: ask why problems are happening via models Example: Downtime on critical path Order items Validate order Validate items in stock Schedule packaging, delivery Produce status reports Prepare delivery packages Deliver order Where are primary software-defect-related delays? Where are biggest improvement-leverage areas? Reducing software defects in Scheduling module Reducing non-software order-validation delays Taking Status Reporting off the critical path Downstream, getting a new Web-based order entry system Ask why not? as well as why? 11/19/03 -CSE 37 Value-Based GQM Results Defect tracking weighted by system-value priority Focuses defect removal on highest-value effort Significantly higher effect on bottom-line business value And on customer satisfaction levels Engages software engineers in system issues Fits increasing system-criticality of software Strategies often helped by quantitative models COQUALMO, idave 11/19/03 -CSE 38 19

Current COQUALMO System Software Size Estimate Software platform, Project, product and personnel attributes Defect removal profile levels Automation, Reviews, Testing COCOMO II COQUALMO Defect Introduction Model Defect Removal Model Software development effort, cost and schedule estimate Number of residual defects Defect density per unit of size 11/19/03 -CSE 39 Defect Removal Rating Scales COCOMO II p.263 Low Low Nominal Extra Automated Analysis Simple compiler syntax checking Basic compiler capabilities Compiler extension Basic req. and design consistency More elaborate req./design Basic distprocessing Intermediatelevel module Simple req./design Formalized specification, verification. Advanced distprocessing Peer Reviews No peer review Ad-hoc informal walkthrough Well-defined preparation, review, minimal follow-up Formal review roles and Welltrained people and basic checklist Root cause analysis, formal follow Using historical data Extensive review checklist Statistical control Execution Testing and Tools No testing Ad-hoc test and debug Basic test Test criteria based on checklist Well-defined test seq. and basic test coverage tool system More advance test tools, preparation. Distmonitoring ly advanced tools, modelbased test 11/19/03 -CSE 40 20

Defect Removal Estimates - Nominal Defect Introduction Rates Delivered Defects / KSLOC 70 60 50 40 30 20 10 0 60 28.5 14.3 7.5 3.5 1.6 VL Low Nom VH XH Composite Defect Removal Rating 11/19/03 -CSE 41 idave Model 11/19/03 -CSE 42 21

ROI Analysis Results Comparison - Business Application vs. Space Application 16 14 12 10 R O I 8 6 4 2 0-2 0. 32 N- >H H- >VH VH- >XH Dependability Investment Levels - 0. 98 Sierra Order Processing Planetary Rover 11/19/03 -CSE 43 Conclusions: Software Reliability (SWR) There is no universal SWR metric to optimize Need to balance stakeholder SWR value propositions Increasing need for value-based approaches to SWR Methods and models emerging to address needs Faster, better, cheaper is feasible If feature content can be a dependent variable Quality is free for stable, high-value, long-life systems But not for dynamic, lower-value, short life systems Future trends intensify SWR needs and challenges Criticality, complexity, decreased control, faster change 11/19/03 -CSE 44 22