Software Cost and Productivity Model



Similar documents
Headquarters U.S. Air Force. Building CERs & SERs for Enterprise Resource Planning (ERP)

Software Cost Estimation Metrics Manual for Defense Systems

A DIFFERENT KIND OF PROJECT MANAGEMENT

Speeding up Level 3 CMM Certification Process with Estimation Tool General Dynamics Calgary

SEER for Software - Going Beyond Out of the Box. David DeWitt Director of Software and IT Consulting

A DIFFERENT KIND OF PROJECT MANAGEMENT: AVOID SURPRISES

An Increase in Software Testing Robustness: Enhancing the Software Development Standard for Space Systems

Using Parametric Software Estimates During Program Support Reviews

The Joint Integrated Analysis Tool (JIAT) Where Good Analysis Starts

Recognizing and Mitigating Risk in Acquisition Programs

Cost Estimation Strategies COST ESTIMATION GUIDELINES

Error Cost Escalation Through the Project Life Cycle

Cost Estimation for Secure Software & Systems

Estimating Software Maintenance Costs: The O&M Phase

IDC Reengineering Phase 2 & 3 US Industry Standard Cost Estimate Summary

Propulsion Gas Path Health Management Task Overview. Donald L. Simon NASA Glenn Research Center

Communication Management Unit : Single Solution of Voice and Data Routing Unit

The Analysis of Quality Escapes in the Aerospace & Defense Industry

MACMILLAN/McGRAW-HILL. MATH CONNECTS and IMPACT MATHEMATICS WASHINGTON STATE MATHEMATICS STANDARDS. ESSENTIAL ACADEMIC LEARNING REQUIREMENTS (EALRs)

Adaptive Cruise Control System Overview

Software cost estimation. Predicting the resources required for a software development process

Signal to Noise Instrumental Excel Assignment

Handbook for Software Cost Estimation

Schedule Risk Analysis: Why It is Important and How to Do It

Chapter 23 Software Cost Estimation

Technical Report CMU/SEI-88-TR-024 ESD-TR

Fundamentals of Measurements

Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project

Rapid Prototyping: Leapfrogging into Military Utility

Advancing the U.S. Air Force Mission

6.0 RELIABILITY ALLOCATION

Model Based Software Development for DDG 1000 Advanced Gun System

Comparative Analysis of COCOMO II, SEER-SEM and True-S Software Cost Models

Software Intensive Systems Cost and Schedule Estimation

How DCMA Helps To Ensure Good Measurements

(Refer Slide Time: 01:52)

Current Defect Density Statistics

UNCLASSIFIED. UNCLASSIFIED United States Special Operations Command Page 1 of 7 R-1 Line #261

Does function point analysis change with new approaches to software development? January 2013

Compressed Natural Gas Study for Westport Light Duty, Inc. Kelley Blue Book Irvine, California April 3, 2012

SLIM Estimate and Microsoft Project Best Practices

OpenSplice DDS. Angelo CORSARO, Ph.D. Chief Technology Officer OMG DDS Sig Co-Chair PrismTech.

Improving ERP Estimating

Estimation Model for Integrated Logistics Support Cost and Annual Recurrent Expenditure in C3 Projects

Summary of GAO Cost Estimate Development Best Practices and GAO Cost Estimate Audit Criteria

Prescriptive Analytics. A business guide

IBM Maximo for Aviation

Knowledge-Based Systems Engineering Risk Assessment

Software Development Principles Applied to Graphical Model Development

Service Availability Metrics

ZIMBABWE SCHOOL EXAMINATIONS COUNCIL. COMPUTER STUDIES 7014/01 PAPER 1 Multiple Choice SPECIMEN PAPER

Architectures for Distributed Real-time Systems

Parameters for Efficient Software Certification

QuickStart Guide vcenter Server Heartbeat 5.5 Update 2

Finally, Article 4, Creating the Project Plan describes how to use your insight into project cost and schedule to create a complete project plan.

Cost Estimating Software for General Aviation Aircraft Design

Rapheal Holder From Platform to Service in the Network Centric Value Chain October 23, Internal Information Services

Example Software Development Process.

Software cost estimation

Development of a Ground System Architecture Test Bed Array

Ames Consolidated Information Technology Services (A-CITS) Statement of Work

Vdot A Revolutionary Tool for Space Logistics Campaign Planning and Simulation

Making the Business Case for Industrial Base and Supply Chain Management for the Aerospace and Defense Industry

CALCULATING THE COSTS OF MANUAL REWRITES

Managing Commercial-Off-the- Shelf (COTS) Integration for High Integrity Systems: How Far Have We Come? Problems and Solutions in 2003

Current Standard: Mathematical Concepts and Applications Shape, Space, and Measurement- Primary

Data Intensive Science and Computing

Addressing the Real-World Challenges in the Development of Propulsion IVHM Technology Experiment (PITEX)

Valuation of Software Intangible Assets

USE OF PYTHON AS A SATELLITE OPERATIONS AND TESTING AUTOMATION LANGUAGE

Software cost estimation

Software Development Cost and Time Forecasting Using a High Performance Artificial Neural Network Model

Rotorcraft Health Management System (RHMS)

SoftwareCostEstimation. Spring,2012

Program Management Toolkit Concept and Contents

Energy Efficient MapReduce

Defense Contract Management Agency

Content Map For Career & Technology

DYNAMIC PROJECT MANAGEMENT WITH COST AND SCHEDULE RISK

INTEGRITY AND CONTINUITY ANALYSIS OCTOBER TO DECEMBER 2013 QUARTERLY REPORT FROM GPS. Integrity and Continuity Analysis 08/01/14 08/01/14 08/01/14

Brillig Systems Making Projects Successful

Math Placement Test Practice Problems

Avoid software project horror stories. Check the reality value of the estimate first!

Article 3, Dealing with Reuse, explains how to quantify the impact of software reuse and commercial components/libraries on your estimate.

Study on Real-Time Test Script in Automated Test Equipment

COMBATSS-21 Scalable combat management system for the world s navies

Rapid Modular Software Integration (RMSI)

Transcription:

Software Cost and Productivity Model presented to Ground Systems Architecture Workshop 2004 Manhattan Beach, California presented by J. E. Gayek, L. G. Long, K. D. Bell, R. M. Hsu, and R. K. Larson* The Aerospace Corporation 31 March 2004 *Currently with the National Aeronautical and Space Administration 1 2004 The Aerospace Corporation

Acknowledgements Sponsorship and funding for this research came from an FY03 Aerospace Corporation internal research & development initiative Special thanks are extended to following for their support: Nancy Kern Carolyn Latta 2

Topics of Presentation Background Data Collection & Definitions Data Characterization Software Metrics Conclusion 3

Topics of Presentation Background Data Collection & Definitions Data Characterization Software Metrics Conclusion 4

Rationale for Project Software is a vital part of space & ground systems Software development is often a cost & schedule risk to acquisition Amount of effort is under-estimated Productivity of staff is over-estimated Cost for unit of labor is under-estimated Assessing the severity of cost & schedule risk of new software can be gauged by examining development of completed modules of similar functionality Aerospace previously developed database & estimating relationships (1996) to address this need Used by SMC / NRO / NASA Update needed to investigate impact of current techniques and languages 5

Areas of Investigation Productivity What is the likelihood the contractor can achieve a development rate of X source lines of code (SLOC) per developer-month (DM)? What is the approximate distribution of productivity? Schedule What kind of schedule duration can be expected for a given SLOC size? Effort How many DM are required to develop software of a given SLOC size, operating environment, and application domain? 6

Topics of Presentation Background Building the Analysis Database Data Characterization Software Metrics Conclusion 7

Building the Analysis Database Call for data went to: Aerospace program offices Government organizations that Aerospace supports (e.g., SMC, Air Force Cost Analysis Agency) DoD and civilian agencies Aerospace does not support on routine basis (e.g., Navy Cost Analysis Division) Contractors in the space industry Data sought on development efforts that commenced or completed after January 1, 1996 Software development comprises product design, code, and CSCI testing Front-end software system requirements and tail-end system-level integration and test not included 8

Software Complexity Software complexity expressed in terms of operational environment and application domain Operational Environment (Platform) Military Ground Avionics Military Mobile Unmanned Space Application Domain Command/Control Signal Processing Database Simulation Mission Planning Support Operating System (OS) / Executive Test 9

Definitions of Operating Environments Military Ground Military Mobile Mil-Spec Avionics Unmanned Space This environment includes ground-based mission-critical software. The hardware platform is usually located at a fixed site where diagnosis of faults can be performed readily. The developer may even be on-site or readily available for updates and revisions. Since consequences of failure are least here (being readily fixable) of the four environments, demands on reliability are not as high. Software residing in a vehicle, such as van, trailer, deployable structure, or ship. Operational consequences of a software fault are higher than for fixed sites. Software may interact with hardware through real-time measurements or openloop control of servomechanisms. Software for operation or control of aircraft or similar vehicle control. Often realtime; may contain closed-loop control algorithms or control of hardware. May operate critical instrument gauges, perform measurements or perform data reduction from sensors. Required reliability is high. Software errors may result in loss of mission, vehicle, and/or life. Similar to MIL-SPEC Avionics in requirements for reliability and impact of mission. Defects in system control software may result in loss of mission. Other software may be revised and uploaded depending on circumstance and vehicle design. Operating environment is often difficult to replicate on the ground, so data-processing software may require extensive advance simulation to validate its effectiveness. Often operates in real-time, performs measurements with hardware interaction, controls vehicle with closed-loop feedback. SEER-SEM Manual, version 6.0.28, December 2002 (Galorath Inc., 100 N. Sepulveda Blvd., El Segundo, Calif., 90245) 10

Definitions of Software Domains (1/2) Command & Control Database Mission Planning Operating System/Executive Signal Processing Examples include network monitoring, network control and switching, sensor control, signal/telemetry processing, message processing, data reduction/analysis, mission control, and command processing. Software that collects, stores, organizes and indexes information. Examples include database generation and database management systems. Software used to support mission-planning activities such as space mission planning, aircraft mission planning, scenario generation, feasibility analysis, route planning, and image/map manipulation. Software that controls basic hardware operations and serves as a platform for applications to run. Multi-user operating systems provide management and security of system users. Operating system functions may include network, security, file management, device drivers, display drivers, multi-processing, multitasking, multi-threading, and real time operating systems. Software used to enhance, transform, filter, convert, or compress data signals. Signal processing has application in many areas such as communications, flight system, sonar, radar, and medical systems. Large volumes of data are processed using complex algorithms, often with real time operating requirements. SEER-SEM Manual, version 6.0.28, December 2002 (Galorath Inc., 100 N. Sepulveda Blvd., El Segundo, Calif., 90245) 11

Definitions of Software Domains (2/2) Simulation Support Test Software that evaluates numerous scenarios and summarizes processes or events to simulate physical processes, complex systems or other phenomenon that may not have simple empirical relationships. Examples include environment simulation, system simulation, emulation, process flow, network simulation, operations flow, and system reliability programs. All software used to aid the development, and testing and support of applications, - systems, test and maintenance, and trainer software Software used for testing and evaluating hardware and software systems. Examples include test case generation, test case data recording, test case data reduction/analysis, and test driver / stub tools and programs. SEER-SEM Manual, version 6.0.28, December 2002 (Galorath Inc., 100 N. Sepulveda Blvd., El Segundo, Calif., 90245) 12

Data Requested Program name CSCI Name / Functional Description Language* SLOC (New/Actuals)* SLOC (Original Estimate)* SLOC (Revised Estimate) SLOC (Reuse)* SLOC (Equivalent)* Labor Effort (Man-months)* Labor Effort (Hours) COTS Packages (Type) COTS Integration Code (SLOC) COTS Effort (Man-months) COTS Effort (Hours) Software Level* Operating Environment* Application Domain* Software Development Completion (Year) Peak Development Staff (Number) Data Source / Comments *Key quantity to basic analyses 13

Data Dictionary (1/2) Program Name CSCI Name/Functional Description Language SLOC (New/Actuals) SLOC (Original Estimate) SLOC (Revised Estimate) SLOC (Reuse) SLOC (Equivalent) Labor Effort (Man-months) Labor Effort (Hours) The name of the program for the information being provided. The name of the CSCI and/or Functional Description. The primary computer programming language used in the software development. Enter the actual source lines of code (SLOC) provided from the delivered development. Enter the estimated source lines of code (SLOC) provided from the beginning of the project. Enter any revised SLOC estimate since the beginning of the project but prior to the delivered development. Enter the SLOC size designated as reuse or modified code. Enter the equivalent SLOC size when the element contains a combination of new and reused code. Enter the labor effort in man-months required to develop the software. Enter the labor effort in hours required to develop the software. 14

Data Dictionary (2/2) COTS Packages (Type) COTS Integration Code (SLOC) COTS Effort (Man-months) COTS Effort (Hours) Software Level Operating Environment Application Domain Software Development Completion (Yr) Peak Development Staff (Number) Data Source/Comments Major emphasis of COTS package (e.g. scientific, administrative). Amount of code required to integrate a COTS product. Effort required to get familiar with data & testing of COTS products in manmonths. Effort required to get familiar with data & testing of COTS products in labor hours. The software level of the data record (e.g. Project, CSCI, CSC) The environment which best describes the primary mission of the software (e.g. mil-ground, unmanned space). The application which best describes the primary function of the software (e.g. mission planning, command & control, test). The year in which the software development was completed. The peak staffing per month, in man-months for this development. List the source of data record and other comments that may be warranted. 15

Adjustments to Data for Analysis Not all responses included SLOC(Equivalent) when SLOC(New/Actuals) and SLOC(Reuse) are given SLOC(Equivalent) is a key parameter for the analyses Created an estimate of SLOC(Equivalent) using SEER-SEM (default values) with the given values of SLOC(New/Actuals), SLOC(Reuse), operating environment and application domain Not all responses reporting software levels as CSCI and Project made sense For purposes of analysis, activities with SLOC(Equivalent) < 200K were treated as CSCI; otherwise, the activity was treated as a Project From here onward, SLOC will refer to SLOC(Equivalent), unless stated otherwise 16

Topics of Presentation Background Building the Analysis Database Data Characterization Software Metrics Conclusion 17

Data Record Summary ENVIRONMENT APPLICATION MILITARY GROUND MILITARY MOBILE MIL-SPEC AVIONICS UNMANNED SPACE COMMAND/CONTROL 45 35 2 25 DATABASE 13 5 4 0 MISSION PLANNING 45 14 1 1 OS/EXECUTIVE 9 3 9 5 SIGNAL/PROCESSING 27 21 2 11 SIMULATION 19 0 1 25 SUPPORT 104 1 1 4 TEST 13 2 4 1 SUBTOTALS 275 81 24 72 TOTAL DATA BASE SIZE = 452 Records 18

Observations None of the data records contain COTS information Is it being collected? Did respondents choose not to provide data on CSCIs with COTS? Is there less use of COTS than expected? None of the data records contain labor effort in hours Use reported labor in man-months for productivity analyses Although some data providers gave the cost to produce the code, such information was not used Conversions would produce uncertainties 186 records contain data useful for productivity analysis 27 records contain data useful for code growth analysis 22 records contain both code growth and productivity information 19

Topics of Presentation Background Building the Analysis Database Data Characterization Software Metrics Conclusion 20

Summary of Productivity Analysis 186 records contain productivity information C, C++, and Ada are the languages predominantly reported In 1996 study, most languages reported were Fortran, Jovial, and other higher order languages Comparing productivity distributions with 1996 data indicates an increase in median productivity levels 31% increase in combined military ground & military mobile environments (2003 study median: 138 SLOC/DM) 33% increase in combined military-spec avionics & unmanned space (2003 study median: 64 SLOC/DM) Median productivity of C/C++ in the combined military ground & military mobile environments is 148 SLOC/DM 21

Probability that the software development... 1.0 0.8 0.6 0.5 0.4 0.2 0.0 Productivity (Military Ground & Military Mobile) 105 138 Productivity (2003) Productivity (1996) 50th percentile (2003) 50th percentile (1996) 0 100 200 300 400 500 600...progressed at LESS THAN this rate in lines/developer-months 1996 graph based on 112 Mil-Ground & Mil-Mobile Softw are Data Records 2003 graph based on 135 Mil-Ground & Mil-Mobile Softw are Data Records 22

Productivity (C and C++) 1 0.8 0.6 0.5 0.4 0.2 Productivity (2003) 50th percentile 0 148 0 100 200 300 400 500 600...progressed at LESS THAN this rate in lines/developer-months Based on 41 C & C++ Military Ground & Military Mobile Software Data Records Probability that the software development 23

Summary of Code Growth Analysis 27 data records contain code growth information All associated with military mobile environment None of these records had reused code Analysis based on code growth multiplier Multiplier = SLOC(New/Actuals) / SLOC(Original Estimate) 80% of these records show code growth 60% of the records have multipliers of at least 1.2 Summary statistics on code growth multipliers: Low: 0.52 Average: 1.49 High: 5.01 24

Distribution of Code Growth Multipliers Frequency 7 100% 90% 6 80% 5 70% 4 60% 50% 3 40% 2 30% 20% 1 10% 0 0% 0.2 0.6 1.0 1.4 1.8 2.2 2.6 3.0 (> 3) Multiplier Bin intervals: 0.2(k-1) m 0.2k, k = 1, 2,, 15, and m > 3.0 Low: 0.52 Ave: 1.49 High: 5.01 25

Summary of Development Schedule Data indicates that it is possible to complete moderately sized records (i.e., scores of KSLOC) in the same time to complete small records (i.e., thousands of SLOC) Larger activities can have increased staffing, working in parallel There are apparent minimum and maximum schedules No records lasted more than 52 months Activities projected to last more than 48 months (the apparent maximum) might be cancelled, renamed, redirected or restarted and do not appear in the database The apparent minimum appear to be a linear function of SLOC MinSched[months] = 4.5 + 0.000139*SLOC 99% (180 out of 182) of the applicable records fall within the apparent minimum and maximum schedule bounds 26

Distribution of Project Schedule 70 Development Schedule (Months) 60 50 40 30 20 10 Apparent "Maximum" Schedule Apparent "Minimum" Schedule 0 0 50,000 100,000 150,000 200,000 250,000 300,000 Based on 182 Military Software Records Number of Source Lines of Code 27

Cost Estimating Relationship Analysis Nonlinear regression analysis (i.e., curve fitting) conducted to find relationship between SLOC and the associated number of DM Executed on environment / application pairs with at least 5 data points The resulting functional relationship can be called a cost estimating relationships (CER) if the number of DM is equated with cost. Results in 11 CERs (down from 12 CERs in 1996) Gains in 2003 Losses in 2003 Military Mobile/Mission Planning Military Ground/Database Military mobile/signal Processing Military Ground/Mission Planning Mil-Spec Avionics/Oper. Sys/Executive Unmanned Space/Database Unmanned Space/Test None of the environments has a database CER using 2003 study data 28

CER Results Application Operating Environment Domain Military Ground Military Mobile Mil-Spec Avionics Unmanned Space # pts 9 30 2 10 Range 6,000-54,400 5,000-111,256 13,298-155,077 Command & CER DM = (8.900x10-3 )*SLOC - 12.21 DM = (1.351x10-3 )*SLOC 1.171 DM = (1.687x10-2 )*SLOC Control %SEE 42.2% 28.2% 32.3% %Bias 0.00% 0.00% 0.0% R 2 0.7611 0.8261 0.9817 # pts 3 5 4 0 Range 11,123-38,519 Database CER (Not provided; R 2 < 0.1) %SEE ---- %Bias ---- R 2 < 0.1 # pts 4 13 1 1 Range 5,169-77,693 Mission CER DM = (6.990x10-3 )*SLOC + 41.39 Planning %SEE 27.0% %Bias 0.00% R 2 0.8796 # pts 4 3 9 2 Range 2,500-94,809 O/S Exec CER DM = (5.115x10-3 )*SLOC + 172.9 %SEE 51.0% %Bias 0.00% R 2 0.5597 # pts 9 21 2 8 Range 13,530-143,026 4,270-140,925 2,200-66,700 Signal CER DM = (5.561x10-3 )*SLOC + 21.91 DM = (9.354x10-3 )*SLOC + 33.99 DM = (2.105x10-2 )*SLOC - 38.65 Processing %SEE 53.8% 34.3% 79.0% %Bias 0.00% 0.00% 0.00% R 2 0.6580 0.9375 0.8723 # pts 5 0 0 3 Range 11,100-140,200 Simulation CER DM = (3.766x10-6 )*SLOC 1.641 + 48.42 %SEE 31.3% %Bias 0.00% R 2 0.9807 # pts 17 1 1 2 Range 2,562-154,378 Support CER DM = (3.051x10-4 )*SLOC 1.256 + 23.48 %SEE 33.1% %Bias 0.00% R2 0.9491 # pts 9 2 4 1 Range 6,113-34,000 Test CER DM = (3.065x10-4 )*SLOC 1.326 %SEE 48.7% %Bias 0.00% R2 0.477 29

CER & Associated Data (Military Ground/Support) Total Effort (Developer-Months) 1,200 800 400 # pts 17 Range 2,562-154,378 CER DM = (3.051x10-4 )*SLOC 1.256 + 23.48 %SEE 33.1% %Bias 0.00% R2 0.9491 CER and Best Fit to Data 0 0 40,000 80,000 120,000 160,000 Based on 17 Military Ground/Support Records Software Size (Source Lines of Code) 30

Topics of Presentation Background Data Building the Analysis Database Data Characterization Software Metrics Conclusion 31

Conclusions Median software productivity appears to have increased since 1996 study Use of newer languages (e.g., C, C++) may be major factor Code growth continues to be an issue Majority of records with this data show growth, one as much as 5x Data not available in several areas COTS Reason unknown Labor effort in hours Could be obtained if earned value management (EVM) data collected in dollars and hours Need to make sure SPOs are acquiring the right data now so future analyses can be performed 32