Comparative Analysis of COCOMO II, SEER-SEM and True-S Software Cost Models

Size: px
Start display at page:

Download "Comparative Analysis of COCOMO II, SEER-SEM and True-S Software Cost Models"

Transcription

1 Comparative Analysis of COCOMO II, SEER-SEM and True-S Software Cost Models Raymond Madachy, Barry Boehm USC Center for Systems and Software Engineering {madachy, 1. Abstract We have been assessing the strengths, limitations, and improvement needs of cost, schedule, quality and risk models for NASA flight projects. The primary cost models used in this domain for critical flight software are COCOMO II, SEER-SEM and True S. A comparative survey and analysis of these models against a common database of NASA projects was undertaken. A major part of this work is defining transformations between the different models by the use of Rosetta Stones that describe the mappings between their cost factors. With these Rosetta Stones, projects can be represented in all models in a fairly consistent manner and differences in their estimates better understood. Top-level Rosetta Stones map the factors between the models, and the detailed ones map the individual ratings between the corresponding factors. Most of the Rosetta Stone mappings between factors are one to one, but some are one to many. The Rosetta Stones we have developed so far allow one to convert COCOMO II estimate inputs into corresponding SEER-SEM or True S inputs, or vice-versa. NASA data came in the COCOMO format and was converted to SEER-SEM and True S factors per the Rosetta Stones. This initial study was largely limited to a COCOMO viewpoint. The current Rosetta Stones need further review and have to deal with incommensurate quantities from model to model. The cost models performed well when assessed against the NASA data despite these drawbacks, the absence of contextual data and potential flaws in the factor transformations. The current set of Rosetta Stones has provided a usable framework for analysis, but more should be done including developing two-way and/or multiple-way Rosetta Stones, and partial factor-to-factor mappings. Factors unique to some models should be addressed and detailed translations between the size inputs should be developed including COTS and reuse sizing. Remaining work also includes elaborating the detailed Rosetta Stone for the new True S model, and rigorous review of all the top-level and detailed Rosetta Stones. Conclusions for existing model usage and new model development are provided. In practice no one model should be preferred over all others, and it is best to use a variety of 1

2 methods. Future work involves repeating the analysis with the refined Rosetta Stones, updated calibrations, improved models and new data. 2. Introduction This research is assessing the strengths, limitations, and improvement needs of existing cost, schedule, quality and risk models for critical flight software for the NASA AMES project Software Risk Advisory Tools. This report focuses only on the cost model aspect. A comparative survey and analysis of cost models used by NASA flight projects is described. The models include COCOMO II, SEER-SEM and True S. We look at evidence of accuracy, the need for calibration, and the use of knowledge bases to reflect specific domain factors. The models are assessed against a common database of relevant NASA projects. The overriding primary focus is on flight projects, but part of the work also looks at related sub-domains for critical NASA software. They are assessed as applicable in some of the following analyses. This report also addresses the critical NASA domain factors of high reliability and high complexity, and how the cost models address them. Transformations between the models are also developed, so projects can be represented in all models in a consistent manner and to help understand why estimates may vary. There is a more thorough treatment of the USC public domain COCOMO II related models as they planned for usage in the current research, and the current datasets are in the COCOMO format. Conclusions for existing model usage and new model development are provided. The SEER-SEM and True S model vendors provided minor support and have identified areas of improvement as described later in this report. 2.1 Cost Models Used The most frequently used cost and schedule models for critical flight software being evaluated are the COCOMO II, True S (previously PRICE S) and SEER-SEM parametric models. COCOMO II is a public domain model that USC continually updates and is implemented in several commercial tools. True S and SEER-SEM are both proprietary commercial tools with unique features but also share some aspects with COCOMO. All three have been extensively used and tailored for flight project domains. Other industry cost models such as SLIM, Checkpoint and Estimacs have not been nearly as frequently used for flight software and are more oriented towards business applications. A previous comparative survey of software cost models can be found in [Boehm et al. 2000b]. A previous study at JPL analyzed the same three models, COCOMO II, SEER-SEM and PRICE S with respect to some of their flight and ground projects [Lum et al. 2001]. In that case each model estimate was a separate data point. This current approach varied since the data used only came in the COCOMO model format and required translation to the other models. 2

3 2.1.1 COCOMO II The COCOMO (COnstructive COst MOdel) cost and schedule estimation model was originally published in [Boehm 1981]. The COCOMO II research effort was started in 1994, and the model continues to be updated at USC, the home institution of research for the COCOMO model family. COCOMO II defined in [Boehm et al. 2000] has three submodels: Applications Composition, Early Design and Post-Architecture. They can be combined in various ways to deal with different software environments. The Application Composition model is used to estimate effort and schedule on projects typically done as rapid application development. The Early Design model involves the exploration of alternative system architectures and concepts of operation. Typically, not enough is known to make a detailed fine-grained estimate. This model is based on function points (or lines of code when available) and a set of five scale factors and seven effort multipliers. The Post-Architecture model is used when top level design is complete and detailed information about the project is available and the software architecture is well defined. It uses Source Lines of Code and/or Function Points for the sizing parameter, adjusted for reuse and breakage; a set of 17 effort multipliers and a set of five scale factors that determine the economies/diseconomies of scale of the software under development. USC provides a public domain tool for COCOMO II. The primary vendor tools that offer the COCOMO II model family include the following: Costar offered by Softstar Systems has a complete COCOMO II implementation with tools for calibration and the Constructive Systems Engineering Model (COSYSMO). See Softstar Systems provided a COCOMO II calibration spreadsheet used in support of this research (see Acknowledgments). The Cost Xpert tool offered by the Cost Xpert Group has a superset of the COCOMO II post-architecture submodel. It has additional linear cost drivers and additional constraint factors on effort and schedule. See The True Planner tool from PRICE Systems has a COCOMO II model that can be invoked in lieu of the True S model. The remainder of this report only considers the COCOMO II post-architecture submodel True S (formerly PRICE S) True S is the updated product to the PRICE S model offered by PRICE Systems. PRICE S was originally developed at RCA for use internally on software projects such as the Apollo moon program, and was then released in 1977 as a proprietary model. Many of the model s central algorithms were published in [Park 1988]. See the PRICE Systems website at The PRICE S model consists of three submodels that enable estimating costs and schedules for the development and support of computer systems. The model covers 3

4 business systems, communications, command and control, avionics, and space systems. PRICE S includes features for reengineering, code generation, spiral development, rapid development, rapid prototyping, object-oriented development, and software productivity measurement. Size inputs include SLOC, function points and/or Predictive Object Points (POPs). The switch to True S is taking place during this work. Hence some of the descriptions retain the old PRICE S terminology (such as the Rosetta Stone) while we move towards a complete True S implementation. All numeric estimate results are for the latest True S model SEER-SEM SEER-SEM is a product offered by Galorath, Inc. This model is based on the original Jensen model [Jensen 1983], and has been on the market for over 15 years. Its parametric modeling equations are proprietary. Descriptive material about the model can be found in [Galorath-Evans 2006]. The scope of the model covers all phases of the project life-cycle, from early specification through design, development, delivery and maintenance. It handles a variety of environmental and application configurations, and models different development methods and languages. Development modes covered include object oriented, reuse, COTS, spiral, waterfall, prototype and incremental development. Languages covered are 3rd and 4th generation languages (C++, FORTRAN, COBOL, Ada, etc.), as well as application generators. The SEER-SEM cost model allows probability levels of estimates, constraints on staffing, effort or schedule, and it builds estimates upon a knowledge base of existing projects. Estimate outputs include effort, cost, schedule, staffing, and defects. Sensitivity analysis is also provided. Many sizing methods are available including lines of code and function points. See the Galorath Inc. website at 3. Model Comparison This section describes the major similarities and differences between the models. Analyses of their performance against project data, calibration and knowledge bases are addressed in Sections 4 and Algorithms As described in [Lum et al. 2001], all three models essentially boil down to the common effort formula shown in Figure 1. Size of the software is provided in a number of available units, cost factors describe the overall environment and calibrations may take the form of coefficients adjusted for actual data or other types of factors that account for domain-specific 4

5 attributes. The total effort is calculated and then decomposed by phases or activities according to different schemes in the model. Size Cost Factors Effort = A * Size B * EM Effort Phase and Activity Calibrations Decomposition Effort in person-months A - calibrated constant B - scale factor EM - effort multiplier from cost factors Figure 1: Common Core Effort Formula All models allow size to be expressed as lines of code, function points, object-oriented metrics and others. Each model has its own respective cost factors for the linear effort multiplier term, and each model specifies the B scale factor in slightly different ways (either directly or through other factors). True S and SEER-SEM models have factors for the project type or domain, which COCOMO II currently does not. The model WBS phases and activities are addressed in section Size All models support size inputs for new and adapted software, where adapted software can be modified or reused without change. Automatically translated or generated code is also supported in some of the models. The models differ with respect to their detailed parameters for the categories of software as shown in Table 1. Commercial Off-The-Shelf (COTS) software is not addressed, but is a future research activity. COCOMO II can treat COTS as reused software or be used in conjunction with the COCOTS model [Boehm et al. 2000]. SEER-SEM and True S have more extensive COTS models. Table 1: Model Size Inputs COCOMO II Size Inputs SEER-SEM Size Inputs True S Size Inputs New Software New Size New Size New Size New Size Non-executable Adapted Software Adapted Size Pre-exists Size 2 Adapted Size % Design Modified (DM) Deleted Size Adapted Size Non-executable % Code Modified (CM) Redesign Required % % of Design Adapted % Integration Required (IM) Reimplementation Required % % of Code Adapted 5

6 Assessment and Assimilation (AA) Retest Required % % of Test Adapted Software Understanding (SU) 1 Reused Size Programmer Unfamiliarity (UNFM) 1 Reused Size Non-executable Deleted Size Code Removal Complexity Automatically Translated and Generated Code Adapted SLOC Auto Generated Code Size Automatic Translation Productivity Auto Generated Size Non-executable % of Code Reengineered Auto Translated Code Size Auto Translated Size Non-executable 1 - Not applicable for reused software 2 Specified separately for Designed for Reuse and Not Designed for Reuse COCOMO II allows for sizing in SLOC or function points. SEER-SEM and True S provide both of those along with additional size units. User-defined proxy sizes can be developed for any of the models and converted back to SLOC or function points. Future work can also be undertaken to develop model translations between the size input parameters. These would consist of rules or guidelines to convert size inputs between models, and can be supplemented with knowledge base settings. 3.3 Cost Factor Rosetta Stones This section describes the mappings, or transformations between cost factors in the different models. With this information COCOMO II estimate inputs can be converted into corresponding SEER-SEM or True S (or PRICE S) inputs, or vice-versa. It also illustrates differences in the models to help understand why estimates may vary. Toplevel Rosetta Stones map the factors between the models, and the detailed ones map the individual ratings between the corresponding factors. An integrated top-level Rosetta Stone for all of the COCOMO II factors is shown in Table 2. Most of the mappings between factors are one to one, but some are one to many (e.g. SEER-SEM has platform factor ratings split into target and host). In the case of True S, many of the COCOMO II factors have direct corollaries to sub-factors in aggregate True S factors. For example the COCOMO personnel factors are represented as sub-factors under the aggregate True S factor for Development Team Complexity. Table 3 and Table 4 show the additional factors in SEER-SEM and True S for which there are no analogs in COCOMO II. Table 2: Integrated Top-Level Rosetta Stone for COCOMO II Factors COCOMO II Factor SEER-SEM Factor(s) True S Factor(s) SCALE DRIVERS Precedentedness none none 6

7 Development Flexibility none Operating Specification Architecture/Risk Resolution none none Team Cohesion none Development Team Complexity Process Maturity none 1 Organization Productivity - CMM Level PRODUCT ATTRIBUTES Required Software Reliability Specification Level - Operating Specification Reliability Data Base Size none Code Size non Executable Product Complexity - Complexity (Staffing) Functional Complexity - Application Class Complexity Required Reusability - Reusability Level Required Design for Reuse - Software Impacted by Reuse Documentation Match to none Operating Specification Lifecycle Needs PLATFORM ATTRIBUTES Execution Time Constraint Time Constraints Project Constraints - Communications and Timing Main Storage Constraint Memory Constraints Project Constraints - Memory & Performance Platform Volatility - Target System Volatility Hardware Platform Availability 3 - Host System Volatility PERSONNEL ATTRIBUTES Analyst Capability Analyst Capability Development Team Complexity - Capability of Analysts and Designers Programmer Capability Programmer Capability Development Team Complexity - Capability of Programmers Personnel Continuity none Development Team Complexity - Team Continuity Application Experience Application Experience Development Team Complexity - Familiarity with Platform Platform Experience - Development System Experience - Target System Experience Development Team Complexity - Familiarity with Product Language and Toolset Experience Programmer s Language Experience Development Team Complexity - Experience with Language PROJECT ATTRIBUTES Use of Software Tools Software Tool Use Design Code and Test Tools Multi-site Development Multiple Site Development Multi Site Development Required Development Schedule none 2 Start and End Date 1 - SEER-SEM Process Improvement factor rates the impact of improvement, not the CMM level 2 - Schedule constraints handled differently 3 - A software assembly input factor 7

8 Table 3: SEER-SEM Cost Factors with no COCOMO II Mapping PERSONNEL CAPABILITIES AND EXPERIENCE Practices and Methods Experience DEVELOPMENT SUPPORT ENVIRONMENT Modern Development Practices Logon thru Hardcopy Turnaround Terminal Response Time Resource Dedication Resource and Support Location Process Volatility PRODUCT DEVELOPMENT REQUIREMENTS Requirements Volatility (Change) 1 Test Level 2 Quality Assurance Level 2 Rehost from Development to Target PRODUCT REUSABILITY Software Impacted by Reuse DEVELOPMENT ENVIRONMENT COMPLEXITY Language Type (Complexity) Host Development System Complexity Application Class Complexity 3 Process Improvement TARGET ENVIRONMENT Special Display Requirements Real Time Code Security Requirements 1 COCOMO II uses the Requirements Evolution and Volatility size adjustment factor 2 Captured in the COCOMO II Required Software Reliability factor 3 Captured in the COCOMO II Complexity factor Table 4: True S Cost Factors with no COCOMO II Mapping To-be-provided 8

9 3.3.1 COCOMO II to SEER-SEM Table 5 shows the detailed correspondence between COCOMO II and SEER-SEM factors with guidelines to convert ratings between the two models for applicable factors. In some cases the SEER-SEM factors cover different ranges than COCOMO and some of the conversions in Table 5 are best approximations. Not all factors have direct corollaries. The settings of the SEER-SEM factors may be defaulted according to project type and domain choices in the knowledge bases. COCOMO II Factor(s) Table 5: COCOMO II to SEER-SEM Factors SCALE DRIVERS Precedentedness none Development Flexibility none Architecture/Risk Resolution none Team Cohesion none Process Maturity none 1 PRODUCT ATTRIBUTES SEER-SEM Factor(s) Required Software Reliability Specification Level - Reliability 2 Very Very - Very + Data Base Size none Product Complexity Complexity (Staffing) 3 Very Very Extra Required Reusability Very Extra Documentation Match to Lifecycle Needs PLATFORM ATTRIBUTES Execution Time Constraint Very Extra Main Storage Constraint Very Very Extra Reusability Level Required none Very Extra Time Constraints Very Memory Constraints 9

10 COCOMO II Factor(s) Very Extra Platform Volatility Very PERSONNEL ATTRIBUTES Analyst Capability Very Very Programmer Capability Very Very Personnel Continuity Application Experience Very Very Platform Experience Very Very Language and Toolset Experience Very Very PROJECT ATTRIBUTES Use of Software Tools Very SEER-SEM Factor(s) Very Extra Target System Volatility, Host System Volatility Very Extra Analyst Capabilities Very Very Programmer Capabilities Very Very none Analyst s Application Experience Very + Development System Experience, Target System Experience Very Very Extra Programmer s Language Experience Very Very Extra Automated Tools Use Very 10

11 COCOMO II Factor(s) Very Multi-site Development Very Very Extra Required Development Schedule none 4 SEER-SEM Factor(s) Very Multiple Site Development Extra Very or or 1 - SEER-SEM Process Improvement factor rates the impact of improvement instead of the CMM level 2 Related SEER-SEM factors include Test Level and Quality Assurance Level which are also usually driven by reliability requirements 3 SEER-SEM also has Application Class Complexity to rate at the program level, and other complexity factors for the development environment 4 - Schedule constraints handled differently in models 11

12 3.3.2 COCOMO II to PRICE S and True S Table 2 showed the top-level view of the COCOMO II to True S Rosetta Stone. Due to the new product being currently phased in, the Rosetta Stone will be refined for the next level of True S subfactors and will be provided at a later date. The more complete Rosetta Stone in Table 6 and Table 7 shows the correspondence to the PRICE S model. The factor names shown are being replaced with the modernized terms in True S in order to elaborate the detailed Rosetta Stone between COCOMO II and True S. 12

13 Table 6: COCOMO II to PRICE S Rosetta Stone COCOMO Factor(s) PRICE S Factor(s) SCALE DRIVERS Precedentedness Development Flexibility Architecture/Risk Resolution Team Cohesion Process Maturity none none none none none PRODUCT ATTRIBUTES Required Software Reliability PLTFM Very Very 1.4 Data Base Size Product Complexity PROFAC APPL Very Very 8.5 Extra Required Reusability CPLX Very +0.3 Extra +0.5 Documentation Match to Lifecycle Needs PLATFORM ATTRIBUTES Execution Time Constraint none UTIL - time Very 0.85 Extra 0.95 Main Storage Constraint Platform Volatility UTIL - memory Very 0.85 Extra 0.95 CPLX Very

14 Table 7: COCOMO II to PRICE S Rosetta Stone (Continued) PERSONNEL ATTRIBUTES Analyst Capability CPLX1 Very Very -0.1 Programmer Capability CPLX1 Very Very -0.1 Personnel Continuity Application Experience none CPLX1 Very Very -0.1 Platform Experience CPLX1 Very Very -0.1 Language and Toolset Experience PROFAC Very Very -0.1 PROJECT ATTRIBUTES Use of Software Tools CPLX1 Very Very Very Very Multi-site Development Required Development Schedule None Development Start date - mandatory 14

15 Another aspect is "normalizing" True S against COCOMO II nominal conditions and matching their diseconomy of scale. A baseline normalization is needed against which factors can be changed to represent the projects already modeled with COCOMO II. Figure 2 shows the normalization between True S and COCOMO II. Effort (Person-Months) COCOMO II (nominal) True complexity = 5.5 True complexity = 5 True complexity = 6 True S Parameters Functional Complexity = 5-6 Operating Specification = 1.0 Organizational Productivity = 1.33 Development Team Complexity = Size (KSLOC) Figure 2: Example of Normalizing True S and COCOMO II The final determined values that most closely match True S to all nominal conditions in COOOMO II are listed below: Functional Complexity is in the range of 5-6 and a value of 5.5 is suggested Organization Productivity = 1.33 Development Team Complexity = 2.5 Operational Specification =

16 3.4 Phases and Activities Reconciliation of the effort work breakdown structures (WBS) is necessary for valid comparison between models. If estimates are to be compared they need to cover the same activities. The common estimate baseline consists of the elaboration and construction phases for software activities (per the COCOMO II default), and shown in Figure 3. Additionally the NASA 94 data came in the COCOMO format and is assumed to cover those activities; hence a model that estimates more must have some activities subtracted out for a valid comparison. The correspondence of the common baseline and the core effort coverage of the different models are also shown in Figure 3. 16

17 Activities Inception Phases Construction COCOMO II Elaboration Maintenance Management Environment/CM Requirements Design Implementation Assessment Deployment Transition True S Design Programming Data SEPGM Q/A CFM Concept System Requirements Software Requirements Preliminary Design Detailed Design Code / Unit Test Integration & Test Hardware / Software Integration Field Test System Integration & Test Maintenance SEER Management SW Reqmnts Design Code Data Prep Test CM QA System Requirements Design Software Requirements Preliminary Design Analysis Detailed Design Code and Unit Test Component Integrate and Test Program Test System Integrate Thru OT&E Legend core effort coverage per model common estimate baseline effort add-on as % of core coverage effort add-on with revised model Figure 3: Model Phases and Activities Coverage 17

18 Due to the differences, the SEER-SEM and True S estimates were refined by subtracting the activities described below True S True S provides a two-tier tree of estimates. The lower tier contains core engineering effort only as shown in Figure 4. The figure shows what elements should be subtracted for the common estimate baseline. The upper tier is a superset line item that includes systems engineering and program management (SEPM) activities. A table of its outputs is shown in Table 5 with the corresponding items to subtract for the common baseline set of activities. Figure 4: Sample True S Engineering Estimate with Effort Items to Subtract 18

19 Table 8: Sample True S SEPM-Level Estimate with Effort Items to Subtract Labor Requirement Table : Engine Control - [Software Assembly] \ Labor Requirements in Months Total 1 Software Maintenance Manage Project Perform Configuration Management Perform Joint Technical Reviews Perform Quality Assurance Plan and Oversee Plan Software Development Write Documentation Analyze System Requirements Design System 6.4 Perform Assembly Integration and 11 Test Software Requirements Analysis Software Design Code and Unit Test Software Integration and Test Software Qualification Test Perform HW/SW Integration and Test 26.1 Perform Software Product Evaluations Perform System Qualification Test Total subtracted effort = revised total = SEER Error! Reference source not found. shows a summarized SEER-SEM estimate and the items subtracted out for this analysis to make equivalent estimates. 19

20 1.1 Program - 81 Activity Management SW Reqmnts Design Code Data Prep Test CM QA Total System Requirements Design S/W Requirements Analysis Preliminary Design Detailed Design Code and Unit Test Component Integrate and Test Program Test System Integrate Thru OT&E Development Total , Maintenance Life Cycle Total , Figure 5: Sample SEER-SEM Estimate with Effort Items to Subtract 3.5 Critical Domain Factors The vendor models provide elaborations of reliability and complexity factors beyond what COCOMO II provides. These are critical domain factors of relevance to NASA flight projects. Table 9 shows how the models address them. Table 9: Vendor Elaborations of Critical Domain Factors COCOMO II SEER-SEM 1 True S Required Software Reliability Product Complexity Specification Level Reliability Test Level Quality Assurance Level Complexity (Staffing) Language Type (Complexity) Host Development System Complexity Application Class Complexity Operating Specification Level (platform and environment settings for required reliability, portability, structuring and documentation) Functional Complexity Application Type Language Language Object-Oriented 1 - SEER-SEM factors supplemented with and may be impacted via knowledge base settings for Platform Application Acquisition method Development method Development standard Class Component type (COTS only) 20

21 SEER-SEM has an extensive set of knowledge base choices. Table 10 shows example knowledge bases applicable to NASA flight projects. Table 10: Example SEER-SEM Knowledge Bases Relevant to NASA Projects Platform Knowledge Bases Avionics Ground System Non Critical Ground-Based Mission Critical Manned Space Unmanned Space Application Knowledge Bases Flight Systems In True S the Operating Specification factor describes the intended operating environment and defines the degree of required reliability, portability, structuring and documentation. Table 11 lists the categories for space software. Additional categories for military environments may also apply. Table 11: True S Operating Specification Choices for Space Projects Space Software Unmanned Manned 4. Model Analysis Against NASA 94 Project Data The research team was provided the NASA 94 set of projects. Of the 95 projects only 13 are listed as flight projects. All the remaining analysis is predicated on those 13 projects alone except noted otherwise where COCOMO II was also applied to five ground embedded projects. The data came in the COCOMO 81 format. They were converted to COCOMO II per the guidelines in [Reifer et al. 2000] and further converted to SEER-SEM or True S factors per the Rosetta Stones in this report. See Appendix A for examples of the original and transformed data. The database covers flight and ground projects, and some ground projects are embedded. An analysis of the critical factor distributions for reliability and complexity indicate that flight projects, as expected, exhibit patterns for both higher reliability and complexity. Figure 6 and Figure 7 show the distributions of these factors in the database. These spreads are as expected, which also supports the contention that these projects provided in the COCOMO 81 format are 21

22 well conditioned. They are internally consistent and standardized in their reporting of these factors Flight Projects Flight Projects Percent of Projects Very Very Rating Ground Embedded Projects Percent of Projects Very Very Rating Ground Embedded Projects Extra Percent of Projects Percent of Projects Very Very Rating Ground Other Projects Very Very Rating Figure 6: Reliability Distribution Percent of Projects Percent of Projects Very Very Extra Rating Ground Other Projects Very Very Extra Rating Figure 7: Complexity Distribution 4.1 Analysis Method The process flow in Figure 8 shows the sequence of steps used in this analysis. Only the first pass through is described in this overall section. In the next iterations with refined and/or additional data not all of the steps will be performed. For example the need for calibration was amply demonstrated in the first round of analysis, and the uncalibrated model runs are not 22

23 necessary in subsequent data iterations. The sequences of the vendor tool runs will also vary slightly to reflect their recommended best practices. Start NASA 94 database Not all steps performed on iterations 2-n Apply COCOMO 81 COCOMO II Rosetta Stone Select relevant domain projects COCOMO II Uncalibrated COCOMO II analysis COCOMO II calibration (via Costar) Outlier analysis SEER-SEM Apply COCOMO II SEER Rosetta Stone Uncalibrated SEER analysis w/ additional factors defaulted Apply SEER knowledge base settings True S Apply COCOMO II True S Rosetta Stone Set additional True S factors for application domain Calibrated COCOMO II analysis SEER analysis with knowledge base settings True S analysis with application domain settings SEER analysis with calibration and refined settings Consolidated analysis Y Update analysis? N End Additional data Figure 8: Model Analysis Flow The model performances were evaluated with standard figures-of-merit per the equations below based on comparisons between actual and estimated effort for n projects in a dataset: Relative Error (RE) = ( Estimated Effort Actual Effort ) / Actual Effort Magnitude of Relative Error (MRE) = Estimated Effort Actual Effort / Actual Effort Mean Magnitude of relative error (MMRE) = (ΣMRE) / n Root Mean Square (RMS) = ((1/n) Σ (Estimated Effort Actual Effort) 2 ) ½ Prediction Level PRED(L) = k/n, where k = the number projects in a set of n projects whose MRE <=L. 23

24 Each run consists of a logical group of projects against with estimates are compared to actuals using the above measures. The progressive effects of calibrations and other adjustments are evaluated this way. To help interpret some of the results we also looked at the error distribution for any biases. All the numeric estimated values for each model run are contained in the calibration analysis spreadsheets provided with [USC 2006] for further investigation. 4.2 Outlier Analysis Investigation of the project data and the calibration results indicated a single particular outlier project. Its productivity is way out of bounds, by orders of magnitude, with the rest of the productivity spread. It is also the second smallest project (though the other small projects on its scale are inline with the rest of the productivity distribution). The data reporting is potentially suspect or it may be indicative of a single individual or extremely small team. At very small project sizes the effects of individuals tend to predominate, and this could be an extremely productive and unencumbered individual or small team. On modern NASA projects with increased complexity and size, it is highly unlikely that a single individual will create an entire CSCI. Because of the giant disparity in productivity and these other reasons, subsequent analyses are performed without the outlier in the dataset except where noted otherwise. It does not seem to be representative of projects we wish to estimate. For additional reference [USC 2006] lists all results with the outlier included. The dataset lists size as PSLOC. If it is physical SLOC then the size is probably overstated with respect to the COCOMO II standard of logical sources statements. USC has not been able to determine the exact unit of measurements or obtain further context information such as the language. If physical lines were counted then a conversion factor can be used to estimate logical statements. Without that information the current calibrations are predicated on the reported size units (i.e. the A constant is relevant when size is estimated using the same measure as the reported results.) 4.3 COCOMO We calibrated for each embedded domain and the combined embedded flight domains. The results of the COCOMO II calibrations for the combined flight domains are in Figure 9. See [USC 2006] for the corresponding results when the outlier project is included. In order to expand the analysis space, we also assessed COCOMO against embedded ground projects. In that case the calibrated coefficient turned out less than the default COCOMO value. This result can be further explored with additional data or clarifications against the current dataset. 24

25 Dataset Name: NASA 94 Category: Avionics + Science Mode: Embedded Number of Projects = 12 A = 5.80 Actual Effort (Personmonths) Calibrated Effort Estimates vs. Actuals Estimated Effort (Person-months) Effort Prediction Summary Uncalibrated Calibrated MMRE 43% 29% RMS PRED(10) 9% 36% PRED(20) 9% 45% PRED(30) 18% 64% PRED(40) 36% 73% Figure 9: COCOMO Results for Embedded Flight Projects (Avionics + Science) 4.4 SEER The first SEER-SEM runs for uncalibrated and initial knowledge base settings were a blind study whereby actuals were not used. The Rosetta Stone was used to map to SEER-SEM inputs and used for the uncalibrated run. No further factors were touched. In the second round USC chose three knowledge base settings without extensive knowledge of SEER-SEM. The previous COCOMO-derived factors overrode any conflicts in the settings. In the most recent iteration, SEER-SEM experts made further settings in the knowledge bases, with use of actuals for calibration, and including the outlier project. 4.5 True S 25

26 These results for True S represent the first round of applying an expert-determined mapping to True S without calibration. Table 12 shows the True S project settings. There are still additional calibrations being done with True S and those results will be provided later. 26

27 Table 12: True S Project Settings Organizational Productivity Project Constraints Development Team Complexity Hardware Platform Availability (Assembly Input) Analysts Programmers Team Con Functional Design for Product Platform Exp. Multiple Site Start and Project Operational Specification IPT Use CMM Complexity Reuse Fam Fam Lang. Development End Date SLOC 81 Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.95 Unstable Capable Capable 5-10% 2-5 yrs <2 yrs <2 yrs nominal all Not 32, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.88 Unstable ly Cap ly Cap 5-10% 5-10 yrs 5-10 yrs 5-10 yrs nominal set Enough 53, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.88 Very Stable Expert Expert 5-10% > 10 yrs novice novice high at Time 41, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.88 Very Stable Expert Expert 5-10% > 10 yrs novice novice high Entire for 24, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.95 Mod. Stable ly Cap ly Cap 5-10% 5-10 yrs 5-10 yrs 5-10 yrs nominal Team Analysis 165, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.95 Mod. Stable ly Cap ly Cap 5-10% 5-10 yrs 5-10 yrs 5-10 yrs nominal in of 70, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.95 Mod. Stable ly Cap ly Cap 5-10% 5-10 yrs 5-10 yrs 5-10 yrs nominal Same Schedule 50, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.88 Very Stable ly Cap Capable 5-10% <2 yrs novice <2 yrs high Place 7, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.95 Mod. Stable ly Cap ly Cap 5-10% 5-10 yrs 5-10 yrs 5-10 yrs nominal 233, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.82 Unstable Capable Capable 5-10% 2-5 yrs <2 yrs <2 yrs nominal 16, Unmanned Spac NA IPT Casual and Team Some Success CMM nominal na 0.82 Unstable Capable Capable 5-10% 2-5 yrs <2 yrs <2 yrs nominal 6, Military Airpborne NA IPT Casual and Team Some Success CMM nominal na 0.95 Mod. Stable ly Cap ly Cap 5-10% 5-10 yrs 5-10 yrs 5-10 yrs nominal 65, Military Airpborne NA IPT Casual and Team Some Success CMM nominal na 0.82 Mod. Stable Capable Capable 5-10% 2-5 yrs 2-5 yrs 2-5 yrs nominal 3,000 Design, Code and Test Tools 27

28 Some data uncertainties may impact True S like the other models. The True S results may need adjustment for the PSLOC size definition. There are also unknowns about the Operational Specification used. Some of the projects may be manned space, but without that knowledge subsystems cannot be assigned to manned or unmanned. The level of specification may also not be consistent and homogeneous on a spacecraft. 5. Effects of Calibration and Knowledge Bases The effects of calibration, knowledge base settings and other adjustments are evaluated with the respective models. 5.1 COCOMO II The improvement effects of calibration for the different project subgroups are clearly seen in the performance measures. It is evident that MMRE improves in Figure 10 for all cases and that PRED(40) also improves in each case shown in Figure Uncalibrated Calibrated Percent Flight Avionics Embedded Flight Science Embedded Flight (All) Ground Embedded Project Types Figure 10: COCOMO II MMRE Calibration Effect 28

29 Uncalibrated Calibrated Percent Flight Avionics Embedded Flight Science Embedded Flight (All) Ground Embedded Project Types Figure 11: COCOMO II PRED(40) Calibration Effect A summary of the COCOMO II calibrations for the different sub-groups are shown in Table 13. The same information when using the outlier project is provided in [USC 2006]. Table 13: Summary of COCOMO II Linear Effort Calibrations Project Group (# of Projects) Calibrated Coefficient Flight Avionics Embedded (10) A = 6.13 Flight Science Embedded (2) A = 4.38 Flight (All) (12) A = 5.80 Ground Embedded (5) A = SEER Similar progressive improvement trends were exhibited with the SEER-SEM model runs in Figure 12 and Figure 13. The first run used the translated COCOMO into SEER-SEM parameter settings with no further adjustments the uncalibrated case. The next progressive improvement was the setting of knowledge base settings by USC personnel without extensive knowledge of SEER-SEM. This action considerably improved the model performance. The last set was calibrated by SEER-SEM personnel with further adjustments to the knowledge base settings and project specific adjustments. 29

30 Uncalibrated Initial Knowledge Base Settings Calibrated and Project-Adjusted Percent Flight Avionics Embedded Project Types Flight (All) Figure 12: SEER-SEM II MMRE Progressive Adjustment Effects Uncalibrated Initial Knowledge Base Settings Calibrated and Project-Adjusted Percent Flight Avionics Embedded Project Types Flight (All) Figure 13: SEER-SEM PRED(40) Progressive Adjustment Effects One conclusion from the SEER-SEM analysis this is that although statistical calibration helps, it is very important to properly characterize the technical and programmatic characteristics of the software being estimated. The SEER-SEM results, both uncalibrated and calibrated, improved significantly with more accurate information about platform, application, effective size and other parameters. It is suspected performance could be even better, without calibration, with a complete technical description of the software modules and for a "full-up" estimate. 5.3 True S In the True S model the factor Application Type serves as a domain setting, similar to the SEER- SEM Knowledge Base settings. By setting Application Type then other factors are affected including Functional Complexity 30

31 Operating Specification Development Team Productivity Sizing-related parameters. True S was handled somewhat differently than SEER-SEM. Instead of choosing a particular application type to preset them, other factors were independently adjusted to represent the detailed project characteristics. The results of the factor adjustments used for input in the True S runs are shown in Table Model Performance Summaries A summary of the model performances in their last runs (COCOMO calibrated, SEER-SEM with refined knowledge base settings, True S with application domain settings) is shown in Table 14. A scatterplot of the effort estimates vs. actuals is shown in Figure 14. Table 14: Model Performance Summaries Model 1 Model 2 Model 3 MMRE 29% 39% 49% RMS PRED(10) 36% 20% 17% PRED(20) 45% 50% 42% PRED(30) 64% 50% 50% PRED(40) 73% 60% 58% Model 1 Model 2 Model 3 Estimated Effort (Person-months) Actual Effort (Person-months) 31

32 5.5 Additional Analysis Runs Figure 14: Effort Estimates vs. Actuals for All Models This analysis will be repeating as more data is received on the research project. As described only a subset of the steps will be performed on the next iterations of analysis. The process may also vary if project data is received in multiple model format, e.g. using the full set of SEER- SEM or True S parameters and bypassing the Rosetta Stones between COCOMO II. A minor revision of the first round of results will be done first. This is necessary since Galorath has provided refined data from the NASA 94 dataset. In particular, the equivalent size of three of the flight projects is lower than the values used in the initial runs. This largely explains why Galorath was able to include the apparent outlier project in their final SEER-SEM run. The other model analyses will be re-performed, but a quick assessment of the differences indicates the overall results will vary negligibly. More importantly the root cause of the data discrepancy will be investigated. Further impacts on the overall results and conclusions, if any, will be reported then. A data collection initiative is also underway at NASA to collect more modern data for this project. On the NASA MOCA research grant, USC has received actuals on the recent Shuttle Abort and Flight Management (SAFM) project and it will be incorporated into the analysis [Madachy-Boehm 2006]. 32

33 6. Conclusions and Future Work This paper has presented an overview of software cost models with a focus on critical flight software. The primary cost models were assessed against a relevant database of NASA and performed well, particularly with the absence of contextual data and potential flaws in the factor transformations. When using the NASA 94 dataset, calibration and knowledge base judgments for the domain improved all model performance versus using default parameter values. This study was performed by persons highly familiar with COCOMO but not necessarily with SEER-SEM or True S. The vendors of these models provided minor support, but do not yet certify or sanction the data nor information contained in this report. Specific vendor concerns include: the study limited to a COCOMO viewpoint only current Rosetta Stones need review and may be weak translators from the original data results not indicative of model performance due to ignored parameters risk and uncertainty were ground ruled out data sanity checking needed. NASA flight projects are typified by extremely high reliability and complexity. Characterizations of the database projects in terms of these important descriptors provided useful and interesting results. Distributions of factor ratings for complexity and reliability factors showed relevant patterns across the subgroups in the database. It also helped to confirm that the COCOMO factor ratings were done consistently across the projects, and adhere to the COCOMO rating criteria. All models support effort variance due to these factors, but the vendor models provide additional elaborations of these factors and domain-specific defaults. Within the COCOMO II scope and subset of analyses, it can be concluded that the overall embedded flight domain calibration with this data is a linear coefficient A 6. The value is slightly less for embedded science vs. avionics projects. But the sample size for science projects was extremely small, and more data should be incorporated for a more robust and credible calibration. Successive experiments with the SEER-SEM model illustrated that the model performance measures markedly improved when incorporating knowledge base information for the domains. Simple educated guesses on the knowledge base choices without extensive SEER-SEM knowledge produced far better estimates than strict uncalibrated estimates. The initial uncalibrated runs from COCOMO II and SEER-SEM both underestimated the projects by approximately 50% overall. That is also reflected in the calibrated COCOMO II coefficient being about twice the default (A 6 vs. A=2.96). For all models (COCOMO, SEER, True S), calibration against the different subgroups exhibited nearly equivalent trends for embedded flight projects. The model performance measures for either individual flight groups (avionics or science) or combined together (avionics plus science) 33

34 were about the same and the improvement trends between uncalibrated and calibrated were identical when the outlying project was excluded. Major future thrusts include refining and expanding the project dataset, and updating the COCOMO model(s) for flight applications. The calibrations are all derived with respect to the reported size termed PSLOC (likely to be physical lines). Further investigations to directly capture logical source statements or through the use of conversion factors may yield different calibrations for using COCOMO II with its current size definitions. The vendor models provide more granular factors for the overall effects captured in the COCOMO II Complexity (CPLX) factor. One consideration is to elaborate the current COCOMO definition with more levels of detail specifically interpreted for critical flight project applications. The COCOMO II Required Software Reliability (RELY) factor is also being elaborated for high dependability and security applications, and that research will be brought to bear on the current effort. The COCOMO II and COQUALMO models are being updated on this project for new technologies, IV&V techniques and new mission requirements (e.g. increased reliability for security and safety). Additional project data and Delphi studies are both being used. The revised models will undergo the same analysis and be re-calibrated again for flight projects with the additional data. One option to expand COCOMO II is to embed a knowledge-based capability into the model specifically for NASA projects. An example could be based on the observed reliability and complexity factor distributions. A sub-domain is selected by the user (flight science, flight avionics, ground embedded, etc.) and factor settings are defaulted in the model. This study has been helpful in reducing sources of misinterpretation across the models, and the current set of Rosetta Stones and other model comparisons have provided a usable framework for analysis, but considerably more should be done including: developing two-way and/or multiple-way Rosetta Stones explicit identification of residual sources of uncertainty across models and their estimates not fully addressable by Rosetta Stones factors unique to some models but not others develop translations between the size input parameter factors address COTS estimation and sizing many-to-many factor mappings partial factor-to-factor mappings similar factors that affect estimates in different ways: linear, multiplicative, exponential, other imperfections in data: subjective rating scales, code counting, counting of other size factors, effort/schedule counting, endpoint definitions and interpretations, WBS element definitions and interpretations. The study participants welcome sponsorship of further joint efforts to pin down sources of uncertainty, and to more explicitly identify the limits to comparing estimates across models. 34

35 A more rigorous review of the detailed cost factor Rosetta Stones should be completed. Some of the future work to complete this work rests on the vendors. They will be further reviewing their specific sections and clarifying the USC interpretation of their work breakdown structures, factor interpretations and other aspects. Additional approaches for calibration are also being evaluated. Remaining work on the Rosetta Stones includes elaborating the detailed Rosetta Stone for True S, and rigorous review of all the top-level and detailed Rosetta Stones. The analysis process can also be improved on several fronts. The recommended sequence for vendor tool usage is to first set knowledge bases before COCOMO translation parameter setting. It is also desirable to capture estimate inputs in all three model formats, and try different translation directionalities. This analysis has also identified additional information on the projects that could be useful. The vendors are involved in this aspect and the model analyses are likely to be re-iterated for several reasons including additional or refined data assumptions. In practice no one model should be preferred over all others. The key to arriving at sound estimates is to use a variety of methods and tools and then investigate the reasons why the estimates provided by one might differ significantly from those provided by another. If the practitioner can explain such differences to a reasonable level of satisfaction, then it is likely that he or she has a good grasp of the factors which are driving the costs of the project at hand. He/she will be better equipped to support the project planning and control functions performed by management. Future work involves repeating the analysis with updated calibrations, revised domain settings, improved models and new data. It is highly desirable to incorporate more recent NASA project data in the cost model analyses. The MOCA project collected actuals on the SAFM project, more data is being solicited and it all will be used to update the analysis and support research demands for current data. Other data concerns include the units of size measurement in the NASA 94 dataset, which should be investigated for reasons previously stated in the analyses. With modern and more comprehensive data, COCOMO II and the other models can be further improved and tailored as necessary for NASA project usage. 6.1 References [Boehm 1981] Boehm B., Software Engineering Economics. Englewood Cliffs, NJ, Prentice- Hall, 1981 [Boehm et al. 2000] Boehm B., Abts C., Brown W., Chulani S., Clark B., Horowitz E., Madachy R., Reifer D., Steece B., Software Cost Estimation with COCOMO II, Prentice-Hall, 2000 [Boehm et al. 2000b] Boehm B, Abts C, Chulani S, Software Development Cost Estimation Approaches A Survey, USC-CSE ,

36 [Boehm et al. 2004] Boehm B, Bhuta J, Garlan D, Gradman E, Huang L, Lam A, Madachy R, Medvidovic N, Meyer K, Meyers S, Perez G, Reinholtz KL, Roshandel R, Rouquette N, Using Empirical Testbeds to Accelerate Technology Maturity and Transition: The SCRover Experience, Proceedings of the 2004 International Symposium on Empirical Software Engineering, IEEE Computer Society, 2004 [Galorath 2005] Galorath Inc., SEER-SEM User Manual, 2005 [Galorath-Evans 2006] Galorath D, Evans M, Software Sizing, Estimation, and Risk Management, Auerbach Publications, 2006 [Jensen 1983] Jensen R, An Improved Macrolevel Software Development Resource Estimation Model, Proceedings of 5 th ISPA Conference, 1983 [Lum et al. 2001] Lum K, Powell J, Hihn J, Validation of Spacecraft Software Cost Estimation Models for Flight and Ground Systems, JPL Report, 2001 [Madachy 1997] Madachy R, Heuristic Risk Assessment Using Cost Factors, IEEE Software, May 1997 [Madachy-Boehm 2006] Madachy R, Boehm B, A Model of Options and Costs for Reliable Autonomy (MOCA) Final Report, reported submitted to NASA for USRA contract #4481, 2006 [Park 1988] Park R, The Central Equations of the PRICE Software Cost Model, COCOMO User s Group Meeting, 1988 [PRICE 2005] PRICE Systems, TRUE S User Manual, 2005 [Reifer et al. 1999] Reifer D, Boehm B, Chulani S, The Rosetta Stone - Making COCOMO 81 Estimates Work with COCOMO II, Crosstalk, 1999 [USC-CSE 2006] USC Center for Software Engineering, Model Comparison Report, Report to NASA AMES, Draft Version, July Acknowledgements This work is supported by the NASA AMES Research Center Cooperative Agreement No. NNA06CB29A for Software Risk Advisory Tools. Our colleagues helping on this research at NASA include Mike ry, Julian Richardson and Tim Menzies. It also would not have been possible without the contributions of other colleagues and generous organizations. We particularly thank Galorath Inc. and PRICE Systems for providing us with their tool information and people. Thanks are due to all the people mentioned below. Tim Hohmann at Galorath Inc. helped as our primary technical contact for SEER-SEM support. Additional assistance and support from Galorath Inc. came from Dan Galorath, Karen McRitchie and Bob Hunt. 36

37 David Seaver was our primary contact and provided technical support from PRICE Systems, and James Otte also provided early assistance. Jairus Hihn and Sherry Stukes from NASA JPL supported this analysis. Dan Ligett from Softstar Systems graciously provided a calibration spreadsheet that was modified for this research. 37

38 8. Appendix A: NASA 94 Original and Transformed Data Table 15: Original COCOMO 1981 Data for NASA 94 Avionics, Embedded Projects recordnum projectnamcat2 forg center year mode rely data cplx time stor virt turn acap aexp pcap vexp lexp modp tool sced equivphyskact_effort 81 hst Avionics f embedded h vh vh xh xh h h n n n l l n n h hst Avionics f embedded h h h vh xh h h h h h h h h n n spl Avionics f embedded h l vh vh xh l n vh vh vh vl vl h h n spl Avionics f embedded h l vh vh xh l n vh vh vh vl vl h h n sts Avionics f embedded vh h vh xh xh n n h h h h h h n h sts Avionics f embedded vh h vh xh xh n l h h h h h h n h sts Avionics f embedded vh h xh xh xh n n h h h h h h n h gal Avionics f embedded vh l vh vh xh l l h l n vl l l h h sts Avionics f embedded vh h vh xh xh n n h h h h h h n h gro Avionics f embedded h n vh vh vh h h n n n l l n n h gro Avionics f embedded h n vh vh vh h h n n n l l n n h Table 16: COCOMO II Transformed Data for NASA 94 All Embedded Projects 38

39 39 USC-CSSE

CSSE 372 Software Project Management: Software Estimation With COCOMO-II

CSSE 372 Software Project Management: Software Estimation With COCOMO-II CSSE 372 Software Project Management: Software Estimation With COCOMO-II Shawn Bohner Office: Moench Room F212 Phone: (812) 877-8685 Email: [email protected] Estimation Experience and Beware of the

More information

MTAT.03.244 Software Economics. Lecture 5: Software Cost Estimation

MTAT.03.244 Software Economics. Lecture 5: Software Cost Estimation MTAT.03.244 Software Economics Lecture 5: Software Cost Estimation Marlon Dumas marlon.dumas ät ut. ee Outline Estimating Software Size Estimating Effort Estimating Duration 2 For Discussion It is hopeless

More information

Software cost estimation

Software cost estimation Software cost estimation Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 26 Slide 1 Objectives To introduce the fundamentals of software costing and pricing To describe three metrics for

More information

Software cost estimation. Predicting the resources required for a software development process

Software cost estimation. Predicting the resources required for a software development process Software cost estimation Predicting the resources required for a software development process Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 23 Slide 1 Objectives To introduce the fundamentals

More information

Finally, Article 4, Creating the Project Plan describes how to use your insight into project cost and schedule to create a complete project plan.

Finally, Article 4, Creating the Project Plan describes how to use your insight into project cost and schedule to create a complete project plan. Project Cost Adjustments This article describes how to make adjustments to a cost estimate for environmental factors, schedule strategies and software reuse. Author: William Roetzheim Co-Founder, Cost

More information

Software Intensive Systems Cost and Schedule Estimation

Software Intensive Systems Cost and Schedule Estimation Software Intensive Systems Cost and Schedule Estimation Final Technical Report SERC 2013-TR-032-2 June 13, 2013 Dr. Barry Boehm, Principal Investigator - University of Southern California Dr. Jo Ann Lane

More information

Chapter 23 Software Cost Estimation

Chapter 23 Software Cost Estimation Chapter 23 Software Cost Estimation Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 23 Slide 1 Software cost estimation Predicting the resources required for a software development process

More information

A DIFFERENT KIND OF PROJECT MANAGEMENT: AVOID SURPRISES

A DIFFERENT KIND OF PROJECT MANAGEMENT: AVOID SURPRISES SEER for Software: Cost, Schedule, Risk, Reliability SEER project estimation and management solutions improve success rates on complex software projects. Based on sophisticated modeling technology and

More information

A DIFFERENT KIND OF PROJECT MANAGEMENT

A DIFFERENT KIND OF PROJECT MANAGEMENT SEER for Software SEER project estimation and management solutions improve success rates on complex software projects. Based on sophisticated modeling technology and extensive knowledge bases, SEER solutions

More information

Software cost estimation

Software cost estimation CH26_612-640.qxd 4/2/04 3:28 PM Page 612 26 Software cost estimation Objectives The objective of this chapter is to introduce techniques for estimating the cost and effort required for software production.

More information

CISC 322 Software Architecture

CISC 322 Software Architecture CISC 322 Software Architecture Lecture 20: Software Cost Estimation 2 Emad Shihab Slides adapted from Ian Sommerville and Ahmed E. Hassan Estimation Techniques There is no simple way to make accurate estimates

More information

SEER for Software - Going Beyond Out of the Box. David DeWitt Director of Software and IT Consulting

SEER for Software - Going Beyond Out of the Box. David DeWitt Director of Software and IT Consulting SEER for Software - Going Beyond Out of the Box David DeWitt Director of Software and IT Consulting SEER for Software is considered by a large percentage of the estimation community to be the Gold Standard

More information

Knowledge-Based Systems Engineering Risk Assessment

Knowledge-Based Systems Engineering Risk Assessment Knowledge-Based Systems Engineering Risk Assessment Raymond Madachy, Ricardo Valerdi University of Southern California - Center for Systems and Software Engineering Massachusetts Institute of Technology

More information

The COCOMO II Estimating Model Suite

The COCOMO II Estimating Model Suite The COCOMO II Estimating Model Suite Barry Boehm, Chris Abts, Jongmoon Baik, Winsor Brown, Sunita Chulani, Cyrus Fakharzadeh, Ellis Horowitz and Donald Reifer Center for Software Engineering University

More information

Software cost estimation

Software cost estimation Software cost estimation Sommerville Chapter 26 Objectives To introduce the fundamentals of software costing and pricing To describe three metrics for software productivity assessment To explain why different

More information

COCOMO-SCORM Interactive Courseware Project Cost Modeling

COCOMO-SCORM Interactive Courseware Project Cost Modeling COCOMO-SCORM Interactive Courseware Project Cost Modeling Roger Smith & Lacey Edwards SPARTA Inc. 13501 Ingenuity Drive, Suite 132 Orlando, FL 32826 Roger.Smith, Lacey.Edwards @Sparta.com Copyright 2006

More information

Topics. Project plan development. The theme. Planning documents. Sections in a typical project plan. Maciaszek, Liong - PSE Chapter 4

Topics. Project plan development. The theme. Planning documents. Sections in a typical project plan. Maciaszek, Liong - PSE Chapter 4 MACIASZEK, L.A. and LIONG, B.L. (2005): Practical Software Engineering. A Case Study Approach Addison Wesley, Harlow England, 864p. ISBN: 0 321 20465 4 Chapter 4 Software Project Planning and Tracking

More information

Cost Estimation Strategies COST ESTIMATION GUIDELINES

Cost Estimation Strategies COST ESTIMATION GUIDELINES Cost Estimation Strategies Algorithmic models (Rayleigh curve Cost in week t = K a t exp(-a t 2 ) Expert judgment (9 step model presented later) Analogy (Use similar systems) Parkinson (Work expands to

More information

Handbook for Software Cost Estimation

Handbook for Software Cost Estimation JPL D-26303, Rev. 0 Handbook for Software Cost Estimation Prepared by: Karen Lum Michael Bramble Jairus Hihn John Hackney Mori Khorrami Erik Monson Document Custodian: Jairus Hihn Approved by: Frank Kuykendall

More information

IDC Reengineering Phase 2 & 3 US Industry Standard Cost Estimate Summary

IDC Reengineering Phase 2 & 3 US Industry Standard Cost Estimate Summary SANDIA REPORT SAND2015-20815X Unlimited Release January 2015 IDC Reengineering Phase 2 & 3 US Industry Standard Cost Estimate Summary Version 1.0 James Mark Harris, Robert M. Huelskamp Prepared by Sandia

More information

Article 3, Dealing with Reuse, explains how to quantify the impact of software reuse and commercial components/libraries on your estimate.

Article 3, Dealing with Reuse, explains how to quantify the impact of software reuse and commercial components/libraries on your estimate. Estimating Software Costs This article describes the cost estimation lifecycle and a process to estimate project volume. Author: William Roetzheim Co-Founder, Cost Xpert Group, Inc. Estimating Software

More information

Fundamentals of Measurements

Fundamentals of Measurements Objective Software Project Measurements Slide 1 Fundamentals of Measurements Educational Objective: To review the fundamentals of software measurement, to illustrate that measurement plays a central role

More information

Cost Estimation Driven Software Development Process

Cost Estimation Driven Software Development Process Cost Estimation Driven Software Development Process Orsolya Dobán, András Pataricza Budapest University of Technology and Economics Department of Measurement and Information Systems Pázmány P sétány 1/D

More information

Assessing Quality Processes with ODC COQUALMO

Assessing Quality Processes with ODC COQUALMO Assessing Quality Processes with ODC COQUALMO Ray Madachy, Barry Boehm USC {madachy, boehm}@usc.edu 2008 International Conference on Software Process May 10, 2008 USC-CSSE 1 Introduction Cost, schedule

More information

Measurement Information Model

Measurement Information Model mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides

More information

Cost Estimation for Secure Software & Systems

Cost Estimation for Secure Software & Systems Background Cost Estimation for Secure Software & Systems Ed Colbert Dr. Barry Boehm Center for Systems & Software Engineering, University of Southern California, 941 W. 37th Pl., Sal 328, Los Angeles,

More information

Software Migration Project Cost Estimation using COCOMO II and Enterprise Architecture Modeling

Software Migration Project Cost Estimation using COCOMO II and Enterprise Architecture Modeling Software Migration Project Cost Estimation using COCOMO II and Enterprise Architecture Modeling Alexander Hjalmarsson 1, Matus Korman 1 and Robert Lagerström 1, 1 Royal Institute of Technology, Osquldas

More information

Process Models and Metrics

Process Models and Metrics Process Models and Metrics PROCESS MODELS AND METRICS These models and metrics capture information about the processes being performed We can model and measure the definition of the process process performers

More information

Project Planning and Project Estimation Techniques. Naveen Aggarwal

Project Planning and Project Estimation Techniques. Naveen Aggarwal Project Planning and Project Estimation Techniques Naveen Aggarwal Responsibilities of a software project manager The job responsibility of a project manager ranges from invisible activities like building

More information

(Refer Slide Time: 01:52)

(Refer Slide Time: 01:52) Software Engineering Prof. N. L. Sarda Computer Science & Engineering Indian Institute of Technology, Bombay Lecture - 2 Introduction to Software Engineering Challenges, Process Models etc (Part 2) This

More information

Software Engineering. Dilbert on Project Planning. Overview CS / COE 1530. Reading: chapter 3 in textbook Requirements documents due 9/20

Software Engineering. Dilbert on Project Planning. Overview CS / COE 1530. Reading: chapter 3 in textbook Requirements documents due 9/20 Software Engineering CS / COE 1530 Lecture 4 Project Management Dilbert on Project Planning Overview Reading: chapter 3 in textbook Requirements documents due 9/20 1 Tracking project progress Do you understand

More information

Modern Tools to Support DoD Software Intensive System of Systems Cost Estimation

Modern Tools to Support DoD Software Intensive System of Systems Cost Estimation Modern Tools to Support DoD Software Intensive System of Systems Cost Estimation Jo Ann Lane and Barry Boehm University of Southern California Center for Systems and Software Engineering Abstract Many

More information

Software Cost and Productivity Model

Software Cost and Productivity Model Software Cost and Productivity Model presented to Ground Systems Architecture Workshop 2004 Manhattan Beach, California presented by J. E. Gayek, L. G. Long, K. D. Bell, R. M. Hsu, and R. K. Larson* The

More information

Software Cost Estimation Metrics Manual for Defense Systems

Software Cost Estimation Metrics Manual for Defense Systems Software Cost Estimation Metrics Manual for Defense Systems Brad Clark USC Ray Madachy Naval Postgraduate School 29 th International Forum on COCOMO and Systems/Software Cost Modeling October 22, 2014

More information

COCOMO II and Big Data

COCOMO II and Big Data COCOMO II and Big Data Rachchabhorn Wongsaroj*, Jo Ann Lane, Supannika Koolmanojwong, Barry Boehm *Bank of Thailand and Center for Systems and Software Engineering Computer Science Department, Viterbi

More information

Dr. Barry W. Boehm USC Center for Software Engineering

Dr. Barry W. Boehm USC Center for Software Engineering 7th Annual Practical Software and Systems Measurement Users Group Conference Keystone, CO July 16, 2003 Dr. Barry W. Boehm USC 1 Workshop Agenda Day 1 (1:30 AM 5:00 PM 7/16) Next-level tutorial Review

More information

6.0 RELIABILITY ALLOCATION

6.0 RELIABILITY ALLOCATION 6.0 RELIABILITY ALLOCATION Reliability Allocation deals with the setting of reliability goals for individual subsystems such that a specified reliability goal is met and the hardware and software subsystem

More information

DOMAIN-BASED EFFORT DISTRIBUTION MODEL FOR SOFTWARE COST ESTIMATION. Thomas Tan

DOMAIN-BASED EFFORT DISTRIBUTION MODEL FOR SOFTWARE COST ESTIMATION. Thomas Tan DOMAIN-BASED EFFORT DISTRIBUTION MODEL FOR SOFTWARE COST ESTIMATION by Thomas Tan A Dissertation Presented to the FACULTY OF THE USC GRADUATE SCHOOL UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulfillment

More information

Summary of GAO Cost Estimate Development Best Practices and GAO Cost Estimate Audit Criteria

Summary of GAO Cost Estimate Development Best Practices and GAO Cost Estimate Audit Criteria Characteristic Best Practice Estimate Package Component / GAO Audit Criteria Comprehensive Step 2: Develop the estimating plan Documented in BOE or Separate Appendix to BOE. An analytic approach to cost

More information

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same!

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same! Software Metrics & Software Metrology Alain Abran Chapter 4 Quantification and Measurement are Not the Same! 1 Agenda This chapter covers: The difference between a number & an analysis model. The Measurement

More information

Identifying Factors Affecting Software Development Cost

Identifying Factors Affecting Software Development Cost Identifying Factors Affecting Software Development Cost Robert Lagerström PhD Student at Industrial Information and Control Systems School of Electrical Engineering KTH Royal Institute of Technology Stockholm,

More information

Software Development Cost Estimation Approaches A Survey 1. Barry Boehm, Chris Abts. University of Southern California. Los Angeles, CA 90089-0781

Software Development Cost Estimation Approaches A Survey 1. Barry Boehm, Chris Abts. University of Southern California. Los Angeles, CA 90089-0781 Software Development Cost Estimation Approaches A Survey 1 Barry Boehm, Chris Abts University of Southern California Los Angeles, CA 90089-0781 Sunita Chulani IBM Research 650 Harry Road, San Jose, CA

More information

Contents. Today Project Management. Project Management. Last Time - Software Development Processes. What is Project Management?

Contents. Today Project Management. Project Management. Last Time - Software Development Processes. What is Project Management? Contents Introduction Software Development Processes Project Management Requirements Engineering Software Construction Group processes Quality Assurance Software Management and Evolution Last Time - Software

More information

COCOMO II Model Definition Manual

COCOMO II Model Definition Manual COCOMO II Model Definition Manual Acknowledgments COCOMO II is an effort to update the well-known COCOMO (Constructive Cost Model) software cost estimation model originally published in Software Engineering

More information

The Challenge of Productivity Measurement

The Challenge of Productivity Measurement Proceedings: Pacific Northwest Software Quality Conference, 2006 The Challenge of Productivity Measurement David N. Card Q-Labs, Inc [email protected] Biography- David N. Card is a fellow of Q-Labs, a subsidiary

More information

Speeding up Level 3 CMM Certification Process with Estimation Tool General Dynamics Calgary

Speeding up Level 3 CMM Certification Process with Estimation Tool General Dynamics Calgary Speeding up Level 3 CMM Certification Process with Estimation Tool General Dynamics Calgary Implementing a commercial estimation software tool has eliminated one to two years of data collection, quickening

More information

Extending CMMI Level 4/5 Organizational Metrics Beyond Software Development

Extending CMMI Level 4/5 Organizational Metrics Beyond Software Development Extending CMMI Level 4/5 Organizational Metrics Beyond Software Development CMMI Technology Conference and User Group Denver, Colorado 14-17 November 2005 Linda Brooks Northrop Grumman Corporation Topics

More information

Supporting Workflow Overview. CSC532 Fall06

Supporting Workflow Overview. CSC532 Fall06 Supporting Workflow Overview CSC532 Fall06 Objectives: Supporting Workflows Define the supporting workflows Understand how to apply the supporting workflows Understand the activities necessary to configure

More information

Software Engineering. Reading. Effort estimation CS / COE 1530. Finish chapter 3 Start chapter 5

Software Engineering. Reading. Effort estimation CS / COE 1530. Finish chapter 3 Start chapter 5 Software Engineering CS / COE 1530 Lecture 5 Project Management (finish) & Design CS 1530 Software Engineering Fall 2004 Reading Finish chapter 3 Start chapter 5 CS 1530 Software Engineering Fall 2004

More information

Application of software product quality international standards through software development life cycle

Application of software product quality international standards through software development life cycle Central Page 284 of 296 Application of software product quality international standards through software development life cycle Mladen Hosni, Valentina Kirinić Faculty of Organization and Informatics University

More information

To introduce software process models To describe three generic process models and when they may be used

To introduce software process models To describe three generic process models and when they may be used Software Processes Objectives To introduce software process models To describe three generic process models and when they may be used To describe outline process models for requirements engineering, software

More information

What is a life cycle model?

What is a life cycle model? What is a life cycle model? Framework under which a software product is going to be developed. Defines the phases that the product under development will go through. Identifies activities involved in each

More information

SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK

SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK Office of Safety and Mission Assurance NASA-GB-9503 SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK AUGUST 1995 National Aeronautics and Space Administration Washington, D.C. 20546 PREFACE The growth in cost

More information

Software Development Life Cycle

Software Development Life Cycle 4 Software Development Life Cycle M MAJOR A J O R T TOPICSO P I C S Objectives... 52 Pre-Test Questions... 52 Introduction... 53 Software Development Life Cycle Model... 53 Waterfall Life Cycle Model...

More information

Software Cost Estimation: A Tool for Object Oriented Console Applications

Software Cost Estimation: A Tool for Object Oriented Console Applications Software Cost Estimation: A Tool for Object Oriented Console Applications Ghazy Assassa, PhD Hatim Aboalsamh, PhD Amel Al Hussan, MSc Dept. of Computer Science, Dept. of Computer Science, Computer Dept.,

More information

Lecture Objectives. Software Life Cycle. Software Engineering Layers. Software Process. Common Process Framework. Umbrella Activities

Lecture Objectives. Software Life Cycle. Software Engineering Layers. Software Process. Common Process Framework. Umbrella Activities Software Life Cycle Lecture Objectives What happens in the life of software To look at the life cycle of a software To understand the software process and its related elements To relate to the different

More information

COCOMO (Constructive Cost Model)

COCOMO (Constructive Cost Model) COCOMO (Constructive Cost Model) Seminar on Software Cost Estimation WS 2002 / 2003 presented by Nancy Merlo Schett Requirements Engineering Research Group Department of Computer Science University of

More information

Comparison of SDLC-2013 Model with Other SDLC Models by Using COCOMO

Comparison of SDLC-2013 Model with Other SDLC Models by Using COCOMO International Journal of Emerging Science and Engineering (IJESE) Comparison of SDLC-2013 Model with Other SDLC Models by Using COCOMO Naresh Kumar, Pinky Chandwal Abstract There exist a large number of

More information

CHAPTER_3 SOFTWARE ENGINEERING (PROCESS MODELS)

CHAPTER_3 SOFTWARE ENGINEERING (PROCESS MODELS) CHAPTER_3 SOFTWARE ENGINEERING (PROCESS MODELS) Prescriptive Process Model Defines a distinct set of activities, actions, tasks, milestones, and work products that are required to engineer high quality

More information

SoftwareCostEstimation. Spring,2012

SoftwareCostEstimation. Spring,2012 SoftwareCostEstimation Spring,2012 Chapter 3 SOFTWARE COST ESTIMATION DB Liu Software Cost Estimation INTRODUCTION Estimating the cost of a software product is one of the most difficult and error-prone

More information

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT SOFTWARE ENGINEERING 2

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT SOFTWARE ENGINEERING 2 BCS THE CHARTERED INSTITUTE FOR IT BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT SOFTWARE ENGINEERING 2 EXAMINERS REPORT Friday 2 nd October 2015 Answer any THREE

More information

Incorporating Data Mining Techniques on Software Cost Estimation: Validation and Improvement

Incorporating Data Mining Techniques on Software Cost Estimation: Validation and Improvement Incorporating Data Mining Techniques on Software Cost Estimation: Validation and Improvement 1 Narendra Sharma, 2 Ratnesh Litoriya Department of Computer Science and Engineering Jaypee University of Engg

More information

COCOMO II Model Definition Manual

COCOMO II Model Definition Manual COCOMO II Model Definition Manual Version 1.4 - Copyright University of Southern California Acknowledgments This work has been supported both financially and technically by the COCOMO II Program Affiliates:

More information

Software Engineering. Introduction. Software Costs. Software is Expensive [Boehm] ... Columbus set sail for India. He ended up in the Bahamas...

Software Engineering. Introduction. Software Costs. Software is Expensive [Boehm] ... Columbus set sail for India. He ended up in the Bahamas... Software Engineering Introduction... Columbus set sail for India. He ended up in the Bahamas... The economies of ALL developed nations are dependent on software More and more systems are software controlled

More information

Simulation for Business Value and Software Process/Product Tradeoff Decisions

Simulation for Business Value and Software Process/Product Tradeoff Decisions Simulation for Business Value and Software Process/Product Tradeoff Decisions Raymond Madachy USC Center for Software Engineering Dept. of Computer Science, SAL 8 Los Angeles, CA 90089-078 740 570 [email protected]

More information

Software Engineering. Software Processes. Based on Software Engineering, 7 th Edition by Ian Sommerville

Software Engineering. Software Processes. Based on Software Engineering, 7 th Edition by Ian Sommerville Software Engineering Software Processes Based on Software Engineering, 7 th Edition by Ian Sommerville Objectives To introduce software process models To describe three generic process models and when

More information

Software Development Cost and Time Forecasting Using a High Performance Artificial Neural Network Model

Software Development Cost and Time Forecasting Using a High Performance Artificial Neural Network Model Software Development Cost and Time Forecasting Using a High Performance Artificial Neural Network Model Iman Attarzadeh and Siew Hock Ow Department of Software Engineering Faculty of Computer Science &

More information

Deducing software process improvement areas from a COCOMO II-based productivity measurement

Deducing software process improvement areas from a COCOMO II-based productivity measurement Deducing software process improvement areas from a COCOMO II-based productivity measurement Lotte De Rore, Monique Snoeck, Geert Poels, Guido Dedene Abstract At the SMEF2006 conference, we presented our

More information

A Comparative Evaluation of Effort Estimation Methods in the Software Life Cycle

A Comparative Evaluation of Effort Estimation Methods in the Software Life Cycle DOI 10.2298/CSIS110316068P A Comparative Evaluation of Effort Estimation Methods in the Software Life Cycle Jovan Popović 1 and Dragan Bojić 1 1 Faculty of Electrical Engineering, University of Belgrade,

More information

PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING

PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING 03-23-05 Christine Green, PMI PMBOK and Estimating EDS, Delivery

More information

Software Engineering. Objectives. Designing, building and maintaining large software systems

Software Engineering. Objectives. Designing, building and maintaining large software systems Software Engineering Objectives Designing, building and maintaining large software systems To define software engineering and explain its importance To discuss the concepts of software products and software

More information

Software Engineering/Courses Description Introduction to Software Engineering Credit Hours: 3 Prerequisite: 0306211(Computer Programming 2).

Software Engineering/Courses Description Introduction to Software Engineering Credit Hours: 3 Prerequisite: 0306211(Computer Programming 2). 0305203 0305280 0305301 0305302 Software Engineering/Courses Description Introduction to Software Engineering Prerequisite: 0306211(Computer Programming 2). This course introduces students to the problems

More information

And the Models Are 16-03-2015. System/Software Development Life Cycle. Why Life Cycle Approach for Software?

And the Models Are 16-03-2015. System/Software Development Life Cycle. Why Life Cycle Approach for Software? System/Software Development Life Cycle Anurag Srivastava Associate Professor ABV-IIITM, Gwalior Why Life Cycle Approach for Software? Life cycle is a sequence of events or patterns that are displayed in

More information

Introduction to Software Paradigms & Procedural Programming Paradigm

Introduction to Software Paradigms & Procedural Programming Paradigm Introduction & Procedural Programming Sample Courseware Introduction to Software Paradigms & Procedural Programming Paradigm This Lesson introduces main terminology to be used in the whole course. Thus,

More information

Factors Influencing Software Development Productivity - State of the Art and Industrial Experiences

Factors Influencing Software Development Productivity - State of the Art and Industrial Experiences Factors Influencing Software Development Productivity - State of the Art and Industrial Experiences Adam Trendowicz, Jürgen Münch Fraunhofer Institute for Experimental Software Engineering Fraunhofer-Platz

More information

WBS, Estimation and Scheduling. Adapted from slides by John Musser

WBS, Estimation and Scheduling. Adapted from slides by John Musser WBS, Estimation and Scheduling Adapted from slides by John Musser 1 Today Work Breakdown Structures (WBS) Estimation Network Fundamentals PERT & CPM Techniques Gantt Charts 2 Estimation Predictions are

More information

PROJECT COST MANAGEMENT

PROJECT COST MANAGEMENT 7 PROJECT COST MANAGEMENT Project Cost Management includes the processes required to ensure that the project is completed within the approved budget. Figure 7 1 provides an overview of the following major

More information

Software Engineering Question Bank

Software Engineering Question Bank Software Engineering Question Bank 1) What is Software Development Life Cycle? (SDLC) System Development Life Cycle (SDLC) is the overall process of developing information systems through a multi-step

More information

Effect of Schedule Compression on Project Effort

Effect of Schedule Compression on Project Effort Effect of Schedule Compression on Project Effort Ye Yang, Zhihao Chen, Ricardo Valerdi, Barry Boehm Center for Software Engineering, University of Southern California (USC-CSE) Los Angeles, CA 90089-078,

More information

A Fool with a Tool: Improving Software Cost and Schedule Estimation

A Fool with a Tool: Improving Software Cost and Schedule Estimation 2006 International Software Measurement and Analysis Conference A Fool with a Tool: Improving Software Cost and Schedule Estimation Ian Brown, CFPS Booz Allen Hamilton A fool with a tool is still a fool.

More information

Establishing Great Software Development Process(es) for Your Organization. By Dale Mayes [email protected]

Establishing Great Software Development Process(es) for Your Organization. By Dale Mayes DMayes@HomePortEngineering.com Establishing Great Software Development Process(es) for Your Organization By Dale Mayes [email protected] Class: ETP-410 Embedded Systems Conference San Francisco 2005 Abstract: There are

More information

Software Process for QA

Software Process for QA Software Process for QA Basic approaches & alternatives CIS 610, W98 / M Young 1/7/98 1 This introduction and overview is intended to provide some basic background on software process (sometimes called

More information

Applying CMMI SM In Information Technology Organizations SEPG 2003

Applying CMMI SM In Information Technology Organizations SEPG 2003 Applying CMMI SM In Information Technology Organizations Mark Servello, Vice President Jim Gibson, Senior Consultant ChangeBridge, Incorporated Page 1 Portions Copyright 2002 Carnegie Mellon University

More information

CS 389 Software Engineering. Lecture 2 Chapter 2 Software Processes. Adapted from: Chap 1. Sommerville 9 th ed. Chap 1. Pressman 6 th ed.

CS 389 Software Engineering. Lecture 2 Chapter 2 Software Processes. Adapted from: Chap 1. Sommerville 9 th ed. Chap 1. Pressman 6 th ed. CS 389 Software Engineering Lecture 2 Chapter 2 Software Processes Adapted from: Chap 1. Sommerville 9 th ed. Chap 1. Pressman 6 th ed. Topics covered Software process models Process activities Coping

More information

Software Life Cycle Processes

Software Life Cycle Processes Software Life Cycle Processes Objective: Establish a work plan to coordinate effectively a set of tasks. Improves software quality. Allows us to manage projects more easily. Status of projects is more

More information

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. February 2013 1 Executive Summary Adnet is pleased to provide this white paper, describing our approach to performing

More information

A Characterization Taxonomy for Integrated Management of Modeling and Simulation Tools

A Characterization Taxonomy for Integrated Management of Modeling and Simulation Tools A Characterization Taxonomy for Integrated Management of Modeling and Simulation Tools Bobby Hartway AEgis Technologies Group 631 Discovery Drive Huntsville, AL 35806 256-922-0802 [email protected]

More information

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University Software Engineering Introduction & Background Department of Computer Science Kent State University Complaints Software production is often done by amateurs Software development is done by tinkering or

More information

Iterative Project Management 1

Iterative Project Management 1 Iterative Project Management Module 2 Objectives Understand issues for Project Managers (PM) who use iterative development by: Learning how the PM monitors and steers an iterative project towards success.

More information

University of Southern California COCOMO Reference Manual

University of Southern California COCOMO Reference Manual USC COCOMOII Reference Manual University of Southern California COCOMO Reference Manual 1 This manual is compatible with USC-COCOMOII.1999 version 0. Copyright Notice This document is copyrighted, and

More information

ICS 121 Lecture Notes Spring Quarter 96

ICS 121 Lecture Notes Spring Quarter 96 Software Management Cost Estimation Managing People Management Poor managment is the downfall of many software projects Ð Delivered software was late, unreliable, cost several times the original estimates

More information

10 Keys to Successful Software Projects: An Executive Guide

10 Keys to Successful Software Projects: An Executive Guide 10 Keys to Successful Software Projects: An Executive Guide 2000-2006 Construx Software Builders, Inc. All Rights Reserved. www.construx.com Background State of the Art vs. State of the Practice The gap

More information