A Manager's Checklist for Validating Software Cost and Schedule Estimates
|
|
|
- Isaac Kelly
- 10 years ago
- Views:
Transcription
1 Special Report CMU/SEI-95-SR-004 January 1995 A Manager's Checklist for Validating Software Cost and Schedule Estimates Robert E. Park Approved for public release. Distribution unlimited. Carnegie Mellon University Pittsburgh, Pennsylvania 15213
2 This technical report was prepared for the SEI Joint Program Office HQ ESC/ENS 5 Eglin Street Hanscom AFB, MA The ideas and findings in this report should not be construed as an official DoD position. It is published in the interest of scientific and technical information exchange. Review and Approval This report has been reviewed and is approved for publication. FOR THE COMMANDER SIGNATURE ON FILE Thomas R. Miller, Lt Col, USAF SEI Joint Program Office This work is sponsored by the U.S. Department of Defense. Copyright 1995 by Carnegie Mellon University This document is available through Research Access, Inc., 800 Vinial Street, Pittsburgh, PA Phone: FAX: (412) Copies of this document are available through the National Technical Information Service (NTIS). For information on ordering, please contact NTIS directly: National Technical Information Service, U.S. Department of Commerce, Springfield, VA Phone: (703) This document is also available through the Defense Technical Information Center (DTIC). DTIC provides access to and transfer of scientific and technical information for DoD personnel, DoD contractors and potential contractors, and other U.S. Government agency personnel and their contractors. To obtain a copy, please contact DTIC directly: Defense Technical Information Center, Attn: FDRA, Cameron Station, Alexandria, VA Phone (703)
3 Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder.
4
5 Table of Contents Table of Contents Acknowledgments i iii 1 Introduction 1 2 The Validation Checklist 3 References 5 Checklist for Validating Software Cost and Schedule Estimates Checklist-1 CMU/SEI-95-SR-004 i
6 ii CMU/SEI-95-SR-004
7 Acknowledgments The following people have contributed to this work, either as reviewers of earlier versions or as part of the team that helped generate ideas and material for the checklist. Edward Averill Dinah Beres US Navy (NAWC, China Lake) Reg Burd Computer Data Systems, Inc. Anita Carleton John P. Chihorek Loral Aeronutronic Lyle Cocking GDE Systems Inc. Cenap Dada US Army (CECOM RD&E Center) Joe Dean Tecolote Research. Inc. Leonard Flens AF Pentagon Communications Agency Judy Galorath Galorath Associates, Inc. Gary Gaston Defense Logistics Agency (DPRO Lockheed, Fort Worth) Wolfhart Goethert Jim Hart Will Hayes Joseph F. Lipari AF Pentagon Communications Agency Reed Little Colonel Russell Logan AF Pentagon Communications Agency Joan Lovelace The MITRE Corporation John MacDonald E-Systems Jordan B. Matejceck US Department of Commerce (NOAA) Barbara Meyers USAF (Joint STARS System Program Office) Thomas E. Moulds J. F. Taylor, Inc. Paul Oliver Booz-Allen & Hamilton Bill Peterson Joseph L. Podolsky Hewlett Packard Michael Rissman Jim Rozum Dick Stutzke SAIC (Huntsville) Ed Tilford, Sr. Fissure Bob Verge SAIC (Orlando) Todd Webb Autodesk Roy Williams Loral Federal Systems CMU/SEI-95-SR-004 iii
8 iv CMU/SEI-95-SR-004
9 A Manager's Checklist for Validating Software Cost and Schedule Estimates Abstract. This report provides a checklist of questions to ask and evidence to look for when assessing the credibility of a software cost and schedule estimate. The checklist can be used either to review individual estimates or to motivate and guide organizations toward improving their software estimating processes and practices. 1 Introduction When you or your boss or customer receive a cost or schedule estimate for a software project, what do you look for to determine your willingness to rely on that estimate? This report provides a checklist to help you address and evaluate a number of issues that are key to estimates we can trust. The checklist guides you through the seven questions in this table: Seven Questions to Ask When Assessing Your Willingness to Rely On a Cost and Schedule Estimate 1. Are the objectives of the estimate clear and correct? 2. Has the task been appropriately sized? 3. Are the estimated cost and schedule consistent with demonstrated accomplishments on other projects? 4. Have the factors that affect the estimate been identified and explained? 5. Have steps been taken to ensure the integrity of the estimating process? 6. Is the organization's historical evidence capable of supporting a reliable estimate? 7. Has the situation changed since the estimate was prepared? Each question is illustrated with elements of evidence that, if present, support the credibility of the estimate. The answers you receive should help you judge the extent to which you can safely use the estimate for planning, tracking, or decision making. The checklist can be used also to motivate and guide organizations toward improving their software estimating processes and practices. CMU/SEI-95-SR-004 1
10 2 CMU/SEI-95-SR-004
11 2 The Validation Checklist We present the checklist for validating software cost and schedule estimates on the pages that follow. You may make, use, or distribute as many copies as you wish, so long as you reproduce the entire checklist, including the copyright notice. The purpose of the checklist is not to impose criteria, but to arm you with questions to ask when deciding whether or not to rely on a particular estimate. Only you can determine which questions to ask and whether or not the answers you get provide sufficient evidence to satisfy your requirements for using the estimate. Although we prepared this checklist to help you evaluate estimates for software costs and schedules, almost everything in it applies equally to hardware and systems engineering projects. If you have responsibilities for developing hardware or integrated systems, you may find that altering the word 'software' wherever it appears will make the checklist applicable to estimates for these systems as well. The checklist in this report was produced as part of the SEI's Software Cost Estimating Improvement Initiative [Park 94]. Please let us know if you find it helpful, or if you have suggestions for improving its usefulness for your organization. For related checklists that can help you evaluate the processes used for making software estimates, please see the SEI report Checklists and Criteria for Evaluating the Cost and Schedule Estimating Capabilities of Software Organizations [Park 95]. CMU/SEI-95-SR-004 3
12 4 CMU/SEI-95-SR-004
13 References [Goethert 92] [Park 92] [Park 94] [Park 95] Goethert, Wolfhart B., et al. Software Effort Measurement: A Framework for Counting Staff-Hours (CMU/SEI-92-TR-21, ADA258279). Pittsburgh, Pa:, Carnegie Mellon University, September Park, Robert E., et al. Software Size Measurement: A Framework for Counting Source Statements (CMU/SEI-92-TR-20, ADA258304). Pittsburgh, Pa:, Carnegie Mellon University, September Park, Robert E., Goethert, Wolfhart B., and Webb, J. Todd. Software Cost and Schedule Estimating: A Process Improvement Initiative (CMU/SEI-94-SR-03). Pittsburgh, Pa:, Carnegie Mellon University, May Park, Robert E. Checklists and Criteria for Evaluating the Cost and Schedule Estimating Capabilities of Software Organizations (CMU/SEI-95-SR-05). Pittsburgh, Pa:, Carnegie Mellon University, January CMU/SEI-95-SR-004 5
14 6 CMU/SEI-95-SR-004
15 Checklist for Validating Software Cost and Schedule Estimates This checklist is designed to help managers assess the credibility of software cost and schedule estimates. It identifies seven issues to address and questions to ask when determining your willingness to accept and use a software estimate. Each question is associated with elements of evidence that, if present, support the credibility of the estimate. Issue 1. Are the objectives of the estimate clear and correct? Evidence of Credibility The objectives of the estimate are stated in writing. The life cycle to which the estimate applies is clearly defined. The tasks and activities included in (and excluded from) the estimate are clearly identified. The tasks and activities included in the estimate are consistent with the objectives of the estimate. Issue 2. Has the task been appropriately sized? Evidence of Credibility A structured process has been used to estimate and describe the size of the software product. A structured process has been used to estimate and describe the extent of reuse. The processes for estimating size and reuse are documented. The descriptions of size and reuse identify what is included in (and excluded from) the size and reuse measures used. The measures of reuse distinguish between code that will be modified and code that will be integrated as-is into the system. The definitions, measures, and rules used to describe size and reuse are consistent with the requirements (and calibrations) of the models used to estimate cost and schedule. Copyright 1995 by Carnegie Mellon University Checklist page 1 of 6 special report, CMU/SEI-95-SR-004 This work has been sponsored by the U.S. Department of Defense
16 The size estimate was checked by relating it to measured sizes of other software products or components. The size estimating process was checked by testing its predictive capabilities against measured sizes of completed products. Issue 3. Are the estimated cost and schedule consistent with demonstrated accomplishments on other projects? Evidence of Credibility The organization has a structured process for relating estimates to actual costs and schedules of completed work. The process is documented. The process was followed. The cost and schedule models that were used have been calibrated to relevant historical data. (Models of some sort are needed to provide consistent rules for extrapolating from previous experience.) The cost and schedule models quantify demonstrated organizational performance in ways that normalize for differences among software products and projects. (So that a simple, unnormalized, lines-of-code per staff-month extrapolation is not the basis for the estimate.) The consistency achieved when fitting the cost and schedule models to historical data has been measured and reported. The values used for cost and schedule model parameters appear valid when compared to values that fit the models well to past projects. The calibration of cost and schedule models was done with the same versions of the models that were used to prepare the estimate. The methods used to account for reuse recognize that reuse is not free. (The estimate accounts for activities such as interface design, modification, integration, testing, and documentation that are associated with effective reuse.) Extrapolations from past projects account for differences in application technology. (For example, data from projects that implemented traditional mainframe applications require adjustments if used as a basis for estimating client-server implementation. Some cost models provide capabilities for this, others do not.) Checklist page 2 of 6
17 Extrapolations from past projects account for observed, long-term trends in software technology improvement. (Although some cost models attempt this internally, the best methods are usually based on extrapolating measured trends in calibrated organizational performance.) Extrapolations from past projects account for the effects of introducing new software technology or processes. (Introducing a new technology or process can initially reduce an organization's productivity.) Work-flow schematics have been used to evaluate how this project is similar to (and how it differs from) projects used to characterize the organization's past performance. Issue 4. Have the factors that affect the estimate been identified and explained? Evidence of Credibility A written summary of parameter values and their rationales accompanies the estimate. Assumptions have been identified and explained. A structured process such as a template or format has been used to ensure that key factors have not been overlooked. Uncertainties in parameter values have been identified and quantified. A risk analysis has been performed, and risks that affect cost or schedule have been identified and documented. (Elements addressed include issues such as probability of occurrence, effects on parameter values, cost impacts, schedule impacts, and interactions with other organizations.) Issue 5. Have steps been taken to ensure the integrity of the estimating process? Evidence of Credibility Management reviewed and agreed to the values for all descriptive parameters before costs were estimated. Adjustments to parameter values to meet a desired cost or schedule have been documented. Checklist page 3 of 6
18 If a dictated schedule has been imposed, the estimate is accompanied by an estimate of The normal schedule. The additional expenditures required to meet the dictated schedule. Adjustments to parameter values to meet a desired cost or schedule are accompanied by management action that makes the values realistic. More than one cost model or estimating approach has been used, and the differences in results have been analyzed and explained. People from related but different projects or disciplines were involved in preparing the estimate. At least one member of the estimating team is an experienced estimator, trained in the cost models that were used. Estimators independent of the performing organization concur with the reasonableness of the parameter values and estimating methodology. The groups that will be doing the work accept the estimate as an achievable target. Memorandums of agreement have been completed and signed with the other organizations whose contributions affect cost or schedule. Issue 6. Is the organization's historical evidence capable of supporting a reliable estimate? Evidence of Credibility The estimating organization has a method for organizing and retaining information on completed projects (a historical database). The database contains a useful set of completed projects. Elements included in (and excluded from) the effort, cost, schedule, size, and reuse measures in the database are clearly identified. (See, for example, the SEI checklists for defining effort, schedule, and size measures.) Checklist page 4 of 6
19 Schedule milestones (start and finish dates) are described in terms of criteria for initiation or completion, so that work accomplished between milestones is clearly bounded. Records for completed projects indicate whether or not unpaid overtime was used. Unpaid overtime, if used, has been quantified, so that recorded data provide a valid basis for estimating future effort. Cost models that were used for estimating have been used also to provide consistent frameworks for recording historical data. (This helps ensure that comparable terms and parameters are used across all projects, and that recorded data are suitable for use in the estimating models.) The data in the historical database have been examined to identify inconsistencies, and anomalies have been corrected or explained. (This is best done with the same cost models that are used for estimating.) The organization has a structured process for capturing effort and cost data from ongoing projects. The producing organization holds postmortems at the completion of its projects. To ensure that recorded data are valid. To ensure that events that affected costs or schedules get recorded and described while they are still fresh in people's minds. Information on completed projects includes The life-cycle model used, together with the portion covered by the recorded cost and schedule. Actual (measured) size, cost, and schedule. The actual staffing profile. An estimate at completion, together with the values for cost model parameters that map the estimate to the actual cost and schedule. A work breakdown structure or alternative description of the tasks included in the recorded cost. A work-flow schematic that illustrates the software process used. Nonlabor costs. Checklist page 5 of 6
20 Management costs. A summary or list of significant deliverables (software and documentation) produced by the project. A summary of any unusual issues that affected cost or schedule. Evolution in the organization's work-flow schematics shows steady improvement in the understanding and measurement of its software processes. Issue 7. Has the situation changed since the estimate was prepared? Evidence of Credibility The estimate has not been invalidated by recent events, changing requirements, or management action (or inaction). The estimate is being used as the basis for assigning resources, deploying schedules, and making commitments. The estimate is the current baseline for project tracking and oversight. Checklist page 6 of 6
21 REPORT DOCUMENTATION PAGE Form Approved OMB No Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA , and to the Office of Management and Budget, Paperwork Reduction Project ( ), Washington, DC AGENCY USE ONLY 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED (Leave Blank) January 1995 Final 4. TITLE AND SUBTITLE A Manager's Checklist for Validating Software Cost and Schedule Estimates 6. AUTHOR(S) Robert E. Park 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Carnegie Mellon University Pittsburgh, PA SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) HQ ESC/XPK 5 Eglin Street Hanscom AFB, MA SUPPLEMENTARY NOTES 5. FUNDING NUMBERS F C PERFORMING ORGANIZATION REPORT NUMBER CMU/SEI-95-SR SPONSORING/MONITORING AGENCY REPORT NUMBER 12A DISTRIBUTION/AVAILABILITY STATEMENT 12B DISTRIBUTION CODE Unclassified/Unlimited, DTIC, NTIS 13. ABSTRACT (MAXIMUM 200 WORDS) This report provides a checklist of questions to ask and evidence to look for when assessing the credibility of a software cost and schedule estimate. The checklist can be used either to review individual estimates or to motivate and guide organizations toward improving their software estimating processes and practices. 14. SUBJECT TERMS checklists, software cost estimates, software estimating process, software process improvement, software schedule estimates 15. NUMBER OF PAGES PRICE CODE 17. SECURITY CLASSIFICATION OF 18. SECURITY CLASSIFICATION OF 19. SECURITY CLASSIFICATION OF REPORT THIS PAGE ABSTRACT Unclassified Unclassified Unclassified 20. LIMITATION OF ABSTRACT NSN Standard Form 298 (Rev. 2-89) Prescribed by ANSI Std. Z UL Checklist page 7 of 6
An Application of an Iterative Approach to DoD Software Migration Planning
An Application of an Iterative Approach to DoD Software Migration Planning John Bergey Liam O Brien Dennis Smith September 2002 Product Line Practice Initiative Unlimited distribution subject to the copyright.
EAD Expected Annual Flood Damage Computation
US Army Corps of Engineers Hydrologic Engineering Center Generalized Computer Program EAD Expected Annual Flood Damage Computation User's Manual March 1989 Original: June 1977 Revised: August 1979, February
CMM SM -Based Appraisal for Internal Process Improvement (CBA IPI): Method Description
Technical Report CMU/SEI-96-TR-007 ESC-TR-96-007 CMM SM -Based Appraisal for Internal Process Improvement (CBA IPI): Method Description Donna K. Dunaway Steve Masters April 1996 Technical Report CMU/SEI-96-TR-007
Software Vulnerabilities in Java
Software Vulnerabilities in Java Fred Long October 2005 CERT Unlimited distribution subject to the copyright. Technical Note CMU/SEI-2005-TN-044 This work is sponsored by the U.S. Department of Defense.
Overview Presented by: Boyd L. Summers
Overview Presented by: Boyd L. Summers Systems & Software Technology Conference SSTC May 19 th, 2011 1 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection
Operationally Critical Threat, Asset, and Vulnerability Evaluation SM (OCTAVE SM ) Framework, Version 1.0
Operationally Critical Threat, Asset, and Vulnerability Evaluation SM (OCTAVE SM ) Framework, Version 1.0 Christopher J. Alberts Sandra G. Behrens Richard D. Pethia William R. Wilson June 1999 TECHNICAL
ELECTRONIC HEALTH RECORDS. Fiscal Year 2013 Expenditure Plan Lacks Key Information Needed to Inform Future Funding Decisions
United States Government Accountability Office Report to Congressional July 2014 ELECTRONIC HEALTH RECORDS Fiscal Year 2013 Expenditure Plan Lacks Key Information Needed to Inform Future Funding Decisions
Integrated Force Method Solution to Indeterminate Structural Mechanics Problems
NASA/TP 2004-207430 Integrated Force Method Solution to Indeterminate Structural Mechanics Problems Surya N. Patnaik Ohio Aerospace Institute, Brook Park, Ohio Dale A. Hopkins and Gary R. Halford Glenn
An Oil-Free Thrust Foil Bearing Facility Design, Calibration, and Operation
NASA/TM 2005-213568 An Oil-Free Thrust Foil Bearing Facility Design, Calibration, and Operation Steve Bauman Glenn Research Center, Cleveland, Ohio March 2005 The NASA STI Program Office... in Profile
REPORT DOCUMENTATION PAGE *
REPORT DOCUMENTATION PAGE * Form Approved OMBNo. 07040188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions,
Quantitative Methods for Software Selection and Evaluation
Quantitative Methods for Software Selection and Evaluation Michael S. Bandor September 2006 Acquisition Support Program Unlimited distribution subject to the copyright. Technical Note CMU/SEI-2006-TN-026
Insider Threat Control: Using Centralized Logging to Detect Data Exfiltration Near Insider Termination
Insider Threat Control: Using Centralized Logging to Detect Data Exfiltration Near Insider Termination Michael Hanley Joji Montelibano October 2011 TECHNICAL NOTE CMU/SEI-2011-TN-024 CERT Program http://www.sei.cmu.edu
Risk Management Framework
Risk Management Framework Christopher J. Alberts Audrey J. Dorofee August 2010 TECHNICAL REPORT CMU/SEI-2010-TR-017 ESC-TR-2010-017 Acquisition Support Program Unlimited distribution subject to the copyright.
Using the Advancement Degree of Difficulty (AD 2 ) as an input to Risk Management
Using the Advancement Degree of Difficulty (AD 2 ) as an input to Risk Management James W. Bilbro JB Consulting International Huntsville, AL Multi-dimensional Assessment of Technology Maturity Technology
Report Documentation Page
(c)2002 American Institute Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the
Addressing the Real-World Challenges in the Development of Propulsion IVHM Technology Experiment (PITEX)
NASA/CR 2005-213422 AIAA 2004 6361 Addressing the Real-World Challenges in the Development of Propulsion IVHM Technology Experiment (PITEX) William A. Maul, Amy Chicatelli, and Christopher E. Fulton Analex
Guide to Using DoD PKI Certificates in Outlook 2000
Report Number: C4-017R-01 Guide to Using DoD PKI Certificates in Outlook 2000 Security Evaluation Group Author: Margaret Salter Updated: April 6, 2001 Version 1.0 National Security Agency 9800 Savage Rd.
Microstructural Evaluation of KM4 and SR3 Samples Subjected to Various Heat Treatments
NASA/TM 2004-213140 Microstructural Evaluation of KM4 and SR3 Samples Subjected to Various Heat Treatments David Ellis and Timothy Gabb Glenn Research Center, Cleveland, Ohio Anita Garg University of Toledo,
DEFENSE CONTRACT AUDIT AGENCY
DEFENSE CONTRACT AUDIT AGENCY Fundamental Building Blocks for an Acceptable Accounting System Presented by Sue Reynaga DCAA Branch Manager San Diego Branch Office August 24, 2011 Report Documentation Page
Team Software Process (TSP) Coach Certification Guidebook
Team Software Process (TSP) Coach Certification Guidebook Timothy A. Chick Marsha Pomeroy-Huff September 2013 SPECIAL REPORT CMU/SEI-2013-SR-025 Team Software Process (TSP) Initiative http://www.sei.cmu.edu
DCAA and the Small Business Innovative Research (SBIR) Program
Defense Contract Audit Agency (DCAA) DCAA and the Small Business Innovative Research (SBIR) Program Judice Smith and Chang Ford DCAA/Financial Liaison Advisors NAVAIR 2010 Small Business Aviation Technology
Technical Report CMU/SEI-93-TR-024 ESC-TR-93-177 February 1993. Capability Maturity Model SM for Software, Version 1.1. Mark C. Paulk.
Technical Report CMU/SEI-93-TR-024 ESC-TR-93-177 February 1993 Capability Maturity Model SM for Software, Version 1.1 Mark C. Paulk Bill Curtis Mary Beth Chrissis Charles V. Weber Technical Report CMU/SEI-93-TR-024
Guidelines for Developing a Product Line Concept of Operations
Guidelines for Developing a Product Line Concept of Operations Sholom Cohen August 1999 TECHNICAL REPORT CMU/SEI-99-TR-008 ESC-TR-99-008 Pittsburgh, PA 15213-3890 Guidelines for Developing a Product Line
LMI. Improving EDI Data Quality. Logistics Management Institute. W. Michael Bridges Charles D. Guilliams MT403LN1
Logistics Management Institute Improving EDI Data Quality MT403LN1 W. Michael Bridges Charles D. Guilliams STASEHEHT K Apjjiovea t«pucäe release 19960913 090 LMI DTIC GUALTTY INSPECTED 1 December 1995
Technical Report CMU/SEI-97-TR-007 ESC-TR-97-007
Technical Report CMU/SEI-97-TR-007 ESC-TR-97-007 Enterprise Framework for the Disciplined Evolution of Legacy Systems John K. Bergey Linda M. Northrop Dennis B. Smith October 1997 Technical Report CMU/SEI-97-TR-007
Assessment of NASA Dual Microstructure Heat Treatment Method Utilizing Ladish SuperCooler Cooling Technology
NASA/CR 2005-213574 Assessment of NASA Dual Microstructure Heat Treatment Method Utilizing Ladish SuperCooler Cooling Technology Joe Lemsky Ladish Company, Inc., Cudahy, Wisconsin February 2005 The NASA
THE MIMOSA OPEN SOLUTION COLLABORATIVE ENGINEERING AND IT ENVIRONMENTS WORKSHOP
THE MIMOSA OPEN SOLUTION COLLABORATIVE ENGINEERING AND IT ENVIRONMENTS WORKSHOP By Dr. Carl M. Powe, Jr. 2-3 March 2005 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden
Software Acquisition Capability Maturity Model (SA-CMM ) Version 1.03
Software Acquisition Capability Maturity Model (SA-CMM ) Version 1.03 Editors: Jack Cooper Matthew Fisher March 2002 TECHNICAL REPORT CMU/SEI-2002-TR-010 ESC-TR-2002-010 Pittsburgh, PA 15213-3890 Software
IISUP-. NAVAL SUPPLY SVSTE:MS COMMAND. Ready. Resourceful. Responsive!
~ IISUP-. NAVAL SUPPLY SVSTE:MS COMMAND Ready. Resourceful. Responsive! Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated
RT 24 - Architecture, Modeling & Simulation, and Software Design
RT 24 - Architecture, Modeling & Simulation, and Software Design Dennis Barnabe, Department of Defense Michael zur Muehlen & Anne Carrigy, Stevens Institute of Technology Drew Hamilton, Auburn University
ADVANCED NETWORK SECURITY PROJECT
AFRL-IF-RS-TR-2005-395 Final Technical Report December 2005 ADVANCED NETWORK SECURITY PROJECT Indiana University APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AIR FORCE RESEARCH LABORATORY INFORMATION
Taxonomy-Based Risk Identification
Technical Report CMU/SEI-93-TR-6 ESC-TR-93-183 Taxonomy-Based Risk Identification Marvin J. Carr Suresh L. Konda Ira Monarch F. Carol Ulrich Clay F. Walker June 1993 Technical Report CMU/SEI-93-TR-6 ESC-TR-93-183
DEFENSE BUSINESS PRACTICE IMPLEMENTATION BOARD
Defense Business Practice Implementation Board DEFENSE BUSINESS PRACTICE IMPLEMENTATION BOARD Report to the Senior Executive Council, Department of Defense MANAGEMENT INFORMATION TASK GROUP Report FY02-3
A Case Study in Software Maintenance
Technical Report CMU/SEI-93-TR-8 ESC-TR-93-185 A Case Study in Software Maintenance Susan Dart Alan M. Christie Alan W Brown June 1993 Technical Report CMU/SEI-93-TR-8 ESC-TR-93-185 April 1993 A Case
Guidelines for Developing a Product Line Production Plan
Guidelines for Developing a Product Line Production Plan Gary Chastek John D. McGregor June 2002 TECHNICAL REPORT CMU/SEI-2002-TR-006 ESC-TR-2002-006 Pittsburgh, PA 15213-3890 Guidelines for Developing
Headquarters U.S. Air Force
Headquarters U.S. Air Force I n t e g r i t y - S e r v i c e - E x c e l l e n c e Air Force Technology Readiness Assessment (TRA) Process for Major Defense Acquisition Programs LtCol Ed Masterson Mr
Software Engineering Process Group Guide
Technical Report CMU/SEI-90-TR-024 ESD-90-TR-225 Software Engineering Process Group Guide Priscilla Fowler Stan Rifkin September 1990 Technical Report CMU/SEI-90-TR-024 ESD-90-TR-225 September 1990 Software
AFRL-RX-WP-TP-2008-4023
AFRL-RX-WP-TP-2008-4023 HOW KILLDEER MOUNTAIN MANUFACTURING IS OPTIMIZING AEROSPACE SUPPLY CHAIN VISIBILITY USING RFID (Postprint) Jeanne Duckett Killdeer Mountain Manufacturing, Inc. FEBRUARY 2008 Final
Interagency National Security Knowledge and Skills in the Department of Defense
INSTITUTE FOR DEFENSE ANALYSES Interagency National Security Knowledge and Skills in the Department of Defense June 2014 Approved for public release; distribution is unlimited. IDA Document D-5204 Log:
Using EVMS with COTS-Based Systems
Using EVMS with COTS-Based Systems Mary Jo Staley Patricia Oberndorf Carol A. Sledge June 2002 TECHNICAL REPORT CMU/SEI-2002-TR-022 ESC-TR-2002-022 Pittsburgh, PA 15213-3890 Using EVMS with COTS-Based
Asset Management- Acquisitions
Indefinite Delivery, Indefinite Quantity (IDIQ) contracts used for these services: al, Activity & Master Plans; Land Use Analysis; Anti- Terrorism, Circulation & Space Management Studies; Encroachment
Introduction to the OCTAVE Approach
Introduction to the OCTAVE Approach Christopher Alberts Audrey Dorofee James Stevens Carol Woody August 2003 Pittsburgh, PA 15213-3890 Introduction to the OCTAVE Approach Christopher Alberts Audree Dorofee
Software Security Engineering: A Guide for Project Managers
Software Security Engineering: A Guide for Project Managers Gary McGraw Julia H. Allen Nancy Mead Robert J. Ellison Sean Barnum May 2013 ABSTRACT: Software is ubiquitous. Many of the products, services,
Technical Report CMU/SEI-96-TR-022 ESC-TR-96-022
Technical Report CMU/SEI-96-TR-022 ESC-TR-96-022 Cleanroom Software Engineering Reference Model Version 1.0 Richard C. Linger Carmen J. Trammell November 1996 Technical Report CMU/SEI-96-TR-022 ESC-TR-96-022
Configuration Management (CM) Plans: The Beginning to Your CM Solution
Technical Report CMU/SEI-93-TR-{Insert number} ESD-TR-93-{Insert number} July 1993 Configuration Management (CM) Plans: The Beginning to Your CM Solution Nadine M. Bounds Susan A. Dart Approved for public
DATA ITEM DESCRIPTION
DATA ITEM DESCRIPTION Form Approved OMB NO.0704-0188 Public reporting burden for collection of this information is estimated to average 110 hours per response, including the time for reviewing instructions,
Algorithmic Research and Software Development for an Industrial Strength Sparse Matrix Library for Parallel Computers
The Boeing Company P.O.Box3707,MC7L-21 Seattle, WA 98124-2207 Final Technical Report February 1999 Document D6-82405 Copyright 1999 The Boeing Company All Rights Reserved Algorithmic Research and Software
Obsolescence Considerations for Materials in the Lower Sub-Tiers of the Supply Chain
INSTITUTE FOR DEFENSE ANALYSES Obsolescence Considerations for Materials in the Lower Sub-Tiers of the Supply Chain Jay Mandelbaum Christina M. Patterson April 2015 Approved for public release; distribution
Issue Paper. Wargaming Homeland Security and Army Reserve Component Issues. By Professor Michael Pasquarett
Issue Paper Center for Strategic Leadership, U.S. Army War College May 2003 Volume 04-03 Wargaming Homeland Security and Army Reserve Component Issues By Professor Michael Pasquarett Background The President
John Mathieson US Air Force (WR ALC) Systems & Software Technology Conference Salt Lake City, Utah 19 May 2011
John Mathieson US Air Force (WR ALC) Systems & Software Technology Conference Salt Lake City, Utah 19 May 2011 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the
NAVSUP FLC NORFOLK PHILADELPHIA OFFICE
NAVSUP FLC NORFOLK PHILADELPHIA OFFICE Gerald Furey Deputy for Small Business Ready. Resourceful. Responsive! 1 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the
I N S T I T U T E F O R D E FE N S E A N A L Y S E S NSD-5216
I N S T I T U T E F O R D E FE N S E A N A L Y S E S NSD-5216 A Consistent Approach for Security Risk Assessments of Dams and Related Critical Infrastructure J. Darrell Morgeson Jason A. Dechant Yev Kirpichevsky
CERT Virtual Flow Collection and Analysis
CERT Virtual Flow Collection and Analysis For Training and Simulation George Warnagiris 2011 Carnegie Mellon University Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden
Department of Defense
OFFICE OF THE INSPECTOR GENERAL PERSONNEL SUPPORT FOR TIIE CORPORATE INFORMATION MANAGEMENT INITIATIVE Report No. 95-233 June 12, 1995 Department of Defense Additional Information and Copies This report
Interpreting Capability Maturity Model Integration (CMMI ) for Business Development Organizations in the Government and Industrial Business Sectors
Interpreting Capability Maturity Model Integration (CMMI ) for Business Development Organizations in the Government and Industrial Business Sectors Donald R. Beynon, Jr. January 2007 Technical Note CMU/SEI-2007-TN-004
Pima Community College Planning Grant For Autonomous Intelligent Network of Systems (AINS) Science, Mathematics & Engineering Education Center
Pima Community College Planning Grant For Autonomous Intelligent Network of Systems (AINS) Science, Mathematics & Engineering Education Center Technical Report - Final Award Number N00014-03-1-0844 Mod.
73rd MORSS CD Cover Page UNCLASSIFIED DISCLOSURE FORM CD Presentation
73rd MORSS CD Cover Page UNCLASSIFIED DISCLOSURE FORM CD Presentation 21-23 June 2005, at US Military Academy, West Point, NY 712CD For office use only 41205 Please complete this form 712CD as your cover
NAVAL POSTGRADUATE SCHOOL
NPS-CS-06-009 NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA CAC on a MAC: Setting up a DOD Common Access Card Reader on the Macintosh OS X Operating System by Phil Hopfner March 2006 Approved for public
PROBLEM STATEMENT: Will reducing the ASD for Kadena AB F-15 C/Ds increase the CPFH for this Mission Design Series (MDS)?
CONSULTING REPORT Kadena F-15 C/D Cost per Flying Hour Analysis PROJECT MANAGERS: Capt Jeremy Howe and Capt Kevin Dawson AFLMA PROJECT NUMBER: LM200520700 16 August 2005 BACKGROUND: Kadena AB is currently
Best Training Practices Within the Software Engineering Industry
Technical Report CMU/SEI-96-TR-034 ESC-TR-96-134 Best Training Practices Within the Software Engineering Industry Nancy Mead Lawrence Tobin Suzanne Couturiaux November 1996 Technical Report CMU/SEI-96-TR-034
Graduate Level Credit for Resident EWS Students. Natasha McEachin CG 1
Graduate Level Credit for Resident EWS Students Natasha McEachin CG 1 February 20, 2009 1 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information
Army Environmental Policy and ISO 14001
Army Environmental Policy and ISO 14001 Army Environmental Policy and ISO 14001 Rick Sinclair and Rochie Tschirhart Since the 1970s, the U.S. Army has instituted environmental policies, programs, regulations,
Best Practices for Researching and Documenting Lessons Learned
NASA/CR 2008 214777 Best Practices for Researching and Documenting Lessons Learned John L. Goodman United Space Alliance Houston, Texas March 2008 THE NASA STI PROGRAM OFFICE... IN PROFILE Since its founding,
DATA ITEM DESCRIPTION
DATA ITEM DESCRIPTION Form Approved OMB NO.0704-0188 Public reporting burden for collection of this information is estimated to average 110 hours per response, including the time for reviewing instructions,
Mr. Steve Mayer, PMP, P.E. McClellan Remediation Program Manger Air Force Real Property Agency. May 11, 2011
Mr. Steve Mayer, PMP, P.E. McClellan Remediation Program Manger Air Force Real Property Agency May 11, 2011 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection
TITLE: The Impact Of Prostate Cancer Treatment-Related Symptoms On Low-Income Latino Couples
AD Award Number: W81WH-07-1-0069 TITLE: The Impact Of Prostate Cancer Treatment-Related Symptoms On Low-Income Latino Couples PRINCIPAL INVESTIGATOR: Sally L. Maliski, Ph.D., R.N. CONTRACTING ORGANIZATION:
PAC 730.3.B.03/2012-07 April 24, 2014 14-PAC-005(R) MEMORANDUM FOR REGIONAL DIRECTORS, DCAA HEADS OF PRINCIPAL STAFF ELEMENTS, HQ, DCAA
DEFENSE CONTRACT AUDIT AGENCY DEPARTMENT OF DEFENSE 8725 JOHN J. KINGMAN ROAD, SUITE 2135 FORT BELVOIR, VA 22060-6219 IN REPLY REFER TO PAC 730.3.B.03/2012-07 April 24, 2014 14-PAC-005(R) MEMORANDUM FOR
Testing a Software Product Line
Testing a Software Product Line John D. McGregor December 2001 TECHNICAL REPORT CMU/SEI-2001-TR-022 ESC-TR-2001-022 Pittsburgh, PA 15213-3890 Testing a Software Product Line CMU/SEI-2001-TR-022 ESC-TR-2001-022
FIRST IMPRESSION EXPERIMENT REPORT (FIER)
THE MNE7 OBJECTIVE 3.4 CYBER SITUATIONAL AWARENESS LOE FIRST IMPRESSION EXPERIMENT REPORT (FIER) 1. Introduction The Finnish Defence Forces Concept Development & Experimentation Centre (FDF CD&E Centre)
Optical Blade Position Tracking System Test
National Renewable Energy Laboratory Innovation for Our Energy Future A national laboratory of the U.S. Department of Energy Office of Energy Efficiency & Renewable Energy Optical Blade Position Tracking
