1 + Thomas J. Coonce Glen B. Alleman A Cure for Unanticipated Cost and Schedule Growth We have lots data. Let s use it create more credible estimates to help tame the growth beast
2 + Why Are We Here? 2 In spite the estimating community s efforts to provide credible estimates, government programs still seem to deliver less than promised, cost more than planned, and take longer than needed. Lots of reasons. Some well established; some hypothesized When estimates are consistently biased low Decisions of choice are distorted Cost growth causes more growth as programs are stretched out to fund portfolios with fixed budgets Taxpayers become more cynical and negative about government The estimating community s credibility is seriously questioned
3 + Why are We Here? (Concluded) 3 This presentation will Summarize many of the reasons documented and hypothesized why programs deliver less, cost more and are late; Provide a broad brush of what the community has done to improve the imbalance; Assert that we can not solve all the root causes, but we can effectively use historical experience (reference class forecasting) to provide more credible estimates for future systems; and Propose and discuss a number of changes needed in estimating, acquisition, and the contracting communities to restore balance and credibility and go a long way to tame the growth beast
4 + Cost and Schedule Growth 4 In 1982, an unnamed witness at a House Armed Service Committee stated, Enough material has been written on the subject of cost growth during the last ten years to fill a Minuteman silo. Unfortunately, cost growth is still with us. In a decade since that testimony enough additional information on cost growth has been written to fill a second minuteman silo 1 Cost/Budget Growth 2 Percent of Projects Study Which Experienced Average Median Growth NASA in the 90s 36% 26% 78% NASA in the 70s 43% 26% 75% NASA in the 80s (GAO) 83% 60% 89% DoD RDT&E 45% 27% 76% 1. Cost Growth in DoD Major Programs: A Historical Perspective, Col. Harry Calcutt, April 1993, 2. Hamaker and Schaffer, NASA, 2004
5 + Cost & Schedule Growth Summary at NASA - Combined 30 Mission Growth Average Over & Above Reserves 3 5 Development Cost Growth* Schedule Growth Development Cost Growth 60% 50% 40% 30% 20% 10% 0% 42% From Phase B Start 29% From PDR 21% From CDR Phase B/C/D Schedule Growth 60% 50% 40% 30% 20% 10% 0% 29% From Phase B Start 23% From PDR 19% From CDR 3 Internal NASA Study, 2009
6 + Cost and Schedule Growth (Continued) 6 Many researcher have tried to understand the root causes for growth. Here is a list from one study 4 Requirements related Poor initial requirement definition Poor performance/cost trade-off during development Changes in quantity requirements Estimating related Errors due to limitation is estimating procedures Failure to understand and account for technical risks Poor inflation estimates Top down pressure to reduce estimates Lack of valid independent cost estimates 4 Calcutt, April 1993
7 + Cost and Schedule Growth (Continued) 7 Root causes from Col. Calcutt s study (continued) Program Management related Lack of program management expertise Mismanagement/human error Over optimism Schedule concurrency Program stretch outs to keep production lines open Contracting related Lack of competition Contractor buy-in Use of wrong type of contract Inconsistent contract management/admin procedures Too much contractor oversight Waste Excess profits Contractors overstaffed Contractor indirect costs unreasonable Taking too long to resolve undefinitized contracts
8 + Cost and Schedule Growth (Continued) 8 Root causes from Col. Calcutt s study (Concluded) Budget related Funding instabilities caused by trying to fund too many programs Funding instabilities caused by congressional decisions Inefficient production rates due to stretching out programs Failure to fund for management reserves Failure to fund programs at most likely cost
9 + Cost and Schedule Growth (Concluded) 9 Root causes cited by the Office of Program Assessment and Root Cause Analysis (PARCA) 5 Inception related Unrealistic performance expectations Unrealistic baseline estimates for cost or schedule Immature technologies or excessive manufacturing or integration risk Execution related Unanticipated design, engineering mfg or technology integration issues Changes in procurement quantities Inadequate program funding or funding instability Poor performance by government or contractor personnel 5 Report to Congress on Performance Assessment and Root Cause Analyses, Office of the Under Secretary of Defense for Acquisition, Technology and Logistics, March 2014, p. 7, parca-report-to-congress.pdf
10 + A Broad Brush Of What The Estimating Community Has Done to Tame the Growth Beast 10 Instituted independent estimating organizations at various levels with DoD and civilian agencies Developed cost estimates using analogous historical data (reference class forecasting) Required a Cost Analysis Requirements Description (CARD) to ensure cost estimates are based on the agreed requirements Developed a variety of professional training and certification programs, e.g., Certified Cost Estimator/Analyst (CCEA), Certified Parametric Practitioner (CPP), AACE certifications, and PMI Augmented independent estimating teams with program management and scheduling personnel e.g., NASA and DoE Continued to collect historical cost and technical data to improve parametric cost estimates Began to develop estimates using planned top-level schedules and historical head counts (recognition that time and people are big cost drivers) Another form of reference class forecasting
11 + A Broad Brush Of What The Estimating Community Has Done (Concluded) 11 Begun to set cost and schedule targets based on the historical variability of cost and schedules The Weapon System Acquisition Reform Act (WSARA) of 2009 required DoD programs to be budgeted at the 80% cost confidence level 6 NASA requires programs to budgeted with a 70% probability of meeting both cost and schedule targets 6 According to the FY 2011 Annual Report on Cost Assessment Activities by the Director, Cost Assessment and Program Evaluation (CAPE), the WSARA requirement for confidence levels was eliminated in the National Defense Authorization Act for Fiscal Year 2011, Public Law Today, the requirement is to select a confidence level such that it provides a high degree of confidence that the program can be completed without the need for significant adjustment to program budgets.
12 + So What? 12 The estimating community is the best position to understand, document and communicate the myriad reasons for cost and schedule growth. We are the masters at collecting the data and evidence! But it is not our role to make the changes. We can only advise We can, however, improve our estimates by using our historical data more effectively We can persuade government leadership to require contractors to do the same
13 + A Few Observations 13 Our estimates are typically formed around a product-oriented structure. We have great historical databases upon which to develop credible estimates. We typically estimate individual WBS elements by developing Cost Estimating Relationships (CERs) like this: COMBINED COST MODELING AND TECHNICAL RISK COST MODELING UNCERTAINTY CER Cost = a + bx c WBS Cost Estimate $ Historical data point Cost estimating relationship TECHNICAL RISK Standard percent error bounds Input variable Cost Driver (Weight)
14 + A Few Observations (Continued) 14 But we have difficulty persuading government leadership to increase their estimates that reflect the historical variances because they can t relate it to their implementation plans that look like this:
15 + Uncertainty in the PM s Plan Must be Driven by Historical Data TI = Time-Independent Cost: Does not change as schedule slips. Example: Materials 15 TI $ TI $ TD $ = Segment Duration X Burn Rate TI $ TI $ Uncertainty Task Duration Probability of Occurrence Duration Uncertainty TI $ TI $ Project End Risk TD $ TD = Time-Dependent Cost: Increases as schedule slips. Example: LOE; marching Army Burn Rate Uncertainty Burn Rate
16 + A Few Observations (Concluded) 16 PDF & CDF CER-Based PDF & CDF Integrated Master Schedule-Based Probability Density % Cumulative Probability = Probability Density % Cumulative Probability Historical Cost Data by WBS Historical Activity Durations and Associated Costs
17 + Rationale for Budgeting at the 70 percent Joint Confidence Level 17 Schedule drives a large component of cost We want programs to deliver on or before promised and at or below the budgeted cost. This is the problem we are trying to solve. We should have a better than 50/50 change of meeting planned targets. (Don t we owe this to the taxpayers?) There is no general consensus within the estimating community about the right joint confidence level upon which to set budgets Little empirical data (Too soon to tell if NASA s experiment is working) More research needed Until then, pick a reasonably high number and see if it works.
18 + A Proposed Solution - Step 1: Program Office Creates a Request for Information (RFI) 18 Program Capabilities (Requirements) 1.0 Develop Draft Integrated Master Plan (IMP) 2.0 Develop Summary Level Integrated Master Schedule (IMS) RFI to Industry Historical Activity Durations and Associated Costs 3.0 Develop Summary- Level Baseline Cost-Loaded IMS 7.0 Prepare RFI Package with IMS and RR 4.0 Create Risk Register (RR) Risk Register 5.0 Create Joint Probability Distribution 6.0 Decide on Target Cost and Completion Date
19 + A Proposed Solution - Step 2: Contractors Review and Submit Revised Plan 19 Is the top-level plan logical given the technical challenges and capabilities required? If no, What other activities should be included or dropped? What changes in logic are required? Are activities durations consistent (within family) of your experience? Are the costs consistent (within family) of your experience? Is the PMO s perspective on risks realistic? If not, Which risks are overstated? Which risks are understated? Which risks were missed? And what is your assessment of probabilities and consequences for those risks? Revise and submit contractor-modified high level cost-level plan, updated Risk Register, and Probability Assessment
20 + A Proposed Solution - Step 3: Program Office Uses the Responses to RFIs to Improve an RFP Assess Responses to RFI 9.0 Update IMP/ IMS, Risk Register, and/or Modify Rqmts Updated Program Capabilities (Requirements) Multiple responses Updated Risk Register (RR) Updated Cost-Loaded IMS 12.0 Create Request for Proposal (RFP) 11.0 Decide on Targeted Cost and Completion Date 10.0 Re-create Joint Probability Distribution
21 + A Proposed Solution - Step 4: Bidders Prepare Responses to Proposals 21 Updated Program Capabilities (Requirements) 13.0 Create Contractor Version of IMP 14.0 Develop Summary Level IMS Response to RFP Contractor Historical Activity Durations and Associated Costs 15.0 Develop Detailed Cost or Resource- Loaded IMS 19.0 Complete Preparation of Proposal 16.0 Create Updated Risk Register (RR) Contractor Risk Register (RR) 17.0 Create Joint Probability Distribution 18.0 Specify Joint Confidence Level of Targeted Cost and Completion Date
22 + A Proposed Solution - Step 5: Program Office Evaluates and Oversees the Contractors 22 Government program offices should: Award contracts based on the credibility of the historical uncertainty data used and discrete risks for the joint cost and schedules proposed Hold requirements stable after contract awards Require contractors to submit updated Risk Registers, and probability statements associated with Best Case, Worse Case and Mostly Likely Estimates at Complete in Format 5 of the Integrated Program Management Reports (IPMRs) every six months (What gets measured, gets managed)
23 + A Proposed Solution - Step 6: Contractors Execute to the Plan 23 Winning contractors should: Set Program Management Baselines (PMBs) Using more detailed cost or resource-loaded Integrated Master Schedules (IMSes) With at least a 50% joint probability of meeting cost and schedule targets Set up objective measures of progress at IBRs that are directly connected back to user-desired capabilities through appropriate Technical Performance Measures (TPMs) Record progress (Budgeted Cost of Work Performed) using the pre-defined set of progress criteria Maintain risk registers and use them to provide probability statements of cost and completion dates every six months
24 + More on Why 24 We don t really need to understand why we have cost and schedule growth; we just need to estimate future programs using all the historical experience We need to speak the same language that the government PM speaks. Independent product-oriented estimates based on historical data are right if the probabilities are set right, but government program managers have difficulty relating these estimates to their plans or potential bidders plans Activity-based estimates that are grounded with historical data help government PMs to revise their plans based on well communicated reasons for the cost and schedule variations, i.e., what happened to similar programs in the past Initial government probabilistic estimates that are based on a program s activity-based plan that recognize the natural variation of cost and schedule performance of historical projects, should tame the growth beast if the joint probabilities are greater than 70 percent Estimates that are activity-based aid government PMs and contractors to manage the contracts during execution. The language is the same and the focus is on their plan (the PMB) and the risks!
25 + Some How Challenges 25 Estimating community needs to start collecting common development activity duration data and associated costs Some of this may already be available from the schedule data contained in the Earned Value Management Central Repository (EVM-CR) Government program office have to step up their game Need to think through the development of the system capabilities and document those in an IMP Need to create a notional summary-level cost-loaded activity-based plan Need to get help on the historical variation of activity durations and associated costs Need to coordinate with the acquisition community on the RFI and follow-on RFP process Integrate Program Management Report (IPMR) Data Item Description (DID) would need to be updated Require the Integrated Master Plan (IMP) as part of the RFP submission Add submission of the contractor s Risk Register and instrumented native probability models every six months Others?
GAO United States Government Accountability Office Applied Research and Methods GAO Cost Estimating and Assessment Guide Best Practices for Developing and Managing Capital Program Costs March 2009 Preface
RISK MANAGEMENT GUIDE FOR DOD ACQUISITION Sixth Edition (Version 1.0) August, 2006 Department of Defense Preface The Department of Defense (DoD) recognizes that risk management is critical to acquisition
Over Target Baseline and Over Target Schedule Guide OUSD AT&L (PARCA) December 5, 2012 1 Table of Contents Section Page No. Introduction 4 Chapter 1. What is Formal Reprogramming? 5 1.1 Formal Reprogramming
STRATEGIC WORKFORCE PLANNING IF YOU ONLY LOOK AT WHAT IS, YOU MIGHT NEVER ATTAIN WHAT COULD BE. - ANONYMOUS August, 2003 TABLE OF CONTENTS INTRODUCTION... 2 WORKFORCE PLANNING WHAT IS IT?... 3 WORKFORCE
STRENGTHENING NONPROFITS: A Capacity Builder s Resource Library Measuring Outcomes TABLE OF CONTENTS INTRODUCTION... 4 OVERVIEW... 5 Why Measure Outcomes?... 5 What is Outcome Measurement?... 5 Some Words
Stay on Track for the PMP Exam with layout, contents and the copyright notes remain unchanged. Items (questions with answering options) from this document may be copied and discussed in public forums and
Standards for Internal Control in New York State Government October 2007 Thomas P. DiNapoli State Comptroller A MESSAGE FROM STATE COMPTROLLER THOMAS P. DINAPOLI My Fellow Public Servants: For over twenty
GAO United States General Accounting Office Executive Guide March 2004 Version 1.1 INFORMATION TECHNOLOGY INVESTMENT MANAGEMENT A Framework for Assessing and Improving Process Maturity a GAO-04-394G March
SECTION 1: DEVELOPMENT POCESSES 1.1 PEFOMANCE MEASUEMENT POCESS 1.1 Performance Measurement Process Introduction Performance measures are recognized as an important element of all Total Quality Management
ASSESSMENT REPORT April 23, 2014 Document History This document is controlled through the Document Management Process. To verify that the document is the latest version, please contact the First Data team.
The Performance Management Toolkit A Guide to Developing and Implementing Performance Management Plans Policy and Program Coordination Bureau Contract Number: AEP-C-00-99-00034-00 Last Revised: April 2003
` City of Beverly Hills Public Works Services Department 345Foothill Road Beverly Hills, CA. 90210 Infor (Hansen) Computerized Maintenance Management System Request for Proposal For Infor System Re-implementation
Guide to Implementing Quality Improvement Principles Foreword Quality Improvement in health care continues to be a priority for the Centers for Medicare & Medicaid Services (CMS). In the Affordable Care
Contracting Guidance to Support Modular Development June 14, 2012 Table of Contents I. Purpose and Scope... 2 II. Overview... 2 III. Background... 3 IV. Modular Development: Key Terms, Principles and Risks...
Review of Asset Management Practices in the Ontario Electricity Distribution Sector March 10th, 2009 Disclaimer The views expressed in this report are those of KPMG LLP and do not necessarily represent
How To Develop a Multi-Year Capital Plan Planning Saves Time and Money Defining a shared community vision or setting community goals are two ways that communities can begin the  process. by Mike Hattery
Making Smart IT Choices Understanding Value and Risk in Government IT Investments Sharon S. Dawes Theresa A. Pardo Stephanie Simon Anthony M. Cresswell Mark F. LaVigne David F. Andersen Peter A. Bloniarz
Louisiana Department of Transportation and Development Initial Transportation Asset Management Plan (Pilot Version February 2015) February 2015 Louisiana Department of Transportation and Development Initial
DoD Integrated Product and Process Development Handbook August 1998 OFFICE OF THE UNDER SECRETARY OF DEFENSE (ACQUISITION AND TECHNOLOGY) WASHINGTON, DC 20301-3000 Table of Contents Chapter 1 Introduction
Big data comes of age in FP&A Financial planning, budgeting, and forecasting kpmg.com KPMG 2013 Startup Success Guide i CONTENTS 1 Executive summary 2 Economic update: Federal Reserve Bank of Chicago 4
Here is Your Money: Using the Standard Cost Model to Measure Regulatory Compliance Costs in Developing Countries Investment Climate Advisory Services I World Bank Group Here Is Your Money! Using the Standard
PREPARED STATEMENT OF ROBERT M. GROVES DIRECTOR U.S. CENSUS BUREAU Census: Planning Ahead for 2020 Before the Subcommittee on Federal Financial Management, Government Information, Federal Services, and
General Principles of Software Validation; Final Guidance for Industry and FDA Staff Document issued on: January 11, 2002 This document supersedes the draft document, "General Principles of Software Validation,
Cataloging and Metadata Education: A Proposal for Preparing Cataloging Professionals of the 21 st Century A response to Action Item 5.1 of the Bibliographic Control of Web Resources: A Library of Congress