CPM-500 Principles of Technical Management Lesson F: Project Schedule Risk Analysis Presented by David T. Hulett, Ph.D. Hulett & Associates, LLC Info@projectrisk.com www.projectrisk.com IPMC 2002 Fall Conference Professional Education Program 1
Defining Risk Project risk is an uncertain event or condition that, if it occurs, has a positive or negative effect on a project objective.* *Guide to the Project Management Body of Knowledge (PMBOK Guide), 2000 Project Management Institute 2
Opportunities and Threats Most Likely Estimate Optimistic Estimate Opportunities Threats 3
Why Manage Risk? Ignoring the risk does not make it it go away Our objective is is to to turn vague uncertainty into identified, quantified risk 4
Iron Triangle of Project Objectives Main three project objectives (a.k.a. Triple Constraint of project management) Technical Cost Schedule Environmental compliance, Safety, Being a good corporate citizen 5
Which Objective Defines the Project? The objectives are interdependent When one is unmovable, the others need to be flexible Any pressure on one will be transmitted to the others Cost Time Technical 6
Performance Objective Traditionally Comes First Projects are usually defined by technical, performance, quality or reliability specifications The obvious question for a project management professional: How much will such a project cost? How long will it take? Cost Time Technical 7
But, Schedule May be the Main Goal If the time factor is crucial The project may be de-scoped => problem with customer Resources may be added => cost more to finish on time Cost Time Technical 8
Or, Cost May Be the Limiting Factor Limits on cost will impact both schedule and performance Performance may be relaxed or reduced to limit cost May use less-capable labor, less overtime => takes longer Cost Time Technical 9
The Objectives are Interrelated Pressures from one objective will impinge on others Pressures also impact on quality and business goals Cost Time Technical 10
Establish Priority of Project Objectives: Example Cost Technical Performance Schedule Must Have Nice to Have Accept Result 11
Measuring Objectives Technical vs. Cost and Time Cost is denominated in dollars, time in days Technical objectives No common unit of measurement Weight Speed, range, capacity, climbing rate Software function points, lines of code Reliability Mean time between failure Quality measures, costs 12
Using Technical Objectives Measurements Technical objectives are not comparable A dollar is a dollar, a day is a day In Reliability Analysis (Fault Tree) Each failure mode has a probability of failure there is one measure of success Otherwise:How do you trade off climbing rate with carrying capacity or reliability? 13
Risk Management Processes Risk Management Planning Risk Identification Qualitative Risk Analysis Quantitative Risk Analysis Risk Response Planning Risk Monitoring and Control Source: Guide to the Project Management Body of Knowledge, (PMBOK Guide, 2000 Project Management Institute 14
Risk Management Plan Plan your approach to risk management on THIS PROJECT Who will manage the risk management process? Determine approach: quantitative, qualitative, narrative How frequently will the risk analysis cycle be done? Budget for the risk management activities Reference: Project Risk Analysis and Management (PRAM) Guide, Association for Project Management, 1997 15
Risk Management Processes Risk Management Planning Risk Identification Qualitative Risk Analysis Quantitative Risk Analysis Risk Response Planning Risk Monitoring and Control Source: Guide to the Project Management Body of Knowledge (PMBOK Guide), 2000 Project Management Institute 16
Risk Breakdown Structure 17
Technical Risk Examples Major state-of-the-art advance may be needed Requirements are not stable or are complex and may be difficult to meet Operating environment is difficult, and may pose problems Logistical challenges in getting equipment to site -- may be more difficult than anticipated 18
Technical Risk (continued) Integration requires interfaces between project sections that may be complex Reliability / maintainability standards may be challenging The design is incomplete or our approach may not work Concurrency (overlapping of phases) may result in confusion, missteps and rework 19
Technical Risk (continued) Test and Evaluation program may not be capable of assessing performance Modeling and Simulation may not be adequate to support the program through all phases Production capabilities may not be adequate for the demanding configuration 20
Technical Risk (continued) The developer may not be capable to design and manufacture the system Budget resources may be insufficient Customer and Contractor management teams may not be sufficient or adequate Time available may be insufficient Source: Risk Management Guide for DoD Acquisition, Defense Acquisition University, January 2000 21
SEI Categories for Software Development Risk Examine risks in several areas Technical aspects of engineering software products Environment within which the development takes place Constraints to successful software development Software Engineering Institute at Carnegie Mellon University www.sei.cmu.edu 22
Technical Risk - Top Ten Checklist of Software Risks - Barry Boehm Barry Boehm has written several articles that are included in his edited volume: Software Risk Management, IEEE Computer Society Press, 1989 (out of print). This list is from his presentation to the Southern California Risk Management Symposium, Sept. 2002 23
The Top Ten Software Risk Items Barry Boehm Risk Item Risk Management Techniques 1. Personnel Shortfalls Staffing with top talent; key personnel agreements; incentives; team-building; training; tailoring process to skill mix; peer reviews 2. Unrealistic schedules and budgets Business case analysis; design to cost; incremental development; software reuse; requirements descoping; adding more budget and schedule 3. COTS; external components Qualification testing; benchmarking; prototyping; reference checking; compatibility analysis; vendor analysis; evolution support analysis 4. Requirements mismatch; gold plating Stakeholder win-win negotiation; business case analysis; mission analysis; ops-concept formulation; user surveys; prototyping; early users manual; design/develop to cost 5. User interface mismatch Prototyping; scenarios; user characterization (functionality, style, workload) 24
The Top Ten Software Risk Items Barry Boehm (Continued) Boehm (Continued) Risk Item 6. Architecture, performance, quality 7. Requirements changes Risk Management Techniques Architecture tradeoff analysis and review boards; simulation; benchmarking; modeling; prototyping; instrumentation; tuning High change threshold; information hiding; incremental development (defer changes to later increments) 8. Legacy software Design recovery; phaseout options analysis; wrappers/mediators; restructuring 9. Externally-performed tasks 10. Straining Computer Science capabilities Reference checking; pre-award audits; award-fee contracts; competitive design or prototyping; team-building Technical analysis; cost-benefit analysis; prototyping; reference checking 25
Common Colds of the Software World: Capers Jones Creeping user requirements Excessive schedule pressure Poor quality Inaccurate estimation of costs Inaccurate metrics and measurement Management malpractice Silver bullet syndrome Source: Capers Jones, Assessment and Control of Software Risks, Prentice Hall, Yourdon Press, 1994 26
Checklist Example Project Name Project Manager Risk Type Risk Area Project Risk Checklist Description of Uncertainty Technology Design Complexity State-of-the-Art Integration Organizational Objectives Unclear Changing Resources Compete w/ other proj. Inexperienced Customer Expectations Unrealistic Interface Vague, changing Not timely Funding Intermittant Regulatory Permit Required Uncertain requirements Uncertain timing Evaluation of Risk (present, importance) Resolution (Action, responsible) 27
Risk Identification Tools: Brainstorming Have the right people in the room Expose the people to the project Have them prepare for the session Get them off-site to concentrate Use the checklist developed on other projects Lessons Learned files Synergy among the participants 28
Sources of Data on Technical Risk Comparison with similar systems Relevant lessons learned Experience Results from tests and prototype development Data from engineering and other models Specialist and expert judgment Analysis of plans and related documents Modeling and simulation Source: Risk Management Guide for DoD Acquisition, Defense Acquisition University, January 2000 29
Risk Management Processes Risk Management Planning Risk Identification Qualitative Risk Analysis Quantitative Risk Analysis Risk Response Planning Risk Monitoring and Control Source: Guide to the Project Management Body of Knowledge (PMBOK Guide), 2000 Project Management Institute 30
Willoughby Templates In early 1980s, W. J. Willoughby, Jr was chairman of the Defense Science Board DSB developed a way to look at risk in Defense Acquisition programs using a simple template approach What is the problem? How can it be addressed? When in the project life cycle should risk mitigation take place Represents Lessons Learned 31
Defense Science Board s Findings and Recommendations Most new weapon systems are less than satisfactory and require burdensome maintenance and logistics requires additional equipment in order to meet the needs.. programs cannot succeed for technical reasons A poorly designed produce cannot be tested efficiently, produced or deployed. Manufacturing problems will overwhelm production schedules and costs 32
Defense Science Board s Findings and Recommendations (continued) Corrective measures by the DoD have focused on establishing a series of management checkpoints and review activities this approach has been responsible for adding numerous layers of management and has tended to compartmentalize, matrixize and polarize the major areas of the acquisition process: design, test and production they do not describe the industrial process 33
Defense Science Board s Findings and Recommendations (continued) The probable cause (of the problems facing acquisition programs) is inadequate engineering and manufacturing disciplines combined with improperly defined and implemented logistics programs. Identify and establish critical engineering process and their control methods Transition from Development to Production, Solving the Risk Equation DoD 4246.7-M, January 1984 34
Overriding Attributes of the Defense Science Board s Recommendations Assurance of design maturity Assessment of contractor s design policy Measurement of test stability Near absence of failures in development testing of a stable design Certification of manufacturing processes Design for production and proof of process 35
Willoughby Templates Top Level Top Level of the Willoughby Templates 36
Willoughby Templates Detail 37
Willoughby Templates Detail (continue) 38
Willoughby Templates Detail (continue) 39
Example Template: Design Requirements Area of Risk Accurate and complete specification of the design reference mission profile is required Sometimes the profile does not correspond to the ultimate service use Often the profile is left to the contractor s discretion 40
Example Template: Design Requirements (continued) Outline for Reducing Risk Functional mission profile shows all functions on a time scale prepared by the government customer Environmental mission profile showing the surroundings affecting the system Contractor prepares profiles based on the government s and become the design requirements Timeline During concept phase (JMSNS phase) 41
Qualitative Ranking of Risks Group the risks into categories for appropriate action This may be enough to manage risk effectively Risk Ranking Stop Light Condition Risk Management Action High Risk RED Resolve or mitigate in baseline plan Moderate Risk YELLOW Resolve or develop a contingency plan Low Risk GREEN Leave resolution to project team 42
Maxwell Risk Driver Assessment Matrix Maxwell Risk Driver Assessment Framework Risk Level Risk Driver Category Very Low Low Medium High Very High 1 RequiredTechnical Minor Modifications Major Beyond State of the Nothing New State of the Art Advancement Only Modifications Art Currently in Under 2 Tech;nology Status Prototype Exists In Design Concept Stage Use Development Somewhat Moderately Highly Complex with 3 Complexity Simple Highly Complex Complex Complex Uncertainties 4 Interaction / Independent of Dependent on One Dependent on Dependent on Dependent on more Other Risk Additional Risk Two Additional Three Additional than Three Additional Dependencies Drivers Driver Risk Drivers Risk Drivers Risk Drivers 5 Process Controls Statistical Process Controls (SPC) Documented Controls (No SPC) Limited Controls Inadequate Controls No Known Controls 6 Manufacturing Precision High Adequate Limited Margins Known but Inadequate Unknown 43
Maxwell Risk Driver Assessment Matrix (continued) Risk Driver Category Maxwell Risk Driver Assessment Framework Risk Level Very Low Low Medium High Very High Reliability Historically High Average Known Limited Problems Serious Problems of Unknown Scope Infeasible Producibility Established Demonstrated Feasible Known Difficulties Infeasible Known Criticality to Possible Nonessential Minimum Impact Alternatives "Show Stopper" Mission Alternatives Exist Available Cost Established Known History or Close Analogies Predicted by Calibrated Model Schedule Demonstrated Historical Similarity Validated Analysis Out of Range of Experience Inadequate Analysis Unknown or Unsupported Estimate Unknown or Unsupported Estimate 44
Probability and Impact Define Risk Risk is defined by two dimensions of a possible event Probability that the event will occur Impact on the project objectives (cost, time, performance) if it does occur Discussions about risk rely on these two dimensions The two attributes of probability and impact must be considered separately Impact is is independent of of how likely the event is is 45
Qualitative Assessment of Probability from Technology Maturity Technology Maturity Scientific research ongoing Concept design formulated for performance and qualifications Concept design tested for performance and qualification concerns at bench scale Critical functions / characteristics demonstrated at pilot scale Full-scale prototype hardware passed qualification tests with ACWA feedstocks More than one full scale facility operational and deployed Probability Very High High Moderate Moderate Low Very Low 46
Probability of Risk from Process Complexity Process Complexity Risk Five or more processes/technologies including complicated interfaces or complex operations Two to four processes/technologies with complicated interfaces and complex operations Two processes/technologies with either complicated interfaces or complex operations Two processes/technologies in a single processing train, and standard interfaces with routine operations One process/technology and standard interfaces with routine operations Probabilty Very High High Moderate Low Very Low Complicated interfaces: multiphase streams, extreme conditions beyond industrial norms Complex operations: multiple interactive and interrelated activities beyond industrial norms 47
Probability of Risk from Process Difficulty Process Difficulty Risk Probability No comparable process is operating on an industrial scale, and more than one of: C, T, TP, or PC is expected Very High to exceed state of the art No comparable process is operating on an industrial scale, and at least one of the requirements for C, T, TP, High or PC are expected to exceed state of the art Integrated process is combination of standard industrial processes and C, T, TP, or PC exceed norm for these Moderate processes Integrated process is a combination of standard industrial processes and C, T, PC or TP are within the Low norm for these processes Standard industrial process meets C, T, TP, and PC Very Low requirements C = Conversion efficiency, T = Tolerance or precision TP = Throughput, PC = Process controls 48
Qualitative Assessment of Impact on Performance Impact of Risk on Performance Objective System requirement not achieved, safety and environmental objectives jeopardized System requirement not achieved, safety and environmental objectives satisfied Degradation of system performance eliminates all margins Degradation of subsystem performance, decrease in system performance (still above requirement) Potential degradation of subsystem performance, but system level not affected No effect on subsystem or system performance (includes producibility and support) Impact Very High High Moderate Moderate Low Very Low 49
Qualitative Assessment of P & I Emphasis on Impact Qualitative Risk Analysis: Probability - Impact Approach to Project Risk Analysis Probability Very high Mod Mod High High High High Low Mod Mod High High Moderate Low Mod Mod High High Low Low Low Mod Mod High Very low Low Low Low Mod Mod Very Low Low Moderate High Very High Impact on Project Objective 50
Probability and Impact by the Numbers Assessing probability and impact verbally is too vague, easy to make mistakes On purpose By design Because of inattention Defining the levels by objective criteria helps Applying numbers to risk probability and impact seems to improve concentration, increase discipline 51
Probability by the Numbers: Technology Maturity (natural) Technology Maturity Probability Scientific research ongoing 0.9 Concept design formulated for performance and qualifications 0.8 Concept design tested for performance and qualification concerns at bench scale 0.6 Critical functions / characteristics demonstrated at pilot scale 0.4 Full-scale prototype hardware passed qualification tests with ACWA feedstocks 0.3 More than one full scale facility operational and deployed 0.1 52
Impact by the Numbers: Performance Objective (more problematic) Many people feel comfortable assigning numbers to the impacts Numbers imply cardinality meaning that the relation between the numbers means something An impact rated.6 is 3 times as bad as one rated.2 DoD recommends avoiding cardinal numbers in favor of ordinal rankings (one is worse than the lower one) E.g. Impacts rated A / B / C / D / E 53
Compare Impact Neutral and Impact Aversion Example of Different Impact Scales Reflecting Organizational Preferences Risk Impact Level Linear, Impact Neutral Non Linear, Impact Averse Very High 0.9 3.2 High 0.7 1.6 Moderate 0.5 0.8 Low 0.3 0.4 Very Low 0.1 0.2 54
Impact on Performance by the Numbers Impact of Risk on Performance Objective System requirement not achieved, safety and environmental objectives jeopardized System requirement not achieved, safety and environmental objectives satisfied Degradation of system performance eliminates all margins Degradation of subsystem performance, decrease in system performance (still above requirement) Potential degradation of subsystem performance, but system level not affected No effect on subsystem or system performance (includes producibility and support) Impact 3.2 1.6 0.8 0.4 0.2 0.1 55
Computing the Technical Risk Score The Risk Score can be computed by multiplying probability times impact Prob. of.4 and impact of.7 yields a score of.28 The organization can determine the cutoff for each level of Risk Score Risk Ranking Stop Light Condition Cut-Off for Risk Score High Risk RED.30 < X Moderate Risk YELLOW.15 < X <.30 Low Risk GREEN X <.15 56
Computing the Risk Score Probability and Impact Risk Scores Risk = P x I Probability 0.9 0.18 0.36 0.72 1.44 2.88 0.7 0.14 0.28 0.56 1.12 2.24 0.5 0.10 0.20 0.40 0.80 1.60 0.3 0.06 0.12 0.24 0.48 0.96 0.1 0.02 0.04 0.08 0.16 0.32 0.2 0.4 0.8 1.6 3.2 Impact (Ratio Scale) 57
DoD Generally Recommends Against Using Cardinal Values for Impact This (use of numbers to calculate risk such as P x I) may be suitable if both likelihood and consequences have been quantified using compatible cardinal scales or calibrated ordinal scales (e.g. using Analytic Hierarchy Process). In such a case mathematical manipulation of values may be meaningful Source: Risk Management Guide for DoD Acquisition, DAU, January 2000 58
DoD Generally Recommends Against Using Cardinal Values for Impact (continued) In many cases, however, risk scales are actually just raw (uncalibrated) ordinal scales, reflecting only relative standing between scale levels and not actual numerical differences. Any mathematical operations performed on results from uncalibrated ordinal scales can provide information that will at best be misleading, if not completely meaningless 59
Risk Management Processes Risk Management Planning Risk Identification Qualitative Risk Analysis Quantitative Risk Analysis Risk Response Planning Risk Monitoring and Control Source: Guide to the Project Management Body of Knowledge, 2000 Project Management Institute 60
Decision Tree Analysis
Making Decisions with Uncertainty Decision analysis helps structure the problem Disciplined approach Decisions to be made Events that can happen, likelihood and impact Costs and benefits of making some decisions, having some events happen 62
Completed Simple Tree with Decisions, Events, Costs, Rewards and Probabilities Greenfield -150 High Low Market Demand 35% 500 350 65% 300 150 Plant Decision Retrofit None High Market Demand -35 Low High Market Demand 0 Low 35% 400 365 65% 200 165 35% 300 300 65% 150 150 63
Folding Back To Solve the Tree Value of $235 moves from right to left High Greenfield Market Demand -150 220 Low 35% 500 350 65% 300 150 Plant Decision 235 High 35% 400 365 Retrofit Market Demand -35 235 Low 65% 200 165 High 35% 300 300 None Market Demand 0 202.5 Low 65% 150 150 64
Changing Costs may Change Results and Value Plant Decision Greenfield Retrofit None High Market Demand -150 220 Low 220 High Market Demand -65 205 Low High Market Demand 0 202.5 Low 35% 500 350 65% 300 150 35% 400 335 65% 200 135 35% 300 300 65% 150 150 65
System Reliability: System Failure Analysis
Purposes of a System Failure Analysis Analyze the possible ways a facility, product or system might fail Understand how to build a more fault-tolerant facility Discover what happened if it failed (e.g. plane crash) Determine design that is efficient use of funds to keep the facility running Evaluate different competing designs from the failure perspective 67
Simple Failure Analysis Model What makes the room go dark in the evening? We have two lamps, desk and floor Specify the objective To have a conversation, we need at least one light and Failure means that both lights go out ( BOTH.. AND ) To read by, we need both lights and failure means one light goes out ( EITHER.. OR ) 68
Quantitative Analysis of the Two-Element Fault Tree Suppose that the failure rates are as follows: Floor lamp fails 4% of the time over a month Desk lamp fails 5% of the time over a month What is the likelihood that the room will be completely dark in the evening, over the month? An And Gate requires: Both the Floor Lamp and the Desk Lamp must Fail These are redundant 2002 Hulett systems, & Associates failure is unlikely 69
AND Gate -- Only 0.2% Likelihood of a Completely Dark Room Fault Tree Analysis State of Being Likelihood Joint Likelihood of Occurrence Floor Lamp Desk Lamp Floor On, Desk On 96.0% 95.0% 91.2% Floor On, Desk Off 96.0% 5.0% 4.8% Floor Off, Desk On 4.0% 95.0% 3.8% Floor Off, Desk Off 4.0% 5.0% 0.2% Total Likelihood 100.0% 70
Complete Darkness? Simple AND Gate in Software The likelihood is shown on the solved fault tree Result And Gate Likelihood of Failure 71
Suppose Bright Light and Both Lamps are Necessary? This is an OR GATE The condition of Bright Light fails if: Either the Floor Lamp or the Desk Lamp Fails This is a more common occurrence These are not redundant systems any more 72
OR Gate 8.8% Likely That At Least One Lamp Fails Likelihood of failure increases to 8.8% Fault Tree Analysis State of Being Likelihood Joint Likelihood of Occurrence Floor Lamp Desk Lamp Floor On, Desk On 96.0% 95.0% 91.2% Floor On, Desk Off 96.0% 5.0% 4.8% Floor Off, Desk On 4.0% 95.0% 3.8% Floor Off, Desk Off 4.0% 5.0% 0.2% Total Likelihood 100.0% 73
Fault Tree for Room Relatively Dark, with Probabilities -- OR Gate The likelihood that either / or failure occurs is 8.8% Result Or Gate Likelihood of Failure 74
Risk Management Processes Risk Management Planning Risk Identification Qualitative Risk Analysis Quantitative Risk Analysis Risk Response Planning Risk Monitoring and Control Source: Guide to the Project Management Body of Knowledge 2000 Project Management Institute 75
Project Life Cycle, Risk and Risk Management Total Project Life Cycle Plan Execute INCREASING RISK CONCEPT DEVELOPMENT IMPLEMENTATION TERMINATION Opportunity and Risk EFFECT OF RISK MANAGEMENT INCREASING VALUE Amount at Stake TIME 76
Evaluate a Risk Response (Handling) Option Can it be feasibly implemented? What is its expected effectiveness? Is it affordable? Is time available to develop the option? What is its impact on the schedule? What effect does it have on the system s technical performance Source: Risk Management Guide for Acquisition, DAU 2000 77
Risk Avoidance Eliminating the risk event Relax the project objective Severing the link to the project objective Any risk for which the Triple Constraint can be relaxed 78
Risk Mitigation (Handling) Design Immaturity Risk Fall back to a less demanding design Extend the time of the design phase to get it right Rapid prototyping of the user interface Design Uncertainty Risk Parallel development This is an expensive risk response strategy 79
Risk Mitigation (Handling) Process Immaturity Risk Test components Test the integrated system at a scale factor Semi-works tests at a larger scale May need to test some components at full-scale 80
Some other Handling Options Trade Studies Arrive at a balance of engineering requirements Incremental Development Design for later upgrade Technology Maturation Plan to replace current technology with preferred technology on Unit 2 Robust Design Use advanced design and manufacturing techniques to promote quality, may be costly 81
Some other Handling Options Design of Experiments I.D. critical design factors and focus on those Open Systems Carefully selected commercial specifications and standards Use Standard Items / Software reuse Modeling / simulation of the system Manufacturing screening to identify deficient manufacturing processes Source: Risk Management Guide for Acquisition, DAU 2000 82
Software Risks: Capers Jones Creeping User Requirements New and unanticipated user requirements added after the project is initiated and estimated User requirements creep at about 1% per month Mitigation Prototyping can reduce the severity and volume of this 83
Software Risks: Capers Jones Poor Quality Result of technical and cultural issues Not use effective defect prevention, removal therapies Corporate culture not committed to quality Mitigation Test planning Requirements analysis up front Formal inspection Beta testing with many users Formal defect prevention 84
Software Risks: Capers Jones Inaccurate Metrics Proven as early as 1978 that lines of code cannot safely be used to size projects, especially when multiple languages exist. Impact is poor measurement of productivity, cost and schedule estimation and project planning Mitigation Jones promotes the use of function points as accurate metrics 85
Software Risks: Capers Jones Management Malpractice Managers not trained for their jobs, not rewarded for good project management skills Culture lacks awareness of good practices No training or good curriculum in schools Mitigation Understand what is needed Allocate training resources, acquire courses Elevate project management status in company Establish best in class culture of professionalism 86
Checklist of Software Risk Items: Barry Boehm Developing the wrong software functions Risk management techniques Organization analysis Mission analysis Ops-concept formulation User surveys Prototyping Early users manuals 87
Checklist of Software Risk Items: Barry Boehm Developing the wrong user interface Risk management techniques Prototyping Scenarios Task analysis User characterization (functionality, style, workload) 88
Checklist of Software Risk Items: Barry Boehm Gold plating Risk management techniques Requirements scrubbing Prototyping Cost-benefit analysis Design to Cost 89
Risk Transfer Key for large projects Contract provisions include conditions for customer or supplier paying Type of contract may deflect from customer to contractor or vice versa Fixed price contracts may limit customer s cost exposure It may increase cost if there are engineering change orders 90
Risk Deflection Between Customer and Contractor / Supplier by Contract Contractor and Customer Risk 100 80 60 40 20 Relative Risk Contractor Risk Customer Risk 1 2 3 4 5 6 7 8 9 10 Cost Plus Fixed Price 0 91
Contingency Planning Plan B is the contingency if the baseline plan does not work out Plan B will be developed, then kept for emergency Identify the events that might trigger the need for Plan B Identify the value or characteristics of the trigger events that would initiate Plan B Monitor those trigger points in Risk Monitoring and Control 92
Risk Acceptance: Set a Dollar Contingency Reserve The best way is to: Quantify risks and perform a Monte Carlo simulation Agree on the organization s risk tolerance and pick the point on the cumulative distribution A short cut that may be helpful -- use the averages with Method of Moments 93
Risk Acceptance: Take the Right Risks Accepting the Right Risks and Setting Contingency Amounts Risk Accepted Likelihood Impact Expected Value Mitigation Strategy Mitigation Cost Expected Savings a b c = a x b d d - c Subcontractor late 40% 250 100 Use Old Subcontractor 250 150 Expected value of the risk is less than mitigation costs Take the right risks 94
But Establish a Contingency Reserve Risk Accepted Likelihood Impact Expected Value a b c = a x b Subcontractor late 40% 250 100 Part not pass the test 30% 350 105 Design assumption faulty 20% 650 130 Total 335 Set the right contingency reserve Set at the expected value if have enough risks 95
Content of the Risk Register Date risk latest revised Name Description Responsibility for monitoring, mitigating, reporting Risk before further risk mitigation described in the Risk Register 96
Content of the Risk Register (continued) Risk mitigation options chosen Timing of the action -- now, later, contingent Fallback plans if action fails Amount of risk expected to remain after risk management 97
Risk Management Processes Risk Management Planning Risk Identification Qualitative Risk Analysis Quantitative Risk Analysis Risk Response Planning Risk Monitoring and Control Source: Guide to the Project Management Body of Knowledge, 2000 Project Management Institute 98
Risk Monitoring and Control Risk Analysis is a continuing requirement Cycle through identification, assessment, quantification and response planning Keep track of the identified risks (watch list) Identify residual risks Assure the execution of risk plans or if new plans should be developed 99
Risk Monitoring and Control (continued) Test and Evaluation (T&E) monitoring the performance of selected risk handling options and developing new risk assessments Test-Analyze-and-Fix (TAAF) use a period of dedicated testing to identify and correct deficiencies Demonstration Events points in the program where technology and risk abatement are demonstrated Program metrics measure how well the system is achieving its objectives, monitor corrective action Source: Risk Management Guide for Acquisition, DAU 2000 100
Earned Value Analysis Earned Value Management System (EVMS) Helps forecast the budget and schedule at completion Augments quantitative risk analysis Based on data from the project itself, in an early stage Earned Value has exhibited a reliable ability to predict results at completion early in the project (e.g. at 20% complete) 101
Picture of a Troubled Project Earned Value Management System 120 100 Dollars 80 60 40 Over Cost Behind Schedule Plan Earned Value Actual 20 0 1 3 5 7 9 11 13 15 17 19 Project Months 102
Earned Value Forecast of a Troubled Project 160 140 120 Status at Planned Completion 100 80 60 40 20 0 1 4 7 10 13 16 19 22 Dollars Time Now Plan Earned Value Actual Projected EV Projected Actual Project Months 103
Technical Performance Measurement Technical Performance Measurement (TPM) Identify key technical goals or targets to be made through the project Set the targets in the schedule, usually at key milestones Measure technical achievements Compare measured achievements to the technical baseline Variances between actual achievements and the baseline lead to warnings of technical risk of completion satisfaction 104
Technical Performance Measurement (continued) Targets should have meaning for technical risks Technical targets all support the successful completion of the project These targets should have predictive value If they are not met, you are in trouble Identifying and scheduling the technical performance is key 105
Technical Performance Measurement (continued) Variances from the technical baseline early in the project are hard to make up later Variances are accompanied with schedule and budget implications Some projects go on with only partial completion of technical requirements at milestones These projects may be in trouble Catching up may be difficult and / or costly and / or take more time 106
Technical Performance Starts Well, Falls Behind Technical Performance Measurement 100 Technical Metric 80 60 40 20 0 Technical Shortfall 0 5 10 15 20 Technical Plan Technical Performance To-Date Project Months 107
Forecasting Technical Risk with TPM Performance Metric 100 80 60 40 20 0 Technical Performance Metric of a Troubled Project Status at Revised Time Now Completion Technical Plan 1 4 7 10 13 16 19 22 25 Technical Performance To- Date Technical To-Go Project Months 108
Project Closeout Data gathering Lessons Learned data base Help future projects Make it possible for future project managers to access the data easily, learn the lessons clearly 109
Project Schedule Risk Analysis CPM-500 Principles of Technical Management Presented by David T. Hulett, Ph.D. Hulett & Associates, LLC Info@projectrisk.com www.projectrisk.com 110