A Study of Systems Engineering Effectiveness. Building a Business Case for Systems Engineering



Similar documents
Supply-Chain Risk Management Framework

2012 CyberSecurity Watch Survey

Assurance Cases for Design Analysis of Complex System of Systems Software

Exploring the Interactions Between Network Data Analysis and Security Information/Event Management

Contracting Officer s Representative (COR) Interactive SharePoint Wiki

Risk Management Framework

Overview. CMU/SEI Cyber Innovation Center. Dynamic On-Demand High-Performance Computing System. KVM and Hypervisor Security.

Copyright 2014 Carnegie Mellon University The Cyber Resilience Review is based on the Cyber Resilience Evaluation Method and the CERT Resilience

Operationally Critical Threat, Asset, and Vulnerability Evaluation SM (OCTAVE SM ) Framework, Version 1.0

Resolving Chaos Arising from Agile Software Development

The Business Case for Systems Engineering Study: Results of the Systems Engineering Effectiveness Survey

Monitoring Trends in Network Flow for Situational Awareness

Arcade Game Maker Pedagogical Product Line: Marketing and Product Plan

Sustaining Operational Resiliency: A Process Improvement Approach to Security Management

An Application of an Iterative Approach to DoD Software Migration Planning

How To Ensure Security In A System

CMMI: What do we need to do in Requirements Management & Engineering?

Evaluating the Quality of Software Engineering Performance Data

Merging Network Configuration and Network Traffic Data in ISP-Level Analyses

CMMI for SCAMPI SM Class A Appraisal Results 2011 End-Year Update

Applying Software Quality Models to Software Security

SOA for Healthcare: Promises and Pitfalls

Using CMMI Effectively for Small Business Panel

National Defense Industrial Association Systems Engineering Division Task Group Report Top Five Systems Engineering Issues

Moving Target Reference Implementation

Software Architecture for Big Data Systems. Ian Gorton Senior Member of the Technical Staff - Architecture Practices

Department of Homeland Security Cyber Resilience Review (Case Study) Matthew Butkovic Technical Manager - Cybersecurity Assurance, CERT Division

Interpreting Capability Maturity Model Integration (CMMI ) for Business Development Organizations in the Government and Industrial Business Sectors

Network Monitoring for Cyber Security

Assurance in Service-Oriented Environments

Preview of the Mission Assurance Analysis Protocol (MAAP): Assessing Risk and Opportunity in Complex Environments

Network Analysis with isilk

Getting Started with Service- Oriented Architecture (SOA) Terminology

SED / Rassa Open Issue: Rapid acquisition and deployment Sep Top SE Issues

Buyer Beware: How To Be a Better Consumer of Security Maturity Models

CMMI for Acquisition, Version 1.3

Data Management Maturity (DMM) Model Update

Guidelines for Developing a Product Line Concept of Operations

Architectural Implications of Cloud Computing

Capability Maturity Model Integration (CMMI ) Version 1.2 Overview

Interpreting Capability Maturity Model Integration (CMMI ) for Service Organizations a Systems Engineering and Integration Services Example

CMMI for Development, Version 1.3

Software Process Improvement CMM

Why Would You Want to Use a Capability Maturity Model?

Concept of Operations for the Capability Maturity Model Integration (CMMI SM )

Electricity Subsector Cybersecurity Capability Maturity Model (ES-C2M2) (Case Study) James Stevens Senior Member, Technical Staff - CERT Division

Cyber Intelligence Workforce

Ipek Ozkaya Senior Researcher

Common Testing Problems: Pitfalls to Prevent and Mitigate

A Systematic Method for Big Data Technology Selection

UFO: Verification with Interpolants and Abstract Interpretation

CMMI for Development, Version 1.3

Software Acquisition Capability Maturity Model (SA-CMM ) Version 1.03

Building Process Improvement Business Cases Using Bayesian Belief Networks and Monte Carlo Simulation

Overview of SAE s AS6500 Manufacturing Management Program. David Karr Technical Advisor for Mfg/QA AFLCMC/EZSM david.karr@us.af.

[project.headway] Integrating Project HEADWAY And CMMI

CERT Virtual Flow Collection and Analysis

The Key to Successful Monitoring for Detection of Insider Attacks

The Systems Security Engineering Capability Maturity Model (SSE-CMM)

NDIA Software Industry Experts Panel

Guidelines for Developing a Product Line Production Plan

Building Resilient Systems: The Secure Software Development Lifecycle

Automated Provisioning of Cloud and Cloudlet Applications

Kurt Wallnau Senior Member of Technical Staff

Extending AADL for Security Design Assurance of the Internet of Things

DoD Software Migration Planning

Experiences in Migrations of Legacy Systems

Additional Terms And Conditions E-2D Advanced Hawkeye (AHE) Pilot Production (Prime Contract No. N C-0057)

Service Measurement Index Framework Version 2.1

Introduction to the OCTAVE Approach

Copyright 2014 Carnegie Mellon University The Cyber Resilience Review is based on the Cyber Resilience Evaluation Method and the CERT Resilience

How To Use Elasticsearch

The Software Engineering. Today and in the Future. Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213

MULTIPLE VIEWS OF CMMI APPROACH: A CASE EXPERIENCE

VoIP in Flow A Beginning

The Program Managers Guide to the Integrated Baseline Review Process

Top Systems Engineering Issues In US Defense Industry

A Framework for Categorizing Key Drivers of Risk

Information Asset Profiling

ACOT WEBSITE PRIVACY POLICY

CMMi and Application Outsourcing

Agile Development and Software Architecture: Understanding Scale and Risk

Data Rights for Proprietary Software Used in DoD Programs

Supplier Ratings Systems: A Survey of Best Practices

System Concept of Operations: Standards, Practices and Reality

Deriving Enterprise-Based Measures Using the Balanced Scorecard and Goal-Driven Measurement Techniques

Why Make the Switch? Evidence about the Benefits of CMMI

AGILE (SCRUM) WORKSHOP Sponsored by the C4ISR Division of NDIA

U.S. Dept. of Defense Systems Engineering & Implications for SE Implementation in Other Domains

CMM SM -Based Appraisal for Internal Process Improvement (CBA IPI): Method Description

Performance Results of CMMI -Based Process Improvement

Using NetIQ's Implementation of NetFlow to Solve Customer's Problems Lecture Manual

Best Training Practices Within the Software Engineering Industry

emontage: An Architecture for Rapid Integration of Situational Awareness Data at the Edge

CMMI Executive Overview

Distributed and Outsourced Software Engineering. The CMMI Model. Peter Kolb. Software Engineering

Incident Management Capability Metrics Version 0.1

+SAFE, V1.2 A Safety Extension to CMMI-DEV, V1.2

Modelling the Management of Systems Engineering Projects

Process Challenges in Human Systems Integration

Transcription:

Building a Business Case for Systems Engineering

NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. Use of any trademarks in this presentation is not intended in any way to infringe on the rights of the trademark holder. This Presentation may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. This work was created in the performance of Federal Government Contract Number FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so, for government purposes pursuant to the copyright license under the clause at 252.227-7013. 2

Context The value of SE is appreciated by some, disputed by a few, and not understood by many. Quantitative evidence of the value of SE is sparse Greuhl, Walter: Lessons Learned, Cost/Schedule Assessment Guide. NASA Comptrollers Office, 1992 Honour, Eric; Understanding the Value of Systems Engineering. 2004 Weaknesses in SE continue to impact program success GAO-09-362T managers rely heavily on assumptions about system[s] which are consistently too optimistic. These gaps are largely the result of a lack of a disciplined systems engineering analysis SE Costs are evident SE Benefits are less obvious and less tangible resources spent elapsed schedule cost avoidance improved efficiency, risk avoidance better products 3

Background In 2006, NDIA embarked on a project to collect quantitative evidence of SE Value NDIA formed the SE Effectiveness Committee (SEEC) The SEEC conducted the SE Effectiveness Study Developed a survey collecting information from defense contractors Queried individual project s to assess SE capabilities applied, resulting project performance, and other factors influencing project performance Received responses from 64 projects Analyzed the data and identified the strength of relationships between SE activities and project performance Results published results in 2007 and 2008 (http://www.sei.cmu.edu/reports/08sr034.pdf) Showed valuable relationships between many SE activities and project performance 4

SE Effectiveness Survey SE Deployment Assessment of 71 SE artifacts Based on CMMI- SE/SW/IPPD Project Performance Compliance with Budget Compliance with Schedule Satisfaction of Requirements Other Factors Project Challenge Acquirer Capability Project Environment 5

The Bottom Line 1.00 0.75 0.50 PROJECT PERFORMANCE vs. TOTAL SE CAPABILITY 15% 46% 12% 59% 56% 13% Best Performance ( x > 3.0 ) Moderate Performance ( 2.5 x 3.0 ) For the projects that did the least SE, only 15% delivered the best project performance. 0.25 0.00 39% Lower Capability ( x 2.5 ) N = 13 29% Moderate Capability ( 2.5 < x < 3.0 ) N = 17 31% Higher Capability (x 3.0 ) N = 16 Lower Performance ( x < 2.5 ) Gamma = 0.32 p = 0.04 For the projects that did the most SE, 56% delivered the best project performance 6

Product Architecture Capability vs. Project Performance Product architecture assessment examined High-level product structure documentation Including multiple views Interface Descriptions Better Product Architecture has a Moderately Strong / Strong positive relationship with Better Performance 7

Summary of Relationships Relationship of SE Processes to Program Performance Reqts + Tech Solution Architecture 40% 49% Trade Studies Technical Solution 37% 36% SE Capability IPT Capability Reqts Devel & Mgmt Overall SE Capability Validation Risk Mgmt Verification 34% 33% 32% 28% 28% 25% Product Integration Config Mgmt 13% 21% Project Planning Project Monitor/Control -13% 13% -20% -10% 0% 10% 20% 30% 40% 50% 60% Composite Measures Gamma (strength of relationship) Strong Relationship Moderately Strong to Strong Relationship Moderately Strong Relationship Weak Relationship 8

Moving Forward Study results have been adopted by several major aerospace and defense suppliers. Used the survey instrument to assess their internal projects Compared results against benchmarks established by the study Used results to guide SE process improvement activities. Presented study results and recommendations to OSD in 2007 Held discussions with IEEE in 2009 regarding extension of the study to a wider audience Briefed OSD leadership (Mr. Stephen Welby) in May-2010 Received an enthusiastic response Interest in gathering more data Some interest in disseminating data throughout DoD Some interest in incorporating findings into DoD acquisition guidance So, Here we are today 9

The NEW SE Effectiveness Committee Role Designee Affiliations Project Manager William Lyons IEEE AESS Board of Governors The Boeing Company Deputy Project Manager Robert C. Rassa President, NDIA Systems Engineering Division Raytheon Systems Company Deputy Project Manager Alan R. Brown Chair, NDIA Systems Engineering Effectiveness Committee The Boeing Company OSD Liaison Michael McLendon OSD (DDR&E) * Lead Researcher Joseph P. Elm Software Engineering Institute Companies Represented on the SE Effectiveness Committee Boeing Oliva Engineering Textron System Georgia Tech OSD USAF - AFMC/EN Harris Raytheon USAF - SAF/AQRE INCOSE Sikorsky Northrop Grumman Lockheed Martin Software Engineering Institute General Dynamics * On IPA assignment from Software Engineering Institute 10

The Mission Promote the achievement of quantifiable and persistent improvement in project performance through appropriate application of systems engineering principles and practices Identify principles and practices shown to provide benefit This is an extension and a confirmation of the prior NDIA study Assist DoD, industry, and academia in developing the guidance and direction to implement those principles and practices Assist DoD, industry and academia in establishing a means of monitoring / tracking the results of these efforts An on-going data collection and analysis process 11

Tenets of the NEW SE Effectiveness Survey All data will be submitted anonymously No data collected will identify the respondent, project, or organization All data will be handled confidentially Data will be submitted directly to a secure web site managed by the SEI The SEI is a federally funded research and development center. It does not compete with any responding organizations, and frequently operates as a trusted broker in matters of confidential and proprietary information. Only authorized SEI staff will have access to the submitted data Only aggregated data will be released to the participants and the public No released data will be traceable to a project, person, or organization. 12

Participation Our target audience is Project Managers, Chief Engineers, Lead System Engineers, etc. of projects delivering products (not services) Not limited to defense industries all industries are welcome Not limited to US companies all are welcome Reaching potential respondents Grass roots approach Broadcast an invitation to participate to members of participating organizations (NDIA, IEEE-AESS, INCOSE) Top down approach Identify SE leadership at major companies Network through participating organizations (NDIA, IEEE-AESS, INCOSE) Contact them directly and solicit their support Identify potential respondents within their company Promote participation 13

Status 1 Survey questionnaire complete Update of 2006 questionnaire Contacted several major defense contractors (via committee members) to solicit participants Boeing General Dynamics Harris Lockheed Martin Canvassed memberships of IEEE-AESS and INCOSE to identify participant candidates Inquiries Sent Northrop Grumman Sikorsky Textron and others Inquiries Delivered Responses Rec d (to date) Invitations Sent (to date) IEEE-AESS 3555 3341 15 15 INCOSE 7947 7042 95 95 NDIA 0 0 0 0 14

Status 2 Survey response collection Web site open and active through 31-Jan-2012 # of survey responses started (to date): 95 # of survey responses completed (to date): 62 15

Why should you participate? It s good for you A better understanding of the effectiveness of specific SE practices will help you do your job better, and help you justify SE efforts to your management It s good for your company A business case for SE will help your company apply resources where they can have the most impact It s good for the world Better SE leading to better projects will produce lower costs, faster deliveries, and better performance for systems As in the prior NDIA study of SE Effectiveness, survey participants will receive early access to study results, enabling them to evaluate their SE practices against an industry benchmark. 16

Please Help Us Make this Study a Success! For more information, contact: William F. Lyons IEEE-AESS Board of Governors william.f.lyons@boeing.com Joseph P. Elm Software Engineering Institute jelm@sei.cmu.edu Alan R. Brown NDIA SE Effectiveness Committee Chair alan.r.brown2@boeing.com Robert C. Rassa NDIA SE Division Chair RCRassa@raytheon.com 17