Some Critical Success Factors for Industrial/Academic Collaboration in Empirical Software Engineering



Similar documents
A Software Development Simulation Model of a Spiral Process

Value-Based Feedback in Software/IT Systems

The ROI of Systems Engineering: Some Quantitative Results

MKS Integrity & CMMI. July, 2007

Cost Estimation for Secure Software & Systems

CHAPTER_3 SOFTWARE ENGINEERING (PROCESS MODELS)

Current and Future Challenges for Software Cost Estimation and Data Collection

Introduction to the CMMI Acquisition Module (CMMI-AM)

Skating to Where the Puck Is Going:!

The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into

Aircraft & Defense Vehicle Simulation Lab

Modern Tools to Support DoD Software Intensive System of Systems Cost Estimation

Background: Business Value of Enterprise Architecture TOGAF Architectures and the Business Services Architecture

CS 389 Software Engineering. Lecture 2 Chapter 2 Software Processes. Adapted from: Chap 1. Sommerville 9 th ed. Chap 1. Pressman 6 th ed.

JOURNAL OF OBJECT TECHNOLOGY

Improving Software Development Economics Part I: Current Trends

Ideas for a More Proactive Role for Parts Management and DMSMS in Acquisition

Enabling Data Quality

Application of software product quality international standards through software development life cycle

Using Appraisals to Improve the Acquisition of Software-Intensive Systems

Data Governance Primer. A PPDM Workshop. March 2015

Sustaining Software-Intensive Systems - A Conundrum

Best Practices for the Acquisition of COTS-Based Software Systems (CBSS): Experiences from the Space Systems Domain

Develop Project Charter. Develop Project Management Plan

Five best practices for deploying a successful service-oriented architecture

Software Engineering Graduate Project Effort Analysis Report

Systems Engineering. Designing, implementing, deploying and operating systems which include hardware, software and people

Current and Future Challenges for Systems and Software Cost Estimation

Empirical Models and Techniques for Software Engineering Development

Essential Elements for Any Successful Project

Best Practices, Process

A Comparison between Five Models of Software Engineering

National Defense Industrial Association Systems Engineering Division Task Group Report Top Five Systems Engineering Issues

A Comparison Between Five Models Of Software Engineering

Towards Better Software Projects and Contracts: Commitment Specifications in Software Development Projects

Profile. Business solutions with a difference

GSAW C2 System Advantages Sought, Lessons Learned, and Product Philosophies. Ryan Telkamp. Presenter name Presenter Title

ASSESSMENT OF SOFTWARE PROCESS MODELS

Status Report: Practical Software Measurement

Chapter 9 Software Evolution

Software Engineering. Introduction. Software Costs. Software is Expensive [Boehm] ... Columbus set sail for India. He ended up in the Bahamas...

<name of project> Software Project Management Plan

(Refer Slide Time: 01:52)

Applying Lean on Agile Scrum Development Methodology

Consolidated Afloat Networks and Enterprise Services (CANES)

NIST Cloud Computing Program Activities

GAO MAJOR AUTOMATED INFORMATION SYSTEMS. Selected Defense Programs Need to Implement Key Acquisition Practices

Extending CMMI Level 4/5 Organizational Metrics Beyond Software Development

Lifecycle Models: Waterfall / Spiral / EVO

Open Source egovernment Reference Architecture Osera.modeldriven.org. Copyright 2006 Data Access Technologies, Inc. Slide 1

Managing Commercial-Off-the- Shelf (COTS) Integration for High Integrity Systems: How Far Have We Come? Problems and Solutions in 2003

Future Multi-Mission Satellite Operations Centers Based on an Open System Architecture and Compatible Framework

Cloud Computing. Key Initiative Overview

Using Measurement to translate Business Vision into Operational Software Strategies

Comparing Agile Software Processes Based on the Software Development Project Requirements

W hitepapers. Delighting Vodafone Turkey s Customers via Agile Transformation

Operationalizing Data Governance through Data Policy Management

Enterprise Data Governance

How To Understand The Software Process

Using Parametric Software Estimates During Program Support Reviews

CS4507 Advanced Software Engineering

Reaching CMM Levels 2 and 3 with the Rational Unified Process

Measurement Strategies in the CMMI

Technology management in warship acquisition

A Software Engineering Process for Operational Space Weather Systems. S. Dave Bouwer, W. Kent Tobiska Space Environment Technologies

Data Governance A Big Step for your Big Data Initiatives

ITIL Managing Digital Information Assets

Knowledge Base Data Warehouse Methodology

Five Fundamental Data Quality Practices

In the IEEE Standard Glossary of Software Engineering Terminology the Software Life Cycle is:

A discussion of information integration solutions November Deploying a Center of Excellence for data integration.

Module System Architecture Context

6.0 Systems Integration

Managing Open Source Code Best Practices

Introduction to OpenUP (Open Unified Process)

10 Keys to Successful Software Projects: An Executive Guide

Top Systems Engineering Issues In US Defense Industry

Software Project Management Plan. Team Synergy Version: 1.0 Date: 1/27/03

Requirements Development ttechniques

Basic Trends of Modern Software Development

Quality Assurance in an Agile Environment

SC207 Software Engineering. Review Report: Producing More Reliable Software

Adjusting Software Life-Cycle Anchorpoints Lessons Learned in a System of Systems Context Steven Crosson (PM FCS) and Barry Boehm (USC)

Introduction to Software Engineering. 8. Software Quality

Simplifying the Interface Challenge in Healthcare. Healthcare Software Provider or Medical Device Manufacturer s Approach to Healthcare Integration

A Business Analysis Perspective on Business Process Management

QUality Assessment of System ARchitectures (QUASAR)

Simulation for Business Value and Software Process/Product Tradeoff Decisions

Perspectives on Productivity and Delays in Large-Scale Agile Projects

Software Engineering. So(ware Evolu1on

Software Economics: A Roadmap

C. Wohlin, "Managing Software Quality through Incremental Development and Certification", In Building Quality into Software, pp , edited by

What is a life cycle model?

THE CATHOLIC UNIVERSITY OF AMERICA Metropolitan School of Professional Studies Washington, DC Tel: Fax:

COMP 354 Introduction to Software Engineering

Agile So)ware Development

Strengthening the decision making process with data intelligence in publishing industry CONTEC 2014 Frankfurt Germany - October 7 th 2014

Case Study on Critical Success Factors of Running Scrum *

Integrated Modeling of Business Value and Software Processes

Transcription:

Some Critical Success Factors for Industrial/Academic Collaboration in Empirical Software Engineering Barry Boehm, USC (in collaboration with Vic Basili) EASE Project Workshop November 7, 2003 11/7/03 1

Outline Relevant (U.S.) CeBASE experiences NASA High Dependability Computing Program Army Future Combat Systems Some critical success factors Incremental results Sustained upper management commitment CRACK participants No missing links in adoption chain Fully collaborative activities Careful definition of data, metadata Careful handling of intellectual property Conclusions 11/7/03 2

Relevant CeBASE Experiences -U.S. context; may be different in Japan Government-sponsored collaborations NASA High Dependability Computing Program Army Future Combat Systems NASA Software Engineering Lab FAA Air Traffic Control Systems Direct industry collaborations USC, UMD, FC-MD affiliate programs 11/7/03 3

Applied Research NASA High Dependability Computing Program Project Goal: Increase the NASA s ability to engineer highly dependable software systems via the development of new techniques, processes, and technologies. Research Goal: Develop high dependability technologies and assess their effectiveness under varying conditions and transfer them into practice at NASA Partners: CMU (PI), UMD, USC, MIT, U. Washington, 2002-2006 UMD/USC Level of Effort: $5 million over 5 years Activities: Empirical investigation of NASA and NASA-contractor dependability problems Development of new technologies and engineering principles to address general forms of the problems Evaluation and iterative improvement of our results using realistic testbeds Model-based technology transfer which will provide the technology users with results of the effectiveness of the technology under varying conditions 11/7/03 4

HDCP Testbed Objectives Buy down risks of using new HDCP technologies Pre-qualify new technologies in mission context Enable cost-effective HDCP technology integration Dependability objectives vary by mission Testbeds provide mission-relevant cost-effectiveness data Accelerate pace of HDCP technology maturity, relevance Via early and accurate feedback Accelerate pace of technology transition Usually around 18 years for software engineering technology 11/7/03 5

Accelerating Technology Maturity via Hierarchical Testbeds Level 1: Researcher-specific testbeds Scenarios oriented around researcher s technology Level 2: Common, distributable, mission-representative testbeds Integrating Level 1 testbeds into common framework Full complement of supporting capabilities Level 3: On-site, off-line mission testbeds Test technology on actual NASA computers and software Ability to use Level 2 supporting capabilities Level 4: On-site, live mission platforms and software Carefully prepared; real proof of the pudding 11/7/03 6

SCRover Response to HDCP Testbed Criteria - I Representative of NASA, NASA-related missions First external application of JPL MDS technology Campus public safety robot Using state-based autonomous control Extensive review, support by JPL MDS personnel Full complement of supporting capabilities (current state) Specs and code (UML, C++ baseline, xadl extension) Mission scenario generations (MDS GEL-based) Instrumentation (xadl/mae assertion checks) Tracers (seeded defects based on SCRover development) Data analysis tools (xadl/mae) Experimental guidelines (FC-MD guidelines) 11/7/03 7

Defect Seeding Suppose HDCP technology finds 3 defects Is this 100% of 3 defects, or 3% of 100 defects? Defect seeding Seed testbed software with 10 defects Suppose HDCP technology finds 6 of 10 seeded defects (60%) Can estimate that it found 3 of 5 unseeded defects (60%) Assumptions Seeded defects representative of existing defects SCRover: obtained from project inspections, testing Can also use representative NASA defect distributions Test profile representative of operational profile SCRover: use representative NASA mission scenarios 11/7/03 8

Example Intervention: xadl/mae Refined SCRover UML specs into xadl* Analyzed consistency, behavior with Mae tools Instrumented code with xadl assertions SCRover testbed a good match for ADL interventions Straightforward UML-xADL elaboration Basic testbed infrastructure in place; usable for run-time assertion checking Modest level of effort: 160 person-hours over 2 months xadl/mae able to find 15 of 38 known defects, 6 unknown defects Defect seeding analysis, defect distributions help determine what HDC techniques to apply next Successful comparative test of CMU Acme ADL xadl and Acme found complementary defects Led to NASA/USC/CMU effort to integrate, apply ADLs * xadl: XML-based Architecture Description Language 11/7/03 9

Mae Defect Detection Yield by Type 25 20 #defects #Represented in Mae # Mae Detected 15 10 5 0 Interface Class/Obj Logic/Alg Ambiguity DataValues Other Inconsistency 11/7/03 10

Applied Research: Army Future Combat Systems (FCS) Complex system of systems (CSOS): $4 billion for Increment 1 CeBASE funded by FCS and OSD Software Intensive Systems Third year: $1.2 million per year Members of SW Steering Committee and Program Office software support team Ensure software issues are addressed throughout the program Provide proactive expert consultation to the Program Office and integration contractor (Boeing) Collaborate with Boeing to apply risk-driven spiral model to software and system acquisition Capture and analyze empirical experience data to support downstream program decisions and future CSOS acquisitions 11/7/03 11

Future Combat Systems Risk Example: Limited speed of CSOS Software Development Many CSOS scenarios require close coupling of complex software across several systems and subsystems Well-calibrated software estimation models agree that there are limits to development speed in such situations Estimated development schedule in months for closely coupled SW with size measured in equivalent KSLOC (thousands of source lines of code): Months =~ 5 * 3 KSLOC KSLOC 300 1000 3000 10,000 -Months 33 50 72 108 Strategy to meet end-of-decade target (over 10,000 KSLOC): Use SAIV process. Architect for parallel incremental development, rapid integration of smaller supplier components 11/7/03 12

How Much Architecting Is Enough? 100 -A COCOMO II Analysis Percent of Time Added to Overall Schedule 90 80 70 60 50 40 30 20 10000 KSLOC 100 KSLOC 10 KSLOC Sweet Spot Percent of Project Schedule Devoted to Initial Architecture and Risk Resolution Added Schedule Devoted to Rework (COCOMO II RESL factor) Total % Added Schedule Sweet Spot Drivers: Rapid Change: leftward 10 High Assurance: rightward 0 0 10 20 30 40 50 60 Percent of Time Added for Architecture and Risk Resolution 11/7/03 13

Future Combat Systems Risk Example: COTS Upgrade Synchronization and Obsolescence Risk: Many subcontractors means a proliferation of evolving COTS interfaces Strategy: Emphasize COTS interoperability in source selection process. Establish COTS tracking system and refresh strategy. Risk: Aggressively-bid subcontracts can lead to delivery of obsolete COTS New COTS released every 8-9 months (GSAW) COTS unsupported after 3 releases (GSAW) An actual delivery: 120 COTS; 46% unsupported Strategy: Contract provisions ensuring delivery of refreshed COTS products. 11/7/03 14

CeBASE CSOS Experience Base: Risks, Issues, Lessons Learned Building lessons learned experience base to learn from early phases of FCS improve later phases of FCS provide an experience base for other DoD projects Example Experience Bases An independent report of the top ten software risks as identified by the Software Team A web-accessible software issue tracking system that captures select program issues brought to the attention of the software steering committee A web-accessible lessons learned experience base that analyzes and synthesizes the software problem areas and tracks their evolution and resolution over time 11/7/03 15

Outline => Relevant (U.S.) CeBASE experiences NASA High Dependability Computing Program Army Future Combat Systems Some critical success factors Incremental results Sustained upper management commitment CRACK participants No missing links in adoption chain Fully collaborative activities Careful definition of data, metadata Careful handling of intellectual property Conclusions 11/7/03 16

Upper Management Commitment Personal and organizational commitment Stable sources of funding, key personnel, data Participation in reviews Responsiveness to problem situations 11/7/03 17

CRACK Participants Collaborative Otherwise no teamwork Representative Otherwise poorly-matched projects Authorized Otherwise authorization delays or misleading commitments Committed Otherwise missing participation, contributions Knowledgeable Otherwise delays, unacceptable products To get value from the collaboration, don t send the people you won t miss. Do send your crack (expert) people. 11/7/03 18

No Missing Links in Adoption Chain Technology Developers Technology Advocates Early Adopters Mainstream Adopters Avoid communication gaps About technology, user domain knowledge Ensure rapid adaptation to change, problems 11/7/03 19

Fully Collaborative Activities Some co-location; some electronic collaboration Coverage of all adoption-chain links Co-evaluation of processes, tools, methods, metrics Common core with special industry extensions Group prioritization activities Stakeholder win-win negotiations 11/7/03 20

Careful Definition of Data, Metadata Common core with special industry extensions Management-relevant data But not used in performance reviews Low data collection overhead E.g, log file interpretation 11/7/03 21

Intellectual Property Data protection Data summarization Tool rights Non-disclosure agreements Don t overdo; don t underdo 11/7/03 22

Conclusions Some definite successes and failures Critical success factors explain most differences Incremental results Sustained upper management commitment CRACK participants No missing links in adoption chain Fully collaborative activities Careful definition of data, metadata Careful handling of intellectual property 11/7/03 23