IEEE P1633 Software Reliability Best Practices. Nematollah Bidokhti Working Group Member



Similar documents
Department of Energy Quality Managers Software Quality Assurance Subcommittee Reference Document SQAS Software Quality Assurance Control

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

CHAPTER 7 Software Configuration Management

<name of project> Software Project Management Plan

IAEA-TECDOC-1328 Solutions for cost effective assessment of software based instrumentation and control systems in nuclear power plants

Certified Tester. Advanced Level Overview

Testing Metrics. Introduction

DEDICATED TO EMBEDDED SOLUTIONS

Software Project Models

TEST PLAN OUTLINE (IEEE 829 FORMAT)

Software Test Plan (STP) Template

CDC UNIFIED PROCESS JOB AID

Advanced Test Manager E-learning Course Outline

Levels of Software Testing. Functional Testing

Certified Software Quality Engineer (CSQE) Body of Knowledge

Applying CMMI SM In Information Technology Organizations SEPG 2003

Metrics in Software Test Planning and Test Design Processes

Module 2. Software Life Cycle Model. Version 2 CSE IIT, Kharagpur

FSW QA Testing Levels Definitions

QEx WHITEPAPER. Increasing Cost Predictability in Performance Testing Services via Unit-Based Pricing Model.

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

Coverity White Paper. Effective Management of Static Analysis Vulnerabilities and Defects

Software Configuration Management Plan

How To Improve A Test Process

Software Project Measurement

View Point. The Enterprise QA Transformation Model. A solution to enhance an enterprises testing maturity. Abstract.

Appendix V Risk Management Plan Template

Optimizing IV&V Benefits Using Simulation

Risk Assessment for Medical Devices. Linda Braddon, Ph.D. Bring your medical device to market faster 1

Software Engineering Compiled By: Roshani Ghimire Page 1

Appendix O Project Performance Management Plan Template

Software Quality Management

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

CDC UNIFIED PROCESS PRACTICES GUIDE

Test Management Tools

Meeting DO-178B Software Verification Guidelines with Coverity Integrity Center

CMMi and Application Outsourcing

Oracle Fixed Scope Services Definitions Effective Date: October 14, 2011

ISCT Cell Therapy Liaison Meeting AABB Headquarters in Bethesda, MD. Regulatory Considerations for the Use of Software for Manufacturing HCT/P

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Off-the-Shelf Software: A Broader Picture By Bryan Chojnowski, Reglera Director of Quality

NEOXEN MODUS METHODOLOGY

Fundamentals of Measurements

Introduction to Software Project Management. CITS3220 Software Requirements & Project Management

Test Plan Template (IEEE Format)

Establishing a Defect Management Process Model for Software Quality Improvement

Software Quality Assurance Plan

MKS Integrity & CMMI. July, 2007

1.1 The Nature of Software... Object-Oriented Software Engineering Practical Software Development using UML and Java. The Nature of Software...

State of Utah Version of Document E

Do Code Clones Matter?

Best Practices for the Acquisition of COTS-Based Software Systems (CBSS): Experiences from the Space Systems Domain

Measuring the Return on IT Security Investments. White Paper Intel Information Technology Computer Manufacturing Information Security

CSSE 372 Software Project Management: Managing Software Projects with Measures

Quality Management. Lecture 12 Software quality management

Evaluating the Business Impacts of Poor Data Quality

An Enterprise Architecture and Data quality framework

Security Engineering Best Practices. Arca Systems, Inc Boone Blvd., Suite 750 Vienna, VA

RELIABILITY AND AVAILABILITY OF CLOUD COMPUTING. Eric Bauer. Randee Adams IEEE IEEE PRESS WILEY A JOHN WILEY & SONS, INC.

Project Zeus. Risk Management Plan

Application Performance Testing Basics

Training Programs for Enterprise-Wide Change

Minimizing code defects to improve software quality and lower development costs.

ISTQB Certified Tester. Foundation Level. Sample Exam 1

Evaluation of the Iceland State Financial and Human Resource System REPORT OF THE INDIVIDUAL EVALUATOR. Annex 2 SYSTEM AND SOFTWARE QUALITY

1. Introduction. Annex 7 Software Project Audit Process

An introduction to designing reliable cloud services

An ITIL Perspective for Storage Resource Management

IT Outsourcing: Software Development Vendor Evaluation

Six Sigma in Project Management for Software Companies

Chapter 23 Software Cost Estimation

Shared Assessments Program Case Study

Risk Management Primer

Prerequisites. Course Outline

The Secondary Impact of System Optimisation on Building Equipment; Maintenance and Life Expectancy

Predicting Human Performance in Unexpected Events

How To Create A Process Measurement System

Software Quality Management II

White Paper Case Study: How Collaboration Platforms Support the ITIL Best Practices Standard

Requirements-Based Testing: Encourage Collaboration Through Traceability

LEXEVS OPERATIONS AND MAINTENCE SUPPORT PROJECT MANAGEMENT PLAN

Moving from ISO9000 to the Higher Levels of the Capability Maturity Model (CMM)

Performance Testing. Slow data transfer rate may be inherent in hardware but can also result from software-related problems, such as:

Testing Automated Manufacturing Processes

Project Risk Management: IV&V as Insurance for Project Success

A Report on The Capability Maturity Model

A Case Study in Software Enhancements as Six Sigma Process Improvements: Simulating Productivity Savings

Software Asset Management (SAM) and ITIL Service Management - together driving efficiency

Software Life Cycle Process - DO-178B

Software Configuration Management

IEEE ComputerSociety 1 Software and Systems Engineering Vocabulary

How To Improve Your Software

AV Jeff Schulman

Transcription:

1 IEEE P1633 Software Reliability Best Practices Nematollah Bidokhti Working Group Member

2 Introduction and motivation Software reliability Fundamental prerequisite for virtually all modern systems Software reliability research generated over last several decades Guidance on how to apply these models has lagged significantly Requires pragmatic guidance and tools to apply software reliability models to assess real software or firmware projects during each stage of the software development lifecycle Reliability engineers may lack software development experience Software engineers may be unfamiliar with methods to predict software reliability

IEEE 1633: Recommended Practice on Software Reliability Background Initial 2008 release is undergoing revision by the IEEE Reliability Society (IEEE-RS) Update will provide actionable steps for analyzing, predicting, measuring software reliability so as to provide an aid for managing deployment and sustainment. Purpose Document defines recommended practices to predict software reliability early in development Facilitates planning, sensitivity analysis, and tradeoff studies Defines recommended practices to estimate software reliability during test and operation Scope Determine whether software or firmware meets the reliability objective Useful to organizations wishing to identify methods, equations, and criteria to quantitatively assess software or firmware reliability Also beneficial to organizations acquiring software Benefit: knowing estimated reliability of software prior to acquisition Objectives Equip reliability and software quality engineers, and software managers with tools needed to assess software reliability 3

Table of contents 4 Section Contents 1,2,3 Overview, definitions and acronyms 4 Tailoring guidance 5 Actionable Procedures with Checklists and Examples 5.1 Planning for software reliability 5.2 Develop a failure mode 5.3 Apply SRE during development 5.4 Apply SRE during testing 5.5 Support Release decision 5.6 Apply SRE in operation Annex A Annex B Annex C Annex D Detailed procedures on predicting size and supporting information for the predictive models Supporting information for the software reliability growth models Examples Supporting information on the software FMEA

5 Section 4 - SRE Tailoring The document is geared towards 4 different roles, any industry and any type of software. Hence section 4 provides guidance for tailoring the document. By role recommended sections if you are a reliability engineer, software QA, software manager or acquisitions. By life cycle How to apply the document if you have an incremental life cycle model. By criticality Some SR tasks are essential while others are project specific.

6 Section 5.1 - Planning for software reliability This section provides the planning tasks that are essential prerequisites for the other section and task Characterize the software system Identify the software Line Replaceable Units (LRUs) of the system Construct a software Bill of Material Characterize the operational environment Identify impact of the software design on the system and system design Define failures and criticality Perform a reliability risk assessment Assess Product Risks Safety Considerations Security and Vulnerability Product Maturity Assess Project and Schedule Risks Grossly Inaccurate Software Size Estimations Reliability growth is grossly overestimated Defects Pileup from Release to Release Assess whether there are too many inherent risks for one release Assess the data collection system Review available tools needed for software reliability Develop a Software Reliability Program Plan (SRPP)

7 Section 5.2 - Develop Failure Modes Analysis This section focuses on the 3 analyses that identify potential failure modes. Understanding the failure modes is essential for development, testing, and decision making. Real examples are included. Perform Defect Root Cause Analysis (RCA) Perform Software Failure Modes Effects Analysis (SFMEA) Prepare the SFMEA Analyze Failure Modes and Root Causes Identify consequences Mitigate Generate a Critical Items List (CIL) Understand the differences between a HW FMEA and a SW FMEA Include Software in the System Fault Tree Analysis

8 sfmea and sfta Viewpoints These are complementary methods

Section 5.3 - Apply SRE during development 9 This section presents methods to predict software reliability, defects, defect logs, failure rate, MTBF, availability and other figures of merit. Identify/obtain the initial system reliability objective Perform a software reliability assessment and prediction Collect data about the project and product Select a model to predict software reliability to use early in development Apply software reliability models early in development Step 1 - Predict Total Defects Step 2 - Predict when the defects will be discovered over time Step 3 - Predict Failure Rate Step 4 - Predict Reliability Step 5 - Predict Availability Apply software reliability models with incremental development Use the assessment to qualify a subcontractor, COTS or FOSS vendor Assess software reliability of COTS LRU Assess software reliability of Free Open Sourced Software (FOSS) LRUs Sanity check the prediction Merge the software reliability predictions into the overall system prediction Determine an Appropriate Overall Software Reliability Requirement Plan the reliability growth Perform a sensitivity analysis Allocate the Required Software Reliability to the Software LRUs Employ software reliability metrics for transition to testing

If you can predict this fault profile you can predict all of the other reliability figures of merit 10 The predictive models predict the fault profile first and then then failure rate, MTBF, reliability and availability is predicted from that

Section 5.4 - Apply SRE during testing 11 This section presents methods to estimate the software reliability based on a variety of software reliability growth models Develop a Reliability Test Suite Increase test effectiveness via fault insertion Measure test coverage Collect Fault and Failure Data during Testing Select Reliability Growth Models Based on the Observed Fault rate Inherent Defect Content Effort required to use the model(s) Availability of data required for the model(s) Apply software reliability growth models with an incremental or evolutionary life cycle model Apply software reliability metrics Determine the accuracy of the predictive and reliability growth models Revisit the defect root cause analysis

Section 5.5 - Support Release Decision 12 Once the development and testing is complete the SRE analyses, models and metrics can be used to determine whether a decision should be accepted Decision is based on Requirements and Operational Profile coverage Stress test coverage Code coverage Adequate defect removal Confidence in reliability estimates SRE Tasks performed prior to acceptance Determine Release Stability do the reliability estimates meet the objective? Forecast additional test duration If the objective hasn t been met how many more test hours are required? Forecast remaining defects and effort required to correct them Will the forecasted defects pile up? Impact the next release? Perform a SW RDT Determine statistically whether the software meets the objective

Section 5.6 - Apply SRE in Operations Once the software is deployed the reliability should be monitored to assess any changes needed to previous analyses, predictions and estimations Employ software reliability metrics to monitor operational reliability Software Defects Per Million Hours (SWDPMH) Compare operational reliability to predicted reliability Assess changes to previous characterizations or analyses Archive operational data 13

14 Software Reliability Tools Examples Software BOM sfmea sfit SWDPMH

15 Software BOM Software Bill of Materials Item SW BOM Class SW Assembly 210 Company Owned 221 COTS 213 FOSS 224 Mutual Agreement 231 License Based 232 210 Key: Software assembly Consists of other software LRUs Company Owned The software LRU is not outsourced COTS Commercial Off the Shelf Software FOSS Free Open Sourced Software Mutual Agreement This component has a special mutual agreement for use License Based It requires a software license to use HW BOM 213 221 210 224 224 224 232 213 231 213 232 213 224 221 232

Which One RCA, sfmea, sfta? 16

Software FMEAs can be conducted from different viewpoints FMEA viewpoint Product Level Viewpoint Identifies failures related to. Functional Requirements Timing, sequence, Faulty data, erroneous error messages for a component Interface Detailed Production Maintenance Usage Interface between 2 components At class or module level Process related failures during development Changes to the software User friendliness & consistency + documentation Timing, sequence, Faulty data, erroneous error messages between 2 components All of the above plus memory management, algorithms, I/O, DB issues Problems with many defects and/or ability to meet a schedule, execution and Tools Problems when software is modified, installed, updated Software/documentation is too difficult or inconsistent to be used properly Life cycle timing SRS completion Interface Design Spec completion Detailed design or code is complete. Any time During maintenance As early as possible as these issues will influence design

Software FMEA 18

Failure Modes Examples 19

20 Checklist for performing software fault insertion Collect customer field failure data and brainstorm failure modes applicable to the software. Develop a software taxonomy for the system. Identify which failure modes are most critical & applicable for each function of code. Plan how to insert the faulty condition that causes that failure mode. Insert the faulty condition by instrumenting the system Capture how the system behaves when the fault is inserted. Evaluate the test results & make modifications to the requirements, design or code as applicable.

21 sfit The user needs to apply a compiler that compiles programs in high-level languages such as C and C++ to a low-level IR (Intermediate Representation). The IR preserves type information from the source-level, but at the same time, represents the detailed control and data flow of the program.

22 SWDPMH A normalized value that shows the rate at which SW defects are occurring per million hours of user system / product usage. Each defect experienced by the user regardless of it severity is counted in calculating the SWDPMH. There will be a short time lag between the time SW is released and the start of calculating SWDPMH to build an acceptable install base. It can be calculated as early as 6 months after SW release. It is a solid metric to gauge the health of the SW in the early stages of deployment.

23 SWDPMH Calculation Utilizes # of defects reported by the customers which includes all severity (1,2,3, ) and # of units installed. There are approximately 730 hours in a month. So assuming that the software is running continually: (Sum of defects in the last 3 months * 1 million) SWDPMH = -------------------------------------------------------------------- * Correction Factor (Sum of installed units for the last 3 months * 730) (120 + 220 + 180 = 520) * 1,000,000 SWDPMH = -------------------------------------------------------- * 1.02 = 7.12 (11100 + 23300 + 67700 = 102100) * 730 Correction Factor correction factor of 1.02 implies 2% of customer found defects did not have an associated version.

24 Summary IEEE P1633 puts forth recommended practices to apply qualitative software failure modes analyses and qualitative models Improve product and ensure software or firmware delivered with required reliability IEEE P1633 includes improved guidance Offers increased value more accessible to a broader audience Reliability engineers Software quality engineers Software managers Acquisitions

Acknowledgement of IEEE 1633 Working Group members 25 Chair: Ann Marie Neufelder Vice Chair: Lance Fiondella Martha Wetherholt Peter Lakey Robert Binder Michael Siok Ming Li Ying Shi Taz Daughtrey Thierry Wandji Michael Grottke Andy Long George Stark Kishor Trevidi Allen Nikora Bakul Banerjee Robert Raygan Mary Ann DeCicco Debra Greenhalgh Mark Sims Rajesh Murthy Willie Fitzpatrick Mark Ofori-kyei Sonya Davis Burdette Joyner Marty Shooman Andrew Mack Loren Garroway Sheila Prather David Kraus Christopher Swickline Kevin Mattos Kevin Frye Clair Jones Val Korszniak Shane Smith