How Good Is the Software: A Review of Defect Prediction Techniques Brad Clark Dave Zubrow
|
|
|
- Millicent Hutchinson
- 10 years ago
- Views:
Transcription
1 Pittsburgh, PA How Good Is the Software: A Review of Defect Prediction Techniques Brad Clark Dave Zubrow Sponsored by the U.S. Department of Defense 2001 by Carnegie Mellon University Version 1.0 A Review of Defect Prediction Techniques - page 1
2 Objectives Awareness of defect prediction and estimation techniques Awareness of the value to project management and process improvement activities of analyzing defect data Recommendations for getting started 2001 by Carnegie Mellon University Software Engineering Symposium page 2
3 Why Analyze and Predict Defects? Project Management Assess project progress Plan defect detection activities Work Product Assessment Decide work product quality Process Management Assess process performance Improve capability 2001 by Carnegie Mellon University Software Engineering Symposium page 3
4 Outline F Background Definitions Measuring Defects - Tools and techniques - Attributes Technique Review Project Management Work Product Assessment Process Improvement 2001 by Carnegie Mellon University Software Engineering Symposium page 4
5 Definition - Software Defect Software Defect: any flaw or imperfection in a software work product or software process software work product is any artifact created as part of the software process software process is a set of activities, methods, practices, and transformations that people use to develop and maintain software work products A defect is frequently referred to as a fault or bug Focus on Predicting those Defects that affect Project and Product Performance 2001 by Carnegie Mellon University Software Engineering Symposium page 5
6 Defects as the Focus of Prediction Distinguish between major and minor defects do not use minor or documentation defects in predictions minor defects will inflate estimate of latent product defects Most defect prediction techniques used in planning rely on historical data Defect prediction techniques vary in the types of data they require some require little data, others require more some use work product characteristics, others require defect data only Techniques have strengths and weaknesses depending on the quality of the inputs used for prediction 2001 by Carnegie Mellon University Software Engineering Symposium page 6
7 Defect Attributes Available for Analysis Problem Status Open Recognized Evaluated Resolved Closed Problem Type Software Defect Requirements Defect Design Defect Code Defect Operational Doc. Defect Test Case Defect Other Product Defect Problem Type (con t) Other Problems Hardware Problem Operating System Problem User Mistake Operations Mistake New Req t / Enhancement Undetermined Not repeatable Value not identified Uniqueness Original Duplicate Criticality Level Urgency 2001 by Carnegie Mellon University Software Engineering Symposium page 7
8 Additional Attributes to Consider Recognition What is the problem? When was the problem reported? Who reported the problem? Evaluation What work product caused the problem? What activity discovered the problem? What activity introduced the problem? Resolution What work needs to be done? What work products will be affected by the change? What are the prerequisite changes? Closure When are the changes expected? What configuration contains the changes? 2001 by Carnegie Mellon University Software Engineering Symposium page 8
9 Project and Process Factors Correlated with Defect Insertion Requirements adequacy Application Size Application Complexity COTS and Reused Code Development Team Capability Problem Solving Designing Coding Development Team Experience Application Domain Language & Tools Platform Process Maturity Product Defects Defect Detection Techniques 2001 by Carnegie Mellon University Software Engineering Symposium page 9
10 Defect Discovery Sources (how are the data generated) Defect Detection Techniques V&V Static Dynamic Inspections Tool-Based Checklist-based Insp. Perspective-based Insp. Fagan-based Insp. Complexity Measures Language Compilers Design Measures Path Testing Scenario-Based Testing Module Interface Testing User Interface Testing Operational (Post-Deployment) User Discovered System Administration Environmental 2001 by Carnegie Mellon University Software Engineering Symposium page 10
11 Outline Background Definitions Measuring Defects - Tools and techniques - Attributes F Technique Review Project Management Work Product Assessment Process Improvement 2001 by Carnegie Mellon University Software Engineering Symposium page 11
12 Defect Prediction Techniques Project Management Empirical Defect Prediction Defect Discovery Profile COQUALMO Orthogonal Defect Classification Work Product Assessment Fault Proneness Evaluation (Size, Complexity, Prior History) Capture/Recapture Analysis Process Improvement Defect Prevention Program Statistical Process Control 2001 by Carnegie Mellon University Software Engineering Symposium page 12
13 Simple 4 Phase Model Empirical Defect Prediction Technique Review Description - number of Defects per Size (Defect Density) defect density (Number of Defects / Thousands Lines of Code) based on historical data enhanced with historical data on injection distribution and yield Discovery Profile by Phase Req ts 10 V & V Estimated Total Injected Defects: 100 % Injected % Injected (10%) (30%) 5 5 Design V & V % Yield (50%) 18 % Yield (50%) Code 67 % Injected (50%) Estimated Total Removed Defects: by Carnegie Mellon University Software Engineering Symposium page V & V 33 % Yield (50%) 10 % Injected (10%) Test V&V % Yield (50%) 21 Latent Defects
14 Empirical Defect Prediction - 2 When to use - use for planning (total defects) and in-process monitoring of defect discovery numbers (latent defects) Required Data - historical defect density data required for planning; in-process data required for monitoring Strengths - easy to use and understand; can be implemented with minimal data Weaknesses - requires stable processes and standardized life cycle; does not account for changes in the project, personnel, platform, or project 2001 by Carnegie Mellon University Software Engineering Symposium page 14
15 Defect Discovery Profile Technique Review Description - projection, based on time or phases, of defect density (or number of defects) found in-process onto a theoretical discovery curve (Rayleigh). Found in the SWEEP and STEER models Defects / KSLOC Est. Defect Density in Phase = E*(e -B(P-1)2 - e -BP2 ) E = total Defects/KSLOC B = efficiency of discovery process P = phase number Phases (P) Est. Latent Defect Density Remaining = E * e -(BP2) 2001 by Carnegie Mellon University Software Engineering Symposium page 15
16 Defect Discovery Profile - 2 When to use - as early in the life cycle as defect data collection permits Required Data - historical defect density data, estimated and actual size, and consistently tracked defect counts Strengths - predicts defect density by time period enabling the estimation of defects to be found in test Weaknesses - no insight or adjustment mechanism for B to account for changes in the product, personnel, platform, or project will impact defect predictions by Carnegie Mellon University Software Engineering Symposium page 16
17 COQUALMO Technique Review Description - a defect prediction model for the requirements, design, and coding phases based on sources of introduction and discovery techniques used Phases Requirements Defects Design Defects Code Defects Est. Number of Defects Introduced by Phase = A*(Size) B *QAF QAF are 21 Quality Adjustment Factors characterizing the people, product, platform and project. Est. Number of Residual Defects = C j * (Defects Introduced) j * (1-DRF j ) Detection and Removal (DRF) Factors are Automated Analysis, People Reviews and Execution Testing / Tools 2001 by Carnegie Mellon University Software Engineering Symposium page 17
18 COQUALMO - 2 When to use used in the planning phase of a project Required Data - size of the product and ratings for 21 Quality Adjustment Factors Strengths - predicts defects for three phases; quantifies the effect of different discovery techniques on the detection and removal of defects. Considers the effects of attributes such as product, personnel, project, and platform Weaknesses - covers a small number of phases; does not predict test or post-deployment defects 2001 by Carnegie Mellon University Software Engineering Symposium page 18
19 Orthogonal Defect Classification Technique Review Description classification and analysis of defects to identify project status based on comparison of current defects with historical patterns; identify areas for process improvement based on analysis of defect types, triggers, impact, and source Types are what was required for the fix, not the cause of the defect (e.g. function, assignment, interface) Triggers are catalysts that cause defects to surface (e.g. testing, inspection, conditions of operational use) Design Code Function Test System Test Algorithm Assignment Checking Timing 2001 by Carnegie Mellon University Software Engineering Symposium page 19
20 Orthogonal Defect Classification - 2 When to use ongoing throughout project Required Data orthogonal defect classification scheme mapped to development process; historical defect profiles Strengths classifications linked to process provide valuable insight; classification takes little time Weaknesses requires development of classification scheme; reliable classification of defects; ongoing data collection and analysis; does not account for changes in people, process, or product 2001 by Carnegie Mellon University Software Engineering Symposium page 20
21 Fault Proneness Technique Review Description analysis of work product attributes to plan for allocation of defect detection resources (inspection and testing) A variety of models and heuristics comparing cyclomatic complexity against a threshold various parametric models (e.g., discriminant analysis) reviewing module or component defect histories Product Characteristics Size Complexity Cohesion Coupling Fault Proneness Product History Number of Defects Found Number of Modifications Amount of V&V 2001 by Carnegie Mellon University Software Engineering Symposium page 21
22 Fault Proneness - 2 When to use test planning, during coding and testing, Required Data size, complexity, coupling, historical defect data, etc. Strengths efficient and effective focus of defect detection activities Weaknesses in-process fault density by module or component may not predict operational fault density, effort may be misdirected; models and assumptions not likely to hold from one system to the next 2001 by Carnegie Mellon University Software Engineering Symposium page 22
23 Capture Recapture Technique Review Description analysis of pattern of defects detected within an artifact by independent defect detection activities (inspectors or inspection versus test) Number of remaining defects is estimated from the overlap in defects identified independently by individual inspectors according to the following formula: n(inspector 1) * n(inspector 2) N (estimated) in work product = m (number defects found by both inspectors) N(estimated) N (unique discovered) = Remaining defects (est.) 2001 by Carnegie Mellon University Software Engineering Symposium page 23
24 Capture Recapture - 2 When to use determining whether a work product should undergo re-inspection Required Data detailed defect descriptions from each inspector Strengths can be used as soon as data are available Weaknesses estimates of number of remaining defects best when stringent assumptions are met. Relaxing assumptions requires more complicated estimation. More robust when simply used to predict whether a criterion for re-inspection has been exceeded 2001 by Carnegie Mellon University Software Engineering Symposium page 24
25 Defect Prevention Technique Review Description root cause analysis of most frequently occurring defects Sample of defect reports selected for in-depth causal analysis Actions taken to make process changes or improve training to eliminate the root cause of defects and prevent their recurrence Stage Kickoff Meeting Kick off package Repositories Feedback Implemented Actions Development Stage Action Team Errors Causal Analysis Meeting Suggested Actions Action Database 2001 by Carnegie Mellon University Software Engineering Symposium page 25
26 Defect Prevention- 2 When to use prior to launching new projects or beginning new phases of a project Required Data historical defect data Strengths allows for comparison of defect trends over time to assess impact and ROI for defect prevention activities Weaknesses requires sampling of defects and in-depth analysis and participation by engineers to identify root causes 2001 by Carnegie Mellon University Software Engineering Symposium page 26
27 Statistical Process Control Technique Review Description use of control charts to determine whether inspection performance was consistent with prior process performance. Process capability depicts expected range of performance in terms of selected attribute. Moving Range Total Defects Component Inspection Sequence UCL=24.99 CL=15.81 LCL=6.634 UCL=11.27 CL= by Carnegie Mellon University Software Engineering Symposium page 27
28 Statistical Process Control - 2 When to use when inspections are being conducted Required Data current measures of inspection process (e.g., defects found, prep time, review rate); historical measures to develop control chart Strengths gives indication of inspection and development process performance. Signals provide rapid feedback suggesting re-inspection or process anomaly. Weaknesses requires stable process and real time data collection and analysis 2001 by Carnegie Mellon University Software Engineering Symposium page 28
29 Observations For Project Management models predict total defects in a product and latent defects from in-process measures models use estimated and actual software size as a parameter models use additional factors to adjust defect estimates For Product Quality predictions can be developed from inspection or product characteristics data For Process Improvement expected process behavior can be used to gauge performance and identify opportunities for improvement 2001 by Carnegie Mellon University Software Engineering Symposium page 29
30 Observations -2 Prediction models are useful for planning and establishing expectations. Tracking against expectations when deviations occur - some action is taken such as reallocation of resources towards defect detection, specific modules, or re-inspection. most analysis techniques are not very explicit on the threshold that triggers investigation. The exception is control limits in SPC. Estimates are often inaccurate but suggestive and value-added for decision making and planning 2001 by Carnegie Mellon University Software Engineering Symposium page 30
31 Recommendations for Getting Started Get started even with simple techniques the data available will help determine the technique availability of historical data will drive model selection analyze for patterns across defects, don t just fix the defects Measure product defects early in the life cycle Post-release defect tracking is the least helpful Pre-release defect tracking by phase and by type is most helpful but also more burdensome Defect definition should meet the intended use of the data Track project progress Determine product quality Assess process performance Changes in the product, personnel, platform, or project must be measured and accounted for in predictions 2001 by Carnegie Mellon University Software Engineering Symposium page 31
32 Call for Collaboration If you are interested in using these techniques or studying their effectiveness, please contact: Dave Zubrow Software Engineering Measurement and Analysis 4500 Fifth Ave Pittsburgh, Pa I look forward to hearing from you by Carnegie Mellon University Software Engineering Symposium page 32
33 Resources and References 1 General References Fenton, N. and Neil, M. A critique of software defect prediction models, IEEE Transactions on Software Engineering, May/June Frederick, M. Using defect tracking and analysis to improve software quality, University of Maryland, June Florac, W. A. Software Quality Measurement: A Framework for Counting Problems and Defects (CMU/SEI-92-TR-22). Pittsburgh Pa.: Software Engineering Institute, Carnegie Mellon University, September Available online: Peng, W.W and Wallace, D.R. Software error analysis, NIST, Special Publication , Empirical Defect Prediction Humphrey, W. Introduction to the Team Software Process. Reading, MA: Addison Wesley, Weller, E.F. Managing software projects using inspection data, IEEE Software, Defect Profile Prediction Technique Gaffney, J., Roberts, W., and DiLorio, R. A Process and Tool for Improved Software Defect Analysis and Quality Management. Software Technology Conference Proceedings, May COQUALMO Prediction Technique Chulani, S. Modeling Software Defect Introduction and Removal: COQUALMO. University of Southern California Center for Software Engineering Technical Report USC-CSE , by Carnegie Mellon University Software Engineering Symposium page 33
34 Resources and References - 2 Orthogonal Defect Classification Defect Prediction Technique Chillarege, R., Bhandari, I., Chaar, H., Halliday, D Ray, B. and Wong, M. Orthogonal Defect Classification - A Concept for In-Process Measurements. IEEE Transactions on Software Engineering, Vol 18, No 11, Nov 1992 Bridge, N., Miller, C. Orthogonal Defect Classification: Using defect data to improve software development. El Emam, K, Wieczorek, I. The repeatability of code defect classifications. IEEE: Proceedings of the Ninth International Symposium on Software Reliability Engineering, Fault Proneness Selby, R. and Basili, V. Analyzing error-prone system structure. IEEE Transactions on Software Engineering, vol 17, no. 2, Feb Briand, L.C., Melo, W.L., and Wust, J. Assessing the applicability of falut-proneness models across object-oriented software projects, ISERN Report No. ISERN-00-06, El Emam, K. A Primer on Object Oriented Measurement. National Research Council of Canada, Fenton, N.E. and Ohlsson, N. Quantitative analysis of faults and failures in a complex software system, IEEE Transactions on Software Engineering, vol 26, no 8, August 2000, Ohlsson, M.C. and Wohlin, C. Identification of green, yellow, and red legacy components. Lund University, Sweden by Carnegie Mellon University Software Engineering Symposium page 34
35 Resources and References -3 Capture/Recapture Analysis Briand, L. and Freimut, B.G. A comprehensive evaluation of capture-recapture models for estimating software defect content, IEEE Transactions on Software Engineering, vol 26., No. 6, June 2000, Humphrey, W. Introduction to the Team Software Process. Reading, MA: Addison Wesley, Petersson, H. and Wohlin, C. An empirical study of experience-based software defect content estimation methods Lund University. Defect Prevention Program Mays, R.G., Jone, C.L., Holloway, G.J., and Sudinski, D.P., Experiences with defect prevention, IBM Systems Journal, vol. 29, no. 1, 1990, Grady, R.B., Software failure analysis for high-return process improvement decisions, Hewlett Packard Journal, August, Gale, J.L., Tirso, J.R., and Burchfield, C.A., Implement the defect prevention process in the MVS interactive programming organization, IBM Systems Journal, vol. 29, no. 1, 1990, Statistical Process Control Florac, W.A. and Carleton, A.D. Measuring the Software Process: Statistical process control for software process improvement. Reading, MA : Addison Wesley, by Carnegie Mellon University Software Engineering Symposium page 35
Assessing Quality Processes with ODC COQUALMO
Assessing Quality Processes with ODC COQUALMO Ray Madachy, Barry Boehm USC {madachy, boehm}@usc.edu 2008 International Conference on Software Process May 10, 2008 USC-CSSE 1 Introduction Cost, schedule
Utilization of Statistical Process Control in Defined Level Software Companies to Manage Processes Using Control Charts with Three Sigma
Proceedings of the World Congress on Engineering and Computer Science 00 Vol I WCECS 00, October 0-, 00, San Francisco, USA Utilization of Statistical Process Control in Defined Level Software Companies
ANALYSIS OF OPEN SOURCE DEFECT TRACKING TOOLS FOR USE IN DEFECT ESTIMATION
ANALYSIS OF OPEN SOURCE DEFECT TRACKING TOOLS FOR USE IN DEFECT ESTIMATION Catherine V. Stringfellow, Dileep Potnuri Department of Computer Science Midwestern State University Wichita Falls, TX U.S.A.
Orthogonal Defect Classification in Agile Development
Orthogonal Defect Classification in Agile Development Monika Jagia, IBM Software Group India, [email protected] Seema Meena, IBM Software Group India, [email protected] 2008 IBM Corporation Copyright
Establishing a Defect Management Process Model for Software Quality Improvement
Establishing a Management Process Model for Software Quality Improvement Hafiz Ansar Khan Abstract remains in the whole life of software because software is developed by humans and to err is human. The
Knowledge Infrastructure for Project Management 1
Knowledge Infrastructure for Project Management 1 Pankaj Jalote Department of Computer Science and Engineering Indian Institute of Technology Kanpur Kanpur, India 208016 [email protected] Abstract In any
Learning from Our Mistakes with Defect Causal Analysis. April 2001. Copyright 2001, Software Productivity Consortium NFP, Inc. All rights reserved.
Learning from Our Mistakes with Defect Causal Analysis April 2001 David N. Card Based on the article in IEEE Software, January 1998 1 Agenda What is Defect Causal Analysis? Defect Prevention Key Process
Measurement Strategies in the CMMI
Measurement Strategies in the CMMI International Software Measurement & Analysis Conference 9-14 September 2007 Rick Hefner, Ph.D. Director, Process Management Northrop Grumman Corporation One Space Park,
C. Wohlin, "Is Prior Knowledge of a Programming Language Important for Software Quality?", Proceedings 1st International Symposium on Empirical
C. Wohlin, "Is Prior Knowledge of a Programming Language Important for Software Quality?", Proceedings 1st International Symposium on Empirical Software Engineering, pp. 27-36, Nara, Japan, October 2002.
Simulating the Structural Evolution of Software
Simulating the Structural Evolution of Software Benjamin Stopford 1, Steve Counsell 2 1 School of Computer Science and Information Systems, Birkbeck, University of London 2 School of Information Systems,
Fundamentals of Measurements
Objective Software Project Measurements Slide 1 Fundamentals of Measurements Educational Objective: To review the fundamentals of software measurement, to illustrate that measurement plays a central role
CONTROL CHARTS FOR IMPROVING THE PROCESS PERFORMANCE OF SOFTWARE DEVELOPMENT LIFE CYCLE
www.arpapress.com/volumes/vol4issue2/ijrras_4_2_05.pdf CONTROL CHARTS FOR IMPROVING THE PROCESS PERFORMANCE OF SOFTWARE DEVELOPMENT LIFE CYCLE S. Selva Arul Pandian & P. Puthiyanayagam Department of mathematics,
Root Cause Analysis for Customer Reported Problems. Topics
Root Cause Analysis for Customer Reported Problems Copyright 2008 Software Quality Consulting Inc. Slide 1 Topics Introduction Motivation Software Defect Costs Root Cause Analysis Terminology Tools and
Advancing Defect Containment to Quantitative Defect Management
Advancing Defect Containment to Quantitative Defect Management Alison A. Frost and Michael J. Campo The defect containment measure is traditionally used to provide insight into project success (or lack
EVALUATING METRICS AT CLASS AND METHOD LEVEL FOR JAVA PROGRAMS USING KNOWLEDGE BASED SYSTEMS
EVALUATING METRICS AT CLASS AND METHOD LEVEL FOR JAVA PROGRAMS USING KNOWLEDGE BASED SYSTEMS Umamaheswari E. 1, N. Bhalaji 2 and D. K. Ghosh 3 1 SCSE, VIT Chennai Campus, Chennai, India 2 SSN College of
USING DEFECT ANALYSIS FEEDBACK FOR IMPROVING QUALITY AND PRODUCTIVITY IN ITERATIVE SOFTWARE DEVELOPMENT
USING DEFECT ANALYSIS FEEDBACK FOR IMPROVING QUALITY AND PRODUCTIVITY IN ITERATIVE SOFTWARE DEVELOPMENT Pankaj Jalote Department of Computer Science and Engineering Indian Institute of Technology Kanpur
Defect Analysis and Prevention for Software Process Quality Improvement
Defect Analysis and Prevention for Software Process Quality Improvement Sakthi Kumaresh Research Scholar, Bharathiar University. Department of Computer Science, MOP Vaishnav College for Women, Chennai.
Applying Software Quality Models to Software Security
Applying Software Quality Models to Software Security Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Carol Woody, Ph.D. April 21, 2015 Copyright 2015 Carnegie Mellon University
Software Engineering. How does software fail? Terminology CS / COE 1530
Software Engineering CS / COE 1530 Testing How does software fail? Wrong requirement: not what the customer wants Missing requirement Requirement impossible to implement Faulty design Faulty code Improperly
Baseline Code Analysis Using McCabe IQ
White Paper Table of Contents What is Baseline Code Analysis?.....2 Importance of Baseline Code Analysis...2 The Objectives of Baseline Code Analysis...4 Best Practices for Baseline Code Analysis...4 Challenges
Prediction of Software Development Faults in PL/SQL Files using Neural Network Models
Prediction of Software Development Faults in PL/SQL Files using Neural Network Models Tong-Seng Quah, Mie Mie Thet Thwin School of Electrical & Electronic Engineering Nanyang Technological University [email protected]
IMPLEMENTATION OF A SOFTWARE PROJECT OFFICE AT HONEYWELL AIR TRANSPORT SYSTEMS. by Michael A. Ross
IMPLEMENTATION OF A SOFTWARE PROJECT OFFICE AT HONEYWELL AIR TRANSPORT SYSTEMS by Michael A. Ross Abstract. This paper justifies, defines and describes an organization-level software project management
Introducing Root-Cause Analysis and Orthogonal Defect Classification at Lower CMMI Maturity Levels
Introducing Root-Cause Analysis and Orthogonal Defect Classification at Lower CMMI Maturity Levels Luigi Buglione 1 and Alain Abran 1 1 École de Technologie Supérieure - ETS 1100 Notre-Dame Ouest, Montréal,
INTELLIGENT DEFECT ANALYSIS SOFTWARE
INTELLIGENT DEFECT ANALYSIS SOFTWARE Website: http://www.siglaz.com Semiconductor fabs currently use defect count or defect density as a triggering mechanism for their Statistical Process Control. However,
SOFTWARE QUALITY IN 2002: A SURVEY OF THE STATE OF THE ART
Software Productivity Research an Artemis company SOFTWARE QUALITY IN 2002: A SURVEY OF THE STATE OF THE ART Capers Jones, Chief Scientist Emeritus Six Lincoln Knoll Lane Burlington, Massachusetts 01803
Measurement Information Model
mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides
Software Metrics: Roadmap
Software Metrics: Roadmap By Norman E. Fenton and Martin Neil Presentation by Karim Dhambri Authors (1/2) Norman Fenton is Professor of Computing at Queen Mary (University of London) and is also Chief
Mahmoud Khraiwesh Faculty of Science and Information Technology Zarqa University Zarqa - Jordan [email protected]
World of Computer Science and Information Technology Journal (WCSIT) ISSN: 2221-0741 Vol. 1, No. 2, 26-33, 2011 Validation Measures in CMMI Mahmoud Khraiwesh Faculty of Science and Information Technology
CHAPTER 7 Software Configuration Management
CHAPTER 7 Software Configuration Management ACRONYMS CCB CM FCA MTBF PCA SCCB SCI SCM SCMP SCR SCSA SEI/CMMI SQA SRS USNRC INTRODUCTION Configuration Control Board Configuration Management Functional Configuration
Metrics in Software Test Planning and Test Design Processes
Master Thesis Software Engineering Thesis no: MSE-2007:02 January 2007 Metrics in Software Test Planning and Test Design Processes Wasif Afzal School of Engineering Blekinge Institute of Technology Box
Software project cost estimation using AI techniques
Software project cost estimation using AI techniques Rodríguez Montequín, V.; Villanueva Balsera, J.; Alba González, C.; Martínez Huerta, G. Project Management Area University of Oviedo C/Independencia
Empirical study of Software Quality Evaluation in Agile Methodology Using Traditional Metrics
Empirical study of Software Quality Evaluation in Agile Methodology Using Traditional Metrics Kumi Jinzenji NTT Software Innovation Canter NTT Corporation Tokyo, Japan [email protected] Takashi
Simulation for Business Value and Software Process/Product Tradeoff Decisions
Simulation for Business Value and Software Process/Product Tradeoff Decisions Raymond Madachy USC Center for Software Engineering Dept. of Computer Science, SAL 8 Los Angeles, CA 90089-078 740 570 [email protected]
SECURITY METRICS: MEASUREMENTS TO SUPPORT THE CONTINUED DEVELOPMENT OF INFORMATION SECURITY TECHNOLOGY
SECURITY METRICS: MEASUREMENTS TO SUPPORT THE CONTINUED DEVELOPMENT OF INFORMATION SECURITY TECHNOLOGY Shirley Radack, Editor Computer Security Division Information Technology Laboratory National Institute
Automotive Applications of 3D Laser Scanning Introduction
Automotive Applications of 3D Laser Scanning Kyle Johnston, Ph.D., Metron Systems, Inc. 34935 SE Douglas Street, Suite 110, Snoqualmie, WA 98065 425-396-5577, www.metronsys.com 2002 Metron Systems, Inc
Generation of an Error Set that Emulates Software Faults - Based on Field Data
Generation of an Error Set that Emulates Software Faults - Based on Field Data J. Christmansson Department of Computer Engineering Chalmers University of Technology Abstract A significant issue in fault
Failure Mode and Effect Analysis. Software Development is Different
Failure Mode and Effect Analysis Lecture 4-3 Software Failure Mode and Effects Analysis in Software Software Development, Pries, SAE Technical Paper 982816 Software Development is Different Process variation
A Prediction Model for System Testing Defects using Regression Analysis
A Prediction Model for System Testing Defects using Regression Analysis 1 Muhammad Dhiauddin Mohamed Suffian, 2 Suhaimi Ibrahim 1 Faculty of Computer Science & Information System, Universiti Teknologi
OPM Example- Improving Software Code Quality by reducing Code Complexity using Klocwork
OPM Example- Improving Software Code Quality by reducing Code Complexity using Klocwork Sarit Assaraf [email protected] Yossi Cohen [email protected] SEPG NORTH AMERICA The CMMI CONFERENCE 6-7 May 2014
Software Sustainability Challenges for Acquisition, Engineering, and Capability Delivery in the Face of the Growing Cyber Threat
2012 Systems and Software Technology Conference Software Sustainability Challenges for Acquisition, Engineering, and Capability Delivery in the Face of the Growing Cyber Threat Paul R. Croll Fellow CSC
The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.
CAPACITY AND AVAILABILITY MANAGEMENT A Project Management Process Area at Maturity Level 3 Purpose The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision
Efficient Collection and Analysis of Program Data - Hackystat
You can t even ask them to push a button: Toward ubiquitous, developer-centric, empirical software engineering Philip Johnson Department of Information and Computer Sciences University of Hawaii Honolulu,
Using Peer Review Data to Manage Software Defects By Steven H. Lett
Using Peer Review Data to Manage Software Defects By Steven H. Lett Abstract: Peer reviews, in particular software inspections, have become accepted within the software industry as a cost effective way
Object Oriented Design
Object Oriented Design Kenneth M. Anderson Lecture 20 CSCI 5828: Foundations of Software Engineering OO Design 1 Object-Oriented Design Traditional procedural systems separate data and procedures, and
The Effectiveness of Automated Static Analysis Tools for Fault Detection and Refactoring Prediction
The Effectiveness of Automated Static Analysis Tools for Fault Detection and Refactoring Prediction Fadi Wedyan, Dalal Alrmuny, and James M. Bieman Colorado State University Computer Science Department
Measurement of Object-Oriented Software Development Projects. January 2001
Measurement of Object-Oriented Software Development Projects January 2001 Principal Authors David N. Card, Software Productivity Consortium Khaled El Emam, National Research Council of Canada Betsy Scalzo,
Measuring Software Complexity to Target Risky Modules in Autonomous Vehicle Systems
Measuring Software Complexity to Target Risky Modules in Autonomous Vehicle Systems M. N. Clark, Bryan Salesky, Chris Urmson Carnegie Mellon University Dale Brenneman McCabe Software Inc. Corresponding
Empirical Software Engineering Introduction & Basic Concepts
Empirical Software Engineering Introduction & Basic Concepts Dietmar Winkler Vienna University of Technology Institute of Software Technology and Interactive Systems [email protected]
Optimal Resource Allocation for the Quality Control Process
Optimal Resource Allocation for the Quality Control Process Pankaj Jalote Department of Computer Sc. & Engg. Indian Institute of Technology Kanpur Kanpur, INDIA - 208016 [email protected] Bijendra
A Comparison of Calibrated Equations for Software Development Effort Estimation
A Comparison of Calibrated Equations for Software Development Effort Estimation Cuauhtemoc Lopez Martin Edgardo Felipe Riveron Agustin Gutierrez Tornes 3,, 3 Center for Computing Research, National Polytechnic
AN EMPIRICAL REVIEW ON FACTORS AFFECTING REUSABILITY OF PROGRAMS IN SOFTWARE ENGINEERING
AN EMPIRICAL REVIEW ON FACTORS AFFECTING REUSABILITY OF PROGRAMS IN SOFTWARE ENGINEERING Neha Sadana, Surender Dhaiya, Manjot Singh Ahuja Computer Science and Engineering Department Shivalik Institute
Software Project Management Matrics. Complied by Heng Sovannarith [email protected]
Software Project Management Matrics Complied by Heng Sovannarith [email protected] Introduction Hardware is declining while software is increasing. Software Crisis: Schedule and cost estimates
Software testing. Objectives
Software testing cmsc435-1 Objectives To discuss the distinctions between validation testing and defect testing To describe the principles of system and component testing To describe strategies for generating
CHAPTER 9 SOFTWARE ENGINEERING PROCESS
CHAPTER 9 SOFTWARE ENGINEERING PROCESS Khaled El Emam Institute for Information Technology National Research Council Building M-50, Montreal Road Ottawa, Ontario K1A 0R6, Canada +1 (613) 998 4260 [email protected]
Quantitative Project Management Framework via Integrating
Quantitative Project Management Framework via Integrating Six Sigma and PSP/TSP Sejun Kim, BISTel Okjoo Choi, Jongmoon Baik, Abstract: Process technologies such as Personal Software Process SM (PSP) and
Agile, TSP SM, CMMI pick one, pick two, pick all three!
1 Agile, TSP SM, CMMI pick one, pick two, pick all three! Daniel M. Roy Cape Town SPIN 20 February, 2008 PSP, TSP, Personal Software Process and Team Software Process are service marks of CMU CMM is and
A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management
International Journal of Soft Computing and Engineering (IJSCE) A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management Jayanthi.R, M Lilly Florence Abstract:
Building Security into the Software Life Cycle
Building Security into the Software Life Cycle A Business Case Marco M. Morana Senior Consultant Foundstone Professional Services, a Division of McAfee Outline» Glossary» What is at risk, what we do about
Testing Metrics. Introduction
Introduction Why Measure? What to Measure? It is often said that if something cannot be measured, it cannot be managed or improved. There is immense value in measurement, but you should always make sure
Making Architectural Design Phase Obsolete TDD as a Design Method
HUT / SoberIT 2004 Spring T-76.650 SQA in Agile Software Development 1 Making Architectural Design Phase Obsolete TDD as a Design Method Marc Josefsson T-76.650 Seminar course on SQA in Agile Software
REQUIREMENTS SPECIFICATION AND MANAGEMENT. Requirements Analysis and Specification
REQUIREMENTS SPECIFICATION AND MANAGEMENT In this note we give the requirements process in a software organization, a template for the requirements document, and the process to manage changes to the requirements.
Making Process Improvement Work
Making Process Improvement Work A Concise Action Guide for Software Managers and Practitioners Neil Potter Mary Sakry The Process Group [email protected] www.processgroup.com Version 2.3 1 Session
Chap 1. Software Quality Management
Chap. Software Quality Management.3 Software Measurement and Metrics. Software Metrics Overview 2. Inspection Metrics 3. Product Quality Metrics 4. In-Process Quality Metrics . Software Metrics Overview
Building a Data Quality Scorecard for Operational Data Governance
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
A Study on Code Peer Review Process Monitoring using Statistical Process Control
A Study on Code Peer Review Process Monitoring using Statistical Process Control Alhassan Muhammad Abubakar Faculty of Computing, Universiti Teknologi Malaysia, 81310, Skudai, Johor Email: [email protected]
Design and Analysis in Software Engineering. Part 1: The Language of Case Studies and Formal Experiments
ACM SIGSOFT Software Engineering Notes vol 19 no 4 October 1994 Page 16 contest. Members of the winning IBM team were Feng-hsinng Hsu, Murray S. Campbell and Arthur J. Hoane, Jr. Running five times faster
CSTE Mock Test - Part III Questions Along with Answers
Note: This material is for Evaluators reference only. Caters to answers of CSTE Mock Test - Part III paper. 1. Independence is important in testing is mostly due to the fact that (Ans: C) a. Developers
How To Create A Process Measurement System
Set Up and Operation of a Design Process Measurement System Included below is guidance for the selection and implementation of design and development process measurements. Specific measures can be found
An Exploratory Study on SAP Maintenance Process Management
An Exploratory Study on SAP Maintenance Process Management Celeste See-Pui Ng Department of Information Management Yuan Ze University, Taiwan R.O.C. Email: [email protected] Abstract This research-in-progress
Distributed and Outsourced Software Engineering. The CMMI Model. Peter Kolb. Software Engineering
Distributed and Outsourced Software Engineering The CMMI Model Peter Kolb Software Engineering SEI Trademarks and Service Marks SM CMM Integration SCAMPI are service marks of Carnegie Mellon University
Defect Management in Agile Software Development
I.J. Modern Education and Computer Science, 2014, 3, 55-60 Published Online March 2014 in MECS (http://www.mecs-press.org/) DOI: 10.5815/ijmecs.2014.03.07 Defect Management in Agile Software Development
Engineering Standards in Support of
The Application of IEEE Software and System Engineering Standards in Support of Software Process Improvement Susan K. (Kathy) Land Northrop Grumman IT Huntsville, AL [email protected] In Other Words Using
