Definitions. Software Metrics. Why Measure Software? Example Metrics. Software Engineering. Determine quality of the current product or process



Similar documents
EVALUATING METRICS AT CLASS AND METHOD LEVEL FOR JAVA PROGRAMS USING KNOWLEDGE BASED SYSTEMS

Chap 4. Using Metrics To Manage Software Risks

EPL603 Topics in Software Engineering

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University

Evaluating the Relevance of Prevailing Software Metrics to Address Issue of Security Implementation in SDLC

Percerons: A web-service suite that enhance software development process

Chapter 24 - Quality Management. Lecture 1. Chapter 24 Quality management

Quality Analysis with Metrics

Object Oriented Metrics Based Analysis of DES algorithm for secure transmission of Mark sheet in E-learning

Automatic software measurement data collection for students

Object Oriented Design

Chap 1. Software Quality Management

Fundamentals of Measurements

Manufacturing View. User View. Product View. User View Models. Product View Models

Testing Metrics. Introduction

Design methods. List of possible design methods. Functional decomposition. Data flow design. Functional decomposition. Data Flow Design (SA/SD)

Software Engineering Compiled By: Roshani Ghimire Page 1

A hybrid approach for the prediction of fault proneness in object oriented design using fuzzy logic

Software Metrics. Lord Kelvin, a physicist. George Miller, a psychologist

A COMPARATIVE STUDY OF VARIOUS SOFTWARE DEVELOPMENT METHODS AND THE SOFTWARE METRICS USED TO MEASURE THE COMPLEXITY OF THE SOFTWARE

Unit 11: Software Metrics

Software Testing Strategies and Techniques

Research Article Predicting Software Projects Cost Estimation Based on Mining Historical Data

II. TYPES OF LEVEL A.

Baseline Code Analysis Using McCabe IQ

Quality Management. Managing the quality of the software process and products

Quality Management. What is quality? Managing the quality of the software process and products ISO 9000

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

How To Calculate Class Cohesion

Quality prediction model for object oriented software using UML metrics

Karunya University Dept. of Information Technology

Program Understanding in Software Engineering

International Journal of Advanced Research in Computer Science and Software Engineering

Quality Management. Objectives

Quality Management. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 27 Slide 1

SOFTWARE REQUIREMENTS

Quality Management. Objectives. Topics covered. Process and product quality Quality assurance and standards Quality planning Quality control

CSC 408F/CSC2105F Lecture Notes

Synopsis: Title: Software Quality. Theme: Information Systems. Project Term: 9th semester, fall Project Group: sw907e13

Using Code Quality Metrics in Management of Outsourced Development and Maintenance

Vragen en opdracht. Complexity. Modularity. Intra-modular complexity measures

The role of Software Metrics on Software Development Life Cycle

CS 451 Software Engineering Winter 2009

Keywords Class level metrics, Complexity, SDLC, Hybrid Model, Testability

Introduction to Software Engineering. 8. Software Quality

Software Testing Interview Questions

Goal Setting and the Software Design Process

Example Software Development Process.

Open Source Software: How Can Design Metrics Facilitate Architecture Recovery?

Engineering Process Software Qualities Software Architectural Design

Unit Test Case Design Metrics in Test Driven Development

Project Planning and Project Estimation Techniques. Naveen Aggarwal

Software Quality Management

Measuring Software Complexity to Target Risky Modules in Autonomous Vehicle Systems

An Introduction to. Metrics. used during. Software Development

How Designs Differ By: Rebecca J. Wirfs-Brock

Formal Software Testing. Terri Grenda, CSTE IV&V Testing Solutions, LLC

Project Management Estimation. Week 11

Measurement Information Model

Assessing Internal Software Quality Attributes of the Object-Oriented and Service-Oriented Software Development Paradigms: A Comparative Study

Visualization of Software Metrics Marlena Compton Software Metrics SWE 6763 April 22, 2009

Darshan Institute of Engineering & Technology Unit : 7

Software Project Management Matrics. Complied by Heng Sovannarith

Module 11. Software Project Planning. Version 2 CSE IIT, Kharagpur

Introduction to Computers and Programming. Testing

Process Models and Metrics

What do you think? Definitions of Quality

Exploring the Differing Usages of Programming Language Features in Systems Developed in C++ and Java

THE EFFECT OF SOFTWARE DESIGN PATTERNS ON OBJECT-ORIENTED SOFTWARE QUALITY AND MAINTAINABILITY

Research Article An Empirical Study of the Effect of Power Law Distribution on the Interpretation of OO Metrics

Software Engineering: Analysis and Design - CSE3308

How To Develop Software

Performance Evaluation of Reusable Software Components

Design and Code Complexity Metrics for OO Classes. Letha Etzkorn, Jagdish Bansiya, and Carl Davis. The University of Alabama in Huntsville

Mining Metrics to Predict Component Failures

Managing and Maintaining Windows Server 2008 Servers

International Journal of Advance Research in Computer Science and Management Studies

Software Test Plan (STP) Template

Predicting Class Testability using Object-Oriented Metrics

A Review and Analysis of Software Complexity Metrics in Structural Testing

Software Testing. Quality & Testing. Software Testing

FUNCTION POINT ANALYSIS: Sizing The Software Deliverable. BEYOND FUNCTION POINTS So you ve got the count, Now what?

A Framework for Dynamic Software Analysis & Application Performance Monitoring

The «SQALE» Analysis Model An analysis model compliant with the representation condition for assessing the Quality of Software Source Code

D6 INFORMATION SYSTEMS DEVELOPMENT. SOLUTIONS & MARKING SCHEME. June 2013

AN EMPIRICAL REVIEW ON FACTORS AFFECTING REUSABILITY OF PROGRAMS IN SOFTWARE ENGINEERING

Going Faster: Testing The Web Application. By Adithya N. Analysis and Testing of Web Applications Filippo Ricca and Paolo Tonella

Glossary of Object Oriented Terms

Simulating the Structural Evolution of Software

Software Quality Management

CS6403-SOFTWARE ENGINEERING UNIT-I PART-A

Software testing. Objectives

Analysis and Evaluation of Quality Metrics in Software Engineering

Research Paper. (Received 25 August 2009; Accepted 31 December 2009) 1. Introduction

Module 1. Introduction to Software Engineering. Version 2 CSE IIT, Kharagpur

Relational Analysis of Software Developer s Quality Assures

MEASURING AND QUANTIFYING WEB APPLICATION DESIGN

Montana Department of Transportation Information Services Division. System Development Life Cycle (SDLC) Guide

Topics. Project plan development. The theme. Planning documents. Sections in a typical project plan. Maciaszek, Liong - PSE Chapter 4

Personal Software Process (PSP)

Transcription:

Definitions Software Metrics Software Engineering Measure - quantitative indication of extent, amount, dimension, capacity, or size of some attribute of a product or process. Number of errors Metric - quantitative measure of degree to which a system, component or process possesses a given attribute. A handle or guess about a given attribute. Number of errors found per person hours expended Why Measure Software? Determine quality of the current product or process Predict qualities of a product/process Improve quality of a product/process Example Metrics Defects rates Errors rates Measured by: individual module during development Errors should be categorized by origin, type, cost

Metric Classification Products Explicit results of software development activities. Deliverables, documentation, by products Processes Activities related to production of software Resources Inputs into the software development activities hardware, knowledge, people Product vs. Process Process Metrics- Insights of process paradigm, software engineering tasks, work product, or milestones. Lead to long term process improvement. Product Metrics- Assesses the state of the project Track potential risks Uncover problem areas Adjust workflow or tasks Evaluate teams ability to control quality Types of Measures Direct Measures (internal attributes) Cost, effort, LOC, speed, memory Indirect Measures (external attributes) Functionality, quality, complexity, efficiency, reliability, maintainability Size Oriented Metrics Size of the software produced Lines Of Code (LOC) 1000 Lines Of Code KLOC Effort measured in person months Errors/KLOC Defects/KLOC Cost/LOC Documentation Pages/KLOC LOC is programmer & language dependent

LOC Metrics Easy to use Easy to compute Can compute LOC of existing systems but cost and requirements traceability may be lost Language & programmer dependent Function Oriented Metrics Function Point Analysis [Albrecht 79, 83] International Function Point Users Group (IFPUG) Indirect measure Derived using empirical relationships based on countable (direct) measures of the software system (domain and requirements) Computing Functions Points Number of user inputs Distinct input from user Number of user outputs Reports, screens, error messages, etc Number of user inquiries On line input that generates some result Number of files Logical file (database) Number of external interfaces Data files/connections as interface to other systems Compute Function Points FP = Total Count * [0.65 +.01*Sum(F i )] Total count is all the counts times a weighting factor that is determined for each organization via empirical data F i (i=1 to 14) are complexity adjustment values

Complexity Adjustment Does the system require reliable backup and recovery? Are data communications required? Are there distributed processing functions? Is performance critical? Will the system run in an existing heavily utilized operational environment? Does the system require on-line data entry? Does the online data entry require the input transaction to be built over multiple screens or operations? Complexity Adjustment (cont) Are the master files updated on line? Are the inputs, outputs, files, or inquiries complex? Is the internal processing complex? Is the code designed to be reusable? Are conversions and installations included in the design? Is the system designed for multiple installations in different organizations? Is the application designed to facilitate change and ease of use by the user? Using FP FP and Languages Errors per FP Defects per FP Cost per FP Pages of documentation per FP FP per person month Language Assembly C COBOL FORTRAN Pascal C++ Ada VB SQL LOC/FP 320 128 106 106 90 64 53 32 12

Using FP FP and LOC based metrics have been found to be relatively accurate predictors of effort and cost Need a baseline of historical information to use them properly Language dependent Productivity factors: People, problem, process, product, and resources FP can not be reverse engineered from existing systems easily Complexity Metrics LOC - a function of complexity Language and programmer dependent Halstead s Software Science (entropy measures) n 1 - number of distinct operators n 2 - number of distinct operands N 1 - total number of operators N 2 - total number of operands if (k < 2) { if (k > 3) x = x*k; } Example Distinct operators: if ( ) { } > < = * ; Distinct operands: k 2 3 x n 1 = 10 n 2 = 4 N 1 = 13 N 2 = 7 Halstead s Metrics Amenable to experimental verification [1970s] Length: N = N 1 + N 2 Vocabulary: n = n 1 + n 2 Estimated length: Nˆ = n 1 log 2 n 1 + n 2 log 2 n 2 Close estimate of length for well structured programs Purity ratio: PR = Nˆ /N

Program Complexity Volume: V = N log 2 n Number of bits to provide a unique designator for each of the n items in the program vocabulary. Program effort: E=V/L L = V*/V V* is the volume of most compact design implementation This is a good measure of program understandability McCabe s Complexity Measures McCabe s metrics are based on a control flow representation of the program. A program graph is used to depict control flow. Nodes represent processing tasks (one or more code statements) Edges represent control flow between nodes Flow Graph Notation Cyclomatic Complexity Sequence While Set of independent paths through the graph (basis set) If-then-else Until V(G) = E N + 2 E is the number of flow graph edges N is the number of nodes V(G) = P + 1 P is the number of predicate nodes

Example Flow Graph i = 0; while (i<n-1) do j = i + 1; while (j<n) do if A[i]<A[j] then swap(a[i], A[j]); end do; i=i+1; end do; 7 4 1 2 3 6 5 Computing V(G) Another Example V(G) = 9 7 + 2 = 4 V(G) = 3 + 1 = 4 Basis Set 1, 7 1, 2, 6, 1, 7 1, 2, 3, 4, 5, 2, 6, 1, 7 1, 2, 3, 5, 2, 6, 1, 7 9 5 4 7 6 8 1 2 3 What is V(G)?

Meaning V(G) is the number of (enclosed) regions/areas of the planar graph Number of regions increases with the number of decision paths and loops. A quantitative measure of testing difficulty and an indication of ultimate reliability Experimental data shows value of V(G) should be no more then 10. Testing is very difficulty above this value. McClure s Complexity Metric Complexity = C + V C is the number of comparisons in a module V is the number of control variables referenced in the module Similar to McCabe s but with regard to control variables. Metrics and Software Quality FURPS Functionality - features of system Usability aesthesis, documentation Reliability frequency of failure, security Performance speed, throughput Supportability maintainability Measures of Software Quality Correctness Defects/KLOC Defect is a verified lack of conformance to requirements Failures/hours of operation Maintainability Mean time to change Change request to new version (Analyze, design etc) Cost to correct Integrity Fault tolerance, security & threats Usability Training time, skill level necessary to use, Increase in productivity, subjective questionnaire or controlled experiment

Quality Model High level Design Metrics product operation revision transition Structural Complexity Data Complexity System Complexity Card & Glass 80 reliability efficiency usability maintainability testability portability reusability Metrics Structural Complexity S(i) of a module i. S(i) = f out2 (i) Fan out is the number of modules immediately subordinate (directly invoked). Design Metrics Data Complexity D(i) D(i) = v(i)/[f out (i)+1] v(i) is the number of inputs and outputs passed to and from i. System Complexity C(i) C(i) = S(i) + D(i) As each increases the overall complexity of the architecture increases. System Complexity Metric Another metric: length(i) * [f in (i) + f out (i)] 2 Length is LOC Fan in is the number of modules that invoke i. Graph based: Nodes + edges Modules + lines of control Depth of tree, arc to node ratio

Coupling Data and control flow d i input data parameters c i input control parameters d o output data parameters c o output control parameters Global g d global variables for data g c global variables for control Environmental w fan in number of modules called r fan out number modules that call module M c = k/m, k=1 Metrics for Coupling m = d i + ac i + d o + bc o + g d + cg c + w + r a, b, c, k can be adjusted based on actual data Component Level Metrics Cohesion (internal interaction) Coupling (external interaction) Complexity of program flow Cohesion difficult to measure Bieman 94, TSE 20(8) Data slice from a program slice Using Metrics The Process Select appropriate metrics for problem Utilized metrics on problem Assessment and feedback Formulate Collect Analysis Interpretation Feedback

Metrics for the Object Oriented Chidamber & Kemerer 94 TSE 20(6) Metrics specifically designed to address object oriented software Class oriented metrics Direct measures Weighted Methods per Class n c i i= 1 WMC = c i is the complexity (e.g., volume, cyclomatic complexity, etc.) of each method Must normalize What about inherited methods? Be consistent Depth of Inheritance Tree DIT is the maximum length from a node to the root (base class) Lower level subclasses inherit a number of methods making behavior harder to predict However, more methods are reused in higher DIT trees. Number of Children NOC is the number of subclasses immediately subordinate to a class As NOC grows, reuse increases But the abstraction may be diluted

Coupling between Classes CBO is the number of collaborations between two classes As collaboration increases reuse decreases CRC lists the number of collaborations Classes, Responsibilities, and Collaborations Response for a Class RFC is the number of methods that could be called in response to a message to a class Testing effort increases as RFC increases Lack of Cohesion in Methods LCOM poorly described in Pressman Class C k with n methods M 1, M n I j is the set of instance variables used by M j LCOM There are n such sets I 1,, I n P = {(I i, I j ) (I i I j ) = } Q = {(I i, I j ) (I i I j ) } If all n sets I i are then P = LCOM = P - Q, if P > Q LCOM = 0 otherwise

Example LCOM Take class C with M 1, M 2, M 3 I 1 = {a, b, c, d, e} I 2 = {a, b, e} I 3 = {x, y, z} P = {(I 1, I 3 ), (I 2, I 3 )} Q = {(I 1, I 2 )} Thus LCOM = 1 Explanation LCOM is the number of empty intersections minus the number of nonempty intersections This is a notion of degree of similarity of methods. If two methods use common instance variables then they are similar LCOM of zero is not maximally cohesive P = Q or P < Q Class Size CS Total number of operations (inherited, private, public) Number of attributes (inherited, private, public) Number of Operations Overridden NOO A large number for NOO indicates possible problems with the design Poor abstraction in inheritance hierarchy May be an indication of too much responsibility for a class

Number of Operations Added NOA The number of operations added by a subclass As operations are added it is farther away from super class As depth increases NOA should decrease Specialization Index SI = [NOO * L] / M total L is the level in class hierarchy M total is the total number of methods Higher values indicate class in hierarchy that does not conform to the abstraction Method Inheritance Factor n M i ( Ci ) i= 1 MIF = n. M a ( Ci ) i= 1 M i (C i ) is the number of methods inherited and not overridden in C i M a (C i ) is the number of methods that can be invoked with C i M d (C i ) is the number of methods declared in C i MIF M a (C i ) = M d (C i ) + M i (C i ) All that can be invoked = new or overloaded + things inherited MIF is [0,1] MIF near 1 means little specialization MIF near 0 means large change

Coupling Factor is _ client( C, C ) i j i j CF=. 2 ( TC TC) is_client(x,y) = 1 iff a relationship exists between the client class and the server class. 0 otherwise. (TC 2 -TC) is the total number of relationships possible (Total Classes 2 diagonal) CF is [0,1] with 1 meaning high coupling Polymorphism Factor M i o( Ci ) PF =. [ M ] i n( Ci ) DC( Ci ) M n () is the number of new methods M o () is the number of overriding methods DC() number of descendent classes of a base class The number of methods that redefines inherited methods, divided by maximum number of possible distinct polymorphic situations Operational Oriented Metrics Average operation size (LOC, volume) Number of messages sent by an operator Operation complexity cyclomatic Average number of parameters/operation Larger the number the more complex the collaboration Encapsulation Lack of cohesion Percent public and protected Public access to data members

Inheritance Number of root classes Fan in multiple inheritance NOC, DIT, etc.