FUNCTION POINT ANALYSIS: Sizing The Software Deliverable. BEYOND FUNCTION POINTS So you ve got the count, Now what?



Similar documents
Introduction to Function Points

Why SNAP? What is SNAP (in a nutshell)? Does SNAP work? How to use SNAP when we already use Function Points? How can I learn more? What s next?

Derived Data in Classifying an EO

Industry Metrics for Outsourcing and Vendor Management

Industry Metrics for Outsourcing and Vendor Management

PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING

Calculation of the Functional Size and Productivity with the IFPUG method (CPM 4.3.1). The DDway experience with WebRatio

Measurement for Successful Projects. Authored by Michael Harris and David Herron

How to Avoid Traps in Contracts for Software Factory Based on Function Metric

SIZING ANDROID MOBILE APPLICATIONS

Counting Infrastructure Software

Fundamentals of Function Point Analysis

Software Development: Tools and Processes. Lecture - 16: Estimation

APPLYING FUNCTION POINTS WITHIN A SOA ENVIRONMENT

Measuring Change Requests to support effective project management practices.

FUNCTION POINT ANAYSIS DETERMINING THE SIZE OF ERP IMPLEMENTATION PROJECTS By Paulo Gurevitz Cunha

Function Point Measurement from Java Programs

Sizing Logical Data in a Data Warehouse A Consistent and Auditable Approach

DETERMINING THE SIZE OF ERP IMPLEMENTATION PROJECTS. Paulo Gurevitz Cunha EDS EDS --Electronic Data Systems Data Engineering West,

FAST Function Points. David Seaver Director Estimation and Measurement Fidelity Investments

Full Function Points for Embedded and Real-Time Software. UKSMA Fall Conference

Method and Plan of Action

MEASURING THE SIZE OF SMALL FUNCTIONAL ENHANCEMENTS TO SOFTWARE

Function Point Counting Practices Manual. Release 4.1.1

Function Points? David Longstreet

IPA/SEC Data entry form Version 3.0 for IPA/SEC White Paper 20xx on software development projects in Japan

Does function point analysis change with new approaches to software development? January 2013

Performance Measurement of Software Application Development & Maintenance

Relationships Among Software Metrics in Benchmarking

Testing Metrics. Introduction

Managing Projects with Practical Software & Systems Measurement PSM

The IFPUG Counting Practices On-Going Effort in Sizing Functional Requirements. Janet Russac

Extending Function Point Estimation for Testing MDM Applications

Appendix G Technical Methodology and Approach Document

Using Entity-Relationship Diagrams To Count Data Functions Ian Brown, CFPS Booz Allen Hamilton 8283 Greensboro Dr. McLean, VA USA

Copyright 2014 Alvin J. Alexander All rights reserved. No part of this book may be reproduced without prior written permission from the author.

Software Project Management Matrics. Complied by Heng Sovannarith

Productivity Measurement and Analysis

How to Decide which Method to Use

Measuring Software Functionality Using Function Point Method Based On Design Documentation

How to Determine Your Application Size Using Function Points

Department of Finance and Deregulation 2011/004 Portfolio Panels for IT Services ATTACHMENT A

A Fool with a Tool: Improving Software Cost and Schedule Estimation

Fundamentals of Measurements

Function Points Analysis Training Course

SIZE & ESTIMATION OF DATA WAREHOUSE SYSTEMS

TEST METRICS AND KPI S

Module 1 Study Guide Introduction to PPO. ITIL Capability Courses - Planning, Protection and Optimization

Define Activities Sequence Activities Estimate Activity Resources Estimate Activity Durations Develop Schedule Control Schedule

Sizing Application Maintenance and Support activities

Case Study I: A Database Service

ITIL Version 3.0 (V.3) Service Transition Guidelines By Braun Tacon

Measuring Intangible Investment

Designing a Metrics Dashboard for the Sales Organization By Mike Rose, Management Consultant.

EPL603 Topics in Software Engineering

Accounting for Non-Functional Requirements in Productivity Measurement, Benchmarking & Estimating

ISTQB Certified Tester. Foundation Level. Sample Exam 1

IBM Tivoli Netcool network management solutions for enterprise

Application Support Solution

Best Practices Statement Project Management. Best Practices for Managing State Information Technology Projects

ITRM Guideline CPM Date: January 23, 2006 SECTION 4 - PROJECT EXECUTION AND CONTROL PHASE

Definitions. Software Metrics. Why Measure Software? Example Metrics. Software Engineering. Determine quality of the current product or process

Project Audit & Review Checklist. The following provides a detailed checklist to assist the PPO with reviewing the health of a project:

Chapter 23 Software Cost Estimation

MK II FUNCTION POINT ANALYSIS COUNTING PRACTICES MANUAL

An Introduction to. Metrics. used during. Software Development

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

Defect Tracking Best Practices

CA Clarity PPM. Project Management User Guide. v

Using Productivity Measure and Function Points to Improve the Software Development Process

FUNCTION POINT ESTIMATION METHODS: A COMPARATIVE OVERVIEW

Verum white paper study ASD SaaS Business Case for Philips Healthcare

BROCADE PERFORMANCE MANAGEMENT SOLUTIONS

Program Lifecycle Methodology Version 1.7

NASCIO EA Development Tool-Kit Solution Architecture. Version 3.0

WARRANTY MANAGEMENT REDESIGN

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

CPET 545 SOA and Enterprise Applications. SOA Final Project Project Scope Management

Defining the Scope of a Project

Automated Function Points in a Continuous Integration Environment (Agile AFP)

Introduction and Overview

Project Scope Management in PMBOK made easy

Recommendations for Performance Benchmarking

Transcription:

FUNCTION POINT ANALYSIS: Sizing The Software Deliverable BEYOND FUNCTION POINTS So you ve got the count, Now what? 2008

Course Objectives The primary webinar objectives are to: Review function point methodology Demonstrate the best practices in strategic planning, metrics and management using Function Point Analysis Copyright 2006. David Consulting Group 2

Course Standard Basis For Course The David Consulting Group's Course: Function Point Analysis, Sizing The Software Deliverable contains material which has been extracted from the International Function Point Users Group (IFPUG) Counting Practices Manual (CPM), Version 4.2 CPM is reproduced in this document with permission of IFPUG Copyright 2006. David Consulting Group 3

Course Agenda Function Point/IFPUG History Counting A High Level View Using Function Points Copyright 2006. David Consulting Group 4

Module 1: Function Point Overview Function Point/IFPUG History Counting A High Level View Using Function Points Copyright 2006. David Consulting Group 5

IFPUG History International Function Point Users Group IFPUG chartered in 1986 IFPUG Purpose To promote and encourage use of Function Points To develop consistent and accurate counting guidelines Maintains Public Standard for Sizing Software (CPM) Benefits: Networking with other counters Management Reporting Research projects Web site Metric Views, Voice Certifications for Function Point Methodology and Measurement Copyright 2006. David Consulting Group 6

IFPUG History Function Point History Function Points were introduced in 1979 by A. Albrecht, IBM, at joint Share/ Guide Conference Now used world wide in North America, Central America, South America, Australia, Europe, and Asia Recognized Functional Size Measure for ISO Standards IFPUG White Papers published on a variety of measurementrelated topics IFPUG CPM Version 4.2 issued in 2005, other versions issued now at 4.2.2 and will have new version 4.2.3 in Jan 2009 IFPUG Certification in Software Measurement in 2004 ISMA Fall Conferences on the Function Point Methodology and Measurement Topics Copyright 2006. David Consulting Group 7

Module 1: Function Point Overview Function Point/IFPUG History Sizing and Counting A High Level View Using Function Points Copyright 2006. David Consulting Group 8

Why Sizing is Important Requirements Expectation management Estimation Effective management practice, resource management Process Improvement Process management Change Control Project Management

Characteristics of an Effective Sizing Metric Meaningful to both developer and business user Defined (industry recognized) Consistent (methodology) Easy to learn and apply Accurate, statistically based Available when needed (early)

Why Function Point Analysis? Function Point Analysis is a standardized method for measuring the functionality delivered to an end user. Consistent method Easy to learn Available early in the lifecycle Acceptable level of accuracy Meaningful internally and externally Function Point counts have replaced Line of Code counts as a sizing metric that can be used consistently and with a high degree of accuracy.

What is (are) Function Points? IFPUG Definition Standard method for measuring software development from the user s point of view A standardized, structured method of measuring the size of an application or project based upon systematic classification of user requirements (processes) into system components (functions). The unit that is used to measure the software functions is function points. Function points are a unit of measure for software size based upon the functionality of the system. Copyright 2006. David Consulting Group 12

What is the User View? Key Definitions User can be a business analyst, administrator, end user or another system User View Represents a formal description of the user s business needs in the user s language; developers translate the user information into information technology language in order to provide a solution Function Point count accomplished using the information in a language common to both users and developers User View is a description of the business function is approved by the user can be used to count function points can vary in physical form (documentation formats) Copyright 2006. David Consulting Group 13

What do we count? FPs Logical View Overview Overview Physical Lines of code or programs/modules Physical database and/ or files Physical transactions (screens) Logical Functionality Logically grouped stores of user data Elementary processes which leave the business in a consistent state Copyright 2006. David Consulting Group 14

What DO we count? The Physical View INPUT FILES AND INPUT TRANSACTIONS (BATCH INTERFACES) SCREENS (ADDS, CHANGES, DELETES, QUERIES) CONTROL INFORMATION APPLICATION BOUNDARY INTERNAL FILES (TABLES, DATA FILES, CONTROL FILES) REPORTS FILES XML VIEWS OUTPUT FILES AND OUTPUT TRANSACTIONS (BATCH INTERFACES) OTHER OUTPUTS XML FICHE TAPE DISKETTES LETTERS NOTICES ALARMS TABLES & FILES REFERENCED (Not Maintained)

What do we count? The Logical View Overview External Input External Output External Inquiry Other Applications External Input External Output Internal Logical File Internal Logical File Application Boundary External Interface File Other Applications Copyright 2006. David Consulting Group 16

How do we count? FP Evaluates these Components Logically Overview Overview Data Functions Internal Groupings of data called Internal Logical Files (ILF) External Groupings of data or External Interface Files (EIF) The term file does not refer to files in the physical sense, rather refers to logical data stores, entities, objects or super classes Transaction Functions External Inputs (EI) External Outputs (EO) External Inquires (EQ) General System Characteristics Copyright 2006. David Consulting Group 17

Overview How do we count? Component Complexity FP Counting is based on Identification Rules and Complexity Rules (CPM) Components are assessed based upon complexity according to the rules Complexity Components: Low Avg. High Total Internal Logical File (ILF) x 7 x 10 x 15 External Interface File (EIF) x 5 x 7 x 10 External Input (EI) x 3 x 4 x 6 External Output (EO) x 4 x 5 x 7 External Inquiry (EQ) x 3 x 4 x 6 Copyright 2006. David Consulting Group 18

HOW do we count? 1. Identify the processes that are identifiable to the user files, reports, inputs, outputs, transactions (System functions). 2. Classify system components into function point categories (Function Point entities). {Report could be an External Output} 3. Assign each entity a complexity rating (low, average, high) and a weight (in function points). {High = 7 fps} 4. Add all the weights together and apply a system complexity factor to get final count.

What can we count? Any system that has inputs or outputs, and has and/or uses data or control information Inputs Outputs Queries Internal data stores External data sources

What can we count? Any system that has inputs or outputs, and has and/or uses data or control information Installed application: Baseline (or application) count Development project: New system or subsystem Enhancement project: Add, change or delete to present system Maintenance or Support projects: Minor enhancements Technical changes are not countable nor some types of maintenance

What can we count? Enhancement versus Maintenance Adaptive Maintenance Software maintenance performed to make a computer program usable in a changed environment Includes modifications to meet new or changing business or technical requirements Initiated by business requests Enhancement requests are a subset of this group Corrective Maintenance Software maintenance performed to correct faults in hardware or software Includes defect repair Ensures that previously delivered functionality performs as required Effort should be attributed to the original project that introduced the defect Perfective Maintenance Software maintenance performed to improve the performance, maintainability, or other attributes of a computer program Includes system upgrades, performance optimization, platform service agreement maintenance No business functionality 22

What can we count? Enhancement versus Maintenance Function Point Analysis is applicable to a subset of Adaptive Maintenance Function Point Analysis should not be used to size Perfective or Corrective Maintenance work. Mixing enhancement work with maintenance work can be expedient or necessary, even efficiency, but the hours must be tracked separately For accurate metrics, separation of work effort is necessary Preventative and/or corrective maintenance have zero function points Packing into releases and apportioning releases is usually the most manageable approach 23

Using FP Using FP Using Function Points (FPs) WHEN can we count? PROPOSAL REQUIREMENTS DESIGN CONSTRUCTION TESTING DELIVERY CORRECTIVE MAINTENANCE Feasibility Study Initial User Requirements Function Points Function Points Function Points Initial Technical Requirements Final Functional Requirements Function Points Early Estimate Approximated Revised Estimate Counted Life Cycle Phase Actual Size Counted Size can be approximated? Size can be measured? Proposal Yes No Requirements Yes Yes Design Yes Yes Construction Yes Yes Delivery Yes Yes Corrective Maintenance Yes Yes 24

Why do we count? Benefits of FPA A tool to determine the benefit of an application to an organization by counting functions that specifically match requirements Purpose is to measure functionality that the user requests and receives Measure software development and maintenance activities independently of technology used for implementation Function point count is independent of platform, language or technology Provide a consistent, normalized measure across various projects, organizations and technology platforms A normalization factor for software comparison A tool to measure the units of software product to support quality and productivity analysis A vehicle to estimate cost and resources required for software development and maintenance (Must also consider project attributes, work breakdown structure, etc.) A tool to size purchased application package Copyright 2006. David Consulting Group 25

WHY do we count? An example. To determine delivery rates for better estimation and process capability, in support of internal or outsourcing activities DEFINITION CAPABILITY ESTIMATE REQUIREMENT PROJECT SIZE and COMPLEXITY RATE OF DELIVERY Effort Schedule Costs FUNCTION POINT SIZE FUNCTION POINTS Per EFFORT MONTH

Module 1: Function Point Overview Function Point/IFPUG History Counting A High Level View Using Function Points Copyright 2006. David Consulting Group 27

Using Function Points Primary Usage Portfolio Management Applications Asset Management Baseline Counts Production Environment Projects Estimation Modeling Release Management Performance Metrics To Support Industry Best Practices Copyright 2006. David Consulting Group 28

Using Function Points to Support Organizational Goals Productivity Rate or Delivery Rate = staff hour/fp 100 fp 100 sh = 1 sh/fp 250 fp 1000 sh= 4 sh/fp 10 fp 2 sh =.2 sh/fp 500 fp 25 sh =.5 sh/fp 860 fp 1127 sh = 1.31 sh/fp Cost per function pt Costs/fps = $67620/860 = $78.62/fp Defects per function pt Defects/fps = 4/860 = 1 defect/215 fps Support rate = portfolio size/ftes Cost per fp support = Support $/fps Project and Application Counts Performance Measures Estimated Project Cost Project/SCR = 1000 fps Estimated staff hrs = Proj fps * prod rate = 1000*1.3 = 1310 sh 1310 * rate/hr = 1310 * $60 = $78,600 300,000 fps/300 ftes = 1000 fps/fte $12,000,000/300,000 fps = $40/fp New system estimate = 20,000 fps Support staff estimate = 20,000/1000 = 20 Support costs = 20,000 * $40 = $800,000 Organizational Goal Based Metrics Program Executive Level Strategic Goals Page 29

Common Metrics Formulas Productivity Rate = Effort in Hours / Size in Function Points Rework Percentage = Rework Hours / Total Project Hours Defect Removal Efficiency = Total Number of Defects Found Prior to Delivery / Total Number of Defects Discovered (before and after delivery) Production Defect Rate = Total Production Defects / Function Points Installed Warranty Defect Rate = Defects during Warranty Period/Function Points Delivered Scope Creep = Add + Del + Chg Function Points / Original Function Point Count Cost Efficiency = Actual Cost / Function Points Installed Project Duration = Project End Date Project Start Date Estimated Project Duration = Estimated Project End Date Estimated Project Start Date Development cost = Effort*Hourly Rate (or Effort Costs)/Function Points Delivered Cost Variance = Actual Cost Estimated Cost / Estimated Cost Effort Variance = Actual Effort Estimated Effort / Estimated Effort (in hours) Duration Variance = Actual Duration Est. Duration / Est. Duration (in months) Production Support Rate = Full time equivalent (FTEs)/Application Function Points Supported

Common Metrics Formulas Scorecards Development Scorecard Productivity Rate = Effort in Hours / Size in Function Points Scope Creep = Added + Deleted + Changed Function Points / Original Function Point Count Warranty Defect Rate = Defects during Warranty Period/Function Points Delivered Delivery costs = Project Costs (Effort)/Function Points Delivered Application Support Scorecard Production Support Rate = Full time equivalent (FTEs)/Function Points (Installed) Supported Production Defect Rate = Total Production Defects / Function Points Installed Cost Efficiency = Actual Cost / Function Points Installed Application Portfolio Ratio = Total Application Counts (Installed) Before/Total Application Counts (Installed) After

Best Practices in Benchmarking and Metrics Organizations use application baselines for maintenance service levels and establish performance baseline levels for enhancement and new development activities. Focus on performance levels for currently active, business critical applications for baselining and expected performance levels in accordance with industry standards. Create an application portfolio profile categorizing applications based on platform, application type, volatility, etc. Apply industry accepted measurement tools and techniques to size the defined application portfolio. Identify a core set of measures to baseline current performance levels. Measure performance levels for enhancement and new development activities. Compare supplier performance levels to internal and external benchmarks. 32

Effective Use of Baseline Data Using the data from the portfolio and performance baselines a comprehensive set of core measures can be established that will provide management with an increased ability to more effectively manage resources, plan budgets and make more informed decisions. Portfolio Management Bring maintenance costs, effort and defects in line with Industry Average benchmarks driving to Industry top quartile performance Identify costly problem areas within the application portfolio Project Performance Levels Drive cost reductions and increase performance Deliver projects on time and within budget Business Value Management Monitor contribution to the business 33

Organizational Baseline and Performance Examples FP/EM PRODUCTIVITY Client Industry Average Industry Best Practices Overall 3 12 6 18 42 98 $$/FP Client COST Industry Average Industry Best Practices Overall $535 $2345 $629 $1692 $158 $473 New 6 12 6 14 42 77 Enhancement 3 10 10 18 56 98 New $712 $2345 $823 $1692 $305 $473 Enhancement $535 $1660 $629 $1300 $158 $289 Mainframe 3 8 6 12 42 86 Client Server 6 12 9 18 51 98 Months Client DURATION Industry Average Industry Best Practices Overall 5 12 8 17 3.0 7.8 Mainframe $650 $2345 $930 $1692 $216 $473 Client Server $535 $1245 $629 $1154 $158 $420 Defects/FP Client QUALITY Industry Average Industry Best Practices Overall.0478.7060.0333.0556.0000.0175 New 5 10 8 14 4.0 7.8 Enhancement 7 12 10 17 3.0 6.2 Mainframe 9 12 8 17 3.8 7.8 Client Server 5 10 9 14 3.0 7.5 New.0478.6664.0333.0556.0095.0175 Enhancement.0873.7060.0400.0556.0000.0098 Mainframe.2568.7060.0357.0556.0095.0175 Client Server.0478.5566.0333.0526.0000.0098 Note: Examples not actual data 34

Core Measures Portfolio Management Focus: Recognizing and managing the cost for corrective maintenance (repairing defects). Reduce maintenance costs. Portfolio Management measures: Assignment Scope number of resources required to support application functionality Growth Rate increased number of Function Points per application Stability Ratio number of changes per application Function Points Application Maintenance Support measures: Maintainability Maintenance Cost per Function Point Reliability Number of production failures per total application Function Points 35

Core Measures Performance Levels Focus: Enhancement counts by project/release to monitor and manage performance effort, cost, and quality Application enhancement measures: Delivery Rate (FP/SM) Time to Market Cost Ratio ($/FP) Defect Density (Defects/FP) Project management measures: Delivery Rate (Hours per Function Point) By Platform By Type of Development By Team By Application Speed of Delivery or Function Points per Elapsed Month By Platform By Type of Development By Team By Application 36

Using Function Points Any questions? Copyright 2006. David Consulting Group 37