The Latest Industry Data for Application Development And Maintenance



Similar documents
Reducing Gaps In Software Process Performance Through Identification And. Implementation Of Best Software Practices

Performance Measurement of Software Application Development & Maintenance

Method and Plan of Action

Sustainable Software Development in Agile and CMMI: Apply Lessons Learned today

Industry Metrics for Outsourcing and Vendor Management

Industry Metrics for Outsourcing and Vendor Management

Introduction to Function Points

FUNCTION POINT ANALYSIS: Sizing The Software Deliverable. BEYOND FUNCTION POINTS So you ve got the count, Now what?

Determining Best Fit. for ITIL Implementations

(Refer Slide Time: 01:52)

Fundamentals of Measurements

Measurement for Successful Projects. Authored by Michael Harris and David Herron

SEI Level 2, 3, 4, & 5 1 Work Breakdown Structure (WBS)

Understanding the Performance Management Process

Earned Value Management for Enterprise Resource Planning Implementations

The Productivity Pitfalls of Process Improvement

ITIL v3 Service Manager Bridge

Personal Software Process (PSP)

CDC UNIFIED PROCESS PRACTICES GUIDE

Software Engineering: Analysis and Design - CSE3308

Dallas IIA Chapter / ISACA N. Texas Chapter. January 7, 2010

Benefits of conducting a Project Management Maturity Assessment with PM Academy:

Integration Mgmt / Initiating Process Group 4.1 Develop Project Charter

Application Support Solution

Outsourcing A Fact Based Decision?

Java Programming (10155)

PUB (MPI) 1-62 Reference: Gartner Scorecard

Portfolio, Programme and Project Management Maturity Model - a Guide to Improving Performance

Relationships Among Software Metrics in Benchmarking

ITSM Reporting Services. Enterprise Service Management. Monthly Metric Report

Quick Reference Guide Interactive PDF Project Management Processes for a Project

How To Implement An Enterprise Resource Planning Program

ITSM 101. Patrick Connelly and Sandeep Narang. Gartner.

SITA Service Management Strategy Implementation. Presented by: SITA Service Management Centre

Process Improvement. Process improvement. Process improvement stages. Understanding, Modelling and Improving the Software Process

IG ISCM MATURITY MODEL FOR FY 2015 FISMA FOR OFFICIAL USE ONLY

Project Integration Management

ITIL V3 Foundation Certification - Sample Exam 1

2011 NASCIO Nomination Business Improvement and Paperless Architecture Initiative. Improving State Operations: Kentucky

Anatomy of an Enterprise Software Delivery Project

Project Audit & Review Checklist. The following provides a detailed checklist to assist the PPO with reviewing the health of a project:

Service Quality Management The next logical step by James Lochran

Benchmarking Software Quality With Applied Cost of Quality

PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING

The 10 Knowledge Areas & ITTOs

Simplify and Automate IT

Executive Summary. Overview

Productivity Measurement and Analysis

Steve Masters (SEI) SEPG North America March Carnegie Mellon University

How To Implement Fusion Hcm

Board Update. Performance Management Activities

Practice Overview. REQUIREMENTS DEFINITION Issue Date: <mm/dd/yyyy> Revision Date: <mm/dd/yyyy>

Request for Proposal for Application Development and Maintenance Services for XML Store platforms

In an ERP implementation how the entire organization works changes forever

Usability Test Plan Docutek ERes v University of Texas Libraries

Software Development Life Cycle (SDLC)

Strategic Outcome- Based Metrics for the Federal Government

Measurement and Metrics Fundamentals. SE 350 Software Process & Product Quality

How To Run A Center Of Excellence (Coe)

A Study on Software Metrics and Phase based Defect Removal Pattern Technique for Project Management

Help desk / incident management system. João Carlos Cardoso Director of IT Tribunal de Contas, Portugal 19 de Abril de 2007

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Verum white paper study ASD SaaS Business Case for Philips Healthcare

An Introduction to. Metrics. used during. Software Development

ID Task Name Time Pred

Implementing CMMI for High-Performance

Risk Mitigation, Monitoring and Management Plan

8 Steps for Turning 360 Feedback into Results. Leadership Intelligence White Paper // Paul Warner, Ph.D.

SOFTWARE QUALITY & SYSTEMS ENGINEERING PROGRAM. Quality Assurance Checklist

Software Process Improvement CMM

PRODUCT DEVELOPMENT BEST PRACTICES AND ASSESSMENT

Software cost estimation

Chapter 23 Software Cost Estimation

Measurement Information Model

IT Services Management Service Brief

SOFTWARE MANAGEMENT EXECUTIVE SUMMARY

Total Quality. 1) Quality

Cisco IT Technology Tutorial Overview of ITIL at Cisco

Business Intelligence. Using business intelligence for proactive decision making

Optimos Enterprise Helpdesk Automation Solution Case Study

Measure for Measure: A Practical Quality Management Program. Robert Altizer and Don Downin Motorola Semiconductor Products Sector Tempe, AZ

Application Performance Testing Basics

Change Management Best Practices

Certification of Externally Developed Software Craig A. Schiller Payoff

Quantitative Quality Management through Defect Prediction and Statistical Process Control Τ

Current Defect Density Statistics

Input, Output and Tools of all Processes

Blend Approach of IT Service Management and PMBOK for Application Support Project

What methods are used to conduct testing?

WHY GOOD DATA IS A MUST

How To Evaluate A Crom Software Product

Network Management Slide Set 3

HR Function Optimization

Knowledge Area Inputs, Tools, and Outputs. Knowledge area Process group/process Inputs Tools Outputs

IT Customer Relationship Management supported by ITIL

Domain Analysis for the Reuse of Software Development Experiences 1

IT Services Management Service Brief

The Diverse Manufacturing Supply Chain Alliance (DMSCA) The Corporate Mentoring Program (CMP)

Leveraging BPM Workflows for Accounts Payable Processing BRAD BUKACEK - TEAM LEAD FISHBOWL SOLUTIONS, INC.

The Financial Planning Analysis and Reporting System (FPARS) Project: Implementation Status Update

Transcription:

The Latest Industry Data for Application Development And Maintenance February 9, 2005 Software Quality Group of New England www.davidconsultinggroup.com Presentation Topics Qualitative and Quantitative Measurement Baseline Your Organization s Performance Implement Best Software Practices 2005 The David Consulting Group, Inc. 2 1

Qualitative and Quantitative Measurement QUANTITATIVE QUALITATIVE Measures how you are doing Deliverable Size Effort/Cost Duration Quality Process Methods Skills Tools Environment Identifies what you are doing Measured Performance Capability Maturity Standard of performance Baseline of Performance 2005 The David Consulting Group, Inc. 3 Base Metrics On Goals Of The Process Being Measured Business Related Measures Unit Delivery Cost Time To Market Customer Satisfaction Process Related Measures Effectiveness Integration Compliance Project Related Measures Project Tracking Estimating Change Management Contribution Measure the impact of IT on the business Identify trends and monitor progress Utilize measures in a pro-active format 2005 The David Consulting Group, Inc. 4 2

Utilize Measurement Results In Decision Making Improvements resulting from current and future initiatives must be measured The basis for measuring improvements may include: Industry data Organizational baseline data It is necessary for the organization to put a stake in the ground relative to current performance level in order to improve development practices 2005 The David Consulting Group, Inc. 5 Presentation Topics Qualitative and Quantitative Measurement Baseline Your Organization s Performance Implement Best Software Practices 2005 The David Consulting Group, Inc. 6 3

Collect & Report Identify sample set (typically project oriented) Collect baseline data Project measures (e.g., effort, size, cost, duration, defects) Project attributes (e.g., skill levels, tools, process, etc.) Analyze data Performance comparisons (identification of process strengths and weaknesses) Industry averages and best practices Performance modeling (identify high impact areas) Report results 2005 The David Consulting Group, Inc. 7 Quantitative & Qualitative Assessments Research MEASURES Software Size Level of Effort Time to Market Delivered Defects Cost CHARACTERISTICS Skill Levels Automation Process Management User Involvement Environment Analysis PERFORMANCE LEVELS PROFILES Results Correlate Performance Levels to Characteristics Substantiate Impact of Characteristics Identify Best Practices 2005 The David Consulting Group, Inc. 8 4

Analyze Project Attributes MANAGEMENT Team Dynamics High Morale Project Tracking Project Planning Automation Management Skills DEFINITION Clearly Stated Requirements Formal Process Customer Involvement Experience Levels Business Impact DESIGN Formal Process Rigorous Reviews Design Reuse Customer Involvement Experienced Development Staff Automation BUILD Code Reviews Source Code Tracking Code Reuse Data Administration Computer Availability Experienced Staff Automation TEST ENVIRONMENT Formal Testing Methods New Technology Test Plans Automated Process Development Staff Experience Adequate Training Effective Test Tools Organizational Dynamics Customer Involvement Certification 2005 The David Consulting Group, Inc. 9 Strengths & Opportunities (An Example) Definition Strengths Requirements are clearly stated and stable Development and customers are experienced in applications Opportunities for Improvement More formal requirements gathering process on larger projects More consistent use of prototyping on larger projects A formal review process 2005 The David Consulting Group, Inc. 10 5

Establish A Baseline Size is expressed in terms of functionality delivered to the user Software Size 2200 2000 1800 1600 1400 1200 1000 800 600 400 200 0 Rate of delivery is a measure of productivity Performance Productivity A representative selection of projects is measured Organizational Baseline 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery Function Points per Person Month 2005 The David Consulting Group, Inc. 11 Compare To Industry Benchmarks Software Size 2200 2000 1800 1600 1400 1200 1000 800 600 400 200 0 Industry baseline performance 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery Function Points per Person Month 2005 The David Consulting Group, Inc. 12 6

Function Points Per Person Month Average of Recent Projects Across Different Platforms Client Server 17 Main Frame 13 Web 25 e-business Web 15 Vendor Packages 18 Data Warehouse 9 2005 The David Consulting Group, Inc. 13 Function Points Supported By One FTE Average of Support Provided for Corrective Maintenance by One FTE Client Server 642 Main Frame 978 Web 756 e-business Web 438 Vendor Packages 740 Data Warehouse 392 2005 The David Consulting Group, Inc. 14 7

Presentation Topics Qualitative and Quantitative Measurement Baseline Your Organization s Performance Implement Best Software Practices 2005 The David Consulting Group, Inc. 15 Reduce The Gaps Model the impact of implementing selected process improvements Evaluate the impact on productivity Modeling is performed from several perspectives: Management Improvements, Design Improvements, Definition Improvements, Build Improvements, Test Improvements, Environment Improvements and SEI CMM Specific Improvements EXAMPLE: Improvements as measured from the following baseline -- Average Project Size: 133 Function Points (FPs) Average Productivity: 10.7 FP/Effort Month (EM) Average Time-to-Market: 7.3 Months Average Cost/FP: $934.58 Projected Delivered Defects/FP:.0301 2005 The David Consulting Group, Inc. 16 8

About Performance Modeling Develop models that utilize historical data points to analyze the impact of selected process improvements Provide a knowledge base for improved decision making Identify areas of high impact (e.g., productivity and quality) Create an atmosphere of measuring performance Opportunity for comparison to industry best practices 2005 The David Consulting Group, Inc. 17 DCG Data Base Reveals Best Practices Characteristics Project Type Platform Data Base Method Language Metrics Complexity Variables Logical Algorithms Code Structure Mathematical Algorithms Performance Data Relationships Memory Functional Size Security Reuse Warranty Attributes Size Cost Effort Duration Defects Management Definition Design Build Test Environment Process Skill Levels Quality Practices Measures 2005 The David Consulting Group, Inc. 18 9

Quantitative & Qualitative Performance Measurement Collection COLLECT QUANTITATIVE DATA Size Effort Duration Cost Quality COLLECT QUALITATIVE DATA Process Methods Skills Tools Management Analysis Measured Performance Capability Profiles Results Baseline Performance Action Opportunities For Improvement Best Practices 2005 The David Consulting Group, Inc. 19 Quantitative Performance Evaluation COLLECT QUANTITATIVE DATA Size Effort Duration Cost Quality Measured Performance Quantitative Assessment Perform functional sizing on all selected projects. Collect data on project level of effort, cost, duration and quality. Calculate productivity rates for each project, including functional size delivered per staff month, cost per functional size, time to market, and defects delivered. Results Baseline Productivity Average Project Size 133 Average FP/SM 10.7 Average Time-To-Market (Months) 6.9 Average Cost/FP $939 Delivered Defects/FP 0.0301 2005 The David Consulting Group, Inc. 20 10

Qualitative Performance Evaluation COLLECT QUALITATIVE DATA Process Methods Skills Tools Management Capability Profiles Qualitative Assessment Conduct Interviews with members of each project team. Collect Project Profile information. Develop Performance Profiles to display strengths and weaknesses among the selected projects. Results Project Name Profile Score Management Definition Design Build Test Environment Accounts Payable 55.3 47.73 82.05 50.00 46.15 43.75 50.00 Priotity One 27.6 50.00 48.72 11.36 38.46 0.00 42.31 HR Enhancements 32.3 29.55 48.72 0.00 42.31 37.50 42.31 Client Accounts 29.5 31.82 43.59 0.00 30.77 37.50 42.31 ABC Release 44.1 31.82 53.85 34.09 38.46 53.13 42.31 Screen Redesign 17.0 22.73 43.59 0.00 15.38 0.00 30.77 Customer Web 40.2 45.45 23.08 38.64 53.85 50.00 34.62 Whole Life 29.2 56.82 28.21 22.73 26.92 18.75 53.85 Regional - East 22.7 36.36 43.59 0.00 30.77 9.38 30.77 Regional - West 17.6 43.18 23.08 0.00 26.92 9.38 26.92 Cashflow 40.6 56.82 71.79 0.00 38.46 43.75 38.46 Credit Automation 23.5 29.55 48.72 0.00 38.46 6.25 26.92 NISE 49.0 38.64 56.41 52.27 30.77 53.13 53.85 Help Desk Automation 49.3 54.55 74.36 20.45 53.85 50.00 38.46 Formula One Upgrade 22.8 31.82 38.46 0.00 11.54 25.00 46.15 2005 The David Consulting Group, Inc. 21 Modeled Improvements Project Name Profile Score Management Definition Design Build Test Environment Payable 55.3 47.73 82.05 50.00 46.15 43.75 50.00 Accounts Priotity One 27.6 50.00 48.72 11.36 38.46 0.00 42.31 HR Enhancements 32.3 29.55 48.72 0.00 42.31 37.50 42.31 Client Accounts 29.5 31.82 43.59 0.00 30.77 37.50 42.31 ABC Release 44.1 31.82 53.85 34.09 38.46 53.13 42.31 Screen Redesign 17.0 22.73 43.59 0.00 15.38 0.00 30.77 Customer Web 40.2 45.45 23.08 38.64 53.85 50.00 34.62 Whole Life 29.2 56.82 28.21 22.73 26.92 18.75 53.85 Regional - East 22.7 36.36 43.59 0.00 30.77 9.38 30.77 Regional - West 17.6 43.18 23.08 0.00 26.92 9.38 26.92 Cashflow 40.6 56.82 71.79 0.00 38.46 43.75 38.46 Credit Automation 23.5 29.55 48.72 0.00 38.46 6.25 26.92 NISE 49.0 38.64 56.41 52.27 30.77 53.13 53.85 Help Desk Automation 49.3 54.55 74.36 20.45 53.85 50.00 38.46 Formula One Upgrade 22.8 31.82 38.46 0.00 11.54 25.00 46.15 Process Improvements: Code Reviews and Inspections Requirements Management Defect Tracking Configuration Management Project Name Profile Score Management Definition Design Build Test Environment Payable 75.3 61.73 82.05 60.00 60.15 53.75 50.00 Accounts Priotity One 57.6 57.00 55.72 18.36 45.46 22.00 49.31 HR Enhancements 52.3 32.55 51.72 23.00 42.31 57.50 49.31 Client Accounts 69.5 53.82 65.59 12.00 50.77 67.50 49.31 ABC Release 74.1 55.82 69.85 49.09 52.46 63.13 49.31 Screen Redesign 67.0 43.73 63.59 21.00 36.38 20.00 51.77 Customer Web 59.2 49.45 27.08 58.64 53.85 54.00 49.62 Whole Life 50.2 49.82 32.21 27.73 31.92 24.75 53.85 Regional - East 57.7 59.36 49.59 0.00 30.77 9.38 50.77 Regional - West 52.6 55.18 30.08 0.00 33.92 19.38 26.92 Cashflow 67.6 66.82 71.79 0.00 49.46 53.75 49.46 Credit Automation 60.5 41.55 78.72 0.00 50.46 26.25 46.92 NISE 79.0 68.64 76.41 62.27 65.77 53.13 53.85 Help Desk Automation 79.3 64.55 74.36 47.45 63.85 54.00 58.46 Formula One Upgrade 52.8 49.82 52.46 0.00 31.54 25.00 56.15 SAMPLE DATA Baseline Productivity Average Project Size 133 Average FP/SM 10.7 Average Time-To-Market (Months) 6.9 Average Cost/FP $939 Delivered Defects/FP 0.0301 Performance Improvements: Productivity ~ +131% Time to Market ~ -49% Defect Ratio ~ -75% Productivity Improvement Average Project Size 133 Average FP/SM 24.8 Average Time-To-Market (Months) 3.5 Average Cost/FP $467 Delivered Defects/FP 0.0075 2005 The David Consulting Group, Inc. 22 11

Contact Information David Consulting Group web site: www.davidconsultinggroup.com David Garmus dg@davidconsultinggroup.com 2005 The David Consulting Group, Inc. 23 12