Productivity Measurement and Analysis Best and Worst Practices Dr. Bill Curtis Director, CISQ 1
Consortium for IT Software Quality Co-sponsorship IT Executives CISQ Technical experts
CISQ Objectives and Process CISQ Work Groups Automated Function Point Defined Measures Reliability CISQ Exec Forum Performance Efficiency OMG ISO Fasttrack Security Maintainability Deployment Workshops
Automated FP Specification OMG approved Automated Function Point specification Specification developed by international team led by David Herron of the David Consulting Group Mirrors IFPUG counting guidelines, but automatable Commercial implementations expected in 2Q13
Productivity Analysis Objectives Improvement Estimation Productivity Analysis Benchmarking Managing Vendors
Productivity Analysis Measures Size Primary Measures Effort Instructions Functions Requirements Functional Structural Behavioral Quality Productivity Analysis Adjustment Measures Hours Roles Phases Application Project Organization Demographics
Software Productivity Software Productivity = Size of software produced Total effort expended to produce it Release Productivity = Size of software { developed, deleted, or modified Total effort expended on the release
Size Measures Requirements-based Use Case Points, Story Points Use Case Points have not become widely used and need more development. Story points are subjective to each team and are subject to several forms of bias. Computer instructions Lines of Code Most frequently used. Different definitions of a line can cause counts to vary by 10x. Smaller programs often accomplish the same functionality with higher quality coding. Functions Function Points Popular in IT. Several variations of counting schemes (IFPUG, NESSMA, Mark II, COSMIC, etc. Manual counting is expensive and subjective certified counters can differ by 10%
Effort Measures Weakest Link After the fact estimates Underreporting Lack of normalization Memory lapses Time-splicing Inconsistency Contract issues HR issues Impressions Roles included Phases included Hours in P-Year Effort Unreliable, Inconsistent
Lifetime Productivity Decline Assumption: Productivity is a stable number Reality: Productivity is unstable, tending to decline Original productivity baseline Incremental increases in technical debt Continuing decrease in productivity Unless you take action!!!
Carry-forward Rework Release N Develop N Rework N Unfixed defects release N Release N+1 Develop N+1 Rework N+1 Rework N Unfixed defects release N Release N+2 Develop N+2 Rework N+2 Rework N Rework N+1 Unfixed defects release N+1
Quality-Adjusted Productivity Quality- Adjusted = Release productivity f(technical debt) Productivity Release Productivity should be adjusted for: 1. Effort shifted forward for fixing functional defects added in this release 2. Effort shifted forward for fixing structural defects added in this release 3. Future effort caused by maintainability problems added in this release Quality- Adjusted Productivity = Effort (Release N) + Function Points or Enhancement Function Points Effort for Carryforward Should-fix Functional Defects (Release N) Effort for Carryforward Should-fix Structural Defects + + f effort ( (Release N) Maintainability (Release N))
Productivity Baselines Pro-duc-tiv-i-ty base-line (n.) a value in a monotonically declining function that compares the amount of product produced to the amount of effort required to produce it. unless you take action Release Productivity = Volume of code developed, modified, or deleted Total effort expended on the release Quality-Adjusted Productivity = Release productivity f(technical debt) 13
Segment Applications 1 KLOC 8 ~20 Engineering programs Business programs Small programs Productivity
Segment Applications 2 Lines of code/person-years 2500 2000 1500 1000 500 0 1980 1981 1982 Baselines
Demographic Measures Level Application Project Organization Examples of demographic factors Type of application, languages, frameworks, technology platforms, application age, etc. Development method, lifecycle phases, release frequency, team size, software tool usage, team experience, customer, requirements volatility, in- vs. outsourced, location, etc. Organizational maturity, industry segment, etc.
Productivity Analysis Process 1) Establish executive leadership 8) Segment applications 2) Form Measurement group 7) Inspect data distributions 9) Iterate analyses 3) Automate data collection 6) Enforce data management 10) Pre-brief results 4) Seek measure consensus 5) Check data validity 11) Structure final results
Best Practices Initiation Establish executive support Charter measurement group Automate data collection Seek measure consensus State objectives Track progress Enforce compliance Achieve insight At least one full time member Linked to improvement function Measurement & statistical expertise Communication skills Integrate with development environment Automate collection and reporting Train data providers Assist semi-automated collection Match measures to objectives Start with industry standards measures Review with staff and management Publish and automate definitions
Best Practices Gathering Data Check data validity Enforce data management Audit data submissions Make common sense data checks Review questionable data with projects Correct issues Clarify application boundaries Establish change and version control Continually normalize measures Manage productivity baseline versions
Best Practices Analyzing Data Inspect data distributions Segment applications Iterate analyses Evaluate shape and range Investigate outliers and extreme values Revalidate strange data Establish impact on interpretation Identify important demographic factors Test differences among factor categories Evaluate cross factor relationships Establish baseline groupings Develop overall and segmented baselines Study scatterplots of data Ask why questions about the results Run more analyses to answer questions
Best Practices Reporting Data Pre-brief Results Structure Final Results Discuss results with application managers Seek clarification and explanations Correct mistakes or misinterpretations Prepare app manager for report to execs Interview execs about needs from data Organize results to support decisions Highlight the important messages Anticipate questions
www.it-cisq.org 1. Join CISQ 2. Contribute to the blog 3. Use CISQ standards 4. Attend CISQ seminars Berlin, June 19 NJ, Sept. 25 SF, Dec. 11 5. Initiate measurement 6. Improve continually 7. Build great software