Measurement of Automation

Similar documents
The Testing Dashboard: Becoming an Information Provider

Improved Software Testing Using McCabe IQ Coverage Analysis

TesT AuTomATion Best Practices

Social Return on Investment

Testing Metrics. Introduction

This file contains 2 years of our interlibrary loan transactions downloaded from ILLiad. 70,000+ rows, multiple fields = an ideal file for pivot

Lean Six Sigma Analyze Phase Introduction. TECH QUALITY and PRODUCTIVITY in INDUSTRY and TECHNOLOGY

VisualCalc AdWords Dashboard Indicator Whitepaper Rev 3.2

Measuring the effectiveness of testing using DDP

An Introduction to. Metrics. used during. Software Development

1. Introduction. Annex 7 Software Project Audit Process

THE SIX SIGMA YELLOW BELT SOLUTIONS TEXT

Disseminating Research and Writing Research Proposals

Baseline Code Analysis Using McCabe IQ

Software Project Audit Process

Choosing Colors for Data Visualization Maureen Stone January 17, 2006

Testing and Scrum. Agenda. Fall 2007 Scrum Gathering

Expert Color Choices for Presenting Data

Defining Quality Workbook. <Program/Project/Work Name> Quality Definition

Universal Controller Solution Brief

5/30/2012 PERFORMANCE MANAGEMENT GOING AGILE. Nicolle Strauss Director, People Services

Statistical Process Control (SPC) Training Guide

ISO and Industry Standards for User Centred Design

Automation Testing in Mobile Applications Swati Hajela

Stellar Phoenix Exchange Server Backup

CONNECTING LESSONS NGSS STANDARD

1) Write the following as an algebraic expression using x as the variable: Triple a number subtracted from the number

Statistics 2014 Scoring Guidelines

London Stock Exchange

The Importance of Continuous Integration for Quality Assurance Teams

Greater visibility and better business decisions with Business Intelligence

Technical versus non-technical skills in test automation

Microsoft Dynamics NAV

Single Manufacturing Site with. Extended Manufacturing Site(s)

Smarter Balanced Assessment Consortium. Recommendation

Decimals and Percentages

Digital Segmentation. Basic principles of effective customer segmentation

Test Plan Template (IEEE Format)

Chapter 2: Frequency Distributions and Graphs

Leveraging Network and Vulnerability metrics Using RedSeal

Lean Management Standards: Manufacturing

99.37, 99.38, 99.38, 99.39, 99.39, 99.39, 99.39, 99.40, 99.41, cm

Prism System Update. Prism-Safety Limited. To keep you up to date on the latest status of the new updated system software, the following applies:

ALM/Quality Center. Software

Process Streamlining. Whitepapers. Written by A Hall Operations Director. Origins

Creating Charts in Microsoft Excel A supplement to Chapter 5 of Quantitative Approaches in Business Studies

Agile Development and Testing Practices highlighted by the case studies as being particularly valuable from a software quality perspective

ACTUARIAL CONSIDERATIONS IN THE DEVELOPMENT OF AGENT CONTINGENT COMPENSATION PROGRAMS

Facility & Property Management Solution

PROMOTE YOUR BUSINESS FPFREE PACK GREAT FOR DISCOVERING HOW EUROPAGES WORKS. EUROPAGES

BALANCED SCORECARD What is the Balanced Scorecard?

ISTQB Foundation Sample Question Paper No. 6

Research Methods & Experimental Design

DELTA Dashboards Visualise, Analyse and Monitor kdb+ Datasets with Delta Dashboards

ENHANCING INTELLIGENCE SUCCESS: DATA CHARACTERIZATION Francine Forney, Senior Management Consultant, Fuel Consulting, LLC May 2013

BIBA Report on the Importance of Advice in the Small to Medium Enterprise Market

Web site evaluation. Conducted for. (The Client) By Information & Design. October 17th, 1998

Lab 11: Budgeting with Excel

cadpeople l visual communication & learning visual communication with great effect

Models of a Vending Machine Business

Upping the game. Improving your software development process

Self-Improving Supply Chains

Position Classification Standard for Management and Program Clerical and Assistance Series, GS-0344

Gage Studies for Continuous Data

Guide to choosing Graphic Designers

An Overview of Quality Assurance Practices in Agile Methodologies

INTRODUCTION TO DATA MINING SAS ENTERPRISE MINER

Human Resources Management Structure Chart. Head of Human Resources

SUMMARY OF THE PROJECT

Model-based Testing: Next Generation Functional Software Testing

Multiagent Control of Traffic Signals Vision Document 2.0. Vision Document. For Multiagent Control of Traffic Signals. Version 2.0

Software Testing, Mythology & Methodologies

Utilizing Domain-Specific Modelling for Software Testing

Software Testing. System, Acceptance and Regression Testing

Northumberland Knowledge

Test What You ve Built

Fire Safety Risk Assessment Checklist for Residential Care Premises

Part 1: The Website Rules

Security visualisation

3D Interactive Information Visualization: Guidelines from experience and analysis of applications

The basics of storytelling through numbers

The Practical Organization of Automated Software Testing

White Paper Reference Data Management in Banking ENABLEIT LLC, Namir Hamid

Performance Testing Uncovered

Measuring ROI of Agile Transformation

How Silk Central brings flexibility to agile development

How to Backtest Expert Advisors in MT4 Strategy Tester to Reach Every Tick Modelling Quality of 99% and Have Real Variable Spread Incorporated

Improving database development. Recommendations for solving development problems using Red Gate tools

Test Automation. Full service delivery for faster testing at optimum cost

XpoLog Center Suite Log Management & Analysis platform

Basic Testing Concepts and Terminology

The Agile Audit. 2. Requirements & Technical Architecture

Save money and stay in. compliance. Know. what you have, keep. what you must, eliminate. what you should.

Creating Business Value with Mature QA Practices

STANDARD DEVIATION AND PORTFOLIO RISK JARGON AND PRACTICE

What is the best automation testing approach?

Whitepaper. 5 Dos and Don ts of Embedded Analytics.

Acquire with retention in mind.

Publicizing your ISO 9001 : 2008 or ISO : 2004 certification

Unit 9 Describing Relationships in Scatter Plots and Line Graphs

Transcription:

1 Measurement of Automation Tel: +44 1558 685180 Fax: +44 1558 685181 Prepared and presented by Mark Fewster Grove Consultants Llwyncynhwyra, Cwmdu Llandeilo, SA19 7EW, UK Email: mark@grove.co.uk www.grove.co.uk Measurement of Automation Contents Some principles Important automation metrics Internal automation metrics Guidelines and recommendations

2 Useful measures a useful measure: supports effective analysis and decision making, and that can be obtained relatively easily. Bill Hetzel, Making Software Measurement Work, QED, 1993. easy measures may be more useful even though less accurate (e.g. car fuel economy) useful depends on objectives, i.e. what you want to know A few principles any measure is better than no measure - need not be perfect, may not be adequate two measures better than one? presentation of measures is important - see Visual Display of Quantitative Data every measure has limitations any measure can be abused currency of measure must be meaningful

3 Measurement of Automation Contents Some principles Important automation metrics Internal automation metrics Guidelines and recommendations A particularly useful metric: EMTE equivalent manual test effort (EMTE) - a simple measure - easily understood - useful in several ways test case EMTE = amount of effort required to run the test case manually - can relate other measures to EMTE - more meaningful than plain averages

4 Important measures for automation benefit - related to the main objective(s) ) of automation - to prove that effort spent automating gives greater value than it would if spent running tests manually recurring costs - the largest and most frequent build cost (effort to automate new tests) failure analysis cost (effort investigating) maintenance cost (effort updating testware) Measures of benefit - objective: same tests, less effort e.g. 4 hours* to run 120 tests with 40 hrs EMTE saving = 36 hours for 1 test cycle - relate target to cost of automation e.g. cost to build = 40 minutes/test for 120 tests, cost to build = 80 hours 2 test cycles required to break even 4 test cycles required for 2 fold ROI *4 hours effort to invoke automated tests and analyse failures

5 Measures of benefit - objective: more thorough testing, same effort e.g. 4 hours* to run 120 tests with 40 hrs EMTE saving = 36 hours use the 36 hours to specify and run new manual tests, then employ usual metric(s) ) used to measure test thoroughness, e.g. functional/structural coverage number of test cases/conditions covered additional (manual) execution time *4 hours effort to invoke automated tests and analyse failures Measures of benefit - objective: unattended testing e.g. hours of additional (unattended) machine use - objective: faster testing (same testing, less time) average number of tests run per hour of test effort - objective: reduce testing cost average effort per test case run - objective: run more tests number of test cases run - objective: repeatable testing number of repeated runs

6 Measures of build effort time taken to automate tests - hours to add new or existing manual tests - average across different test types proportion of equivalent manual test effort - e.g. 1 hour to automate 30 minute manual test = 2 times equivalent manual test effort Suggestion Target: Trend: < 2 times decreasing 10% per year Measures of failure analysis effort analysis effort for each test - captured in fault report - effort from first recognition through to resumption of test execution - average hours (or minutes) per failed test case - % tests failing Suggestion Target: Trend: 15 minutes stable

7 Measures of maintenance effort maintenance effort of automated tests - percentage of test cases requiring maintenance - average effort per test case - percentage of equivalent manual test effort - frequency of software changes that impact automated tests Suggestion Target: < 10% Trend: stable or decreasing Measurement of Automation Contents Some principles Important automation metrics Internal automation metrics Guidelines and recommendations

8 Internal (to automation) measures indicators of test automation health - number of tests versus scripts - total execution time of tests - total EMTE - number of uses (executions) - number of software faults found - number of test faults found - number of new (automated) tests - number of new scripts Attributes of test automation reliability - % automated tests with faults - no. test iterations due to test faults - no. false negatives (test failed but result OK) flexibility - time to test an emergency fix - time to identify and execute a specific subset of tests (e.g. for regression testing) - time to restore archived tests

9 Attributes of test automation usability - average time to become confident and productive - effort to perform main tasks (e.g. select/invoke tests, analyse failures) - users own rating robustness - no. tests that fail due to one software fault - frequency of test failures due to unexpected events Attributes of test automation efficiency - effort to perform tasks (e.g. automating tests, preparing tests for execution, ascertaining results) - % test scripts used by at least x tests portability - effort to prepare automated tests for execution in new environment - no. of different environments supported - effort to make tests run using a different tool

10 Measurement of Automation Contents Some principles Important automation metrics Internal automation metrics Guidelines and recommendations Collecting the facts do not collect information for the sake of it decide how often you need to report decide who needs the information decide what information is needed decide why that information is needed expect to refine the data collection Source: Isabel Evans Get your message across

11 Presenting measures figures embedded in explanatory text - may not be read tables of figures - important information may not stand out graphs - particularly good for trends and comparison dash board of instruments - helps to characterise information Figures embedded in text The latest testing effort proved to be more difficult than planned. Of the 30 programs to be tested only 10 were delivered on time and 5 were delivered with only one week remaining of the test schedule. The quality of the software was also poorer than expected. In the first week 47 faults were found of which 18 were of high severity and 19 medium severity. This compares with an expected total of 30 with a maximum of 10 high severity and 10 medium severity faults. During the next two weeks of testing a further 55 faults were found, 22 of them high severity and 16 medium severity.

12 Table of figures No. Programs 10 1 8 4 0 2 5 0 Days late 0 2 4 6 8 10 12 14 Inconsistent scales may be confusing Week no. 1 2 3 4 High 18 14 8 3 Medium 19 11 5 5 Low 10 6 11 7 Total 47 31 24 15 Graphs No. Programs 10 8 6 4 2 50 45 40 35 30 25 20 15 10 5 0 0 2 4 6 8 10 12 14 Faults Found Days Late Low Medium High 1 2 3 4 Week No.

13 Dash board (ideas) Completed >60% >30% Untested Graphics graphics reveal data - use them to display complex ideas clearly, precisely and efficiently, but... As to the propriety and justness of representing sums of money, and time, by parts of space, tho very readily agreed to by most men, yet a few seem to apprehend that there may possibly be some deception in it, of which they are not aware... William Playfair, The Commercial and Political Atlas (1786) quoted by Edward Tufte, The Visual Display of Quantitative Information (1983) source: Isabel Evans, Get you message across, Eurostar 2002

14 The Lie Factor size of effect in graphic = size of effect in data Weeks No. of bugs 1 2 3 4 50 100 150 200 Using the diameter of the circle to show the data: Using the area of the circle to show the data: Presentation guidelines use charts / graphs were possible - make the information visually interesting keep important information visible - large and stuck to the wall use (primary / bright) colours - red signifies problems / risk - green signifies things are OK keep information up-to-date In the UK }at least!

15 Recommendations measure benefit with respect to objectives - these may change over time, must be kept challenging measure recurring costs - largest costs: build, failure analysis, maintenance - set targets for reduction (per test) be credible - don t artificially inflate Measurement of Automation Summary there are no perfect measures - measuring something is better than no measure objectives of automation indicate important metrics measure benefit - most important (since costs are more obvious) measure specific costs - so you know where to improve