TEST AUTOMATION EFFORT ESTIMATION - Best practices



Similar documents
SOFTWARE TESTING TRAINING COURSES CONTENTS

Test Automation Process

Aspire's Approach to Test Automation

Copyrighted , Address :- EH1-Infotech, SCF 69, Top Floor, Phase 3B-2, Sector 60, Mohali (Chandigarh),

Automated Testing Best Practices

Higher Focus on Quality. Pressure on Testing Budgets. ? Short Release Cycles. Your key to Effortless Automation. OpKey TM

Viewpoint. Choosing the right automation tool and framework is critical to project success. - Harsh Bajaj, Technical Test Lead ECSIVS, Infosys

Testhouse Training Portfolio

Testing Metrics. Introduction

TESTING FRAMEWORKS. Gayatri Ghanakota

QTP Open Source Test Automation Framework Introduction

Latest Trends in Testing. Ajay K Chhokra

Developing Software in a Private workspace PM PMS

Test Automation Framework

BUILDING TEST SUITES FROM TEST RECORDINGS OF WEB APPLICATIONS

Developing a Load Testing Strategy

ALM/Quality Center. Software

The ROI of Test Automation

WHITE PAPER. Test data management in software testing life cycle- Business need and benefits in functional, performance, and automation testing

Microsoft Windows PowerShell v2 For Administrators

Adaptive Automated GUI Testing Producing Test Frameworks to Withstand Change

Business white paper. Best practices for implementing automated functional testing solutions

Performance Testing Uncovered

CUT COSTS, NOT PROJECTS

Codeless Test Automation for Web Apps

A Guide Through the BPM Maze

A Guide To Evaluating a Bug Tracking System

PARCC TECHNOLOGY ARCHITECTURE ARCHITECTURAL PRINCIPLES AND CONSTRAINTS SUMMARY

Automation can dramatically increase product quality, leading to lower field service, product support and

Object-Oriented Test Automation

The QlikView deployment framework

Solution Spotlight KEY OPPORTUNITIES AND PITFALLS ON THE ROAD TO CONTINUOUS DELIVERY

Performance Testing Process A Whitepaper

ENTERPRISE ARCHITECTUE OFFICE

Rapid software development. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 17 Slide 1

Table of contents. Enterprise Resource Planning (ERP) functional testing best practices: Ten steps to ERP systems reliability

Automation Guide for SAP Regression Testing. Author: Bhavana Pande

AUTOMATED TESTING and SPI. Brian Lynch

Operational Excellence for Data Quality

Test Automation -Selenium

AUTOMATING THE WEB APPLICATIONS USING THE SELENIUM RC

This course provides students with the knowledge and skills to develop ASP.NET MVC 4 web applications.

The Importance of Continuous Integration for Quality Assurance Teams

Getting Started With Automated Testing. Michael Kelly

SOA and BPO SOA orchestration with flow. Jason Huggins Subject Matter Expert - Uniface

Managing Process Architecture and Requirements in a CMMI based SPI project 1

Agile Development with Jazz and Rational Team Concert

Sun Bear Marketing Automation Software

schedulix!focus The independit schedulix Scheduling System in Data Warehouse Environments

SOA-14: Continuous Integration in SOA Projects Andreas Gies

Testing Tools Content (Manual with Selenium) Levels of Testing

Description of Services for A Quality Assurance Engineer for SQA Assignment for eservices Development Projects ICTA/CON/IC/P5/411B

Software Engineering Principles The TriBITS Lifecycle Model. Mike Heroux Ross Bartlett (ORNL) Jim Willenbring (SNL)

GEDAE TM - A Graphical Programming and Autocode Generation Tool for Signal Processor Applications

HPE PC120 ALM Performance Center 12.0 Essentials

a new generation software test automation framework - CIVIM

Basics of Automation and Overview of QTP. By, Anver Sathic Abdul Subhan

Decomposition into Parts. Software Engineering, Lecture 4. Data and Function Cohesion. Allocation of Functions and Data. Component Interfaces


Role of Analytics in Infrastructure Management

Know the Difference. Unified Functional Testing (UFT) and Lean Functional Testing (LeanFT) from HP

TEST MANAGEMENT SOLUTION Buyer s Guide WHITEPAPER. Real-Time Test Management

Agile SPL-SCM: Agile Software Product Line Configuration and Release Management

Agile Software Development Methodologies and Its Quality Assurance

Data Masking: A baseline data security measure

International Software Test Institute

Agile Scrum Workshop

Benefits of Upgrading to Phoenix WinNonlin 6.2

Smarter Balanced Assessment Consortium. Recommendation

Integrity 10. Curriculum Guide

3-Step Competency Prioritization Sequence

The Practical Organization of Automated Software Testing

F15. Towards a More Mature Test Process. Anne Mette-Hass. P r e s e n t a t i o n

GAMP 4 to GAMP 5 Summary

User experience storyboards: Building better UIs with RUP, UML, and use cases

Methods and tools for data and software integration Enterprise Service Bus

Requirements INTLAND SOFTWARE. Management Guide INTLAND SOFTWARE LIBRARY QUIET ZONE. Requirements. Requirements

(Refer Slide Time: 01:52)

Software development. Outline. Outline. Version control. Version control. Several users work on a same project. Collaborative software development

Test Automation: A Project Management Perspective

Windows 7 Upgrade Risk Mitigation Planning: Ensuring Windows 7 Upgrade Success

Definition of SOA. Capgemini University Technology Services School Capgemini - All rights reserved November 2006 SOA for Software Architects/ 2

What is a workflow? Workflows are a series of actions that correspond to a work process

BUILD A FEARLESS SALES FORCE. CPQ Guide. CPQ Readiness check list Key questions to ask CPQ vendors CPQ implementation guide.

Business Application Services Testing

Testing Introduction. IEEE Definitions

Software Project Audit Process

Best Overall Use of Technology. Jaspersoft

Fault Slip Through Measurement in Software Development Process

Towards Collaborative Requirements Engineering Tool for ERP product customization

Transcription:

TEST AUTOMATION EFFORT ESTIMATION - Best practices

"The subject of software estimating is definitely a black art" says Lew Ireland, former president of the Project Management Institute. Estimation is more of an art than a science, and inherently more prone to the negative aspects of human biases. This paper introduces and outlines the best practices of effort estimation process for test automation projects. In addition, the paper summarizes possible framework components for any test automation project. 1. Candidates for test automation. One of the classical mistakes of the test automation team is: NOT choosing right test cases for automation. The test automation scripts are only a support device to manual testing, NOT to bump off the later. In that case, the customer will be more focused on the return on investment (ROI) for each of test automation script built (as the initial investment is high!). So choose the tangible test cases to automate against each phases of development (and demonstrate the same to the customer). How to find good test case candidates? Sl. No Test Case Complexity Number of actions Number of verifications Good candidates (~ No of executions) 1 Simple < 5 < 5 > 15 executions 2 Medium > 5 - < 15 > 5 - < 10 > 8-10 executions 3 Complex > 15 - < 25 > 10 - < 15 > 5-8 executions Why do you need phase-wise test automation? Most of the test automation projects fail to yield the ROI quickly due to application rapid change, unsuitable test cases, shaky frameworks and/or scripting issues. Also it summarizes that test automation projects catch fewer defects than it is supposed to do. The root casual analysis showed us the necessity of phase-wise test automation than one-go test automation process. So advice that test automation needs to kick off with critical test cases that are of good candidate type and then slowly branching out to other mediums as required. This solution helps the customer by lesser maintenance costs and more business for you. Also remember that framework needs constant updates along with script development process and thereby it becomes harder to maintain the framework incase if you have many scripts to develop in parallel.

What test type to be automated? It is always good to script integration and/or system functional test cases as they mostly bundle component level test cases within them. This would reduce your effort further and find good defects (Yep! Of course, this is primary objective of any test automation project) too. Please note that if you have good framework, then you can change the test scope and/or test type using configuration files. 2. Factors that affects test automation estimation. The following factors may have varying impact on the test automation effort calculation exercise. Sl. No Type Factor Impact Remarks 1 Framework Availability High Good framework makes your scripting, debugging and maintenance easier. Do understand that framework needs continuous updating across the script development. 2 Application Repeat functionality High It is quite easier to automate incase the functionality repeats across the application (Recommend to use keyword driven in such cases, as you do not end up in writing more action/verification based methods). If NOT, then the effort of building libraries and/or scripts is more linear in nature. 3 Project Test Scope High If the complexity of application as well as the test scope is complex in nature, then it would consume huge efforts to automate each test case. 4 Test tool Support to AUT Medium The test tool to be used may not support some application functionality and may cause overhead. You may find it more difficult to get started with open source scripting and/or tools. 5 Scripter Skill Medium This costs project. The right skill packages of the scripter are very essential for any good test automation. If the customer NOT willing to provide the leverage on the estimate for this factor, do NOT forget to add the learning curve cost to the overall time. 6 Application Custom Objects Medium The number of custom objects in the automation scope matters as it becomes overhead for the test automation team to built and maintain the libraries for them. 7 Application Type (Web/ Client- Server / Mainframe) 8 Application Developed Language & Medium Medium Low For web application, any commercial test tool has amazing utilities and support. Otherwise, there are good possibilities that you need to spend huge effort in building libraries. It matters as selected test tool does not support specific verification checkpoints.

3. Grouping steps to determine complexity. This is an important exercise as it may draw wrong overall effort in spite of depth application analysis. STEP 1: Suggest finding number of actions and verifications points for each test case (that are in automation scope) and then draw a chart to find the average actions, verification points and then the control limits for them. So that the complexity derivation will be based on the AUT not based on the generic industry standards. Example, TC 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 STEPS 8 12 15 12 16 30 4 8 22 21 12 27 22 15 11 15 12 9 8 19 21 42 6 22 22 Step Count 45 40 35 30 25 20 15 10 5 0 Complexity Vs Step Count 1 4 7 10 13 16 19 22 25 Series1 STEP 2: Based on the data chart, Average step count = 16 Lower control limit = 08 Upper control limit = 25 So the complexity can be grouped as:- Simple 7 steps Medium 8 steps -- Complex 17 steps -- 16 steps 25 steps TestCase No

Recommendations: 1. Neither group test case steps too close, nor wide for labeling the complexity. Be aware that the pre-script development effort for each test script is considerable as the following activities are time-consuming operations:- 1) Executing test case manually before scripting for confirming the successful operation. 2) Test data selection and/or generation for the script. 3) Script template creation (like header information, comments for steps, identifying the right reusable to be used from the repository and so on.) These efforts are highly based on the number of steps in the test case. Note that if test case varies by fewer steps, then this effort does not deviate much but incase it varies by many steps even this effort widely differs. 2. Also another factor in determining the complexity is the functionality repetition. If the test case is Complex by steps but the functionality is same as the other test case then this can be labeled as Medium or Simple (based on the judgment). 3. If the test case steps count are more than upper control limit (~ 25 in this case) value then those additional steps need to be considered as another test case. For example, the TC - 06 containing 30 steps shall be labeled as 1 complex + 1 simple (30-25) test cases. If the test case is marked as Complex instead of Medium, understand that your efforts shoot up and hurts your customer. On other way of miscalculation, it hurts us. There by, this complexity grouping is more of logical workout with data as input.

4. Framework design & estimation. We have experienced a significant increase in software reusability and an overall improvement in software quality due to the application programming concepts in the development and (re)use of semi finished software architectures rather than just single components and the bond between them, that is, their interaction. - Wolfgang Pree [Pree94] There are many frameworks that are available commercially & as open-source which are specific to a test tool or wideopen. Also you find homebrew test automation frameworks too specific to test tools. These frameworks saves a lot of scripting, debugging and maintenance efforts but aware that the customization of framework (based on the application) are very essential. Characteristics of any framework: Portable, extendable and reusable across and within projects. Ease of functionality Plug-ins/outs based on application version changes. Loosely coupled with the test tool wherever possible. Extended recovery system and exception handling to capture the unhandled errors and to run smoothly. Step, Log and Error Information provide easier debugging and customized reporting facilities of scripts. Ease of test data driven to the scripts and they need to be loosely coupled. Easily controllable and configurable test scope for every test run. Simple and easy integration of test automation components with test management, defect tracking and configuration management tools. Please note that these efforts have wide range as the framework size and scope purely depends on application nature, size and complexity. It is always a good practice to create and/or customize the framework for initial needs and then add/ update components/ features and tune.

5. Scripting Effort Estimation Template SL.NO SUB COMPONENT Simple (<8 steps) ESTIMATED EFFORT Medium (8-16 steps) Complex (17-25 steps) REMARKS 1 Pre Script Development a Test Case execution (Manual) For 1 iteration (assuming scripter knows navigation) b Test data selection For one data set (valid/invalid/erratic kind) c Script Template creation Can use script template generation utility to avoid this. d Identify the required reusable Assuming proper reusable traceability matrix presence. 2 Script Development a Application map creation Assuming the no of objects = number of actions b Base scripting Normally all these go hand-in-hand. Separated for c Add error/exception handling analysis & reasoning. d Implement framework elements 3 Script Execution a Script execution For n iterations (~ average iteration count) b Verification & Reporting Assuming there will minimal defect reporting. Total Effort per script Overall effort calculation may have the following components:- 1. Test Requirement gathering & Analysis 2. Framework design and development 3. Test Case development (incase the available manual test cases not compatible) 4. Script Development 5. Integration Testing and Baseline. 6. Test Management.