Quality is the responsibility of the whole team



Similar documents
Latest Trends in Testing. Ajay K Chhokra

Automated Testing Best Practices

Table of contents. Enterprise Resource Planning (ERP) functional testing best practices: Ten steps to ERP systems reliability

Introduction to Automated Testing

Aspire's Approach to Test Automation

Basic Testing Concepts and Terminology

Standard Glossary of Terms Used in Software Testing. Version 3.01

Test Automation Framework

Key Benefits of Microsoft Visual Studio Team System

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

Business white paper. Best practices for implementing automated functional testing solutions

Th3 - Open Source Tools for Test Management

ISTQB Certified Tester. Foundation Level. Sample Exam 1

Application Test Management and Quality Assurance

Establishing your Automation Development Lifecycle

Global Software Change Management for PVCS Version Manager

Global Delivery Excellence Best Practices for Improving Software Process and Tools Adoption. Sunil Shah Technical Lead IBM Rational

Rational Quality Manager. Quick Start Tutorial

Benefits of Test Automation for Agile Testing

On the Edge of Mobility Building a Bridge to Quality October 22, 2013

TESTING FRAMEWORKS. Gayatri Ghanakota

Sonata s Product Quality Assurance Services

JOB DESCRIPTION APPLICATION LEAD

The ROI of Test Automation

Service Virtualization:

CDC UNIFIED PROCESS JOB AID

AGILE SOFTWARE TESTING

Designing a Software Test Automation Framework

SOFTWARE TESTING TRAINING COURSES CONTENTS

The Tester's Role in Continuous Integration

Towards Collaborative Requirements Engineering Tool for ERP product customization

Know the Difference. Unified Functional Testing (UFT) and Lean Functional Testing (LeanFT) from HP

A Technology Based Solution to Move Client Server Applications to Java /.NET in Native 3-Tier Web Code Structures

MTAT : Software Testing

Table of contents. Performance testing in Agile environments. Deliver quality software in less time. Business white paper

Scalable Web Programming. CS193S - Jan Jannink - 1/12/10

Enabling Continuous Delivery by Leveraging the Deployment Pipeline

Co-Presented by Mr. Bill Rinko-Gay and Dr. Constantin Stanca 9/28/2011

The Practical Organization of Automated Software Testing

The Importance of Continuous Integration for Quality Assurance Teams

Sample Exam Syllabus

Implement a unified approach to service quality management.

Testhouse Training Portfolio

White Paper Software Quality Management

a new generation software test automation framework - CIVIM

Standard Glossary of Terms Used in Software Testing. Version 3.01

Enabling Data Quality

Software Solutions Digital Marketing Business Services. SugarCRM Community Edition for Small & Medium Enterprises

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

A Guide To Evaluating a Bug Tracking System

SOA Testing Services. Enabling Business Agility and Digital Transformation

Project Management Plan for

IBM Client Innovation Centre Thames Valley Vacancies for Experienced Hires

Business Analysis Standardization & Maturity

Five best practices for deploying a successful service-oriented architecture

View Point. Developing a successful Point-of-Sale (POS) test automation strategy. Abstract. - Sujith George


Federal Secure Cloud Testing as a Service - TaaS Center of Excellence (CoE) Robert L. Linton

Performance Testing Uncovered

HP Service Manager. Software Version: 9.34 For the supported Windows and UNIX operating systems. Processes and Best Practices Guide

Ce document a été téléchargé depuis le site de Precilog. - Services de test SOA, - Intégration de solutions de test.

Best Overall Use of Technology. Jaspersoft

Bringing Value to the Organization with Performance Testing

Software Test Management Involving Client Relationship and Application Virtualization

What Is Specific in Load Testing?

Swirl. Multiplayer Gaming Simplified. CS4512 Systems Analysis and Design. Assignment Marque Browne Manuel Honegger

Collaborative DevOps Learn the magic of Continuous Delivery. Saurabh Agarwal Product Engineering, DevOps Solutions

Functional and LoadTest Strategies

Fundamentals of Measurements

Migration from SharePoint 2007 to SharePoint 2010

Levels of Software Testing. Functional Testing

How Silk Central brings flexibility to agile development

Workshop on Agile Test Strategies and Experiences. Fran O'Hara, Insight Test Services, Ireland

FSW QA Testing Levels Definitions

Process Description Incident/Request. HUIT Process Description v6.docx February 12, 2013 Version 6

Viewpoint. Choosing the right automation tool and framework is critical to project success. - Harsh Bajaj, Technical Test Lead ECSIVS, Infosys

Achieving business benefits through automated software testing. By Dr. Mike Bartley, Founder and CEO, TVS

Agenda. The Need 4 Speed. Promises of test automation. Quality of the test case. Quality of the test case

Getting Things Done: Practical Web/e-Commerce Application Stress Testing

Software Quality Testing Course Material

Life Cycle Management for Oracle Data Integrator 11 & 12. At lower cost Get a 30% return on investment guaranteed and save 15% on development costs

What s new in the HP Functional Testing 11.5 suite Ronit Soen, product marketing John Jeremiah, product marketing

Smarter Balanced Assessment Consortium. Recommendation

Bringing agility to Business Intelligence Metadata as key to Agile Data Warehousing. 1 P a g e.

<Insert Picture Here> When to Automate Your Testing (and When Not To)

Continuous???? Copyright 2015 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.

Secrets to Automation Success. A White Paper by Paul Merrill, Consultant and Trainer at Beaufort Fairmont, LLC

Why HTML5 Tests the Limits of Automated Testing Solutions

Test Data Management Best Practice

SA Tool Kit release life cycle

Transcription:

Quality is the responsibility of the whole team Test do not create quality or take it away, it is only a method of assessing the quality of a product. If development schedules get tight, make sure project managers does not use the test team as quality gatekeeper at the end of the development process by skipping qa activities and testing at earlier stages of development. Include test coordinators in status meetings or make sure they get all information that may influence the scope, prioritizations, method and/or progress of testing activities including test planning. Never trust the test team to be the quality gatekeeper October 2011/Dagny G. Pedersen 1

How to make the testers love you (tester=business expert) Configuration management procedures are not invented to make developer s life miserable lack of configuration managements makes everybody miserable System test is the end users first encounter with the new system think about how you can contribute to a GOOD first impression Insist on project resources for deliverables and ways of working that promotes good quality (business term glossary is one example, coding standards and reviews are others) Defects that should never survive unit test and unit integration test are Invalid input allowed (e.g. alphanumeric input in numeric field, values out of range) Missing error messages Missing field labels, buttons, values in drop-down lists etc. October 2011/Dagny G. Pedersen 2

Test tools Computer Aided Software Testing (CAST) Topics Type of test tools Test automation Selection and introduction of test tools October 2011/Dagny G. Pedersen 3

Why invest in test tools? Some reasons why an enterprise needs software testing tools are: The demand for testing is continuously growing accelerating marked demands for new releases of business systems increasing number of user interface platforms and the need to test upgrades of a whole range of technical infrastructure components. Can make testing cheaper There just is not enough time, money and human resources with the right knowledge to keep testing coverage at an acceptable level using only manual testing. The business systems of today are so complex and diversified that it is impossible to assess the quality unless one has a tool to support the test process and keep track of all quality attributes and their statuses. Multi-site organisations with stakeholders, tester and developers spread over several sites and even in different time zones also counters for a tool to ease sharing of information. Can make testing more systematic Some tests are not possible to do without tools October 2011/Dagny G. Pedersen 4

ALM = Application Lifecycle Management The term is commonly used for products that somehow touch the application lifecycle, from requirements gathering and configuration management to project management and monitoring. New tools and improved versions of existing tools are continuously introduced to the market. No vendor support ALL aspects of ALM in one single tool October 2011/Dagny G. Pedersen 5

A myriad of vendors and tools October 2011/Dagny G. Pedersen 6

ALM tool model Project / task / resource management Requirement management tool Functional testing tool including automation capabilities Incident Management Tracking Tool Test Management tool Test Data Preparation Tool Tools for test of spesial features Tools for Performance/ Load/Stress testing Configuration management One key requirement to a test tool is OPEN INTERFACE capabilities. October 2011/Dagny G. Pedersen 7

Configuration management The most important tool! Lack of configuration management is the 8th deadly sin! A developer must always know which version he is deploying must always keep in mind changes in different branches of the code A tester must always know which version of a product he is running his tests against use the corresponding version of test script/procedure There is seldom a good excuse for bad configuration management. Still a lot of developers and testers everywhere waste a lot of time due to lack of version control. October 2011/Dagny G. Pedersen 8

Tool for test of special features Security testing (data privacy, and access control, virus checking) Usability testing Coverage tool Website link checking tool to look for broken links Drivers for early verification of interface specifications Dynamic analysis tool (debugger is one example, performance testing tool another) Static code analyzer Cross-browser compatibility tool October 2011/Dagny G. Pedersen 9

Some basic terms Static testing tool: traverse and check the code without executing it Dynamic testing tool: executes the code Test framework: A tool to make execution of a test object possible A developer tool to isolate a component from its surroundings (or that replaces or generates drivers and stubs) Otherwise used in industry for: Reusable and extensible testing libraries to build testing tools (such tools are also called test harness ) A type of design of test automation (like data-driven, keyword-driven) The process for executing tests Test automation framework defining the format in which to express input variables and expected results creating a mechanism to hook into or drive the application under test executing the tests reporting results October 2011/Dagny G. Pedersen 10

Test management tool Support the whole test process: test planning, test design, test execution and incident management. Test planning (coverage, status of test scripts (design, review, ready), who is responsible, priority Test design Test execution planning: test cycles, test sets, when and whom Interface for executing tests and logging test results. Progress and status reports (which tests are passed, failed, not completed, not run) Team collaboration (mail notification is one example) A lot of these tools also support requirement management: Requirement prioritization, release planning, version handling Risk models to support risk based testing October 2011/Dagny G. Pedersen 11

Test management tool requirements Provide necessary metrics for quality assessment including requirement traceability Support multi-projects Support agile as well as waterfall software development lifecycle models Support distributed teams ( My page is a very useful feature!) Integration with tools from other vendors, e.g. tools for test automation Ease of use Ease of customization (roles, work flows, access rights, attributes) Customer support Innovation TOC unless your budget is unlimited October 2011/Dagny G. Pedersen 12

Using test results to assess quality Metrics from test: Requirements coverage summary: requirements verified OK, requirements failed and requirements not yet tested. Requirement incident count Summary of test execution status; Number of test passed, number of tests failed, number of tests not run. Incident summary report showing total number of incidents grouped by severity and status. A list of all unresolved incidents Incident progress rate together with test run progress rate October 2011/Dagny G. Pedersen 13

Graphs supporting test completion decision Incident Progress Rate Graph Test Run Progress Rate Graph October 2011/Dagny G. Pedersen 14

Potential benefits of test management tool Ease of access to information on test procedure tasks and test results (Less) Unbiased assessment of quality Reuse of test scripts is simplified Test process improvement October 2011/Dagny G. Pedersen 15

NOTE!! Using a test management tool is no guarantee for high quality deliveries Metrics from the tool can show complete requirements coverage, 100% successful test execution and no open defects at test termination and still result in a product release with a lot of defects. This is normally caused by a combination of several factors like Incomplete requirements Inadequate test data Poor test design Test not focusing on the high risk areas Testing tasks left out of plans because not enough time and resources Inadequate version handling of requirements, code and test assets -.. These factors are due to procedural shortcomings, lack of software testing skills etc. and can not be helped by any tool October 2011/Dagny G. Pedersen 16

Incident management tool Incident reports with documentation such as screen captures Categorization Prioritization Allocation Support several incident types (defect, change request, issue ) Incident work flow depending on incident type Role based access rights (not all project members should be allowed to close a defect) Statistical analysis (can for instance spot error prone application areas) Integration with configuration management (which files are modified when fixing a defect) October 2011/Dagny G. Pedersen 17

Incident workflow example New Open In Progress Fixed Test <Env 1> Test <Env 2> Test <Env 3> Reopen Tested OK <Env 3> TestedOK <Env 2> Tested OK <Env 1> Rejected Closed Valid Transitions and roles New -> Open: Bug fix manager when assigning issue to developer Open -> In Progress: Developer when starting work on the incident In Progress -> Fixed: Developer when defect is ready for deployment, assigns to deployment Fixed -> Test <env 1>: Deployment assigns to Test Manager when fix is deployed in <Env 1> Test <Env 1> - >Tested OK <Env 1>: Tester assigns to Deployment or to Test Manager if Env 1 is last stop New, Open, In Progress, Reopen -> Rejected : Test Manager, Bug fix manager or developer assigns to tester October 2011/Dagny G. Pedersen 18

Tools for test design and test data generators (From Hans) From database Generates databases and files from schema Filters contents from existing bases Translates data between different formats From code In order to cover all code Problem: Missing code not found Problem: Code itself is test basis :-( From interfaces Finds which data are used in interfaces Can make a GUI-map Can generate erroneous input From (requirements) specification or models Formal notation necessary UML, MSCs, state charts Model based testing Creativity of a test designer cannot be replaced by tools. But standard tests can be generated. Expected results/ test oracle: Very difficult to generate. Generators for combinations Random generation October 2011/Dagny G. Pedersen 19

Tools for static testing Tool that support the review process (plan, execute and evaluate results) Static analyzers (developers tool) (See chapter on static testing) The compiler Data flow and control flow anomalies Code standard violations Field boundaries over-/underflow Measurement of code complexity to select code that needs review Model checking tools Website link checking tool to look for broken links These kind of tools are cheap and using them can save a lot of time and money! October 2011/Dagny G. Pedersen 20

Test automation Test automation has for more than a decade been praised as the silver bullet to control the increasing cost of software testing. The myth that automation can reduce head count creates a dilemma when initiators for automation requires resources for test automation planning, script generation and maintenance of scripts. Return on investment for test automation cannot be measured in head counts but rather in these potential benefits the ability to find defects up front better test coverage with the same number of testers less boring work better accuracy and repeatability the ability to uncover defects that manual testing cannot (like system resource overutilization) October 2011/Dagny G. Pedersen 21

Things to consider when planning test automation How is your manual testing process? The principle of finding defects as early as possible is valid also with automated testing Submitting untested software to automated system test is very bad for the ROI of automation "If "If you you automate chaos, chaos, you you get get faster faster chaos." (Dorothy (Dorothy Graham) Graham) October 2011/Dagny G. Pedersen 22

Planning- Factors to consider when picking the best candidates for test automation Stability of test object Detailed knowledge of test object and preferably also the road map for later releases Frequency of manual testing Complexity Test Data Test consumes data and it is no practical way of a refresh It must be possible to restore test precondition automatically either by roll-back or by running a script ahead of the test Risk October 2011/Dagny G. Pedersen 23

Test of payment a test automation example Payment registration screens Payment overview screen Payment detail screen Test script flow: 1. Register payment 2. Go to payment overview 3. Select payment from step 1 4. Compare all fields with input from step 1 5. Return to payment overview 6. Cancel payment from step 1 Input: Debit account Credit account Payment date Payment amount.. Payment Lists Future payments Account validation Display payment details Services Test script design details: Payment screen is tested in every detail: Invalid input (date, amount, account, missing input, message field and so on.), use of tabulator key, use of buttons October 2011/Dagny G. Pedersen 24

Problems with test design in previous slide The test pretends to be a system test/acceptance test. Business testers can be mislead to think that all business rules related to payments have been validated. Defects related to update of account balance, update of GL will remain undetected. In order to validate remaining business process, testers will have to enter a new payment Repeating tests on screen input will eventually not detect any new defects Payment Update account Update GL Clearing Services October 2011/Dagny G. Pedersen 25

Soft aspects related to test automation Business testers mistrust automatic test results want to be assured by checking themselves Business testers feel automation is a job threat Business testers are intimidated by the technical harness surrounding automated scripts and feel they loose control of the test process Neglect of these aspects can stop the automation initiative. October 2011/Dagny G. Pedersen 26

Test automation initiative and organization Involve business testers early in test automation initiative. Involve business testers in test planning and design. Communication: You cannot expect the business expert to read 10 pages of technical test design documentation. Think carefully about how new testers shall get sufficient information about the test coverage. Be accurate about which testing still has to be manual Status reports must be a summary of manual and automated test coverage and execution results so business experts testers get the full picture both during planning and during execution. Tool support must be adequate during build up of automated test suit but also afterwards in everyday use of the test suite. Run a health check of test environment up front before running the business test scripts. Testers will mistrust scripts that often fails due to non-functional errors and revert to manual testing October 2011/Dagny G. Pedersen 27

Test automation different tool types Capture/replay tools Are very exposed to changes in test object (But can be useful for performance testing) Pure script based tools -Writing programs to test programs: Development resources Test scripts also needs debugging Test scripts also need maintenance The challenge of keeping the test scripts aligned with the test object Key word driven test design (action-word) tools Data driven testing tools Anything that has a potential to change (also called "Variability" and includes such as environment, end points, test data and locations, etc), is separated out from the test logic (scripts) and moved into an 'external asset'. This can be a configuration or test dataset in a spreadsheet or database. The logic executed in the script is dictated by the data values. A lot of tools on the market combine some of these characteristics October 2011/Dagny G. Pedersen 28

Keyword driven tools Pros Maintenance is low in a long run Test cases are concise Test Cases are readable for the stake holders Test Cases easy to modify Keyword re-use across multiple test cases Not dependent on Tool / Language Division of Labor Test Case construction needs stronger domain expertise - lesser tool / programming skills Keyword implementation requires stronger tool/programming skill - with relatively lower domain skill Abstraction of Layers. Scripts can be used for both manual and automatic test execution Cons Longer Time to Market (as compared to manual testing or record and replay technique) Moderately high learning curve initially October 2011/Dagny G. Pedersen 29

Cost of automation Training Script Design (unless you can reuse test design for manual testing) Documentation Test data Incident analysis (Do not underestimate this incidents are produced at a much higher rate. When a test fails, is it caused by a setup bug, test data bug, automation bug or product bug) Test execution Change management and maintenance Servers and other infrastructure costs License fees and tool maintenance fees October 2011/Dagny G. Pedersen 30

Calculation of ROI for test automation Common formula for calculation of winnings Manual testing cost = Manual preparation cost + cost of manual test execution x times executed Automated testing cost = Automation preparation cost + cost of automated test execution x times executed Several flaws with this approach You cannot count tests that you normally will not repeat with manual testing You cannot equate manual testing with automated testing The automated test does everything exactly like every time and will eventually not find more defects The human tester has awareness of all kind of product characteristics October 2011/Dagny G. Pedersen 31

Service testing 4 focus areas To fully utilize the possible benefit of SOA architecture, every service consumer cannot test all functionality of the service provider. 4 primary focus areas: Functional, Performance, Interoperability, and Security Functional testing provides the ability to verify the proper behavior of services and build regression test suites to automate testing and baseline expected behavior Performance testing to determine throughput and capacity statistics of the service Interoperability testing to measure the design characteristics of a service as well as the runtime adherence to standards and best-practices. (testing against technical design specifications) Security testing (data integrity, data privacy, and access control) October 2011/Dagny G. Pedersen 32

SOA test harness A lot of services do not have a user interface Request Request Request Service A Service B Service C Response Response Response Driver (Substitute Service A) (1) (2) Service B (4) (3) Stub (Substitute Service C) Test harness October 2011/Dagny G. Pedersen 33

Some commercial SOA testing tools Company name Product name SOA test harness GreenHat IBM Software group GH Tester Rational Tester for SOA Quality itko Parasoft Crosscheck Networks Hewlett-Packard SmartBear Lisa SOAtest SOAPSonar HP Service Test soapui (open source) October 2011/Dagny G. Pedersen 34

Risk with tools (from Hans) Unrealistic expectations Underestimating time, cost and effort to introduce (training, coaching, help,...) Underestimating time and cost for follow-up, change of processes, etc. Underestimating need for maintenance of tools and the assets maintained or generated by tools Over-reliance on the tool Defects in the tool, and fighting against them Neglecting version control of test assets Neglecting relationships and interoperability between tools Tool vendor problems (out of business, retiring the tool) Sometimes too much trust into the tool instead of intelligent choices. Poor support Platform problems October 2011/Dagny G. Pedersen 35

Introduction of tools Define your process then look for tool support. Remember that all tools will require resources with skills. Suggested sequence of tool introduction: Book: My suggestion: 1. Incident management 1. Configuration management 2. Configuration management 2.Incident management 3. Test planning 3.Test planning 4. Test execution 4. Test specification 5. Test specification 5. Test execution October 2011/Dagny G. Pedersen 36

Tool selection process Which testing tasks needs tool support Tool requirement specification Market research (long list of possible tools and vendors) Tool demos, Follow up references Create short list Further studies of short list candidates, if demonstration license available try the product yourself Make a business case as basis for following up costs and benefits Review results and select October 2011/Dagny G. Pedersen 37

Selection criteria Learning curve for users ( e.g. new method) Ease of integration with current environment Ease of integration with other tools How does the product fit in with the test object (not applicable for all tools) Platform (db, operating system, browsers) Vendors market position, customer support, reliability License and maintenance Weigh the criteria and weigh how the tools meet them October 2011/Dagny G. Pedersen 38

Tool introduction Run a pilot a pilot should result in a list of necessary tool implementation tasks * Evaluate results what is to gain? Adapt process Train the users (Just in time) Stepwise introduction Tool support is an ongoing task Evaluate the benefit and calculate TOC (Total Cost of Ownership). after 1-2 years The cost of training and support must be included when you do calculate TOC * Rules for usage, adaption of test procedure, naming standards, workflow, training program, October 2011/Dagny G. Pedersen 39

References Kaner, Bach, Petticord :Lessons learned in software testing, Wiley 2002 Gartner:Magic quadrant for integrated software quality suites, ID number G00208975 Dorothy Graham: Successful test automation, Tutorial Software 2011 (Dataforeningen) Software Test Automation, Mark Fewster & Dorothy Graham, Addison Wesley, 1999 http://www.oasis-open.org/ www.soatesting.com http://www.thbs.com/pdfs/soa_test_methodology.pdf List of tools: http://www.softwareqatest.com/qatweb1.html#load http://www.usefulusability.com/24-usability-testing-tools/#paper http://en.wikipedia.org/wiki/list_of_unit_testing_frameworks http://www.smashingmagazine.com/2011/08/07/a-dozen-cross-browser-testing-tools/ http://www.satisfice.com/blog/ : testing blog of James Bach October 2011/Dagny G. Pedersen 40