4. Test Design Techniques



Similar documents
Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

Agile Development and Testing Practices highlighted by the case studies as being particularly valuable from a software quality perspective

Basic Testing Concepts and Terminology

The Case Against Test Cases

Testing Introduction. IEEE Definitions

Software Testing Interview Questions

Exercises for Design of Test Cases

Introduction to Automated Testing

Advanced Software Test Design Techniques Use Cases

Levels of Software Testing. Functional Testing

Essentials of the Quality Assurance Practice Principles of Testing Test Documentation Techniques. Target Audience: Prerequisites:

The ROI of Test Automation

Guideline for stresstest Page 1 of 6. Stress test

FSW QA Testing Levels Definitions

QA or the Highway 2016 Presentation Notes

The Dangers of Use Cases Employed as Test Cases

Software localization testing at isp

Test Automation Architectures: Planning for Test Automation

Presentation: 1.1 Introduction to Software Testing

Software Testing Lifecycle

Sample Exam Syllabus

Software testing. Objectives

Testing Mobile Software

Smarter Balanced Assessment Consortium. Recommendation

Netwatch Installation For Windows

Lecture 17: Testing Strategies" Developer Testing"

TesT AuTomATion Best Practices

Software Test Plan (STP) Template

TestTrack Test Case Management Quick Start Guide

Blind Security Testing

Copyrighted , Address :- EH1-Infotech, SCF 69, Top Floor, Phase 3B-2, Sector 60, Mohali (Chandigarh),

ISTQB Certified Tester. Foundation Level. Sample Exam 1

Automating Security Testing. Mark Fallon Senior Release Manager Oracle

How to Perform a Manual High Availability Failover

Co-Presented by Mr. Bill Rinko-Gay and Dr. Constantin Stanca 9/28/2011

Software Testing, Mythology & Methodologies

IDERA WHITEPAPER. The paper will cover the following ten areas: Monitoring Management. WRITTEN BY Greg Robidoux

Test case design techniques II: Blackbox testing CISS

Rational Quality Manager. Quick Start Tutorial

Fail early, fail often, succeed sooner!

Model-based Testing: Next Generation Functional Software Testing

ALM120 Application Lifecycle Management 11.5 Essentials

Lessons Learned in Software Testing

MO. 27. Feb 2006, 17:00 UHR

Step 2 Go2Group Automation Self Assessment

IEEE ComputerSociety 1 Software and Systems Engineering Vocabulary

A Guide To Evaluating a Bug Tracking System

European Commission. <Project Name> Test Management Plan. Date: 23/10/2008 Version: Authors: Revised by: Approved by: Public: Reference Number:

Metrics in Software Test Planning and Test Design Processes

Exploratory Testing Dynamics

ISTQB Advanced Level. Certification Exam. Self Study E-Book

Chapter 8 Software Testing

Test Management and Techniques

Software Engineering. Software Testing. Based on Software Engineering, 7 th Edition by Ian Sommerville

User Guide. Copyright 2003 Networks Associates Technology, Inc. All Rights Reserved.

SharePoint 2013 Best Practices

Syllabus Version 2.5_R ( )

Exploratory Testing Dynamics

Lab - Dual Boot - Vista & Windows XP

A Brief Overview of Software Testing Techniques and Metrics

Latest Research and Development on Software Testing Techniques and Tools

Things to consider before you do an In-place upgrade to Windows 10. Setup Info. In-place upgrade to Windows 10 Enterprise with SCCM

The Importance of Continuous Integration for Quality Assurance Teams

Exploratory Testing in an Agile Context

Getting Started With Delegated Administration

Test Coverage and Risk

UML TUTORIALS THE COMPONENT MODEL

Software Testing. System, Acceptance and Regression Testing

Cem Kaner -- James Bach -- November, 1999

Standard Glossary of Terms Used in Software Testing. Version 3.01

Standard Glossary of Terms Used in Software Testing. Version 3.01

Advanced Testing Techniques

Test Plan Template: (Name of the Product) Prepared by: (Names of Preparers) (Date) TABLE OF CONTENTS

Strategies for a Successful E2E Systems Integration Test. Fiona Charles Let s Test May 9, 2012

Software Engineering for LabVIEW Applications. Elijah Kerry LabVIEW Product Manager

Elements of robot assisted test systems

Testing Automation for Distributed Applications By Isabel Drost-Fromm, Software Engineer, Elastic

Requirements-driven Verification Methodology for Standards Compliance

Security Testing & Load Testing for Online Document Management system

SHOOTING AND EDITING DIGITAL VIDEO. AHS Computing

PMOD Installation on Linux Systems

Benefits of Test Automation for Agile Testing

Agile Tester Foundation Course Outline

REDEFINING QUALITY ASSURANCE

Regression Testing Based on Comparing Fault Detection by multi criteria before prioritization and after prioritization

Cloud A Practitioner Prakat Solutions

Testing of safety-critical software some principles

CSTE Mock Test - Part I - Questions Along with Answers

Why Your SIEM Isn t Adding Value And Why It May Not Be The Tool s Fault. Best Practices Whitepaper June 18, 2014

for Networks Installation Guide for the application on a server September 2015 (GUIDE 2) Memory Booster version 1.3-N and later

UML TUTORIALS THE USE CASE MODEL

Cut. software development. Improve defect removal efficiency while simultaneously lowering costs and shortening schedules.

WHITE PAPER. Ready, Set, Go Upgrade to ERP Cloud Release 10

Software Testing Tutorial

Getting Things Done: Practical Web/e-Commerce Application Stress Testing

Application Performance Testing Basics

Contents. -Testing as a Services - TaaS 4. -Staffing Strategies 4. -Testing as a Managed Services - TaaMS 5. -Services 6.

Introduction to Computers and Programming. Testing

Getting Started With Automated Testing. Michael Kelly

Transcription:

4. Test Design Techniques Hans Schaefer hans.schaefer@ieee.org http://www.softwaretesting.no/ 2006-2010 Hans Schaefer Slide 1 Contents 1. How to find test conditions and design test cases 2. Overview of techniques for designing test cases 3. How to choose techniques 2006-2010 Hans Schaefer Slide 2

How to find Test Conditions and Design Test Cases Test requires concrete instructions. Three work steps: 1. Test condition 2. Test case 3. Test procedure and plan for execution 2006-2010 Hans Schaefer Slide 3 Terminology Test design specification: A document specifying the test conditions (coverage items) for a test item, the detailed test approach and identifying the associated high level test cases. [After IEEE 829] Test condition: An item or event of a component or system that could be verified by one or more test cases, e.g. a function, transaction, quality attribute, or structural element. (Something to be tested). Test case: A set of input values, execution preconditions, expected results and execution postconditions, developed for a particular objective or test condition, such as to exercise a particular program path or to verify compliance with a specific requirement. [After IEEE 610]. A test case should be traceable to its requirements! 2006-2010 Hans Schaefer Slide 4

Example Test Design Specification Testing the user interface of www.nsb.no 1 - is searching for trains part of the home page? 2 - single direction search? 3 - return trip search 4 - search between Norway and neighbor countries? 5 - ticket selling 6 - exception handling 2006-2010 Hans Schaefer Slide 5 Example Test Case What: Searching connections between Norway and neighbor countries Preconditions: Test client connected to www.nsb.no server, displaying home page. Inputs: From stations: Bergen, Oslo, Trondheim, Narvik To stations: Stockholm C, Gøteborg C, Kiruna, Sundsvall Date and Time: Next Monday 08:00h Expected results: Existing train connections (to be checked through www.bahn.de) Postconditions: Test client still connected to www.nsb.no and no change in server database content. 2006-2010 Hans Schaefer Slide 6

Context Test design 1 n test condition n m Test cases 2006-2010 Hans Schaefer Slide 7 Test Cases What is tested here (test conditions) Preconditions Input Expected results Postconditions Expected results Discussion: Have you ever checked that your bank correctly computes your account balance? How to test this? See next slide. 2006-2010 Hans Schaefer Slide 8

Expected Results - How Important? Arenʼt they given by themselves? Expected result = The program behaves reasonably. Problems if you do not specify: Under time pressure you overlook failures (everything is plausible). During test design you do not think through the specification. You forget to check results which are not easily visible (memory leaks, buffer corruption, background data,... ) What is it? The results you should get, judging from the specification of the test object 2006-2010 Hans Schaefer Slide 9 Test Case Steps (not for exam) 1. Set-up: Acquire necessary resources. Instantiate the unit under test and put it into the desired state. Maybe require any other (surrounding, helper) units and test data. 2. Input: Send inputs to the unit under test. 3. Verify: Fetch the result of the stimulus. This may be output data or state changes. Collect these and return them to the test framework, which then reports the outcome to the tester. 4. Teardown: Release all the resources used, in order to avoid interdependencies between tests. 2006-2010 Hans Schaefer Slide 10

Test Procedure How to run the test? Test procedure: A document specifying a sequence of actions for the execution of a test. Also known as test script or manual test script. [After IEEE 829] 2006-2010 Hans Schaefer Slide 11 How Detailed Test Procedure? It depends on tester skills and knowledge: Worst case (test robot): Everything by spoon - program, test script Normally: Specify order of test cases whatever needs special attention Best case: Only preparation and cleaning up is specified. Otherwise just run the tests. It also depends on requirements to documentation. 2006-2010 Hans Schaefer Slide 12

Test Procedure IEEE 829-1998.1 Test procedure specification identifier (Title).2 Purpose.3 Special requirements.4 Procedure steps: Include the steps in.4.1 through.4.10 as applicable..4.1 Log (how to).4.2 Set up.4.3 Start.4.4 Proceed.4.5 Measure.4.6 Shut down.4.7 Restart.4.8 Stop.4.9 Wrap up.4.10 Contingencies 2006-2010 Hans Schaefer Slide 13 What To Do During Test Design? Analyze what to test. -> test conditions Translate test conditions to test cases with test data. Traceability is important: Which test cases test what? Against specifications and requirements (->impact analysis, requirements coverage) Choose good names for your test cases, put them into a data base, test management tool,... Risk considerations: What is most important to test? What is going to be changed most? 2006-2010 Hans Schaefer Slide 14

Techniques for Test Design Black box - Specification based White Box - Structure based: Everything in the code Experience based - Testerʼs experience with bugs Random test We have to choose between those! Reference: A Process for unit test design: http://www.ipl.com/pdf/p0829.pdf 2006-2010 Hans Schaefer Slide 15 Black Box-Test Techniques Black box = based on specification Not based on code May be the component specification Input Output Finds especially forgotten or misunderstood items. 2006-2010 Hans Schaefer Slide 16

More about Black Box Specification may be Models (formal, informal) Descriptions User manuals Our thinking about the system and its components Try to cover the whole specification. Black box- techniques can be used BEFORE implementation! 2006-2010 Hans Schaefer Slide 17 White Box-Test Techniques White box = based on the code Everything in the code shall be executed Test result taken from specification Best on component level Also: structure based, glass box, open box Input Output Finds superfluous code. Finds holes in the test. Important for security & safety requirements. 2006-2010 Hans Schaefer Slide 18

More about White Box Start from Code Design and interfaces Try to cover the whole implementation. Measurement of coverage possible by automated tools. Many possible criteria. Coverage can be checked, and then more test cases can be designed to cover holes. 2006-2010 Hans Schaefer Slide 19 Experience Based Test Test knows typical problems. Focus on: Typical faults Pessimism Unclear items Special values Special connections and contexts Risk factors Use of the program and its environment Error guessing Fault attack Exploratory testing Very dependent on knowledge, skill and experience! Good addition to black and white box! 2006-2010 Hans Schaefer Slide 20

Random Test Really black box Random input, random-generated Usage profile can be base for statistical test Input Output Finds problems not findable by systematic test design methods! Example: Fuzzing. See http://www.codenomicon.com/freeftp-suite/ 2006-2010 Hans Schaefer Slide 21 Some Practical Questions 2006-2010 Hans Schaefer Slide 22

Plan for Execution / Test Procedure: How the Tests Are Executed Test cases shall be executed. Normally, this requires a plan. Like a plan for film shooting. Think about: Installation Smoke test Main test New test after changes (regression test) Automatic or not Executed by whom, when, how, where Priority Dependencies between test cases test execution schedule 2006-2010 Hans Schaefer Slide 23 How To Choose Techniques Factors to consider: Always consider several techniques depending on: Risk Expected types of faults Tester knowledge and experience Test level Tools Type of software, application or problem Customer requirements Requirements from authorities, contract, third parties 2006-2010 Hans Schaefer Slide 24

How To Choose Techniques Black box Always. This is the use of test-driven design or the V-model. Finds misunderstandings early. Finds unclear specifications. White box Especially for low level test. Tools available for nearly every programming language. A cheap insurance. 2006-2010 Hans Schaefer Slide 25 How To Choose Techniques Experience based test Can be very effective Requires experience :-) Does not replace black box-methods Random test Not used often Good against problem or project blindness For very high reliability applications 2006-2010 Hans Schaefer Slide 26

Heuristics for test conditions (not for exam) Touring Heuristic Mike Kelly on Tue, 20/09/2005 - http://www.testingreflections.com/node/view/2823 Mnemonic: FCC CUTS VIDS The mnemonic stands for the following: Feature tour Complexity tour Claims tour Configuration tour User tour Testability tour Scenario tour Variability tour Interoperability tour Data tour Structure tour hardware, files, etc ). * Feature tour: Move through the application and get familiar with all the controls and features you come across. * Complexity tour: Find the five most complex things about the application. * Claims tour: Find all the information in the product that tells you what the product does. * Configuration tour: Attempt to find all the ways you can change settings in the product in a way that the application retains those settings. * User tour: Imagine five users for the product and the information they would want from the product or the major features they would be interested in. * Testability tour: Find all the features you can use as testability features and/or identify tools you have available that you can use to help in your testing. * Scenario tour: Imagine five realistic scenarios for how the users identified in the user tour would use this product. * Variability tour: Look for things you can change in the application - and then you try to change them. * Interoperability tour: What does this application interact with? * Data tour: Identify the major data elements of the application. * Structure tour: Find everything you can about what comprises the physical product (code, interfaces, 2006-2010 Hans Schaefer Slide 27