Di 3.1. A Test Design Poster for Testers, Developers, and Architects. Peter Zimmerer



Similar documents
Quality Assurance Plan

Standard Glossary of Terms Used in Software Testing. Version 3.01

Test Management and Techniques

Certified Tester. Advanced Level Overview

Advanced Test-Driven Development

Standard Glossary of Terms Used in Software Testing. Version 3.01

Software testing. Objectives

Exploratory Testing Dynamics

Testing of safety-critical software some principles

Basic Testing Concepts and Terminology

4. Test Design Techniques

Formal Software Testing. Terri Grenda, CSTE IV&V Testing Solutions, LLC

Standard Glossary of Terms Used in Software Testing. Version 3.01

Techniques and Tools for Rich Internet Applications Testing

Exploratory Testing Dynamics

Contents. Introduction and System Engineering 1. Introduction 2. Software Process and Methodology 16. System Engineering 53

Four Schools of Software Testing.

TURKEY SOFTWARE QUALITY REPORT

Standard for Software Component Testing

Test Coverage and Risk

Software Engineering. Software Testing. Based on Software Engineering, 7 th Edition by Ian Sommerville

Introduction to Automated Testing

Development of AUTOSAR Software Components within Model-Based Design

Module 10. Coding and Testing. Version 2 CSE IIT, Kharagpur

ISTQB Certified Tester. Foundation Level. Sample Exam 1

TESSY Automated dynamic module/unit and. CTE Classification Tree Editor. integration testing of embedded applications. for test case specifications

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

Verification and Validation of Software Components and Component Based Software Systems

Karunya University Dept. of Information Technology

Software Testing. System, Acceptance and Regression Testing

FSW QA Testing Levels Definitions

Introduction to Computers and Programming. Testing

Teaching Software Testing from two Viewpoints

Software Engineering/Courses Description Introduction to Software Engineering Credit Hours: 3 Prerequisite: (Computer Programming 2).

Fundamentals of Measurements

THE THREE ASPECTS OF SOFTWARE QUALITY: FUNCTIONAL, STRUCTURAL, AND PROCESS

Lecture 17: Testing Strategies" Developer Testing"

IEEE ComputerSociety 1 Software and Systems Engineering Vocabulary

Software Testing Interview Questions

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

SOFTWARE TESTING. A Craftsmcm's Approach THIRD EDITION. Paul C. Jorgensen. Auerbach Publications. Taylor &. Francis Croup. Boca Raton New York

Lee Copeland.

Software Quality Engineering: Testing, Quality Assurance, and Quantifiable Improvement

Exploratory Testing in an Agile Context

How To Be Successful At An Agile Software Engineering

Schools of Software Testing

Improved Software Testing Using McCabe IQ Coverage Analysis

Test Case Design Techniques

Software Engineering. How does software fail? Terminology CS / COE 1530

VALLIAMMAI ENGINEERING COLLEGE S.R.M. Nagar, Kattankulathur DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

How CMMI contributes to Software Testing

Sample Exam Syllabus

Software Testing Strategies and Techniques

Model based testing tools. Olli Pekka Puolitaival

Presentation: 1.1 Introduction to Software Testing

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University

MOBILE APPLICATION TESTING ENGINEER

CHAPTER 20 TESING WEB APPLICATIONS. Overview

Quality Management. Lecture 12 Software quality management

Software Requirements, Third Edition

Standard glossary of terms used in Software Testing

Latest Trends in Testing. Ajay K Chhokra

What is a life cycle model?

Erik van Veenendaal. www. erikvanveenendaal.nl. Improve Quality Services BV 2

Advanced Testing Techniques

Sofware Requirements Engineeing

Test case design techniques II: Blackbox testing CISS

Benefits of Test Automation for Agile Testing

Definitions. Software Metrics. Why Measure Software? Example Metrics. Software Engineering. Determine quality of the current product or process

Design principles in Test Suite Architecture

Standard Glossary of Terms Used in Software Testing. Version 3.01

Software Engineering Compiled By: Roshani Ghimire Page 1

Apache Web Server Execution Tracing Using Third Eye

Certified Software Quality Engineer (CSQE) Body of Knowledge

Business Analysis Workshops

Test case design techniques I: Whitebox testing CISS

SECTION 4 TESTING & QUALITY CONTROL

ISTQB ADVANCED LEVEL TEST ANALYST CERTIFICATE IN SOFTWARE TESTING

Agile Software Engineering Practice to Improve Project Success

Exploratory Testing An Agile Approach STC Aman Arora. Xebia IT Architects India Pvt. Ltd. Sec-30, Gurgaon , Haryana

Lecture Slides for Managing and Leading Software Projects. Chapter 2: Process Models for Software Development

Scrum and Testing The end of the test role Bryan Bakker 20 maart 2012

Testing of the data access layer and the database itself

SOFTWARE ENGINEERING INTERVIEW QUESTIONS

Best Practices for Improving the Quality and Speed of Your Agile Testing

Survey of Web Testing Techniques

Introduction of ISO/DIS (ISO 26262) Parts of ISO ASIL Levels Part 6 : Product Development Software Level

Advanced Software Test Design Techniques Use Cases

Transcription:

Di. January 6-0, 009, Munich, Germany ICM - International Congress Centre Munich A Test Design Poster for Testers, Developers, and Architects Peter Zimmerer

Corporate Technology A Test Design Poster for Testers, Developers, and Architects OOP 009 Munich, Germany Peter Zimmerer Principal Engineer Siemens AG, CT SE Corporate Technology Corporate Research and Technologies Software & Engineering, Development Techniques 879 Munich, Germany peter.zimmerer@siemens.com http://www.siemens.com/research-and-development/ http://www.ct.siemens.com/ Copyright Siemens AG 009. All rights reserved. Contents Introduction Test design methods Here: methods, paradigms, techniques, styles, and ideas to create, derive, select, generate a test case Motivation who cares about test design methods? Eamples and references Problem statement Poster Test Design Methods on One Page Guidelines and eperiences Backup Details on test design methods Page January 7, 009 Peter Zimmerer, CT SE

Introduction Test design methods Good test design, i.e. high-quality test cases, is very important There are many different test design methods and techniques Static, dynamic Black-bo, white-bo, grey-bo Based on fault model, eperience, eploratory Statistical (user profiles), random (monkey) The challenge is to adequately combine these methods dependent on the given problem, domain, and requirements This is art as well! Black-bo test design methods are often based on models model-based testing Page January 7, 009 Peter Zimmerer, CT SE Some systematic methods for test design Black-bo (models, interfaces, data) Requirements-based (traceability matri) Use case-based testing, scenario testing Design by contract Equivalence class partitioning Classification-tree method Boundary value analysis State-based testing Cause-effect graphing Decision tables, decision trees Combinatorial testing (n-wise) White-bo (internal structure, paths) Control flow testing Data flow testing Selection, usage, and applicability depends on the specific domain (domain knowledge is required!) used software technology test requirements: required test intensity, quality criteria, risks eisting test basis: specifications, documents, models project factors: constraints and opportunities Page January 7, 009 Peter Zimmerer, CT SE

Who cares / should care about test design methods? Requirements engineers Requirements testing Architects Architecture testing Developers Unit testing Test engineers All the rest: system testing, Important: collaboration between the different stakeholders Page January 7, 009 Peter Zimmerer, CT SE Test levels eample V model with architecture testing User Requirements Architecture interrogation: interviews, interactive workshops Evaluation, prototyping, simulation System Requirements Architecture, Design Test case design Unit Specification based on Analysis, reviews, previews, inspections Load model specification Acceptance Testing System Testing Integration Testing Unit Testing Static testing Code Quality Coding Management Architecture testing any testing of architecture and architectural artifacts Page 6 January 7, 009 Peter Zimmerer, CT SE

Test design methods Preventive Testing TDD Preventive testing is built upon the observation that one of the most effective ways of specifying something is to describe (in detail) how you would accept (test) it if someone gave it to you. David Gelperin, Bill Hetzel (<990) Given any kind of specification for a product, the first thing to develop isn't the product, but how you'd test the product. Don t start to build a product till you know how to test it. Tom Gilb The act of designing tests is one of the most effective bug preventers known. Boris Beizer, 98 Page 7 January 7, 009 Peter Zimmerer, CT SE Test design methods Test basis Selection, usage, and applicability of test design methods depends on the eisting test basis: specifications, documents, models Therefore, any person who is involved in any specification activity should know about test design methods, at least a little Eample Architect: Specify your architecture (especially the dynamic behavior) by using the right models, formats, and notations to provide an adequate test basis to enable more effective and more efficient testing BTW, how can you do this adequately without knowing anything about test design methods??? Page 8 January 7, 009 Peter Zimmerer, CT SE

Test design methods Risks Test design methods address risks See especially fault-based test design methods Test design methods are a rich source describing what might go wrong in the product or system Test design methods are very helpful to identify and analyze risks w.r.t. architecture, software technologies, etc. to built-in better design for testability (DFT) Test design methods are an integral part of the testing strategy BTW, how can you do this adequately without knowing anything about test design methods??? Page 9 January 7, 009 Peter Zimmerer, CT SE What is a Test Case? A set of input values, eecution preconditions, epected results and eecution postconditions, developed for a particular objective or test condition, such as to eercise a particular program path or to verify compliance with a specific requirement. ISTQB 007, IEEE 60 A Test Case should include unique identification who am I? test goal, test purpose why? link to requirements test conditions/requirements what? coverage!!! preconditions system state, environmental conditions test data inputs, data, actions eecution conditions constraints, dependencies epected results postconditions oracles, arbiters, verdicts, coverage, traces system state, epected side effects, epected invariants, traces, environmental conditions Page 0 January 7, 009 Peter Zimmerer, CT SE

Test design methods and test cases Test design methods mainly help us to identify the test data inputs, data, actions i.e. especially the input values for the test cases and hopefully provide us some information about the epected results oracles, arbiters, verdicts, coverage, traces at least to some etent Typically test design methods alone cannot supply preconditions system state, environmental conditions eecution conditions constraints, dependencies epected results oracles, arbiters, verdicts, coverage, traces postconditions system state, epected side effects, epected invariants, traces, environmental conditions Generally these items must be determined from the test basis depending on the contet: project, domain, requirements, constraints Page January 7, 009 Peter Zimmerer, CT SE Eample There are always too many test cases... Page January 7, 009 Peter Zimmerer, CT SE

Eamples Demo Microsoft PowerPoint Microsoft Word 00 Page January 7, 009 Peter Zimmerer, CT SE Test effectiveness and formal (systematic) test design There are studies showing advantages of systematic test design. There are also studies showing advantages of random testing. But do you really want to design your test cases only randomly? Formal test design was almost twice as effective in defect detection per test case as compared to epert (eploratory) type testing, and much more effective compared to checklist type testing. Bob Bartlett, SQS UK, 006 Page January 7, 009 Peter Zimmerer, CT SE

Many testing books cover test design to some etent Boris Beizer: Software Testing Techniques Robert V. Binder: Testing Object-Oriented Systems: Models, Patterns, and Tools Lee Copeland: A Practitioner's Guide to Software Test Design Rick D. Craig, Stefan P. Jaskiel: Systematic Software Testing Tim Koomen et. al.: TMap Net: For Result-driven Testing Glenford J. Meyers: The Art of Software Testing Torbjörn Ryber: Essential Software Test Design James Whittaker: How to Break Software James Whittaker, Herbert Thompson: How to Break Software Security Standard for Software Component Testing by the British Computer Society Specialist Interest Group in Software Testing (BCS SIGIST) (see http://www.testingstandards.co.uk/) There are many different training offerings by different providers Page January 7, 009 Peter Zimmerer, CT SE Test design tools are typically focused and implement only a few test design methods some eamples ATD Automated Test Designer (AtYourSide Consulting, http://www.atyoursideconsulting.com/): cause-effect graphing BenderRBT (Richard Bender, http://www.benderrbt.com/): cause-effect graphing, quick design (orthogonal pairs) CaseMaker (Diaz & Hilterscheid, http://www.casemaker.de/): business rules, equivalence classes, boundaries, error guessing, pairwise combinations, and element dependencies CTE (Razorcat, http://www.ats-software.de/): classification tree editor MaTeLo (Alltec, http://www.alltec.net/): markov chains Qtronic (Conformiq, http://www.conformiq.com/): state-based testing Reactis (Reactive Systems, http://www.reactive-systems.com/): generation of test data from, and validation of, Simulink and Stateflow models Test Designer (Smartesting, http://www.smartesting.com/): state-based testing TestBench (Imbus, http://www.testbench.info/, http://www.imbus.de/): equivalence classes, work-flow / use-case-based testing Page 6 January 7, 009 Peter Zimmerer, CT SE

Problem statement Starting from a risk-based testing strategy an adequate test design is the key for effective and efficient testing. Automation of bad test cases is a waste of time and money! There are many different test design methods around for a long time (perhaps too many?) and a lot of books eplain them in detail. There are different categorizations, classifications, and dimensions naming, interpretations, and understandings of test design methods which does not simplify their usage When we look into practice we can see that often there is quite limited usage of these test design methods at all. What are the reasons behind that? How can we overcome this and improve our testing approaches? Page 7 January 7, 009 Peter Zimmerer, CT SE Poster Test Design Methods on One Page () Idea: Systematic, structured, and categorized overview about different test design methods on one page Focus more on using an adequate set of test design methods than on using only one single test design method in depth / perfection Focus more on concrete usage of test design methods than on defining a few perfect test design methods in detail which are not used then in the project Focus more on breadth instead on depth Do not miss breadth because of too much depth Do not miss the eploratory, investigative art of testing Page 8 January 7, 009 Peter Zimmerer, CT SE

T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) T e s t D e s i g n M e t h o d s o n O n e P a g e Poster Test Design Methods on One Page () Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Key Categorization Methods, Paradigms, Techniques, Styles, and Ideas to Create a Test Case Effort / Difficulty / Resulting Test Intensity ( Levels) Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) Page 9 January 7, 009 Peter Zimmerer, CT SE Poster Test Design Methods on One Page () Categories of test design methods are orthogonal and independent in some way but should be combined appropriately. The selection of the used test design methods depends on many factors, for eample: Requirements of the system under test and the required quality Requirements for the tests quality of the tests, i.e. the required intensity and depth of the tests Testing strategy: effort / quality of the tests, distribution of the testing in the development process Eisting test basis: specifications, documents, models Problem to be tested (domain) or rather the underlying question (use case) System under test or component under test Test level / Test step Used technologies (software, hardware) Suitable tool support: for some methods absolutely required Page 0 January 7, 009 Peter Zimmerer, CT SE

T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) Poster Test Design Methods on One Page () The effort / difficulty for the test design methods or rather the resulting test intensity is subdivided into levels: very low, simple low medium high very high, comple This division into levels is dependent on the factors given above for the selection of the test design methods and therefore can only be used as a first hint and guideline. A test design method also may be used continuously from intuitive use up to 00% complete use as required. In addition describe every test design method on one page to eplain their basic message and intention. Page January 7, 009 Peter Zimmerer, CT SE Guidelines and eperiences () For beginners perhaps you are confused about the many test design methods start simple, step by step ask for help and advice by an eperienced colleague, coach or consultant For advanced, eperienced testers (and developers!) check your current approach against this poster, think twice, and improve incrementally Use the poster as a checklist for eisting test design methods Selection of test design methods is dependent on the contet! So, you should adapt the poster to your specific needs. Page January 7, 009 Peter Zimmerer, CT SE

T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) Guidelines and eperiences () Pick up this poster and give it to every developer and tester in your team or put it on the wall in your office or make it the standard screensaver or desktop background for all team members or even use the testing on the toilet approach by Google (see http://googletesting.blogspot.com/) The poster increases visibility and importance of test design methods, especially also for developers to improve unit testing The poster facilitates a closer collaboration of testers and developers: you have something to talk about... Page January 7, 009 Peter Zimmerer, CT SE Summary There eist many different methods for adequate test design. When looking into practice often these test design methods are used only sporadically and in a non-systematical way. The poster Test Design Methods on One Page containing a systematic, structured, and categorized overview about test design methods will help you to really get them used in practice in your projects Do not miss breadth because of too much depth This will result in better and smarter testing for testers, developers, and architects Page January 7, 009 Peter Zimmerer, CT SE

Backup Details on test design methods Page January 7, 009 Peter Zimmerer, CT SE Requirements-based with traceability matri Inventory tracking matri Objectives / Test Cases Inventories Test Case Test Case Test Case Test Case Test Case Test Case 6 Test Case 7 Test Case 8 Requirements Requirement Requirement Requirement Features Feature Feature Feature Use Cases Use Case Use Case Use Case Objectives Objective Objective Objective Risks Risk Risk Risk Page 6 January 7, 009 Peter Zimmerer, CT SE

Use case-based testing Scenario testing A scenario is a hypothetical story used to help a person think through a comple problem or system Based e.g. on transaction flows, use cases, or sequence diagrams A specific, i.e. more etreme kind of scenario testing is the so-called soap opera testing Page 7 January 7, 009 Peter Zimmerer, CT SE Eample: Soap opera testing Ref.: Hans Buwalda: Soap Opera Testing, Better Software, February 00 Page 8 January 7, 009 Peter Zimmerer, CT SE

Soap opera testing Test objectives Corresponds to the traceability matri Ref.: Hans Buwalda: Soap Opera Testing, Better Software, February 00 Page 9 January 7, 009 Peter Zimmerer, CT SE Equivalence class partitioning and boundary values Goal: Create a minimum number of black bo tests needed while still providing adequate coverage. Two tests belong to the same equivalence class if you epect the same result (pass / fail) of each. Testing multiple members of the same equivalence class is, by definition, redundant testing. Boundaries mark the point or zone of transition from one equivalence class to another. The program is more likely to fail at a boundary, so these are the best members of (simple, numeric) equivalence classes to use. More generally, you look to subdivide a space of possible tests into relatively few classes and to run a few cases of each. You d like to pick the most powerful tests from each class. Page 0 January 7, 009 Peter Zimmerer, CT SE

Eample: a software thermostat Ref.: Stuart Reid, EuroSTAR Conference 006 Page January 7, 009 Peter Zimmerer, CT SE Equivalence class partitioning and boundary value analysis with parameters b ma b b min a min a ma # test cases for n parameters and one valid equivalence class: n + # test cases for n parameters and one valid and one invalid equivalence class: 6n + a b b b b ma min ma min b b a a min min a a ma ma a a Page January 7, 009 Peter Zimmerer, CT SE

Eample: Classification-tree method Ref.: CTE XL, http://www.systematic-testing.com/ Page January 7, 009 Peter Zimmerer, CT SE Eample: State-based testing State Cond_ / A Condition Cond_ / C State State State Cond_ / A Cond_ / N Cond_ / N Cond_ / N State Cond_ / B State / N State / N State table / B / N / N / C / N / D Cond_ / D State transition diagram S new / X X is action (or event) and S new is the resulting new state; action N means do nothing Page January 7, 009 Peter Zimmerer, CT SE

Eample: Cause-effect graphing Requires dependencies between parameters and can get very complicated and difficult to implement Page January 7, 009 Peter Zimmerer, CT SE Eample: Combinatorial Testing (pairwise testing) Given a system under test with parameters A, B, C, and D Each parameter has possible values Parameter A: a, a, a Parameter B: b, b, b Parameter C: c, c, c Parameter D: d, d, d A valid test input data set is e.g. {a, b, c, d}. Ehaustive testing would require = 8 test cases Only 9 test cases are already sufficient to cover all pairwise interactions of parameters Page 6 January 7, 009 Peter Zimmerer, CT SE

Control flow-based Unit/Module Integration System Use it on different levels of abstraction: model, unit, integration, system Page 7 January 7, 009 Peter Zimmerer, CT SE Data flow-based defined / used paths defined (d) for eample value assigned to a variable, initialized used (u) for eample variable used in a calculation, predicate predicate-use (p-u) computation-use (c-u) Test du-paths Read / write access: data source and data sink Use it on different levels of abstraction: model, unit, integration, system Page 8 January 7, 009 Peter Zimmerer, CT SE

Eample: Heuristics and mnemonics* Boundary values CRUD (data cycles, database operation) Create, Read, Update, Delete HICCUPPS (oracle) History, Image, Comparable Products, Claims, User s Epectations, Product, Purpose, Statutes SF DePOT (San Francisco Depot) (product element, coverage) Structure, Function, Data, Platform, Operations, Time CRUSSPIC STMPL (quality criteria) Capability, Reliability, Usability, Security, Scalability, Performance, Installability, Compatibility, Supportability, Testability, Maintainability, Portability, Localizability FDSFSCURA (testing techniques) Function testing, Domain testing, Stress testing, Flow testing, Scenario testing, User testing, Risk testing, Claims testing, Automatic Testing FCC CUTS VIDS (application touring) Feature tour, Compleity tour, Claims tour, Configuration tour, User tour, Testability tour, Scenario tour, Variability tour, Interoperability tour, Data tour, Structure tour *Ref.: James Bach, Michael Bolton, Mike Kelly, and many more Page 9 January 7, 009 Peter Zimmerer, CT SE