Di 3.1. A Test Design Poster for Testers, Developers, and Architects. Peter Zimmerer

Size: px
Start display at page:

Download "Di 3.1. A Test Design Poster for Testers, Developers, and Architects. Peter Zimmerer"

Transcription

1 Di. January 6-0, 009, Munich, Germany ICM - International Congress Centre Munich A Test Design Poster for Testers, Developers, and Architects Peter Zimmerer

2 Corporate Technology A Test Design Poster for Testers, Developers, and Architects OOP 009 Munich, Germany Peter Zimmerer Principal Engineer Siemens AG, CT SE Corporate Technology Corporate Research and Technologies Software & Engineering, Development Techniques 879 Munich, Germany [email protected] Copyright Siemens AG 009. All rights reserved. Contents Introduction Test design methods Here: methods, paradigms, techniques, styles, and ideas to create, derive, select, generate a test case Motivation who cares about test design methods? Eamples and references Problem statement Poster Test Design Methods on One Page Guidelines and eperiences Backup Details on test design methods Page January 7, 009 Peter Zimmerer, CT SE

3 Introduction Test design methods Good test design, i.e. high-quality test cases, is very important There are many different test design methods and techniques Static, dynamic Black-bo, white-bo, grey-bo Based on fault model, eperience, eploratory Statistical (user profiles), random (monkey) The challenge is to adequately combine these methods dependent on the given problem, domain, and requirements This is art as well! Black-bo test design methods are often based on models model-based testing Page January 7, 009 Peter Zimmerer, CT SE Some systematic methods for test design Black-bo (models, interfaces, data) Requirements-based (traceability matri) Use case-based testing, scenario testing Design by contract Equivalence class partitioning Classification-tree method Boundary value analysis State-based testing Cause-effect graphing Decision tables, decision trees Combinatorial testing (n-wise) White-bo (internal structure, paths) Control flow testing Data flow testing Selection, usage, and applicability depends on the specific domain (domain knowledge is required!) used software technology test requirements: required test intensity, quality criteria, risks eisting test basis: specifications, documents, models project factors: constraints and opportunities Page January 7, 009 Peter Zimmerer, CT SE

4 Who cares / should care about test design methods? Requirements engineers Requirements testing Architects Architecture testing Developers Unit testing Test engineers All the rest: system testing, Important: collaboration between the different stakeholders Page January 7, 009 Peter Zimmerer, CT SE Test levels eample V model with architecture testing User Requirements Architecture interrogation: interviews, interactive workshops Evaluation, prototyping, simulation System Requirements Architecture, Design Test case design Unit Specification based on Analysis, reviews, previews, inspections Load model specification Acceptance Testing System Testing Integration Testing Unit Testing Static testing Code Quality Coding Management Architecture testing any testing of architecture and architectural artifacts Page 6 January 7, 009 Peter Zimmerer, CT SE

5 Test design methods Preventive Testing TDD Preventive testing is built upon the observation that one of the most effective ways of specifying something is to describe (in detail) how you would accept (test) it if someone gave it to you. David Gelperin, Bill Hetzel (<990) Given any kind of specification for a product, the first thing to develop isn't the product, but how you'd test the product. Don t start to build a product till you know how to test it. Tom Gilb The act of designing tests is one of the most effective bug preventers known. Boris Beizer, 98 Page 7 January 7, 009 Peter Zimmerer, CT SE Test design methods Test basis Selection, usage, and applicability of test design methods depends on the eisting test basis: specifications, documents, models Therefore, any person who is involved in any specification activity should know about test design methods, at least a little Eample Architect: Specify your architecture (especially the dynamic behavior) by using the right models, formats, and notations to provide an adequate test basis to enable more effective and more efficient testing BTW, how can you do this adequately without knowing anything about test design methods??? Page 8 January 7, 009 Peter Zimmerer, CT SE

6 Test design methods Risks Test design methods address risks See especially fault-based test design methods Test design methods are a rich source describing what might go wrong in the product or system Test design methods are very helpful to identify and analyze risks w.r.t. architecture, software technologies, etc. to built-in better design for testability (DFT) Test design methods are an integral part of the testing strategy BTW, how can you do this adequately without knowing anything about test design methods??? Page 9 January 7, 009 Peter Zimmerer, CT SE What is a Test Case? A set of input values, eecution preconditions, epected results and eecution postconditions, developed for a particular objective or test condition, such as to eercise a particular program path or to verify compliance with a specific requirement. ISTQB 007, IEEE 60 A Test Case should include unique identification who am I? test goal, test purpose why? link to requirements test conditions/requirements what? coverage!!! preconditions system state, environmental conditions test data inputs, data, actions eecution conditions constraints, dependencies epected results postconditions oracles, arbiters, verdicts, coverage, traces system state, epected side effects, epected invariants, traces, environmental conditions Page 0 January 7, 009 Peter Zimmerer, CT SE

7 Test design methods and test cases Test design methods mainly help us to identify the test data inputs, data, actions i.e. especially the input values for the test cases and hopefully provide us some information about the epected results oracles, arbiters, verdicts, coverage, traces at least to some etent Typically test design methods alone cannot supply preconditions system state, environmental conditions eecution conditions constraints, dependencies epected results oracles, arbiters, verdicts, coverage, traces postconditions system state, epected side effects, epected invariants, traces, environmental conditions Generally these items must be determined from the test basis depending on the contet: project, domain, requirements, constraints Page January 7, 009 Peter Zimmerer, CT SE Eample There are always too many test cases... Page January 7, 009 Peter Zimmerer, CT SE

8 Eamples Demo Microsoft PowerPoint Microsoft Word 00 Page January 7, 009 Peter Zimmerer, CT SE Test effectiveness and formal (systematic) test design There are studies showing advantages of systematic test design. There are also studies showing advantages of random testing. But do you really want to design your test cases only randomly? Formal test design was almost twice as effective in defect detection per test case as compared to epert (eploratory) type testing, and much more effective compared to checklist type testing. Bob Bartlett, SQS UK, 006 Page January 7, 009 Peter Zimmerer, CT SE

9 Many testing books cover test design to some etent Boris Beizer: Software Testing Techniques Robert V. Binder: Testing Object-Oriented Systems: Models, Patterns, and Tools Lee Copeland: A Practitioner's Guide to Software Test Design Rick D. Craig, Stefan P. Jaskiel: Systematic Software Testing Tim Koomen et. al.: TMap Net: For Result-driven Testing Glenford J. Meyers: The Art of Software Testing Torbjörn Ryber: Essential Software Test Design James Whittaker: How to Break Software James Whittaker, Herbert Thompson: How to Break Software Security Standard for Software Component Testing by the British Computer Society Specialist Interest Group in Software Testing (BCS SIGIST) (see There are many different training offerings by different providers Page January 7, 009 Peter Zimmerer, CT SE Test design tools are typically focused and implement only a few test design methods some eamples ATD Automated Test Designer (AtYourSide Consulting, cause-effect graphing BenderRBT (Richard Bender, cause-effect graphing, quick design (orthogonal pairs) CaseMaker (Diaz & Hilterscheid, business rules, equivalence classes, boundaries, error guessing, pairwise combinations, and element dependencies CTE (Razorcat, classification tree editor MaTeLo (Alltec, markov chains Qtronic (Conformiq, state-based testing Reactis (Reactive Systems, generation of test data from, and validation of, Simulink and Stateflow models Test Designer (Smartesting, state-based testing TestBench (Imbus, equivalence classes, work-flow / use-case-based testing Page 6 January 7, 009 Peter Zimmerer, CT SE

10 Problem statement Starting from a risk-based testing strategy an adequate test design is the key for effective and efficient testing. Automation of bad test cases is a waste of time and money! There are many different test design methods around for a long time (perhaps too many?) and a lot of books eplain them in detail. There are different categorizations, classifications, and dimensions naming, interpretations, and understandings of test design methods which does not simplify their usage When we look into practice we can see that often there is quite limited usage of these test design methods at all. What are the reasons behind that? How can we overcome this and improve our testing approaches? Page 7 January 7, 009 Peter Zimmerer, CT SE Poster Test Design Methods on One Page () Idea: Systematic, structured, and categorized overview about different test design methods on one page Focus more on using an adequate set of test design methods than on using only one single test design method in depth / perfection Focus more on concrete usage of test design methods than on defining a few perfect test design methods in detail which are not used then in the project Focus more on breadth instead on depth Do not miss breadth because of too much depth Do not miss the eploratory, investigative art of testing Page 8 January 7, 009 Peter Zimmerer, CT SE

11 T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) T e s t D e s i g n M e t h o d s o n O n e P a g e Poster Test Design Methods on One Page () Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Key Categorization Methods, Paradigms, Techniques, Styles, and Ideas to Create a Test Case Effort / Difficulty / Resulting Test Intensity ( Levels) Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) Page 9 January 7, 009 Peter Zimmerer, CT SE Poster Test Design Methods on One Page () Categories of test design methods are orthogonal and independent in some way but should be combined appropriately. The selection of the used test design methods depends on many factors, for eample: Requirements of the system under test and the required quality Requirements for the tests quality of the tests, i.e. the required intensity and depth of the tests Testing strategy: effort / quality of the tests, distribution of the testing in the development process Eisting test basis: specifications, documents, models Problem to be tested (domain) or rather the underlying question (use case) System under test or component under test Test level / Test step Used technologies (software, hardware) Suitable tool support: for some methods absolutely required Page 0 January 7, 009 Peter Zimmerer, CT SE

12 T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) Poster Test Design Methods on One Page () The effort / difficulty for the test design methods or rather the resulting test intensity is subdivided into levels: very low, simple low medium high very high, comple This division into levels is dependent on the factors given above for the selection of the test design methods and therefore can only be used as a first hint and guideline. A test design method also may be used continuously from intuitive use up to 00% complete use as required. In addition describe every test design method on one page to eplain their basic message and intention. Page January 7, 009 Peter Zimmerer, CT SE Guidelines and eperiences () For beginners perhaps you are confused about the many test design methods start simple, step by step ask for help and advice by an eperienced colleague, coach or consultant For advanced, eperienced testers (and developers!) check your current approach against this poster, think twice, and improve incrementally Use the poster as a checklist for eisting test design methods Selection of test design methods is dependent on the contet! So, you should adapt the poster to your specific needs. Page January 7, 009 Peter Zimmerer, CT SE

13 T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) T e s t D e s i g n M e t h o d s o n O n e P a g e Black-bo Standards (e.g. ISO/IEC 96, IEC 608), norms, (formal) specifications, claims (models, interfaces, data) Requirements-based with traceability matri (requirements test cases) Use case-based testing (sequence diagrams, activity diagrams) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Flow testing, scenario testing, soap opera testing User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering) Statistical testing (markov chains) Random (monkey testing) Features, functions, interfaces Design by contract (built-in self test) Equivalence class partitioning Domain partitioning, category-partition method Classification-tree method Boundary value analysis Special values Test catalog / matri for input values, input fields State-based testing (Final State Machines) Cause-effect graphing Decision tables, decision trees Synta testing (grammar-based testing) Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise) Time cycles (frequency, recurring events, test dates) Evolutionary testing Grey-bo Dependencies / Relations between classes, objects, methods, functions Dependencies / Relations between components, services, applications, systems Communication behavior (dependency analysis) Trace-based testing (passive testing) Protocol based (sequence diagrams, message sequence charts) White-bo Control flow-based Coverage Statements (C0), nodes (internal structure, paths) (specification-based, Branches (C), transitions, links, paths model-based, Conditions, decisions (C, C) code-based) Elementary comparison (MC/DC) Interfaces (S, S) Static metrics Cyclomatic compleity (McCabe) Metrics (e.g. Halstead) Data flow-based Read / Write access Def / Use criteria Positive, valid cases Normal, epected behavior Negative, invalid cases Invalid, unepected behavior Error handling Eceptions Fault-based Risk-based Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis) Attack patterns (e.g. by James A. Whittaker) Error catalogs, bug taonomies (e.g. by Boris Beizer, Cem Kaner) Bug patterns: standard, well-known bug patterns or produced by a root cause analysis Bug reports Fault model dependent on used technology and nature of system under test Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher) Ad hoc, intuitive, based on eperience, check lists Error guessing Eploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton) Fault injection Mutation testing Regression (selective retesting) Retest all Retest by risk, priority, severity, criticality Retest by profile, frequency of usage, parts which are often used Retest changed parts Retest parts that are influenced by the changes (impact analysis, dependency analysis) Guidelines and eperiences () Pick up this poster and give it to every developer and tester in your team or put it on the wall in your office or make it the standard screensaver or desktop background for all team members or even use the testing on the toilet approach by Google (see The poster increases visibility and importance of test design methods, especially also for developers to improve unit testing The poster facilitates a closer collaboration of testers and developers: you have something to talk about... Page January 7, 009 Peter Zimmerer, CT SE Summary There eist many different methods for adequate test design. When looking into practice often these test design methods are used only sporadically and in a non-systematical way. The poster Test Design Methods on One Page containing a systematic, structured, and categorized overview about test design methods will help you to really get them used in practice in your projects Do not miss breadth because of too much depth This will result in better and smarter testing for testers, developers, and architects Page January 7, 009 Peter Zimmerer, CT SE

14 Backup Details on test design methods Page January 7, 009 Peter Zimmerer, CT SE Requirements-based with traceability matri Inventory tracking matri Objectives / Test Cases Inventories Test Case Test Case Test Case Test Case Test Case Test Case 6 Test Case 7 Test Case 8 Requirements Requirement Requirement Requirement Features Feature Feature Feature Use Cases Use Case Use Case Use Case Objectives Objective Objective Objective Risks Risk Risk Risk Page 6 January 7, 009 Peter Zimmerer, CT SE

15 Use case-based testing Scenario testing A scenario is a hypothetical story used to help a person think through a comple problem or system Based e.g. on transaction flows, use cases, or sequence diagrams A specific, i.e. more etreme kind of scenario testing is the so-called soap opera testing Page 7 January 7, 009 Peter Zimmerer, CT SE Eample: Soap opera testing Ref.: Hans Buwalda: Soap Opera Testing, Better Software, February 00 Page 8 January 7, 009 Peter Zimmerer, CT SE

16 Soap opera testing Test objectives Corresponds to the traceability matri Ref.: Hans Buwalda: Soap Opera Testing, Better Software, February 00 Page 9 January 7, 009 Peter Zimmerer, CT SE Equivalence class partitioning and boundary values Goal: Create a minimum number of black bo tests needed while still providing adequate coverage. Two tests belong to the same equivalence class if you epect the same result (pass / fail) of each. Testing multiple members of the same equivalence class is, by definition, redundant testing. Boundaries mark the point or zone of transition from one equivalence class to another. The program is more likely to fail at a boundary, so these are the best members of (simple, numeric) equivalence classes to use. More generally, you look to subdivide a space of possible tests into relatively few classes and to run a few cases of each. You d like to pick the most powerful tests from each class. Page 0 January 7, 009 Peter Zimmerer, CT SE

17 Eample: a software thermostat Ref.: Stuart Reid, EuroSTAR Conference 006 Page January 7, 009 Peter Zimmerer, CT SE Equivalence class partitioning and boundary value analysis with parameters b ma b b min a min a ma # test cases for n parameters and one valid equivalence class: n + # test cases for n parameters and one valid and one invalid equivalence class: 6n + a b b b b ma min ma min b b a a min min a a ma ma a a Page January 7, 009 Peter Zimmerer, CT SE

18 Eample: Classification-tree method Ref.: CTE XL, Page January 7, 009 Peter Zimmerer, CT SE Eample: State-based testing State Cond_ / A Condition Cond_ / C State State State Cond_ / A Cond_ / N Cond_ / N Cond_ / N State Cond_ / B State / N State / N State table / B / N / N / C / N / D Cond_ / D State transition diagram S new / X X is action (or event) and S new is the resulting new state; action N means do nothing Page January 7, 009 Peter Zimmerer, CT SE

19 Eample: Cause-effect graphing Requires dependencies between parameters and can get very complicated and difficult to implement Page January 7, 009 Peter Zimmerer, CT SE Eample: Combinatorial Testing (pairwise testing) Given a system under test with parameters A, B, C, and D Each parameter has possible values Parameter A: a, a, a Parameter B: b, b, b Parameter C: c, c, c Parameter D: d, d, d A valid test input data set is e.g. {a, b, c, d}. Ehaustive testing would require = 8 test cases Only 9 test cases are already sufficient to cover all pairwise interactions of parameters Page 6 January 7, 009 Peter Zimmerer, CT SE

20 Control flow-based Unit/Module Integration System Use it on different levels of abstraction: model, unit, integration, system Page 7 January 7, 009 Peter Zimmerer, CT SE Data flow-based defined / used paths defined (d) for eample value assigned to a variable, initialized used (u) for eample variable used in a calculation, predicate predicate-use (p-u) computation-use (c-u) Test du-paths Read / write access: data source and data sink Use it on different levels of abstraction: model, unit, integration, system Page 8 January 7, 009 Peter Zimmerer, CT SE

21 Eample: Heuristics and mnemonics* Boundary values CRUD (data cycles, database operation) Create, Read, Update, Delete HICCUPPS (oracle) History, Image, Comparable Products, Claims, User s Epectations, Product, Purpose, Statutes SF DePOT (San Francisco Depot) (product element, coverage) Structure, Function, Data, Platform, Operations, Time CRUSSPIC STMPL (quality criteria) Capability, Reliability, Usability, Security, Scalability, Performance, Installability, Compatibility, Supportability, Testability, Maintainability, Portability, Localizability FDSFSCURA (testing techniques) Function testing, Domain testing, Stress testing, Flow testing, Scenario testing, User testing, Risk testing, Claims testing, Automatic Testing FCC CUTS VIDS (application touring) Feature tour, Compleity tour, Claims tour, Configuration tour, User tour, Testability tour, Scenario tour, Variability tour, Interoperability tour, Data tour, Structure tour *Ref.: James Bach, Michael Bolton, Mike Kelly, and many more Page 9 January 7, 009 Peter Zimmerer, CT SE

22

Quality Assurance Plan

Quality Assurance Plan CloudSizzle : Quality Assurance Plan Quality Assurance Plan General info Changelog 1. Introduction 2. Quality goals and risks 3. Quality Assurance practices 3.1 Testing levels 3.2 Testing - 3.2.1 Test

More information

Standard Glossary of Terms Used in Software Testing. Version 3.01

Standard Glossary of Terms Used in Software Testing. Version 3.01 Standard Glossary of Terms Used in Software Testing Version 3.01 Terms Used in the Advanced Level - Technical Test Analyst Syllabus International Software Testing Qualifications Board Copyright International

More information

Test Management and Techniques

Test Management and Techniques These slides are distributed under the Creative Commons License. In brief summary, you may make and distribute copies of these slides so long as you give the original author credit and, if you alter, transform

More information

Certified Tester. Advanced Level Overview

Certified Tester. Advanced Level Overview Version 2012 Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged. Copyright (hereinafter called ISTQB ). Advanced Level Working Group: Mike Smith

More information

Advanced Test-Driven Development

Advanced Test-Driven Development Corporate Technology Advanced Test-Driven Development Software Engineering 2007 Hamburg, Germany Peter Zimmerer Principal Engineer Siemens AG, CT SE 1 Corporate Technology Corporate Research and Technologies

More information

Standard Glossary of Terms Used in Software Testing. Version 3.01

Standard Glossary of Terms Used in Software Testing. Version 3.01 Standard Glossary of Terms Used in Software Testing Version 3.01 Terms Used in the Advanced Level - Test Manager Syllabus International Software Testing Qualifications Board Copyright International Software

More information

Software testing. Objectives

Software testing. Objectives Software testing cmsc435-1 Objectives To discuss the distinctions between validation testing and defect testing To describe the principles of system and component testing To describe strategies for generating

More information

Exploratory Testing Dynamics

Exploratory Testing Dynamics Exploratory Testing Dynamics Created by James and Jonathan Bach 1 v1.6 Copyright 2005-2006, Satisfice, Inc. Exploratory testing is the opposite of scripted testing. Both scripted and exploratory testing

More information

Testing of safety-critical software some principles

Testing of safety-critical software some principles 1(60) Testing of safety-critical software some principles Emerging Trends in Software Testing: autumn 2012 Matti Vuori, Tampere University of Technology 27.11.2012 Contents 1/4 Topics of this lecture 6

More information

Basic Testing Concepts and Terminology

Basic Testing Concepts and Terminology T-76.5613 Software Testing and Quality Assurance Lecture 2, 13.9.2006 Basic Testing Concepts and Terminology Juha Itkonen SoberIT Contents Realities and principles of Testing terminology and basic concepts

More information

4. Test Design Techniques

4. Test Design Techniques 4. Test Design Techniques Hans Schaefer [email protected] http://www.softwaretesting.no/ 2006-2010 Hans Schaefer Slide 1 Contents 1. How to find test conditions and design test cases 2. Overview of

More information

Formal Software Testing. Terri Grenda, CSTE IV&V Testing Solutions, LLC www.ivvts.com

Formal Software Testing. Terri Grenda, CSTE IV&V Testing Solutions, LLC www.ivvts.com Formal Software Testing Terri Grenda, CSTE IV&V Testing Solutions, LLC www.ivvts.com Scope of Testing Find defects early Remove defects prior to production Identify Risks Unbiased opinion When Should Testing

More information

Standard Glossary of Terms Used in Software Testing. Version 3.01

Standard Glossary of Terms Used in Software Testing. Version 3.01 Standard Glossary of Terms Used in Software Testing Version 3.01 Terms Used in the Foundation Level Syllabus International Software Testing Qualifications Board Copyright International Software Testing

More information

Techniques and Tools for Rich Internet Applications Testing

Techniques and Tools for Rich Internet Applications Testing Techniques and Tools for Rich Internet Applications Testing Domenico Amalfitano Anna Rita Fasolino Porfirio Tramontana Dipartimento di Informatica e Sistemistica University of Naples Federico II, Italy

More information

Exploratory Testing Dynamics

Exploratory Testing Dynamics Exploratory Testing Dynamics Created by James Bach, Jonathan Bach, and Michael Bolton 1 v2.2 Copyright 2005-2009, Satisfice, Inc. Exploratory testing is the opposite of scripted testing. Both scripted

More information

Contents. Introduction and System Engineering 1. Introduction 2. Software Process and Methodology 16. System Engineering 53

Contents. Introduction and System Engineering 1. Introduction 2. Software Process and Methodology 16. System Engineering 53 Preface xvi Part I Introduction and System Engineering 1 Chapter 1 Introduction 2 1.1 What Is Software Engineering? 2 1.2 Why Software Engineering? 3 1.3 Software Life-Cycle Activities 4 1.3.1 Software

More information

Four Schools of Software Testing. [email protected] www.pettichord.com

Four Schools of Software Testing. bret@pettichord.com www.pettichord.com Four Schools of Software Testing [email protected] www.pettichord.com Workshop on Teaching Software Testing, Florida Tech, February 2003 Why Classify Testing Doctrines into Schools? Clarify why testing

More information

TURKEY SOFTWARE QUALITY REPORT 2013-2014

TURKEY SOFTWARE QUALITY REPORT 2013-2014 TURKEY SOFTWARE QUALITY REPORT 2013-2014 CONTENT Foreword - 02 Executive Summary - 04 Questions - 06 About - 18 Turkish Testing Board (TTB - turkishtestingboard.org) is pleased to bring you the 2013-2014

More information

Standard for Software Component Testing

Standard for Software Component Testing Standard for Software Component Testing Working Draft 3.4 Date: 27 April 2001 produced by the British Computer Society Specialist Interest Group in Software Testing (BCS SIGIST) Copyright Notice This document

More information

Test Coverage and Risk

Test Coverage and Risk Test Coverage and Risk Hans Schaefer Consultant 5281 Valestrandsfossen, Norway [email protected] http://home.c2i.net/schaefer/testing.html 2006 Hans Schaefer page 1 How to check that a test was good

More information

Software Engineering. Software Testing. Based on Software Engineering, 7 th Edition by Ian Sommerville

Software Engineering. Software Testing. Based on Software Engineering, 7 th Edition by Ian Sommerville Software Engineering Software Testing Based on Software Engineering, 7 th Edition by Ian Sommerville Objectives To discuss the distinctions between validation testing and defect t testing To describe the

More information

Introduction to Automated Testing

Introduction to Automated Testing Introduction to Automated Testing What is Software testing? Examination of a software unit, several integrated software units or an entire software package by running it. execution based on test cases

More information

Development of AUTOSAR Software Components within Model-Based Design

Development of AUTOSAR Software Components within Model-Based Design 2008-01-0383 Development of AUTOSAR Software Components within Model-Based Design Copyright 2008 The MathWorks, Inc. Guido Sandmann Automotive Marketing Manager, EMEA The MathWorks Richard Thompson Senior

More information

Module 10. Coding and Testing. Version 2 CSE IIT, Kharagpur

Module 10. Coding and Testing. Version 2 CSE IIT, Kharagpur Module 10 Coding and Testing Lesson 26 Debugging, Integration and System Testing Specific Instructional Objectives At the end of this lesson the student would be able to: Explain why debugging is needed.

More information

ISTQB Certified Tester. Foundation Level. Sample Exam 1

ISTQB Certified Tester. Foundation Level. Sample Exam 1 ISTQB Certified Tester Foundation Level Version 2015 American Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged. #1 When test cases are designed

More information

TESSY Automated dynamic module/unit and. CTE Classification Tree Editor. integration testing of embedded applications. for test case specifications

TESSY Automated dynamic module/unit and. CTE Classification Tree Editor. integration testing of embedded applications. for test case specifications TESSY Automated dynamic module/unit and integration testing of embedded applications CTE Classification Tree Editor for test case specifications Automated module/unit testing and debugging at its best

More information

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008 Oracle Insurance Policy Administration System Quality Assurance Testing Methodology An Oracle White Paper August 2008 Oracle Insurance Policy Administration System Quality Assurance Testing Methodology

More information

Verification and Validation of Software Components and Component Based Software Systems

Verification and Validation of Software Components and Component Based Software Systems Chapter 5 29 Verification and Validation of Software Components and Component Based Christina Wallin Industrial Information Technology Software Engineering Processes ABB Corporate Research [email protected]

More information

Karunya University Dept. of Information Technology

Karunya University Dept. of Information Technology PART A Questions 1. Mention any two software process models. 2. Define risk management. 3. What is a module? 4. What do you mean by requirement process? 5. Define integration testing. 6. State the main

More information

Software Testing. System, Acceptance and Regression Testing

Software Testing. System, Acceptance and Regression Testing Software Testing System, Acceptance and Regression Testing Objectives Distinguish system and acceptance testing o How and why they differ from each other and from unit and integration testing Understand

More information

FSW QA Testing Levels Definitions

FSW QA Testing Levels Definitions FSW QA Testing Levels Definitions 1. Overview This document is used to help determine the amount and quality of testing (or its scope) that is planned for or has been performed on a project. This analysis

More information

Introduction to Computers and Programming. Testing

Introduction to Computers and Programming. Testing Introduction to Computers and Programming Prof. I. K. Lundqvist Lecture 13 April 16 2004 Testing Goals of Testing Classification Test Coverage Test Technique Blackbox vs Whitebox Real bugs and software

More information

Teaching Software Testing from two Viewpoints

Teaching Software Testing from two Viewpoints Teaching Software Testing from two Viewpoints Neil B. Harrison Department of Computer Science Utah Valley University 800 West University Parkway Orem, Utah 84058 801-863-7312 [email protected] Abstract

More information

Software Engineering/Courses Description Introduction to Software Engineering Credit Hours: 3 Prerequisite: 0306211(Computer Programming 2).

Software Engineering/Courses Description Introduction to Software Engineering Credit Hours: 3 Prerequisite: 0306211(Computer Programming 2). 0305203 0305280 0305301 0305302 Software Engineering/Courses Description Introduction to Software Engineering Prerequisite: 0306211(Computer Programming 2). This course introduces students to the problems

More information

Fundamentals of Measurements

Fundamentals of Measurements Objective Software Project Measurements Slide 1 Fundamentals of Measurements Educational Objective: To review the fundamentals of software measurement, to illustrate that measurement plays a central role

More information

THE THREE ASPECTS OF SOFTWARE QUALITY: FUNCTIONAL, STRUCTURAL, AND PROCESS

THE THREE ASPECTS OF SOFTWARE QUALITY: FUNCTIONAL, STRUCTURAL, AND PROCESS David Chappell THE THREE ASPECTS OF SOFTWARE QUALITY: FUNCTIONAL, STRUCTURAL, AND PROCESS Sponsored by Microsoft Corporation Our world runs on software. Every business depends on it, every mobile phone

More information

Lecture 17: Testing Strategies" Developer Testing"

Lecture 17: Testing Strategies Developer Testing Lecture 17: Testing Strategies Structural Coverage Strategies (White box testing): Statement Coverage Branch Coverage Condition Coverage Data Path Coverage Function Coverage Strategies (Black box testing):

More information

IEEE ComputerSociety 1 Software and Systems Engineering Vocabulary

IEEE ComputerSociety 1 Software and Systems Engineering Vocabulary IEEE ComputerSociety 1 Software and Systems test item. (1) system or software item that is an object of testing (IEEE 829-2008 IEEE Standard for Software and System Test Documentation, 3.1.48) (2) work

More information

Software Testing Interview Questions

Software Testing Interview Questions Software Testing Interview Questions 1. What s the Software Testing? A set of activities conducted with the intent of finding errors in software. 2.What is Acceptance Testing? Testing conducted to enable

More information

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction Software Testing Rajat Kumar Bal Introduction In India itself, Software industry growth has been phenomenal. IT field has enormously grown in the past 50 years. IT industry in India is expected to touch

More information

SOFTWARE TESTING. A Craftsmcm's Approach THIRD EDITION. Paul C. Jorgensen. Auerbach Publications. Taylor &. Francis Croup. Boca Raton New York

SOFTWARE TESTING. A Craftsmcm's Approach THIRD EDITION. Paul C. Jorgensen. Auerbach Publications. Taylor &. Francis Croup. Boca Raton New York SOFTWARE TESTING A Craftsmcm's Approach THIRD EDITION Paul C. Jorgensen A Auerbach Publications Taylor &. Francis Croup Boca Raton New York Auerbach Publications is an imprint of the Taylor & Francis Group,

More information

Lee Copeland. [email protected]

Lee Copeland. lee@sqe.com Lee Copeland [email protected] SQE 2012 What Is An Innovation? in no va tion (ĭn'ə-vā'shən) 1. Something new or different 2. Something newly introduced or adopted 3. A creation (a new device or process) resulting

More information

Software Quality Engineering: Testing, Quality Assurance, and Quantifiable Improvement

Software Quality Engineering: Testing, Quality Assurance, and Quantifiable Improvement Software Quality Engineering Slide (Ch.12) 1 Software Quality Engineering: Testing, Quality Assurance, and Quantifiable Improvement Jeff Tian, [email protected] www.engr.smu.edu/ tian/sqebook Chapter 12.

More information

Exploratory Testing in an Agile Context

Exploratory Testing in an Agile Context Exploratory Testing in an Agile Context A guide to using Exploratory Testing on Agile software development teams. Elisabeth Hendrickson 2 Exploratory Testing. So you bang on the keyboard randomly, right?

More information

How To Be Successful At An Agile Software Engineering

How To Be Successful At An Agile Software Engineering "Agile Software Engineering" Overview for external offering of ASE ABAP Juergen Heymann, CPO Software Engineering There are many ingredients for successful software projects Experienced Developers Domain

More information

Schools of Software Testing

Schools of Software Testing Schools of Software Testing [email protected] www.pettichord.com March 2007 Copyright 2003-2007 Bret Pettichord. Permission to reproduce granted with attribution. 2 What is a School? Defined by Intellectual

More information

Improved Software Testing Using McCabe IQ Coverage Analysis

Improved Software Testing Using McCabe IQ Coverage Analysis White Paper Table of Contents Introduction...1 What is Coverage Analysis?...2 The McCabe IQ Approach to Coverage Analysis...3 The Importance of Coverage Analysis...4 Where Coverage Analysis Fits into your

More information

Test Case Design Techniques

Test Case Design Techniques Summary of Test Case Design Techniques Brian Nielsen, Arne Skou {bnielsen ask}@cs.auc.dk Development of Test Cases Complete testing is impossible Testing cannot guarantee the absence of faults How to select

More information

Software Engineering. How does software fail? Terminology CS / COE 1530

Software Engineering. How does software fail? Terminology CS / COE 1530 Software Engineering CS / COE 1530 Testing How does software fail? Wrong requirement: not what the customer wants Missing requirement Requirement impossible to implement Faulty design Faulty code Improperly

More information

VALLIAMMAI ENGINEERING COLLEGE S.R.M. Nagar, Kattankulathur DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

VALLIAMMAI ENGINEERING COLLEGE S.R.M. Nagar, Kattankulathur DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING VALLIAMMAI ENGINEERING COLLEGE S.R.M. Nagar, Kattankulathur DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING Sub Code : CP7007 Sub Name: SOFTWARE REQUIREMENTS ENGINEERING Branch / Year : ME CSE / I Year

More information

How CMMI contributes to Software Testing

How CMMI contributes to Software Testing How CMMI contributes to Software Testing Dr. Uwe Hehn method park Software AG [email protected] Contents 1. Motivation for S/W Quality Models 2. Why Testers should have some knowledge of Quality Models

More information

Sample Exam. 2011 Syllabus

Sample Exam. 2011 Syllabus ISTQ Foundation Level 2011 Syllabus Version 2.3 Qualifications oard Release ate: 13 June 2015 ertified Tester Foundation Level Qualifications oard opyright 2015 Qualifications oard (hereinafter called

More information

Software Testing Strategies and Techniques

Software Testing Strategies and Techniques Software Testing Strategies and Techniques Sheetal Thakare 1, Savita Chavan 2, Prof. P. M. Chawan 3 1,2 MTech, Computer Engineering VJTI, Mumbai 3 Associate Professor, Computer Technology Department, VJTI,

More information

Model based testing tools. Olli Pekka Puolitaival

Model based testing tools. Olli Pekka Puolitaival Model based testing tools Olli Pekka Puolitaival Index 1. Software testing evolution 2. model based testing (MBT): main idea 3. MBT: step by step 4. MBT: tools 5. Questions Software testing evolution model

More information

Presentation: 1.1 Introduction to Software Testing

Presentation: 1.1 Introduction to Software Testing Software Testing M1: Introduction to Software Testing 1.1 What is Software Testing? 1.2 Need for Software Testing 1.3 Testing Fundamentals M2: Introduction to Testing Techniques 2.1 Static Testing 2.2

More information

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University Software Engineering Introduction & Background Department of Computer Science Kent State University Complaints Software production is often done by amateurs Software development is done by tinkering or

More information

MOBILE APPLICATION TESTING ENGINEER

MOBILE APPLICATION TESTING ENGINEER MOBILE APPLICATION TESTING ENGINEER www.rockfortnetworks.com/mapster [email protected] TESTING? Software Testing is the process of exercising or evaluating a system or system component by

More information

CHAPTER 20 TESING WEB APPLICATIONS. Overview

CHAPTER 20 TESING WEB APPLICATIONS. Overview CHAPTER 20 TESING WEB APPLICATIONS Overview The chapter describes the Web testing. Web testing is a collection of activities whose purpose is to uncover errors in WebApp content, function, usability, navigability,

More information

Quality Management. Lecture 12 Software quality management

Quality Management. Lecture 12 Software quality management Quality Management Lecture 12 Software quality management doc.dr.sc. Marko Jurčević prof.dr.sc. Roman Malarić University of Zagreb Faculty of Electrical Engineering and Computing Department of Fundamentals

More information

Software Requirements, Third Edition

Software Requirements, Third Edition j Microsoft Software Requirements, Third Edition Karl Wiegers and Joy Beatty Contents Introduction Acknowledgments xxv xxxi PART I SOFTWARE REQUIREMENTS: WHAT, WHY, AND WHO Chapter 1 The essential software

More information

Standard glossary of terms used in Software Testing

Standard glossary of terms used in Software Testing Standard glossary of terms used in Software Testing Version 2.0 (dd. December, 2 nd 2007) Produced by the Glossary Working Party International Software Testing Qualifications Board Editor : Erik van Veenendaal

More information

Latest Trends in Testing. Ajay K Chhokra

Latest Trends in Testing. Ajay K Chhokra Latest Trends in Testing Ajay K Chhokra Introduction Software Testing is the last phase in software development lifecycle which has high impact on the quality of the final product delivered to the customer.

More information

What is a life cycle model?

What is a life cycle model? What is a life cycle model? Framework under which a software product is going to be developed. Defines the phases that the product under development will go through. Identifies activities involved in each

More information

Erik van Veenendaal. www. erikvanveenendaal.nl. Improve Quality Services BV 2

Erik van Veenendaal. www. erikvanveenendaal.nl. Improve Quality Services BV 2 PRISMA Risk-Based Testing In Practice Never speculate on that which can be known for certain Erik van Veenendaal www.erikvanveenendaal.nl Erik van Veenendaal www. erikvanveenendaal.nl Founder and major

More information

Advanced Testing Techniques

Advanced Testing Techniques 9 March, 2010 ISSN 1866-5705 www.testingexperience.com free digital version print version 8,00 printed in Germany Advanced Testing Techniques Conferences Special istockphoto.com/nwphotoguy istockphoto.com/esemelwe

More information

Sofware Requirements Engineeing

Sofware Requirements Engineeing Sofware Requirements Engineeing Three main tasks in RE: 1 Elicit find out what the customers really want. Identify stakeholders, their goals and viewpoints. 2 Document write it down (). Understandable

More information

Test case design techniques II: Blackbox testing CISS

Test case design techniques II: Blackbox testing CISS Test case design techniques II: Blackbox testing Overview Black-box testing (or functional testing): Equivalence partitioning Boundary value analysis Domain analysis Cause-effect graphing Behavioural testing

More information

Benefits of Test Automation for Agile Testing

Benefits of Test Automation for Agile Testing Benefits of Test Automation for Agile Testing Manu GV 1, Namratha M 2, Pradeep 3 1 Technical Lead-Testing Calsoft Labs, Bangalore, India 2 Assistant Professor, BMSCE, Bangalore, India 3 Software Engineer,

More information

Definitions. Software Metrics. Why Measure Software? Example Metrics. Software Engineering. Determine quality of the current product or process

Definitions. Software Metrics. Why Measure Software? Example Metrics. Software Engineering. Determine quality of the current product or process Definitions Software Metrics Software Engineering Measure - quantitative indication of extent, amount, dimension, capacity, or size of some attribute of a product or process. Number of errors Metric -

More information

Design principles in Test Suite Architecture

Design principles in Test Suite Architecture Design principles in Test Suite Architecture InSTA 2015 (International workshop on Software Test Architecture) Graz, Austria 2015/4/13(Mon) Nishi, Yasuharu The University of Electro-Communications, Japan

More information

Standard Glossary of Terms Used in Software Testing. Version 3.01

Standard Glossary of Terms Used in Software Testing. Version 3.01 Standard Glossary of Terms Used in Software Testing Version 3.01 Terms Used in the Expert Level Test Automation - Engineer Syllabus International Software Testing Qualifications Board Copyright International

More information

Software Engineering Compiled By: Roshani Ghimire Page 1

Software Engineering Compiled By: Roshani Ghimire Page 1 Unit 7: Metric for Process and Product 7.1 Software Measurement Measurement is the process by which numbers or symbols are assigned to the attributes of entities in the real world in such a way as to define

More information

Apache Web Server Execution Tracing Using Third Eye

Apache Web Server Execution Tracing Using Third Eye Apache Web Server Execution Tracing Using Third Eye Raimondas Lencevicius Alexander Ran Rahav Yairi Nokia Research Center, 5 Wayside Road, Burlington, MA 01803, USA [email protected] [email protected]

More information

Certified Software Quality Engineer (CSQE) Body of Knowledge

Certified Software Quality Engineer (CSQE) Body of Knowledge Certified Software Quality Engineer (CSQE) Body of Knowledge The topics in this Body of Knowledge include additional detail in the form of subtext explanations and the cognitive level at which the questions

More information

Business Analysis Workshops

Business Analysis Workshops Business Analysis Workshops Business Analysis is one of the fastest growing areas in IT today. In order for organizations to maximize the returns they get on IT budgets, BAs have to help us properly scope,

More information

Test case design techniques I: Whitebox testing CISS

Test case design techniques I: Whitebox testing CISS Test case design techniques I: Whitebox testing Overview What is a test case Sources for test case derivation Test case execution White box testing Flowgraphs Test criteria/coverage Statement / branch

More information

SECTION 4 TESTING & QUALITY CONTROL

SECTION 4 TESTING & QUALITY CONTROL Page 1 SECTION 4 TESTING & QUALITY CONTROL TESTING METHODOLOGY & THE TESTING LIFECYCLE The stages of the Testing Life Cycle are: Requirements Analysis, Planning, Test Case Development, Test Environment

More information

ISTQB ADVANCED LEVEL TEST ANALYST CERTIFICATE IN SOFTWARE TESTING

ISTQB ADVANCED LEVEL TEST ANALYST CERTIFICATE IN SOFTWARE TESTING ISTQB ADVANCED LEVEL TEST ANALYST CERTIFICATE IN SOFTWARE TESTING Copyright 2015 ps_testware 1/6 Introduction The International Software Testing Qualifications Board (ISTQB) consists of Member Boards representing

More information

Agile Software Engineering Practice to Improve Project Success

Agile Software Engineering Practice to Improve Project Success Agile Software Engineering Practice to Improve Project Success Dietmar Winkler Vienna University of Technology Institute of Software Technology and Interactive Systems [email protected]

More information

Exploratory Testing An Agile Approach STC-2009. Aman Arora. Xebia IT Architects India Pvt. Ltd. Sec-30, Gurgaon 122001, Haryana

Exploratory Testing An Agile Approach STC-2009. Aman Arora. Xebia IT Architects India Pvt. Ltd. Sec-30, Gurgaon 122001, Haryana 1 Exploratory Testing An Agile Approach STC-2009 Aman Arora Xebia IT Architects India Pvt. Ltd. Unit No-612, 6 th floor, BPTP park Centra, Sec-30, Gurgaon 122001, Haryana 2 Abstract As the IT industry

More information

Lecture Slides for Managing and Leading Software Projects. Chapter 2: Process Models for Software Development

Lecture Slides for Managing and Leading Software Projects. Chapter 2: Process Models for Software Development Lecture Slides for Managing and Leading Software Projects Chapter 2: Process Models for Software Development developed by Richard E. (Dick) Fairley, Ph.D. to accompany the tet Managing and Leading Software

More information

Scrum and Testing The end of the test role Bryan Bakker 20 maart 2012

Scrum and Testing The end of the test role Bryan Bakker 20 maart 2012 Scrum and Testing The end of the test role Bryan Bakker 20 maart 2012 voordracht georganiseerd door Discussiegroep Software Testing met de steun van Ingenieurshuis, Antwerpen Scrum and Testing... The end

More information

Testing of the data access layer and the database itself

Testing of the data access layer and the database itself Testing of the data access layer and the database itself Vineta Arnicane and Guntis Arnicans University of Latvia TAPOST 2015, 08.10.2015 1 Prolog Vineta Arnicane, Guntis Arnicans, Girts Karnitis DigiBrowser

More information

SOFTWARE ENGINEERING INTERVIEW QUESTIONS

SOFTWARE ENGINEERING INTERVIEW QUESTIONS SOFTWARE ENGINEERING INTERVIEW QUESTIONS http://www.tutorialspoint.com/software_engineering/software_engineering_interview_questions.htm Copyright tutorialspoint.com Dear readers, these Software Engineering

More information

Best Practices for Improving the Quality and Speed of Your Agile Testing

Best Practices for Improving the Quality and Speed of Your Agile Testing A Conformiq White Paper Best Practices for Improving the Quality and Speed of Your Agile Testing Abstract With today s continually evolving digital business landscape, enterprises are increasingly turning

More information

Survey of Web Testing Techniques

Survey of Web Testing Techniques Survey of Web Testing Techniques Sonal Anand M.Tech (Computer Science) USIT, GGSIPU New Delhi, India Anju Saha Assistant Professor USIT, GGSIPU New Delhi, India ABSTRACT This paper presents a survey of

More information

Introduction of ISO/DIS 26262 (ISO 26262) Parts of ISO 26262 ASIL Levels Part 6 : Product Development Software Level

Introduction of ISO/DIS 26262 (ISO 26262) Parts of ISO 26262 ASIL Levels Part 6 : Product Development Software Level ISO 26262 the Emerging Automotive Safety Standard Agenda Introduction of ISO/DIS 26262 (ISO 26262) Parts of ISO 26262 ASIL Levels Part 4 : Product Development System Level Part 6 : Product Development

More information

Advanced Software Test Design Techniques Use Cases

Advanced Software Test Design Techniques Use Cases Advanced Software Test Design Techniques Use Cases Introduction The following is an excerpt from my recently-published book, Advanced Software Testing: Volume 1. This is a book for test analysts and test

More information