Test Management and Techniques



Similar documents
Basic Testing Concepts and Terminology

Exploratory Testing Dynamics

Exploratory Testing Dynamics

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

Standard Glossary of Terms Used in Software Testing. Version 3.01

Testing Introduction. IEEE Definitions

Software testing. Objectives

Four Schools of Software Testing.

Presentation: 1.1 Introduction to Software Testing

Exploratory Testing in an Agile Context

Introduction to Automated Testing

Lecture 17: Testing Strategies" Developer Testing"

Software Testing Interview Questions

Chapter 8 Software Testing

Standard Glossary of Terms Used in Software Testing. Version 3.01

IEEE ComputerSociety 1 Software and Systems Engineering Vocabulary

ISTQB Certified Tester. Foundation Level. Sample Exam 1

8. Master Test Plan (MTP)

A Brief Overview of Software Testing Techniques and Metrics

Software Testing. System, Acceptance and Regression Testing

(Refer Slide Time: 01:52)

Di 3.1. A Test Design Poster for Testers, Developers, and Architects. Peter Zimmerer

Software Engineering. Software Testing. Based on Software Engineering, 7 th Edition by Ian Sommerville

Introduction to Computers and Programming. Testing

Test Data Management Best Practice

MANUAL TESTING. (Complete Package) We are ready to serve Latest Testing Trends, Are you ready to learn.?? New Batches Info

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

Agile Test Automation. James Bach, Satisfice, Inc.

Preventive Approach for Web Applications Security Testing OWASP 10/30/2009. The OWASP Foundation

Software Test Plan (STP) Template

The Dangers of Use Cases Employed as Test Cases

Co-Presented by Mr. Bill Rinko-Gay and Dr. Constantin Stanca 9/28/2011

Test Coverage and Risk

The ROI of Test Automation

Metrics in Software Test Planning and Test Design Processes

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University

FSW QA Testing Levels Definitions

Software Process for QA

Schools of Software Testing

Sample Exam Syllabus

Testing of safety-critical software some principles

Test management best practices

CUT COSTS, NOT PROJECTS

SECTION 4 TESTING & QUALITY CONTROL

Verification and Validation of Software Components and Component Based Software Systems

Testing. Chapter. A Fresh Graduate s Guide to Software Development Tools and Technologies. CHAPTER AUTHORS Michael Atmadja Zhang Shuai Richard

Test Plan Evaluation Model

Essentials of the Quality Assurance Practice Principles of Testing Test Documentation Techniques. Target Audience: Prerequisites:

CSTE Mock Test - Part III Questions Along with Answers

Different Approaches to White Box Testing Technique for Finding Errors

Black Box Software Testing Fall 2005 Overview for Students

Software Testing, Mythology & Methodologies

Standard Glossary of Terms Used in Software Testing. Version 3.01

Certified Tester. Advanced Level Overview

Formal Software Testing. Terri Grenda, CSTE IV&V Testing Solutions, LLC

Cem Kaner, J.D., Ph.D. Florida Institute of Technology STAR East May 2011

Teaching Software Testing from two Viewpoints

MOBILE APPLICATION TESTING ENGINEER

4. Test Design Techniques

Advanced Test Manager E-learning Course Outline

Standard for Software Component Testing

Module 10. Coding and Testing. Version 2 CSE IIT, Kharagpur

Basic Unix/Linux 1. Software Testing Interview Prep

Quality is the responsibility of the whole team

What is a life cycle model?

Chapter 5. Regression Testing of Web-Components

BCA 421- Java. Tilak Maharashtra University. Bachelor of Computer Applications (BCA) 1. The Genesis of Java

Defining Quality Workbook. <Program/Project/Work Name> Quality Definition

Intelligent and Automated Software Testing Methods Classification

Introduction site management software

Design Document Version 0.0

Testing Rails. by Josh Steiner. thoughtbot

Business Application Services Testing

TESTING: FIRST STEP TOWARDS SOFTWARE QUALITY

Testing Automation for Distributed Applications By Isabel Drost-Fromm, Software Engineer, Elastic

4 Testing General and Automated Controls

CHAPTER 20 TESING WEB APPLICATIONS. Overview

Agile Testing Overview

Analysis of Object Oriented Software by Using Software Modularization Matrix

Advanced Testing Techniques

International Software Test Institute

RBT Framework Coupled with Keyword-driven Test Design Approach

Software Quality. Software Quality Assurance and Software Reuse. Three Important Points. Quality Factors

Latest Research and Development on Software Testing Techniques and Tools

Bringing Value to the Organization with Performance Testing

Lecture 17: Requirements Specifications

AGILE SOFTWARE TESTING

Subject & Course: CS362 Software Engineering II. Credits: 4. Instructor s Name: Arpit Christi. Instructor s christia@onid.oregonstate.

Test Automation Architectures: Planning for Test Automation

Transcription:

These slides are distributed under the Creative Commons License. In brief summary, you may make and distribute copies of these slides so long as you give the original author credit and, if you alter, transform or build upon this work, you distribute the resulting work only under a license identical to this one. For the rest of the details of the license, see http://creativecommons.org/licenses/bysa/2.0/legalcode. Management 2002 Amland Consulting 2-1

Management Outline 1. 2. 3. 4. 5. Introduction ET Planning, Exec. and Testing Fundamentals in 1 hour! Black Box and Glass Box testing Test Planning and Management Test Test Execution Heuristic Risk-Based Testing Progress Tracking and Metrics Management 2002 Amland Consulting 2-2

1. 2. 3. 4. 5. 2. and Techniques 2.1 Testing Fundamentals in 1 hour! 2.2 Test Execution 2.3 Heuristic Risk-Based Testing 2.4 and Metrics Management and Techniques 2002 Amland Consulting 2-3

General information not ET Specific! Testing Fundamentals in 1 hour! Black-Box and Glass Box testing Test Strategy Test Planning Test and Structure This is what you will learn about if you take a Testing Fundamentals class anywhere else... Management 2002 Amland Consulting 2-4

General information not ET Specific! Black Box Testing Black-Box Testing The tester pretend NOT to know how the program works Feed interesting input (expected do lead to interesting output Observe interesting outcome Kaner et. al. 1999, Beizer 1990, Beizer 1995 Management 2002 Amland Consulting 2-5

General information not ET Specific! Glass Box / White Box Testing The programmer will test based on access to and understanding of source code Focused testing Test the program in pieces Testing Coverage Lines of code, branches, paths etc. Control Flow State transitions Data Integrity Data manipulation Internal boundaries Algorithm-specific testing Numeric algorithms Kaner et. al. 1999, Beizer 1990, Beizer 1995 Management 2002 Amland Consulting 2-6

General information not ET Specific! Test Strategy How we plan to cover the product so as to develop an adequate assessment of quality. A good test strategy is: Product-Specific Risk-focused Diversified Practical From Rapid Software Testing, copyright 1996-2002 James Bach Management 2002 Amland Consulting 2-7

General information not ET Specific! Test Plan According to IEEE Std 829-1983: A document describing the scope, approach, resources, and schedule of intended testing activities. It identifies test items, the features to be tested, the testing tasks, who will be doing each task, and any risks requiring contingency planning. Kaner et al 1999: A test plan is a valuable tool to the extent that it helps you manage your testing project and find bugs. Beyond that, it is a diversion of resources. Management 2002 Amland Consulting 2-8

General information not ET Specific! Test Plan IEEE Std 829-1998 IEEE Standard for Software Test Plan Identifier Introduction Test Items Features to be tested Features not to be tested Approach Item pass / fail criteria Suspension criteria and resumption criteria Test deliverables Testing tasks Environmental needs Responsibilities Staffing and training needs Schedule Risks and contingencies Approvals What? How? Who? When? Alternatives can be found in Black 1999 and Kaner et al 1999 Management 2002 Amland Consulting 2-9

General information not ET Specific! Test (IEEE 829 1998) Test Plan (see previous slide) Test Design Specification Test approach and features to be tested Test Case Specifications Input, Expected Output and Environmental needs Test Procedure Specification Steps for executing a test case Test Item Transmittal Report List items for testing Test Log Chronological record of the execution of tests Test Incident Report Document events that requires investigation Test Summary Report To summarize the results and provide evaluation Management 2002 Amland Consulting 2-10

General information not ET Specific! Introduction Test documentation STATIC TEST PLANNING DYNAMIC TEST PLANNING TEST EXECUTION LOG KEEPING Test item tree Test plan Application based groups Test data Test Procedure Spec. Event log Test item spec. Objectives, policy and strategy for the test project Test design specific. Test log v222 Incident Report (CR) Management 2002 Amland Consulting 2-11 2 Test summary report Test Item Transmittal Report Test case spec. Function based groups Error / change log

1. 2. 3. 4. 5. 2. and Techniques 2.1 Testing Fundamentals in 1 hour! 2.2 Test Execution 2.3 Heuristic Risk-Based Testing 2.4 and Metrics Management and Techniques 2002 Amland Consulting 2-12

Test Execution The Problem of testing Test Models, Coverage and Evaluation Overview Test Techniques Boundaries and Equivalence Classes Bug Reporting Test Automation Management 2002 Amland Consulting 2-13

The Universal Test Procedure Try it and see if it works. Learn about it Model it Speculate about it Configure it Operate it Models Know what to look for See what s there Coverage Understand the requirements Identify problems Distinguish bad problems from notso-bad problems Evaluation From Rapid Software Testing, copyright 1996-2002 James Bach Management 2002 Amland Consulting 2-14

All Product Testing is Something Like This (The Satisfice Testing Model) Project Environment Test Techniques Quality Criteria Product Elements Perceived Quality From Rapid Software Testing, copyright 1996-2002 James Bach Management 2002 Amland Consulting 2-15

The Four Problems of Testing Evaluation Problem Logistics Problem Project Environment Test Techniques Coverage Problem Quality Criteria Reporting Problem Perceived Quality Product Elements From Rapid Software Testing, copyright 1996-2002 James Bach Management 2002 Amland Consulting 2-16

Models A Model is A map of a territory A simplified perspective A relationship of ideas An incomplete representation of reality A diagram, list, outline, matrix No good test design has ever been done without models. The trick is to become aware of how you model the product, and learn different ways of modeling. From Rapid Software Testing, copyright 1996-2002 James Bach Management 2002 Amland Consulting 2-17

Coverage Product coverage is the proportion of the product that has been tested. There are as many kinds of coverage as there are ways to model the product. Structural Functional Data Platform Operations From Rapid Software Testing, copyright 1996-2002 James Bach Management 2002 Amland Consulting 2-18

An evaluation strategy is how you know the product works....it works. really means...it appeared to meet some requirement to some degree. Which requirement? Appeared how? To what degree? Perfectly? Just barely? From Rapid Software Testing, copyright 1996-2002 James Bach Management 2002 Amland Consulting 2-19

Evaluation: HICCUPP Consistent with History: Present function behavior is consistent with past behavior. Consistent with our Image: Function behavior is consistent with an image that the organization wants to project. Consistent with Comparable Products: Function behavior is consistent with that of similar functions in comparable products. Consistent with Claims: Function behavior is consistent with what people say it s supposed to be. Consistent with User s Expectations: Function behavior is consistent with what we think users want. Consistent within Product: Function behavior is consistent with behavior of comparable functions or functional patterns within the product. Consistent with Purpose: Function behavior is consistent with apparent purpose. From Rapid Software Testing, copyright 1996-2002 James Bach Management 2002 Amland Consulting 2-20

Test Techniques A test technique is a recipe for performing these tasks in order to reveal something worth reporting: Analyze the situation Model the test space Select what to cover Determine test oracles Configure the test system Observe the test system Evaluate the test result Copyright 1996-2002 James Bach and Cem Kaner Management 2002 Amland Consulting 2-21

General Test Techniques (1) Functional testing Domain testing Stress testing Flow testing User testing Risk testing Claims testing Random testing Regression testing For any one of these techniques there s somebody, somewhere, who believes that is the only way to test. From Black Box Software Testing, copyright 1996 2002 Cem Kaner Management 2002 Amland Consulting 2-22

General Test Techniques (2) Flow graphs and Path Testing Transaction-Flow Testing Data-Flow Testing Domain Testing Boundaries and Equivalence Classes Syntax Testing Logic-Based Testing States, State Graphs, and Transition Testing Beizer 1990 Management 2002 Amland Consulting 2-23

Five-fold Testing System Five dimensions of testing: Testers Who does the testing. Coverage What get tested. Potential Problems Why you re testing (what risk you re testing for) Activities How you test. For example: exploratory testing Evaluation How to tell whether the test passed or failed. Kaner et al, 2001b Management 2002 Amland Consulting 2-24

People based techniques: Who User Testing Alpha testing Beta testing Bug bashes Subject-matter expert testing Paired testing Eat your own dog food Kaner et al, 2001b Management 2002 Amland Consulting 2-25

Coverage-based techniques: What Function testing Feature or function integration testing Menu tour Domain testing Equivalence class analysis Boundary testing Best representative testing Input field test catalogs or matrices Map and test all the ways to edit a field Logic testing State-based testing Path testing Statement and branch coverage Configuration coverage Specification-based testing Requirements-based testing Combination testing Kaner et al, 2001b Management 2002 Amland Consulting 2-26

ManagementProblem-based techniques: Why ET Risk-Based Testing Risk-Based (Amland 1999) Doing risk analysis for the purpose of finding errors (Bach 1999a) Constraint violations Input constraints Output constraints Computation constraints Storage (or data) constraints Kaner et al, 2001b Management 2002 Amland Consulting 2-27

Activity Based Techniques: How Regression testing Scripted testing Smoke testing Exploratory testing Guerilla testing Scenario testing Use Case Flow tests Installation testing Load testing Long Sequence testing Performance testing Kaner et al, 2001b Management 2002 Amland Consulting 2-28

ManagementEvaluation-based techniques ET Self-verifying data Comparison with saved results Comparison with a specification or other authoritative document Heuristic consistency HICCUPP (see earlier slide) Oracle-based testing Kaner et al, 2001b Management 2002 Amland Consulting 2-29

Boundaries and Equivalence Classes (Example Test Series) Here is the program s specification: This program is designed to add two numbers, which you will enter Each number should be one or two digits The program will print the sum. Press Enter after each number To start the program, type ADDER Before you start testing, do you have any questions about the spec? From Black Box Software Testing, copyright 1996 2002 Cem Kaner Management 2002 Amland Consulting 2-30

Boundaries and Equivalence Classes (Example Test Series) Example continued: The First Cycles of Testing Basic strategy for dealing with new code: 1. Start with mainstream-level test. Test the program with easy-to-pass values that will be taken as serious issues if the program fails. 2. Test broadly, rather than deeply. Check out all parts of the program quickly before focusing. 3. Look for more powerful tests. Once the program can survive the easy test, put on your thinking cap and look systematically for challenges. 4. Pick boundary conditions. There will be too many good tests. You need a strategy for picking and choosing. 5. Do some exploratory testing. Run new tests every week, from the first week to the last week of the project. From Black Box Software Testing, copyright 1996 2002 Cem Kaner Management 2002 Amland Consulting 2-31

Equivalence Classes & Boundary Analysis There are too many tests! There are 199 x 199 = 39.601 test cases for valid values: Definitely valid: 0 to 99 Might be valid: -99 to -1 There are infinitely many invalid cases: 100 and above -100 and below Anything non-numeric Some people want to automate these tests. Can you run all the tests? How will you tell whether the program passed or failed? We cannot afford to run every possible test. We need a method for choosing a few tests that will represent the rest. Equivalence analysis is the most widely used approach. Testing Computer Software p. 4 5 and p. 125-132, Kaner 1999 Management 2002 Amland Consulting 2-32

Classical Equivalence Class and Boundary Analysis To avoid unnecessary testing, partition (divide) the range of inputs into groups of equivalent tests. Then treat an input value from the equivalence class as representative of the full group. Two tests are equivalent if we would expect that an error revealed by one would be revealed by the other. Boundaries mark the point or zone of transition from one equivalence class to another. The program is more likely to fail at a boundary, so these are good members of equivalence classes to use. Myers p.45 and Black Box Software Testing, Kaner Management 2002 Amland Consulting 2-33

Equivalence Classes & Boundary Analysis: ManagementTraditional Presentation (Myers) ET One input or output field The valid values for the field fit within one (1) easily specified range. Valid values: -99 to 99 Invalid values < -99 > 99 From Black Box Software Testing, copyright 1996 2002 Cem Kaner Management 2002 Amland Consulting 2-34

Equivalence Classes & Boundary Analysis: Building the Table (in practice) Relatively few programs will come to you with all fields fully specified. Therefore, you should expect to learn what variables exist and their definitions over time. To build an equivalence class analysis over time, put the information into a spreadsheet. Start by listing variables. Add information about them as you obtain it. The table should eventually contain all variables. This means, all input variables, all output variables, and any intermediate variables that you can observe. In practice, most tables that I ve seen are incomplete. The best ones that I ve seen list all the variables and add detail for critical variables. From Black Box Software Testing, copyright 1996 2002 Cem Kaner Management 2002 Amland Consulting 2-35

Equivalence Classes & Boundary Analysis: Myers Boundary Table Variable First Number Second Number Valid Case Equivalence Classes Invalid Case Equivalence Classes -99 to 99 >99 <-99 non-number expressions Boundaries and Spcial Cases 99, 100-99, -100 / : 0 null entry same as first same as first same Notes SUM -198 to 198 Are there other sources of data for this variable? Ways to feed it bad data? From Black Box Software Testing, copyright 1996 2002 Cem Kaner Management 2002 Amland Consulting 2-36

1. 2. 3. 4. Bug Reporting / Advocacy 5. You are what you write Bug reports are the primary work of testers Programmers rely on your reports for vital information Your advocacy drives the repair of the bugs you report Any bug report that you write is an advocacy document that calls for the repair of the bug Make your bug report an effective sales tool Kaner et al, 2001b Management 2002 Amland Consulting 2-37

1. 2. 3. 4. 5. Exercise: How do you advocate this bug? Searching for a book by title, author or ISBN: Management 2002 Amland Consulting 2-38

Test Automation Kaner, Bach, Pettichord 2001b. Use automation when it advances the mission of testing. Select your automation strategy based on your context. A test tool is not a strategy. Test automation is a significant investment. Test automation is design and programming. Fewster & Graham 1999 Automating chaos just gives faster chaos Management 2002 Amland Consulting 2-39

1. 2. 3. 4. 5. 2. and Techniques 2.1 Testing Fundamentals in 1 hour! 2.2 Test Execution 2.3 Heuristic Risk-Based Testing 2.4 and Metrics Management and Techniques 2002 Amland Consulting 2-40

Heuristic Risk-Based Testing This kind of analysis is always available to you, no calculator required. A heuristic method: a useful method that doesn t always work. Checklist of open-ended questions, suggestions or guidewords. Two Approaches: Inside-Out: What can go wrong here? Outside-In (predefined lists of risk): Quality Criteria Categories Generic Risk Lists Risk Catalogs James Bach, 1999a Management 2002 Amland Consulting 2-41

Heuristic Risk-Based Testing Inside-out: What can go wrong here? What risks are associated with this thing? Require technical insight Vulnerabilities What weaknesses or possible failures? Threats What situation might exploit a vulnerability and trigger a failure? Victims Who or what would be impacted by potential failures and how bad? James Bach, 1999a Management 2002 Amland Consulting 2-42

1. 2. 3. 4. Possible Inside-Out-process 5. Sit down with one of the programmers / developers Create a non-threatening, friendly atmosphere (Coffee? Tee? Beer?) Start asking about what he is working on (you need to understand the technology well enough to ask questions!): Can he explain what the functions are supposed to do? Is he aware of how his function fits into the big picture? I.e. do he know all (the main) interfaces? And data flows? Do he know where to find the information he needs? Can he draw a (high level) diagram of process / flow? Is he aware of data being read (used), modified, deleted etc. What kind of error / exception handling has been implemented? What are the complex and / or complicated part(s) of his functions? Why? What kind of Unit Testing has been / will be performed? How? Look for Uncertainty, Contradictions, Inconsistency etc. They are all indicators of potential problems; misunderstandings, incorrect implementation and possibly poor unit testing Go and do some testing in these problem areas! Management 2002 Amland Consulting 2-43

Heuristic Risk-Based Testing Outside-in What things are associated with this kind of risk? Predefined set of potential risks Quality Criteria Categories Generic Risk Lists Risk Catalogs James Bach, 1999a Management 2002 Amland Consulting 2-44

Heuristic Risk-Based Testing Quality Criteria Categories Capability Reliability Usability Performance Installability Compatibility Supportability Testability Maintainability Portability Localizability Risk Catalogues See Testing Computer Software by Kaner, Falk and Nguyen Generic Risk Lists Complex New Changed Upstream Dependency Downstream Dependency Critical Precise Popular Strategic Third-party Distributed Buggy Recent Failure James Bach, 1999a Management 2002 Amland Consulting 2-45

ManagementCommunicating Heuristic RBT ET Risk Watch List Risk / Task Matrix Component Risk Matrix Components Printing Report Generation Installation Clipart Library Risk Normal Higher Lower Lower Risk Heuristics Distributed, popular New, strategic, third-party, complex, critical Popular, usability, changed Complex James Bach, 1999a Management 2002 Amland Consulting 2-46

Other Risk Catalogues www.bugnet.com www.cnet.com (news) Testing Computer Software Includes Kaner, Falk and Nguyen s list of bugs Software Testing Techniques Includes Boris Beizer s taxonomy of bugs Computer-Related Risks Peter G. Neumann Managing the Testing Process Includes Rex Black s list of common software failures Management 2002 Amland Consulting 2-47

WEB E-business Risks No second chance risks outside control Unlimited potential users anyone can visit Many Potential Points of Failure No control over client platform No control over client configuration Which browser to support? Required plug-ins not available The context of web transactions is a risk area Cookies Network connections Firewalls Usability Internationalisation Etc... Paul Gerrard Management 2002 Amland Consulting 2-48

1. 2. 3. 4. 5. 2. and Techniques 2.1 Testing Fundamentals in 1 hour! 2.2 Test Execution 2.3 Heuristic Risk-Based Testing 2.4 and Metrics Management and Techniques 2002 Amland Consulting 2-49

Establish a strong and supportive testing role. Re-visit your strategy and mission every day. Advocate for bugs and testability. Maintain three lists: risks, issues, coverage. Test with your team. Continuously report the status and dynamics of testing. From Rapid Software Testing, copyright 1996-2002 James Bach Management 2002 Amland Consulting 2-50

Metrics - Overview Defect Analysis Defect Density Defect Aging Defect Trends Planning and Progress tracking Planned vs. Actual Progress indicators Number of hours (days) remaining? Estimated to Complete (EtC) Iterative Projects Trend Analysis Coverage Identifies how complete the testing is (will be) Defect Detection Percentage DDP (Overall test efficiency) Further reading: Fenton (1997) Management 2002 Amland Consulting 2-51

Planning and Progress Tracking On-line Test Cases Completed Number of Test Cases Planned Executed QAed Date Management 2002 Amland Consulting 2-52

Progress Indicators - To be vs. Actual To Be Fixed and Actually Fixed To be fixed vs. Actually fixed Number of Faults To Be Fixed Actually fixed To be Re-tested vs. Actually Re-tested To Be Re-tested, Actually Re-tested and Rejected Number of Faults To be Re-tested Act. Re-tested Rejected Management 2002 Amland Consulting 2-53

Progress Indicators - Hours Hours Indicators Number of hours for finding one fault and for fixing one Hours per Fault for Test and Fix Hours per Fault Fix Test Date Management 2002 Amland Consulting 2-54

Estimated to Complete ETC for system test based on: Number of hours testing per fault found Number of hours fixing per fault Number of faults found per function Number of fixes being rejected Number of remaining tests (functions to be tested) Hours ETC Calculated ETC and Actual Hours Actual to Complete at Time t Date Estimated to Complete at Time t Management 2002 Amland Consulting 2-55

Iteration Trend Analysis Verification Points 7 0 60 0 50 0 40 0 30 0 20 0 10 0 0 0 Iteration 1 1 2 3 4 5 Approx. 7200 Approx. 5000 Days Verification Points 7 0 60 0 50 0 40 0 30 0 20 0 10 0 0 0 Iteration 2 1 2 3 4 5 Days Approx. 6200 Approx. 4200 Verification Points 7 0 60 0 50 0 40 0 30 0 20 0 10 0 0 0 Iteration 3 Approx. 6900 Approx. 5700 1 2 3 4 5 Management 2002 Amland Consulting 2-56 Days

Test planning and execution coverage Identify units to be tested: Use Cases Use Case flows Requirements Etc. Build and prioritise test Cases (based on risk) Track: Test preparation/planning: number of test cases built for each priority Test Execution: number of tests executed for each priority, with status Management 2002 Amland Consulting 2-57

1. 2. 3. 4. Defect Detection Percentage 5. TotalNumbe rofbugsintest + TotalNumberOfBugsInProduction TotalNumberOfBugsInTest Management 2002 Amland Consulting 2-58

Example: Requirements Coverage Diagram generated from TestDirector, Mercury Interactive Management 2002 Amland Consulting 2-59

Risk - ordinal scale! Ordinal Scale: The numbers are ordered but no arithmetic can be applied. A Risk Exposure of 3 IS bigger then 2 We do NOT know HOW much bigger! Management 2002 Amland Consulting 2-60

Ordinal vs. Linear Scales Average OK? Management 2002 Amland Consulting 2-61

Gilb s Law Management 2002 Amland Consulting 2-62

Einstein... Management 2002 Amland Consulting 2-63

1. 2. 3. 4. 5. Management Summary Introduction ET Planning, Exec. and Testing Fundamentals in 1 hour! Black Box and Glass Box testing Test Planning and Management Test Test Execution Heuristic Risk-Based Testing Progress Tracking and Metrics Management 2002 Amland Consulting 2-64