Click here to review SW Testing Techniques. Software Testing Strategies Chapter 18

Similar documents
Software Testing Interview Questions

CS 451 Software Engineering Winter 2009

Software Testing Strategies and Techniques

Introduction to Computers and Programming. Testing


Module 10. Coding and Testing. Version 2 CSE IIT, Kharagpur

Software Testing. Definition: Testing is a process of executing a program with data, with the sole intention of finding errors in the program.

Software Testing. Quality & Testing. Software Testing

APPROACHES TO SOFTWARE TESTING PROGRAM VERIFICATION AND VALIDATION

Different Approaches to White Box Testing Technique for Finding Errors

1 White-Box Testing by Stubs and Drivers

Karunya University Dept. of Information Technology

Formal Software Testing. Terri Grenda, CSTE IV&V Testing Solutions, LLC

Software testing. Objectives

Module 10. Coding and Testing. Version 2 CSE IIT, Kharagpur

Test Specification. Introduction

Definitions. Software Metrics. Why Measure Software? Example Metrics. Software Engineering. Determine quality of the current product or process

Introduction to Automated Testing

Testing Introduction. IEEE Definitions

Software Engineering. Software Testing. Based on Software Engineering, 7 th Edition by Ian Sommerville

White Papers: Unit Testing. Unit Testing

Regression Testing Based on Comparing Fault Detection by multi criteria before prioritization and after prioritization

Software Engineering. How does software fail? Terminology CS / COE 1530

Standard for Software Component Testing

SOFTWARE REQUIREMENTS

Basic Unix/Linux 1. Software Testing Interview Prep

Test case design techniques I: Whitebox testing CISS

Chapter 11, Testing, Part 2: Integration and System Testing

Improved Software Testing Using McCabe IQ Coverage Analysis

Mining Metrics to Predict Component Failures

National University of Ireland, Maynooth MAYNOOTH, CO. KILDARE, IRELAND. Testing Guidelines for Student Projects

Advanced Software Test Design Techniques Use Cases

Chapter 11: Integrationand System Testing

IEEE ComputerSociety 1 Software and Systems Engineering Vocabulary

How To Improve Software Quality

Software Testing, Mythology & Methodologies

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

Test case design techniques II: Blackbox testing CISS

Software Testing. Motivation. People are not perfect We make errors in design and code

CSTE Mock Test - Part I - Questions Along with Answers

Software Metrics. Lord Kelvin, a physicist. George Miller, a psychologist

Contents. Introduction and System Engineering 1. Introduction 2. Software Process and Methodology 16. System Engineering 53

Test Case Design Techniques

CSTE Mock Test - Part III Questions Along with Answers

Chapter 11: Integration- and System Testing

ABSTRACT. would end the use of the hefty 1.5-kg ticket racks carried by KSRTC conductors. It would also end the

International Software Test Institute

Presentation: 1.1 Introduction to Software Testing

Software Engineering I: Software Technology WS 2008/09. Integration Testing and System Testing

Test case design techniques I: Whitebox testing CISS

An Analysis on Objectives, Importance and Types of Software Testing

CS SOFTWARE TESTING

Software Development: The Waterfall Model

Comparing the Testing Approaches of Traditional, Object-Oriented and Agent- Oriented Software System

Software Testing & Analysis (F22ST3): Static Analysis Techniques 2. Andrew Ireland

Levels of Software Testing. Functional Testing

Logistics. Software Testing. Logistics. Logistics. Plan for this week. Before we begin. Project. Final exam. Questions?

TESSY Automated dynamic module/unit and. CTE Classification Tree Editor. integration testing of embedded applications. for test case specifications

ISTQB Certified Tester. Foundation Level. Sample Exam 1

Testing Overview and Black-Box Testing Techniques

Analysis / Design. Traditional Development. Process models. Common Methodologies. Common Approach. Analysis: DFD. Traditional Software Development 1

Testing, Debugging, and Verification

Measuring Software Complexity to Target Risky Modules in Autonomous Vehicle Systems

Software Quality Assurance Plan

Example Software Development Process.

Software Implementation Technology report

How To Test Automatically

Contents. System Development Models and Methods. Design Abstraction and Views. Synthesis. Control/Data-Flow Models. System Synthesis Models

MOBILE APPLICATION TESTING ENGINEER

Chapter 8 Software Testing

Metrics in Software Test Planning and Test Design Processes

8. Master Test Plan (MTP)

Model Checking based Software Verification

Testing and Inspecting to Ensure High Quality

How To Write Software

Fourth generation techniques (4GT)

Software Engineering Question Bank

R214 SPECIFIC REQUIREMENTS: INFORMATION TECHNOLOGY TESTING LABORATORY ACCREDITATION PROGRAM

Acknowledgement. Software Engineering. CS 3141: Team Software Project Introduction

SOFTWARE-IMPLEMENTED SAFETY LOGIC Angela E. Summers, Ph.D., P.E., President, SIS-TECH Solutions, LP

Die wichtigsten Use Cases für MISRA, HIS, SQO, IEC, ISO und Co. - Warum Polyspace DIE Embedded Code-Verifikationslösung ist.

EVALUATING METRICS AT CLASS AND METHOD LEVEL FOR JAVA PROGRAMS USING KNOWLEDGE BASED SYSTEMS

應 用 測 試 於 軟 體 發 展 生 命 週 期. Testing In The Software Development Life Cycle

Introduction to Functional Verification. Niels Burkhardt

Static Analysis Best Practices

Fundamentals of Measurements

Software Testing Tutorial

AUTOMATED TESTING and SPI. Brian Lynch

SECTION 4 TESTING & QUALITY CONTROL

5. A full binary tree with n leaves contains [A] n nodes. [B] log n 2 nodes. [C] 2n 1 nodes. [D] n 2 nodes.

Testing Tools Content (Manual with Selenium) Levels of Testing

Software Engineering. Software Development Process Models. Lecturer: Giuseppe Santucci

Verification and Validation of Software Components and Component Based Software Systems

Software Engineering/Courses Description Introduction to Software Engineering Credit Hours: 3 Prerequisite: (Computer Programming 2).

The Theory of Software Testing

What is a life cycle model?

JOURNAL OF OBJECT TECHNOLOGY

Sample Testing Using Cleanroom

Algorithms, Flowcharts & Program Design. ComPro

The Software Process. The Unified Process (Cont.) The Unified Process (Cont.)

Transcription:

Click here to review SW Testing Techniques Software Testing Strategies Chapter 18 1

Review SW Testing Techniques Chapter 17 2

Software Testing Techniques Provide system guidance for designing tests that: Exercise the internal logic of a program White Box test cases design techniques Exercise the input and output Requirements of a program Black Box To Uncover ERRORS / BUGS / MISUNDERSTANDING OF REQUIREMNTS ETC. 3

Software Testing Techniques Execute the program before the customer. To reduce the number of errors detected by customers. In order to find the highest possible number pf errors software testing techniques must be used 4

Software Testing Techniques Constructive Phases Destructive Phase Analysis Design Implementation Test Requirement Spic. Design Document Code Test Cases 5

What is testing and why do we do it? Testing is the filter to catch defects before they are discovered by the customer Every time the program is run by a customer, it generates a test-case. We want our test cases to find the defects first. Software development is a human activity with huge potential for errors Testing before release helps assure quality and saves money 6

Test Cases Design Test cases should be designed to have the highest likelihood of finding problems Can test by either: Black-box - using the specifications of what the software should do Tests are derived from the I/O specification. Used in most functional tests. Other names: data-driven, input/output-driven. White-Box - testing internal paths and working of the software Examine internal program structure and derive tests from an examination of the program s logic. Used to develop test cases for unit and integration testing Other names: Glass-box, Logic-driven, Structural. 7

White-Box Testing Uses the control structure of the program/design to derive test cases We can derive test cases that: Guarantee that all independent paths within a module have been visited at least once. Exercise all logical decisions on their TRUE or FALSE sides Execute all loops at their boundaries 8

A few White-Box Strategies Statement Requires each statement of the program to be executed at least once. Branch Requires each branch to be traversed at least once. Multi-condition Requires each condition in a branch be evaluated. 9

More White-Box Strategies Basis Path Execute all control flow paths through the code. Based on Graph Theory. Data Flow Thomas McCabe s Cyclomatic Complexity: V(g) : #edges - #nodes + 2 Cyclomatic complexity is a SW metric that measures the complexity of a program. The larger V(g) the more complex. Selects test data based on the locations of definition and the use of variables. 10

Statement Coverage The criterion is to require every statement in the program to be executed at least once Weakest of the white-box tests. Specified by the F.D.A. as the minimum level of testing. 11

Branch Coverage This criterion requires enough test cases such that each decision has a TRUE and FALSE outcome at least once. Another name: Decision coverage More comprehensive than statement coverage. 12

Branch Coverage Example: void example(int a, int b, float *x) { 1 if ((a>1) && (b==0)) 2 x /= a; 3 if ((a==2) (x > 1) 4 x++; } Test case(s) 1. a=2, b=0, x=3 2. a=3, b=1, x=1 13

Branch Coverage Test Case 1. a=2, b=0 & x=3 2. a=3, b=1 & x=1 Coverage 1. ace 2. abd What happens with data that takes: abe, or a a > 1 && b==0 b d No a==2 x > 1 No Yes Yes c x /= a; e x++ acd 14

Basis Path Execute all independent flow paths through the code. Based on a flow graph. An independent flow path is one that introduces at least 1 new set of statements or conditions Must move along at least 1 new edge on flow graph Flow graph shows the logical control flow using following notation: while Sequence If until 15

Corresponding Flow Graph i=1; total.input = total.valid = 0; sum = 0; 1. 1 2 value[i] <> - 999 total.input < 100 2. 3. no 10. total.input ++; 4. total.valid > 0 3 N o Yes 11. 12. 12 10 11 4 5 No - value[i] >= min && value[i] <= max Yes sum=sum+valu e[i]; 5. 6. 7. aver=-999 Done aver = sum/ total.valid; 13. 13 6 i++; 8. 8 7 Enddo 9. 9 16

Number of Paths V(g) = E - N + 2 R6 1 2 17-13 + 2 = 6 R = 6 R3 3 10 4 R1 12 R2 11 5 13 R5 6 8 R4 7 9 17

Black-Box Testing Focuses on functional requirements of the software without regard to the internal structure. data-driven, input/output-driven or behavior testing Used in most system level testing Functional, Performance Recovery Security & stress Tests set up to exercise full functional requirements of system 18

Black Box Testing Find Errors in... Incorrect or missing functions (compare to white box) Interface errors Errors in External Data structures Behavior performance problems (Combinations of input make it behave poorly). Initialization and Termination errors (Sensitive to certain inputs (e.g., performance) Blackbox done much later in process than white box. 19

A few Black-box Strategies Exhaustive input testing A test strategy that uses every possible input condition as a test case. Ideal Not possible! Random Test cases are created from a pseudo random generator. Broad spectrum. Not focused. Hard to determine the result of the test. 20

Black-box Strategies Equivalence Partitioning A black-box testing method that divides the input domain of a program into classes of data which test cases can be derived. Boundary Value Analysis A test case design technique that complements equivalence partitioning, by selecting test cases at the edges of the class. 21

Boundary Value Analysis Experience shows that test cases exploring boundary conditions have a high payoff. E.g., Most program errors occur in loop control. Different from equivalence partitioning: Rather than any element in class, BVA selects tests at edge of the class. In addition to input condition, test cases can be derived for output conditions. Similar to Equivalence partitioning. First identify Equivalence classes, then look at the boundaries. 22

Test Case Documentation Minimum information for a test case Identifier Input data Expected output data Recommended to add the condition being tested (hypothesis). Format of test case document changes depending on what is being tested. Always include design worksheets. 23

Simple Test Case Format Id Condition Input Data Expected 24

Test Case Formats Testing worksheet Test Case Identifier (serial number) Condition (narrative or predicate) Input (Stimuli data or action) Expected Output (Results) Test Results Actual Output (Results) Status (Pass/Fail) 25

Use this Test Case format for your Project Test Name/Number Test Objective Test Description Test Conditions Expected Results Actual Results 26

ANSI/IEEE Test Case Outline Test-case-specification Identifier A unique identifier Test Items Identify and briefly describe the items and features to be exercised by this case Input Specifications Specify each input required to execute the test case. Output Specifications Specify all of the outputs and features required of the test items. 27

ANSI/IEEE Test Case Outline Environmental needs Hardware Software Other Special procedural requirements Describe any special constraints on the test procedures which execute this test case. Interfaces dependencies List the id s of test cases which must be executed prior to this test case 28

Software Testing Strategies Chapter 18 29

Software Testing Strategies A formal plan for your tests. What to test? What to test when new components are added to the system? When to start testing with customer? Testing is a set of activities that can be planned in advance and conducted systematically. 30

Software Testing Strategies Testing begins in the small and progresses to the large. Start with a single component and move upward until you test the whole system. Early tests detects design and implementation errors, as move upward you start uncover errors in requirements. 31

Software Testing Strategies Characteristics of testing strategies: Testing begins at the component level, for OO at the class or object level, and works outward toward the integration of the entire system. Different testing techniques, such as white-box and black-box, are appropriate at different times in the testing process. For small projects, testing is conducted by the developers. For large projects, an independent testing group is recommended. Testing and debugging are different activities, but debugging must be included in any testing strategy. 32

Testing can be used for verification and validation (V&V) of the Software: Verification - Are we building the product right? Did the software correctly implements a specific function. Validation - Are we building the right product? Has the software been built to customer requirements Traceability The goal is to make sure that the software meets the organization quality measures 33

Conventional SW Testing can be viewed as a series of four steps: System Engineering Analysis Design Implementation System Testing & Validation Testing Integration Testing Unit Testing 34

Unit Testing Testing focuses on each component unit individually. Unit testing heavily depends on white-box testing techniques. Tests of all components can be conducted in parallel. Use the component level design description, found in the design document, as a guide to testing. 35

Unit Testing Things to consider as you write test cases for each component: Interface to the module - does information flow in and out properly? Local data structures - do local structures maintain data correctly during the algorithms execution? Boundary conditions - all boundary conditions should be tested. Look at loops, First and last record processed, limits of arrays This is where you will find most of your errors. Independent paths- Try to execute each statement at least once. Look at all the paths through the module. Error Handling paths- Trigger all possible errors and see that they are handled properly, messages are helpful and correct, exception-condition processing is correct and error messages identify the location of the true error. 36

Unit Testing Things to consider as you write test cases for each component: Stubs and Drivers: Additional SW used to help test components. Driver a main program that passes data to the component and displays or prints the results. Stub - a "dummy" sub-component or sub-program used to replace components/programs that are subordinate to the unit being tested. The stub may do minimal data manipulation, print or display something when it is called and then return control to the unit being tested. 37

Stubs and Drivers Driver Module to be tested Results Stub Stub 38

Conventional SW Testing can be viewed as a series of four steps: System Engineering Analysis Design Implementation System Testing & Validation Testing Integration Testing Unit Testing 39

Integration Testing Testing focuses on the design and the construction of the SW architecture. This is when we fit the units together. Integration testing includes both verification and program construction. Black-box testing techniques are mostly used. Some white-box testing may be used for major control paths. Look to the SW design specification for information to build test cases. 40

Integration Testing Different approaches to Integration: Top-down Bottom-up Sandwich Big Bang! 41

Top-down Approach Modules are integrated by moving down through the control hierarchy, beginning with the main program. Two approaches to Top-down integration: Depth-first integration Integrate all components on a major control path. Breadth-first integration Integrates all components at a level and then moves down. 42

Top-down Integration Steps Main is used as a test driver and stubs are used for all other components Replace sub-ordinate stubs one at a time (the order depends on approach used depth or breadth) with actual components. retest as each component is integrated. On completion of a set of tests replace stubs To regret ion testing to make sure new modules didn't introduce new errors 43

Depth-first Example M1 M2 M3 M4 M5 M6 44

Breadth-first integration M1 M2 M3 M4 M5 M6 Stub As You Go 45

Top-down Integration Advantages: Major control is tested first, since decision making is usually found in upper levels of the hierarchy. If depth-first is used, a complete function of the SW will be available for demonstration. Disadvantages: Many stubs may be needed. Testing may be limited because stubs are simple. They do not perform functionality of true modules. 46

Integration Testing Different approaches to Integration: Top-down Bottom-up Sandwich Big Bang! 47

Bottom-up Approach Construction and testing begins at the lowest levels in the program structure. Implemented through the following steps: 1. Low level components are combined into clusters or builds that perform a specific SW function. 2. A Driver is written to control the test case input and output. 3. The cluster or build is tested. 4. Drivers are removed and the clusters or builds are combined moving upward in the program structure. GOTO TEXTBOOK PAGE 491 48

Bottom-up Approach Advantages: Not need or for stubs. Disadvantage: Controllers needed 49

Integration Testing Different approaches to Integration: Top-down Bottom-up Sandwich Big Bang! 50

Sandwich approach The sandwich approach is the best of both worlds, and minimizes the need for drivers and stubs. A top-down approach is used for the top components and bottom-up is used for the lower levels. 51

Big Bang approach Put the whole system together and watch the fire works! (Approach used by may under-grads.) Avoid this, and choose a strategy. 52

Regression Testing Each time the software changes I.e when new component is added. To make sure that new changes didn t break existing functionality that used to work fine. Re-Execute some tests to capture errors as a result of side effects of new changes. 53

Validation Testing Validation of the requirements 54

Validation Testing Testing ensures that all functional, behavioral and performance requirements of the system were satisfied. These were detailed in the SW requirements specification. Look in the validation criteria section. Only Black-box testing techniques are used. The goal is to prove conformity with the requirements. 55

Validation Testing For each test case, one of two possibilities conditions will exist: 1. The test for the function or performance criteria will conform to the specification and is accepted. 2. A deviation will exist and a deficiency is uncovered. This could be an error in the SW or a deviation from the specification SW works but it does not do what was expected. 56

Validation Testing Alpha and Beta Testing A series of acceptance tests to enable the customer to validate all requirements. Conducted by the end users and not the developer/tester/system engineer. Alpha At the developer site by customer Bets At the customer site by the end user. Live testing 57

System Testing Verifies that the new SW system integrates with the existing environment. This may include other systems, hardware, people and databases. 58

System Testing Black-box testing techniques are used. Put your software in action with the reset of the system. Many types of errors could be detected but who will accept responsibility? 59

System Testing Types Recovery Testing - force the SW to fail in a number of ways and verify that it recovers properly and in the appropriate amount of time. Security Testing - attempt to break the security of the system. Stress Testing- execute the system in a way that requires an abnormal demand on the quantity, frequency or volume of resources. Performance Testing- For real-time and embedded systems, test the run-time performance of the system. 60

System Testing Types Regression Testing again re-executing a subset of all test cases that have already been conducted to make sure we have not introduced new defects. 61

Click here for OO Testing Chapter 23 62