Presentation Paper Bio Return to Main Menu P R E S E N T A T I O N T1 Thursday, Nov 11, 1999 The New Frontier... Third Generation Software Testing Automation Ed Kit International Conference On Software Testing, Analysis & Review NOV 8-12, 1999 BARCELONA, SPAIN
The New Frontier Third Generation Software Testing Automation Edward Kit www.sdtcorp.com sdt@sdtcorp.com
Essential Testing Challenges! How to design and document inspectable tests? " What is an effective test automation architecture? # How to integrate test design and automation? Key Benefits: - Better, more maintainable tests - Automation achieved with fewer technical testers - Reduced regression, function, system testing costs - Higher motivation for participants Slide 2
Test Automation: Serious Problems Lack of an effective test automation architecture Lack of required competencies: Test Design Technical Automation Application Using capture/playback at the wrong time Lack of sufficient resources: Not enough time for automation implementation Ratio of testers to developers Dedicated capital equipment Slide 3
Test Architecture Key Recommendations Create interfaces between key framework components Separate, yet integrate test design and automation Recognize that the proper use of several tools is essential Allow for capture/playback tool independence Provide infrastructure for effective capture/playback Create one engine that can process all automated tests Customize framework components for the organization Evolve the architecture as test technology matures Slide 4
Proven Test Architecture TestFrame is an example of a proven test architecture Created in 1994 by Hans Buwalda at CMG The TestFrame approach has been used successfully by hundreds of projects in Europe SDT successfully used TestFrame to test ReviewPro, a Web-based Enterprise Verification Application that brings automation to Technical Reviews and Inspections SDT has extended TestFrame to include the SDT Test Design Templates SDT and CMG are working together to evolve TestFrame Has been used for On-line, Web, Batch, API, Embedded for function, system, acceptance test Reference [Buwalda, 1998] Slide 5
Automation Recommendation #1 Just Say No to C/P! Capture Playback Slide 6
Automation Recommendation #2 Separate and Bridge Create User Scenarios from test cases using action words Test Design Test Automation Create Test Cases using design techniques and spreadsheets Slide 7 Process User Scenarios
Automation Architecture Overview Test Architecture Test Plan Feature Hierarchy Test Design Test Cases Action Word Dictionary User Scenarios Automation Design Automation Script Execute Tests Slide 8 Test Effectiveness
Automation Architecture - Tools That Can Help Test Architecture: TestFrame, Visio, Word; For Architecture Review: ReviewPro Test Plan: Word, Visio Feature Hierarchy: Excel Test Design: Excel For Design Review: ReviewPro Test Cases: AETGWeb Action Word Dictionary: Word User Scenarios: Excel Automation Design: Word Automation Script: TestFrame, Capture/Playback Tool Execute Tests: TestFrame Slide 9 Test Effectiveness: Word
Common Design Techniques Summary! Equivalence Partitioning - single input conditions " Equivalence Partitioning - combinations # Boundary Value Analysis $ Output Forcing % State Models & Error Guessing References: [Jorgensen, 1995], [Kit, 1996] Slide 10
Feature Hierarchy Spreadsheet ReviewPro Key Features Primary Forms Reviewer s Log Form Drop Down Lists Disposition Code Severity Status Entry Type Document Under Review Edit Fields Slide 11 Location Summary Link Fields Hot Doc Link General Link GUI Attributes WorkSheet DDDC DDSE DDST DDET DDDU EFLO EFSU LFHD LFGL Priority High High Med Med High High Med High High
Test Design Spreadsheet Template Matrix ID: Matrix Summary: Author: Test Design: Risk Analysis: Test Case ID: Test Case Validity: Priority: Test Condition Feature Hierarchy: Date: Technique Impact Version: Feature Combination Likelihood Expected Results: Slide 12
Test Design Template Choices Test Design: Technique: Equivalence Class Boundary Value Output Condition Special Value State Transition Feature Combination: Yes No Reference: [Kit, 1999] Risk Analysis: Impact: High Medium Low Unknown Likelihood: High Medium Low Unknown Test Case Validity: Valid Invalid Priority: High Medium Low Slide 13
Risk Management: Failure Impact & Fault Likelihood Failure Impact: How significant is the impact if the features addressed in this test design fail? For example: system goes down, someone dies, basic application fails, money is lost, penalty is applied, users sue Fault Likelihood: How likely is it that people will make mistakes for the features addressed in this test design that will lead to software faults that will lead to product failures? Examples of indicators that contribute to an increased likelihood of faults getting into the system: weakness in this part of the system architecture, inexperienced team member, geographically distributed team, aggressive schedule Slide 14
Test Design Example - The Classic Triangle Routine The routine accepts three integer values as input; these are interpreted as representing the lengths of the sides of a triangle. All sides must be at least 1 and have an upper limit of 200. The sides are entered as a comma-delimited list, e.g., 3, 4, 5 by the operator using a keyboard. The routine outputs a message that states whether the triangle is scalene (no sides equal), isosceles (two sides equal) or equilateral (all sides equal). The output message is one of the following strings: Equilateral, Isosceles, Scalene, NoTriangle, IllegalInput. Slide 15
Test Case Validity: Priority: Test Condition Side A = Side B = Side C = Triangle Test Design Matrix ID: BFC Feature Hierarchy: Triangle/Baseline Matrix Summary: Triangle Baseline Fundamental Cases Author: Ed Kit Date: 4/10/99 Version: 2.5 Test Design: Technique Equiv. Class Feature Combination No Risk Analysis: Impact High Likelihood Low Test Case ID: BFC01 BFC02 BFC03 BFC04 BFC05 BFC06 Test Description: Expected Results: Slide 16 Valid Valid Valid Invalid Invalid Invalid High High High High High High 100 100 30 100 79 201 100 100 40 68 24 190 100 10 50 190 null 60 Typical Typical Typical Only Two Isosceles Scalene Illegal Sides Typical Equilat. Output = Equilateral Output = Isosceles Output = Scalene Output = NoTriangle Output = IllegalInput One Side Too Big Output = IllegalInput
Triangle Boundary Value Analysis Test Design Test Case ID: BVA01 BVA02 BVA03 BVA04 BVA05 BVA06 BVA07 BVA08 BVA09 Test Case Validity: Val Val Inv Inv Val Inv Val Val Inv Priority: Test Condition Hi Hi Hi Hi Hi Hi Med Med Hi Side A = 1 200 201 201 1 1 1 199 1 Side B = 1 200 190 201 200 1 2 2 2 Side C = 1 200 12 201 200 200 2 199 1 All sides min * All sides max * 1 side max+1;legal * All sides max+1 * Extreme valid Isosc. * 2 sides min, 1 max * Small valid Isosceles * Isosceles near limit * Nearly minimum * Expected Results: Slide 17 Output = Equilateral Output = Equilateral Output = IllegalInput Output = IllegalInput Output = Isosceles Output = NoTriangle Output = Isosceles Output = Isosceles Output = NoTriangle
User Scenario A User Scenario is formed by stringing together test cases previously defined in the test design matrix TC023 TC098 TC135 TC257 Slide 18
User Scenario ID: Document User Scenarios US01 US02 US03 US04 US05 TBD Scenario Validity: Valid Valid Valid Invalid Invalid Test Case Priority: High High High Med. High TC023 1 1 2 -- 2 TC098 2 3 1 3 1 TC135 3 2 3 2 -- TC257 4 4 4 1 3 Typical Thread 1 * Typical Thread 2 * Typical Thread 3 * Illegal Thread 1 * Illegal Thread 2 * Expected Results: Slide 19 Expected Result 01 Expected Result 02 Expected Result 03 Expected Result 04 Expected Result 05
Action Words: Key to the Bridge Action Words: Establish a high-level application usage abstraction Standardize application actions Enable communication between Test Design and Test Case Processor Tips for designing action words: Keep the abstraction at a high level Determine what set of user actions the test tool should perform with a specific Action Word Scope of the test determines the Action Word level Create an Action Word Dictionary Translate each user scenario into an Action Word spreadsheet Slide 20
Action Word Spreadsheet Example User Scenario US0129 version 1.1 date 2/6/99 author Hans Buwalda Section 1 Test Case CL02 Product codes must be unique product code product colour type weight enter product p2 nail black AAX 1 expect message Transaction executed correctly product code product colour type weight insert product p2 nut grey AAX 1 expect message Value in field product code not allowed A product with another code is allowed product code product colour type weight enter product p3 nail black AAX 1 Check for presence of product product code product colour type weight check product p3 nail black AAX 1 delete button Section 2 Test Case CL13 All fields need to be filled. Slide 21
More Uses For Spreadsheet Test Design Configuration Design Performance Testing: Load and Stress Context Switching Client / Server Design and re-use test design components: Test cases -> User Scenarios -> Stress Scenarios Slide 22
Typical Web Configuration Combinations A partial set of configuration choices for ReviewPro, a Web-based Review and Inspection application, include: Client Browsers (5): Netscape (3.0, 4.0, 4.5), Internet Explorer (3.0, 4.0) Client Operating Systems (5): Windows (3.x, 95, 98, NT), Sun (Solaris 2.51) Application Server Operating Systems (3): NT (3.51, 4.0 with Service Pack 3, 4.0 with Service Pack 4) Database Types (4): Sybase, Oracle, Informix, Microsoft Web Server Software (4): MS IIS, Apache, Netscape (Enterprise, FastTrack) Slide 23
The Combination Mess The goal: re-run a core set of tests in the right mix of configurations to achieve effective coverage and find defects. From the previous slide, there are: 5 * 5 * 3 * 4 * 4 = 1200 possible configurations. Assume there are 500 core tests. This results in the need to run: 1200 * 500 = 600,000 tests! Using a real-world web application, ReviewPro, the calculation resulted in 4,800,000 tests! There must be a more practical solution! (There is.) Slide 24
Dealing with the Combination Mess Reductions must be made, yet sufficient coverage is required. Choices include: Use a spreadsheet to manually select a reasonable subset. Use a tool which automatically selects a small number. Slide 25
Design Configurations Using Spreadsheets ReviewPro Configuration Test Categories: Combination ID: RCT01 RCT02 Combination Validity: Valid Valid Priority: High High RCT03 Valid High RCT04 Invalid High RCT05 Valid High Client Browser Client OS Appl. Server OS Database Web Server SW Expected Results: NS 4.5 IE 4.0 NS 4.0 IE 3.0 NS 3.0 Win98 WinNT Solaris 2.5 Win95 Win3.x NT 4 SP 3 NT 4 SP 4 NT 3.51 NT 4 SP 3 NT 4 SP 3 Oracle Sybase Microsoft Informix Oracle MS IIS Enterprise FastTrack Apache MS IIS Expected Result RCT01 Expected Result RCT02 Expected Result RCT03 Expected Result RCT04 Expected Result RCT05 Slide 26
Reducing Test Cases - Reducing Testing Costs Combinatorial design theory can be used to reduce the number of tests when an astronomical number of test scenarios are possible. Bellcore developed a system called AETG (Automatic Efficient Test Generator) to generate tests for unit, system, and interoperability testing. AETGWeb is commercially available as a service. Customers interact with the AETG software on the Internet on a secured connection. Users enter test specifications, and test cases are returned. The goal is to substantially reduce testing costs. References: [Dalal, 1996], [Sherwood, 1994] Slide 27
AETG Configuration Browser Client_OS Server_OS Database Web_Server_SW 1 NS4.5 Win3.x NT3.51 Informix FastTrack 2 NS4 Solaris 2.51 NT4SP4 Microsoft FastTrack 3 IE3 Win3.x NT4SP4 Sybase Apache 4 NS4.5 Win95 NT3.51 Oracle Apache 5 IE4 WinNT NT3.51 Microsoft Enterprise 6 NS4 Win98 NT4SP4 Oracle FastTrack 7 NS3 Win98 NT3.51 Sybase Enterprise 8 IE3 WinNT NT4SP4 Oracle FastTrack 9 IE4 Solaris 2.51 NT4SP4 Informix Apache 10 NS4.5 Win98 NT4SP4 Sybase FastTrack 11 IE3 Solaris 2.51 NT4SP3 Informix Enterprise 12 IE3 Win95 NT4SP3 Microsoft MSIIS 13 IE4 Win3.x NT4SP4 Oracle Enterprise 14 NS3 WinNT NT4SP3 Sybase Apache 15 MSIIS 16 Apache 17 Nearly 50:1 Reduction: Enterprise 18 25 configurations instead of 1200 MSIIS 19 Apache 20 12,500 tests instead of 600,000 MSIIS 21 Enterprise 22 FastTrack 23 MSIIS 24 NS3 Win3.x NT4SP3 Microsoft FastTrack 25 NS4 WinNT NT3.51 Informix Enterprise Slide 27a
A Spreadsheet for Load & Stress Tests Matrix ID: MLS Feature Hierarchy: Master Load & Stress Matrix Summary: Ensure that representative user scenarios scale Risk Analysis: Impact High Likelihood High Stress Test ID: MLS01 MLS02 MLS03 MLS04 Validity: Valid Valid Valid Invalid Priority: High High High High User Scenario US001 US009 US012 Entry Level System Normal Loaded System Max Valid US001 Too Many Users Expected Results: Slide 28 # of Users 10 15 10 * Expected Result MLS01 # of Users # of Users # of Users 100 170 200 150 100 150 100 80 200 * Expected Result MLS02 * Expected Result MLS03 * Expected Result MLS04
TestFrame Engine Context Switching switch context Mortgages enter client John Jones 200000 30 switch context Loans enter client John Jones 15000 Sets of Action Words navigation Slide 29 target system 1 target system 2 target system n
TestFrame Engine Client Server begin cluster Mortgages enter account John Jones 200000 30 end cluster start server James Mortgages client navigation server navigation server navigation server navigation Slide 30 target system
TestFrame Roles and Responsibilities Test Architect -- Creates the overall approach to verification and validation, including an integrated approach to test process and automation Test Planner/Manager -- Provides test planning, schedule, scope, resources, etc. Test Automation Engineer -- Creates Test Case Processor script Test Designer -- Creates and documents test design, participates in test design inspection Test Executor -- Runs and evaluates tests Slide 31
Automation Architecture Roles Test Architecture: Architect Test Plan: Manager Feature Hierarchy: Designer Test Design: Designer Test Cases: Designer Action Word Dictionary: Designer User Scenarios: Designer Automation Design: AE Automation Script: AE = Automation Engineer Execute Tests: Executor Slide 32 Test Effectiveness: Manager
Case Study Key Test Tool Usage Requirements Management e.g., DOORS, Requisite Pro, RTM Test Design Spreadsheet Template e.g., Excel Test Case Processor Application Specific Code Requirements Repository Review Repository Test Design Repository Test Case Processor Engine e.g., TestFrame Capture/Playback Tool e.g., WebTest/WinRunner, Robot, SilkTest, QAPlayback Technical Review Management e.g., ReviewPro Software Under Test Test Results Report Slide 33
Benefits of an Effective Test Architecture Better, more maintainable tests Improved test design and development Reduced costs, especially for regression testing Higher motivation for participants Fewer highly technical testers required Less sensitive to target system changes Better organizational approach: Clearer separation of tasks Tangible products Accountability Slide 34
Summary Create an effective Test Automation Architecture Focus on Test Design Build a bridge between Test Design and Automation Use spreadsheets for: Feature Decomposition Basic Test Case Design Configuration Combination Design Load and Stress Design Verify - Don t forget Technical Reviews and Inspections Slide 35
References Buwalda, Hans, Testing with Action Words, STAR May 1998 Dalal, Siddhartha R., ( and Cohen, Parelius, Patton), The Combinatorial Design Approach to Automatic Test Generation, International Symposium on Software Engineering, 1996 Jorgensen, Paul C., Software Testing - A Craftsman s Approach, CRC Press, 1995 Kit, Edward, Software Testing in the Real World, Addison Wesley Longman, 1996 Kit, Edward, Integrated, Effective Test Design and Automation, Software Development Magazine, February 1999 Sherwood, George B., Effective Testing of Factor Combinations, STAR, 1994 Slide 36
The End The New Frontier Third Generation Software Testing Automation Edward Kit www.sdtcorp.com sdt@sdtcorp.com
Ed Kit Edward Kit, founder and president of Software Development Technologies, is well known as a test expert, author, and keynote speaker at testing conferences. His best-selling book, Software Testing in the Real World: Improving the Process, has been adopted as a standard by companies around the world such as Sun Microsystems, Exxon, Chase Manhattan Bank and Cadence Design Systems. His feature articles in Software Development Magazine have outlined new state-of-the-practice test automation models that are currently being adopted around the world. Mr. Kit continues to advise clients on bringing practical and proven software quality practices to their development efforts.