Meta-Framework: A New Pattern for Test Automation



Similar documents
SOFTWARE TESTING PROCESSES PRESENTATION

Web Applications Testing

Automation using Selenium

GUI Test Automation How-To Tips

My DevOps Journey by Billy Foss, Engineering Services Architect, CA Technologies

RHEL to SLES Migration Overview

Evaluation of Load/Stress tools for Web Applications testing

Selenium An Effective Weapon In The Open Source Armory

Selenium WebDriver. Gianluca Carbone. Selenium WebDriver 1

identity management in Linux and UNIX environments

F Cross-system event-driven scheduling. F Central console for managing your enterprise. F Automation for UNIX, Linux, and Windows servers

Exploring Web Testing Tools For Use In A Classroom

AUTOMATING THE WEB APPLICATIONS USING THE SELENIUM RC

Ensuring Web Service Quality for Service-Oriented Architectures. An Oracle White Paper June 2008

Continuous Integration

RTI v3.3 Lightweight Deep Diagnostics for LoadRunner

SOFTWARE TESTING TRAINING COURSES CONTENTS

GLOBAL JOURNAL OF ENGINEERING SCIENCE AND RESEARCHES

STUDY AND ANALYSIS OF AUTOMATION TESTING TECHNIQUES

Viewpoint. Choosing the right automation tool and framework is critical to project success. - Harsh Bajaj, Technical Test Lead ECSIVS, Infosys

Oracle WebLogic Server 11g: Administration Essentials

Software Automated Testing

SolovatSoft. Load and Performance Test Plan Sample. Title: [include project s release name] Version: Date: SolovatSoft Page 1 of 13

W H I T E P A P E R. Best Practices for Building Virtual Appliances

Why is a good idea to use OpenNebula in your VMware Infrastructure?

Using SNMP to Obtain Port Counter Statistics During Live Migration of a Virtual Machine. Ronny L. Bull Project Writeup For: CS644 Clarkson University

Educational Collaborative Develops Big Data Solution with MongoDB

Open Source Job Scheduler

SOA Solutions & Middleware Testing: White Paper

LoadRunner and Performance Center v11.52 Technical Awareness Webinar Training

Efficiency of Web Based SAX XML Distributed Processing

Version Control Your Jenkins Jobs with Jenkins Job Builder

QEx Whitepaper. Automation Testing Pillar: Selenium. Naveen Saxena. AuthOr:

The OpenNebula Cloud Platform for Data Center Virtualization

OpenNebula Open Souce Solution for DC Virtualization. C12G Labs. Online Webinar

OpenNebula Open Souce Solution for DC Virtualization

Version Overview. Business value

Chapter 1: Web Services Testing and soapui

IBM RATIONAL PERFORMANCE TESTER

Service Virtualization:

HP Data Protector Integration with Autonomy IDOL Server

About This Document 3. Integration and Automation Capabilities 4. Command-Line Interface (CLI) 8. API RPC Protocol 9.

Continuous Integration (CI) for Mobile Applications

TEST REPORT SUMMARY MAY 2010 Symantec Backup Exec 2010: Source deduplication advantages in database server, file server, and mail server scenarios

Example of Standard API

UML-based Test Generation and Execution

Certified Selenium Professional VS-1083

OpenNebula Open Souce Solution for DC Virtualization

Testing. Chapter. A Fresh Graduate s Guide to Software Development Tools and Technologies. CHAPTER AUTHORS Michael Atmadja Zhang Shuai Richard

Lab Management, Device Provisioning and Test Automation Software

G-Monitor: Gridbus web portal for monitoring and steering application execution on global grids

A challenging position as Software Engineer with an emphasis on Object Technology.

bbc Overview Adobe Flash Media Rights Management Server September 2008 Version 1.5

Introduction to Automated Testing

MALAYSIAN PUBLIC SECTOR OPEN SOURCE SOFTWARE (OSS) PROGRAMME. COMPARISON REPORT ON NETWORK MONITORING SYSTEMS (Nagios and Zabbix)

Last Class: OS and Computer Architecture. Last Class: OS and Computer Architecture

OpenShift. OpenShift platform features. Benefits Document. openshift. Feature Benefit OpenShift. Enterprise

Data Center Virtualization and Cloud QA Expertise

Developing tests for the KVM autotest framework

Advanced Service Design

Testing Tools using Visual Studio. Randy Pagels Sr. Developer Technology Specialist Microsoft Corporation

DISCOVERY OF WEB-APPLICATION VULNERABILITIES USING FUZZING TECHNIQUES

Peach Fuzzer Platform

Migration and Building of Data Centers in IBM SoftLayer with the RackWare Management Module

Revision 1.0. September ICS Learning Group

Objectives. Chapter 2: Operating-System Structures. Operating System Services (Cont.) Operating System Services. Operating System Services (Cont.

Developer support in a federated Platform-as-a-Service environment

An Oracle White Paper February Rapid Bottleneck Identification - A Better Way to do Load Testing

CA Workload Automation Agents for Mainframe-Hosted Implementations

Monitoring Windows Servers and Applications with GroundWork Monitor Enterprise 6.7. Product Application Guide October 8, 2012

Clustering Versus Shared Nothing: A Case Study

OpenShift. Marek Jelen, OpenShift, Red Hat

Certification Report

Ein Unternehmen stellt sich vor. Nagios in large environments

Web Service Testing. SOAP-based Web Services. Software Quality Assurance Telerik Software Academy

Scyld Cloud Manager User Guide

CA CPT CICS Programmers Toolkit for TCP/IP r6.1

An Oracle White Paper June Oracle Linux Management with Oracle Enterprise Manager 12c

Rapid Bottleneck Identification A Better Way to do Load Testing. An Oracle White Paper June 2009

SOFTWARE TESTING SERVICES

Mobile Cloud Computing T Open Source IaaS

A Comparison of Programming Languages for Graphical User Interface Programming

NEXT GENERATION ARCHIVE MIGRATION TOOLS

STABLE & SECURE BANK lab writeup. Page 1 of 21

Automated Software Testing by: Eli Janssen

Oracle Primavera P6 Enterprise Project Portfolio Management Performance and Sizing Guide. An Oracle White Paper October 2010

Test Automation -Selenium

Red Hat Network Satellite (On System z) 18-JUNE CAVMEN Meeting

Lecture 15 - Web Security

Client-Server Applications

Extending Desktop Applications to the Web

Getting Started Hacking on OpenNebula

Zero-Touch Drupal Deployment

Hudson Continous Integration Server. Stefan Saasen,

Transcription:

Meta-Framework: A New Pattern for Test Automation Ryan Gerard Symantec, Security 2.0 6595 Dumbarton Circle Fremont, CA 1-310-892-0821 ryan_gerard@symantec.com Amit Mathur Symantec, Security 2.0 6595 Dumbarton Circle Fremont, CA 1-510-742-2657 amit_mathur@symantec.com ABSTRACT When undertaking a test automation project involving a large piece of software, there is no one size fits all automation application. It is common to have multiple pieces one must execute and collect results from as part of a larger automation strategy. These pieces could be different frameworks, interfaces, and test drivers in programming languages that one must work with. A problem arises when one asks the question, how do you automate the automation tools? In order to simply the execution and management of the individual automation tools and frameworks, we here in the Symantec Security 2.0 group have identified a test automation pattern that we are calling Meta-Framework. This pattern provides a method of solving the problem of automating multiple pieces as part of a larger automation strategy. The metaframework provides a framework abstraction layer that allows separate automation pieces to be executed and have their results reported back in a standardized way. A meta-framework essentially defines a method of scalable abstraction for solving automation strategy problems involving multiple automated pieces. Internally we have been working on a meta-framework for a new web-based application with a massive supporting backend. We have successfully used our meta-framework to abstract the execution of xunit frameworks of various languages, CLI test drivers, and a GUI automation tool, with our results being saved in a test case management system that is abstracted as any other tool would be. This paper will discuss the context for the problem we re solving, provide an overview of the meta-framework pattern, go over the various pieces necessary for building a real meta-framework, and discuss the trials and tribulations involved in our experiences in building the meta-framework internally. 1. INTRODUCTION As part of a larger automation strategy within a project, it is typical to use many different frameworks and tools to manage your automation. One could have xunit frameworks for various lanuguages accomplishing your unit testing, and open-source tools such as JMeter to drive loads for performance testing. The tools can all be run individually, but we want to define a common reporting mechanism as well. The Meta-Framework test automation pattern describes how to create a framework that can manage and execute other frameworks, while recording their results in a common manner. increase the complexity of the testing environment. The trade-off between time and test coverage remains the fundamental challenge for all testers [1], and automation at some levels must be introduced. We are not trying to push our solution as unique; it is not. As a matter of fact, the basic ideas have been produced already in a piece of open-source software known as DTET. What we are suggesting, however, is that this solution is a part of an abstract test automation pattern. When there are many languages and platforms necessary for testing, and multiple tools and frameworks are available for use, managing such a complex automation environment is a project in itself. In addition, one must be able to plan and prepare for any future tools and frameworks that you may need to add on for future requirements. 3. FORCES The forces involved with this problem include: 1. The number of automation tools and frameworks makes a single automation strategy difficult to manage. 2. One must be able to fit in new tools and frameworks for future automation requirements. 3. Reporting the results of each individual automation tool needs to be standardized. 4. SOLUTION We suggest a solution known as a meta-framework. This is a highly extensible framework that is simply a middleware between the automation tools / frameworks and the test case management system. The framework is invoked using scripts in whatever language you decide to implement the meta-framework in. These scripts are used to execute your automated test case using the tools or frameworks necessary on the host you define. The scripts examine the output of the tool, and compare it against pre-defined expected output for that tool. Multiple tools can be run in sequence in this manner. At the end of the script, the results are sent back to your test case management system to be recorded. 2. MOTIVATION As projects become larger, the number of tools to use, the amount of source code to test, and the number of languages involved all 73 of 194

Framework Tool Tool Meta- Framew ork Test Case Manag ement Syste framework or tool in order to execute a test case on a remote machine, the test script itself must correctly copy, install, and configure this framework or tool as a pre-condition to test execution. Host 1 MF TC MS Figure 1. Conceptual Meta-Framework When new automation requirements come up, or new tools / frameworks need to be added to the automation system for new automated test cases, a new script needs to be written for the framework. This script defines the test case to be run, the host to run it on, the tools to be executed, and the expected results. The new tools should be placed in well-known locations that the script can easily access. In order to have the tight integration necessary with the test case management system, it is recommended that one use an in-house tool that can be modified if necessary. As an alternative, one could have the results put into some sort of text format that could be read at a later point in time and input into the test case management system. Due to the fact that you must define the host to run the test one, the tests can be run in a distributed manner, allowing for better automated distributed testing. In this case, one must modify the automated meta-framework script to copy over and install the necessary tools and components. Host 1 Host 2 Meta- Framew ork Figure 2. Distributed Automation Test Case Manage ment System 4.1 Distributed and Remote Automation Given the modular nature of this framework, performing distributed and remote automated tests against the system can be a more manageable problem. The test case management system can be located on one machine, with communication to and from the framework done using well known protocols (HTTP, SOAP, etc). To enable this functionality, the meta-framework implementation must track all hosts that it can connect to, and define the method of communication. This is a small price in overhead in exchange for remote automation. As a result of this functionality, the meta-framework test scripts must be much more robust. For instance, if one needs a specific Host 2 Host 3 Host 4 Figure 3. Remote Automation 5. CONSEQUENSES The quality of this system depends highly on the quality of the test cases and meta-framework scripts being created. This is true in any test environment, but for automated test cases this is even more true. Since these are test cases that you are essentially creating and forgetting about unless they fail, the test cases must be of high quality. There is an additional overhead cost in creating scripts that can execute your automated tools. We believe that the overhead is useful, in that it helps to vet out any possible issues in actually executing the test case. 6. FRAMEWORK IMPLEMENTATION We decided to have test case results sent via HTTP to the test case management system. This allowed for distributed hosts to easily send result data back to the test case management system. In order to do this, we modified the current internal test case management system to allow incoming data via POST, and inserted the data into our database so that the results of the test could be read via the web. We implemented the meta-framework itself in Perl. We defined a test object that would manage the execution and collection of command-line errors and output. We also defined an object that could communicate with our test case management system. The automated test scripts were also written in Perl. The scripts were defined such that one could define the host to execute the test on, input the commands necessary to execute the test, and search the output for the correct and expected output. The execution of the test scripts is fairly straight forward and intuitive. We wrote a small runner utility that allowed you to execute an individual script or a directory of scripts. This runner utility could then be invoked from a cron job, or when triggered from another system (the build system, for example). The entire system was built for Linux, but we had Windows components we needed to test as well. In order to execute crossplatform, we installed Cygwin and a SSH daemon on our 74 of 194

Windows clients. In this way, we could access and manipulate the Windows environment as if it were a Linux environment. 7. IMPLEMENTATION ISSUES One of the biggest issues we ran into was training all the necessary parties on the installation, configuration, and use of this new automation framework. Our team is distributed, and inperson training with detailed examples and ample Q&A time seemed to be the most effective way to train people, but this was not always possible. The idea of using Cygwin to communicate was very nice in theory, but tended to have problems in practice. Cygwin is a very nice platform, with most major Linux utilities included, however we occaisionally had SSH connection problems to Cygwin, which went unexplained for the most part. 8. FRAMEWORK IN USE We decided to implement our automation using a multi-layered strategy. We started by identifying the three focuses for our automation: unit, functional, and system testing. Within these three areas, we identified our three testing interfaces we wanted to exercise as the API, CLI, and GUI. This created a matrix of nine possible uses of the framework where automation was possible. We decided on a strategy for what tools we wanted to use to automate for each situation. We were realistic that not everything can be automated. Table 1. Test Interfaces and Focus Matric Unit Functional System API xunit Framework + Test Drivers Framework + Test Drivers CLI n/a Framework Framework GUI Selenium Selenium? Due to the fact that the test scripts for the framework can execute from the command-line without a problem, we decided that the best way to automated the CLI is to write the automated test directly into the framework test scripts. We proceeded to install the various xunit frameworks we needed to properly unit test the code, and collect the various test drivers we needed to execute the tests. After evaluating a few Linuxbased cross-browser GUI automation tools, we decided that Selenium was the best project to proceed with to automate the GUI. With the tools in place, we began to write test scripts to automate our test cases. To exercise the xunit tests that were written by the developers, our test scripts executed the unit test suites from the command-line and verified that the output indicated success. To exercise the API for Functional and System tests, we wrote test scripts that executed our test drivers and verified the output indicated success. For the CLI test cases, we again wrote test scripts that executed the tests from the command-line and verified the output indicated success. The GUI automation was a bit more overhead. We had to first install and configure Selenium RC on the server we were executing the tests from. Next, we had to write Selenium test scripts, and luckily Selenium supports many languages, including Java, Ruby, Perl, PHP,.NET, Python, and Javascript. After writing and verifying our Selenium GUI test scripts, we wrapped the execution of those Selenium scripts in scripts that could be executed by the Framework. Using this meta-framework implementation, we were able to successfully automate a portion of our testing across three interfaces, but managed by one framework. 9. FRAMEWORK USE ISSUES We discovered, to our chagrin, that using this framework for xunit testing was not the best use of our time. Given the distributed nature of the execution of tests, our assumption that we would have source code available to us to test was a fallacy. In the end, executing the unit tests simply does not provide the tester as much value as the developer who wrote the tests. It is possible that these automated tests would still be useful on the build system, but more likely than not the build system already has a method for executing unit tests. If this project were done over again, we would likely take out the xunit requirement and xunit frameworks for automation. The best and easiest use of this framework, we discovered, was for executing command-line tools and utilities. The test frameworks to integrate typically have dependencies that are difficult to manage. Automating many simple command-line tools turned out to be much easier than expected. 10. KNOWN USES The TETware test framework provides local and distributed testing environment similar to our framework. It does not provide the test case management integration that our framework defines, however I m positive that it could be modified for such a purpose. 11. RELATED PATTERNS The pattern this most closely resembles is the Façade design pattern. A façade object provides a simplified interface to a large body of code [2]. While the Façade pattern provides a simplified interface, our pattern does not exactly provide a new interface so much as it acts as a middleware between the tools and the test case management system. 12. CONCLUSIONS The meta-framework test automation pattern provides a way to manage the increasingly complex test automation environment by providing a middleware between the various tools and frameworks, and the test case management system. This allows for easily extensibility for new automation tools that need to be incorporated into a larger automation strategy. 75 of 194

13. FUTURE WORK We would like to work on a generic meta-framework implementation that can be released as open source to the wider community. We believe that this is a useful concept that mid to large sized projects could put to good use. 14. ACKNOWLEDGMENTS Our thanks to AST for organzing and accepting us into this conference. 15. REFERENCES [1] Kaner, Cem. Pattern: Scenario Testing. Retrieved April 9, 2007 from http://www.testing.com/test-patterns/patterns/patternscenario-testing-kaner.html [2] Façade Pattern. Retrieved April 9, 2007 from http://en.wikipedia.org/wiki/fa%c3%a7ade_pattern 76 of 194

Meta-Framework: A New Pattern for Test Automation Amit Mathur, Sr. QA Manager Ryan Gerard, QA Engineer Automation Problem JUnit Selenium-RC C++ Test Tool WinRunner 77 of 194 Presentation Identifier Goes Here 2

Automation Solution Meta-Framework JUnit C++ Test Tool Selenium-RC CppUnit WinRunner Presentation Identifier Goes Here 3 Meta-Framework as a Pattern The solution isn t novel Patterns are abstract solutions to common problems Find more information on Test Patterns: http://www.testing.com/test-patterns/patterns/ 78 of 194 Presentation Identifier Goes Here 4

Thank You! Ryan Gerard Amit Mathur ryan_gerard@symantec.com amit_mathur@symantec.com http://searchforquality.blogspot.com 2006 Symantec Corporation. All rights reserved. THIS DOCUMENT IS PROVIDED FOR INFORMATIONAL PURPOSES ONLY AND IS NOT INTENDED AS ADVERTISING. ALL WARRANTIES RELATING TO THE INFORMATION IN THIS DOCUMENT, EITHER EXPRESS OR IMPLIED, ARE DISCLAIMED TO THE MAXIMUM EXTENT ALLOWED BY LAW. THE INFORMATION IN THIS DOCUMENT IS SUBJECT TO CHANGE WITHOUT NOTICE. Presentation Identifier Goes Here 5 79 of 194