AUTOMATION TESTED: an independent comparison of Selenium and Provar www.provartesting.com info@provartesting.com
Acknowledgements The following has been reproduced with permission from the authors of the original study. Although some descriptions have been modified and/or extended for clarity, details of the original analyses run and the results observed have been retained without modification. Introduction The following comparison study was undertaken by a Global System Integrator (GSI) on behalf of a large US multi-national. The company had developed a number of Salesforce test cases using the GSI s Selenium framework, which had been developed over a number of years by the GSI. The company sought a comparison study after encountering the following challenges with Selenium: Java developers were required to create and maintain tests UI test failures were common due to broken locators Significant test script updates were required when testing a new environment, testing new deployments or refreshing an existing environment The company chose to evaluate Provar against the Selenium framework to understand whether it could offer a better return on investment. High Level Summary of Results The following provides a high level summary of the report. Full details can be read in the individual sections below. Test case authoring and debugging: Provar was over four times faster than the Selenium framework in authoring and debugging the comparison test cases. Test case execution: Provar was over four times faster than the Selenium framework in the authoring and debugging of the comparison test cases. Test case maintenance: On average, Provar was six times faster than the Selenium framework across a range of normal maintenance activities. Test suite integration: Test suite integration is not recommended with a Selenium framework owing to the time and effort to deliver and maintain. The Provar tool shows better out of the box capabilities for test suite integration. Many different types of integration capability were observed, including email and database testing, web services and running via continuous integration systems. Comparison Phases The comparison study covered the following phases of test automation: Test case authoring and debugging Test case execution Test case maintenance Test suite integration
Comparison Test Cases Five test cases were chosen to evaluate Provar against the Selenium framework. Test Case Complexity Area Description 1 Low Opportunity Management 2 Medium Opportunity Management 3 Medium Opportunity Management Create new opportunity Create and assert related lists Create new opportunity Add 3 products via a custom Visualforce page Locate and edit existing opportunity Verify the value of hidden s Iterate across the functionality above 4 Medium Account & Contacts Create new contact Verify account record Add affiliation related list Verify primary contacts 5 High Risk Process (Visualforce) Test business workflow with complex Visualforce elements, including alerts and dialogue boxes This document details the difference in approach for each phase when delivery test automation including timings. Test Case Authoring Summary of Results: Provar was over four times faster than the Selenium framework in authoring and debugging tests. The following table shows a detailed breakdown of timings per test case. Time taken to author and debug (hours) Test Case Provar Selenium Framework 1 0.5 2 2 1 5.5 3 1 8 4 1 6 5 2 5 Total 5.5 26.5
Individual timings vary across test cases, but in every test case a significant efficiency gain can be observed when authoring and debugging with Provar. The following table shows the average time taken to perform various authoring and debugging activities in both tools. Activity Provar Avg Time (mins) Selenium framework Avg Time (mins) Recording process Provar builds tests using its Test Builder, a Chrome application. Test Builder performs all the actions against the page using the recorded script, which removes the need to re-run the test case. 20 Firefox Selenium IDE is used to record steps. This tracks mouse clicks, so there is no guarantee that the script works until re-running the test. 20 Re-working locators Field locators are suggested automatically with Provar s metadata integration. Initial mapping does not require updates. 0 Selenium IDE maps using ID. Field ID frequently fails for Visualforce pages, and when changing or refreshing environments. Creating functions A test case can be converted into a function by ticking a checkbox ( Callable Test ). 10 Functions are created manually by Selenium developers. Each one may require modification and retesting. Adding datadriven testing (DDT) Data can be imported from Excel or a database. 15 Excel sheets and database integration is possible, but must be programmed and tested on a by basis. Mapping tables and related lists Tool supports list views, related lists, reports and data lookups. Provar recognises these elements when the page is mapped. 0 The steps recorded initially capture every by ID. To implement reliably a developer needs to write code to recognise the table and loop through the results. 40 Create seed data Creation or cloning of test data is supported via a drag-and-drop interface within the tool. 10 Data is coded individually in each test. 20 Clean up data Provar supports automated cleanup where enabled. 5 Code is added to delete the correct objects. 10
Debugging Tool provides debugging options and allows test and locator updates without restarting the test. This results in a single script debug. 20 Each change to a test case needs requires compiling and re-execution. This can result in multiple debug iterations. 40 Test Case Execution Summary of Results: Provar was over four times faster than the Selenium framework in the execution of the same test scenarios. The following table shows a detailed breakdown of timings per test case. Test Case Execution (minutes:seconds) Test Case Provar Selenium framework 1 1:00 6: 2 1:16 22:00 3 2:19 13:00 4 3:20 6:00 5 3:13 9:00 Total 11:08 46: The primary causes of these timing differences were as follows: Area Provar Selenium framework Page Layout Testing Visualforce Testing Provar has ability to detect automatically whether Salesforce layouts and s have loaded in a browser. This avoids errors due to loading times and allows Provar tests to execute in a timely manner. Provar has ability to set a specific wait time. It can also add intelligent wait conditions, e.g. Wait for Field to appear, Selenium IDE typically adds 10 seconds of wait time after each step test to allow for loading. Tool does not have inbuilt recognition of whether elements have loaded. This can cause errors due to loading times and can make test cases slow to execute. Selenium IDE typically adds 10 seconds of wait time after each test step. Tool does not have inbuilt recognition of whether elements have loaded.
API Testing Salesforce Object Extraction and Assertion Wait for background processes to complete. Provar supports the use of test APIs: SOQL APEX Bulk Salesforce CRUD APIs Provar provides a mechanism to extract every record type for a layout into an Excel (.xls) spreadsheet. This includes picklist values and row/column position on the page layout. Selenium libraries do support these test APIs but these can be manually programmed against the SFDC APIs. Not available. Test Case Maintenance Summary of Results: On average, Provar was six times faster than the Selenium framework across a range of normal maintenance activities. A subset of common maintenance activities was undertaken in booth tools to identify the average time required in each. Maintenance activities were included for page layouts and Visualforce pages. The following table shows the average time taken to perform these maintenance activities in both tools. Area Provar Avg Time (mins) Page Layout Maintenance Activities Selenium framework Avg Time (mins) Minor repositioning of Provar detected change automatically from metadata. No update required. 0 Updates required to execute test case, update code, rerun test case and debug. Iterations often required. Update of label Provar detected change automatically from metadata. No update required. 0 Updates required to remap, sometimes required. Renaming of Field remapped by setting a breakpoint in Test Builder, Provar s Chrome application. Achieved in one test run. 10 Updates required to remap,
Addition of new Field remapped by setting a breakpoint in Test Builder, Provar s Chrome application. Achieved in one test run. 10 Updates required to map, Removal of Provar detected change automatically from metadata and alerted tester through Problems View within application. Field removed declaratively through Provar UI before test run. 5 Removal identified after execution of test and analysis of failures. Anticipate more significant troubleshooting required when change not communicated to tester. Updates required to update code and retest. Iterations -60* Total for Section 0:25 2:- 3:00 Visualforce Maintenance Activities Cosmetic change updates the salesforce generated id of a (j_12:j_13: ) No update required due to Provar s native Visualforce mapping capability. 0 Updates required to update code and retest. Iterations often required. Minor repositioning of No update required due to Provar s native Visualforce mapping capability. 0 Updates required to remap value, Update of label No update required due to Provar s native Visualforce mapping capability. 0 Updates required to remap value, Renaming of variable inside APEX code for Locator edited by setting a breakpoint in Test Builder, Provar s Chrome application. Achieved in one test run. 10 Updates required to remap value, Addition of new Field added by setting a breakpoint in Test Builder, Provar s Chrome application. Achieved in one test run. 10 Updates required to remap value, Removal of Field removed declaratively through Provar UI. 10 Updates required to remap value, Total for Section 0: 3:00
Overall Total 0:55 5:- 6:00 It was observed that effort could increase significantly in the Selenium framework when relevant changes were not communicated to the tester, requiring the tester to troubleshoot the root cause. This troubleshooting time cannot be estimated precisely so a range has been given. Note: this observation applies to the Selenium framework only. Provar s metadata integration was seen to proactively alert the tester when problems caused by metadata changes were likely to lead to test failure. This allowed the tester to address changes directly. Test Suite Integration Integration of test suites was not within the scope of the comparison test scenarios. The following observations are based on an understanding of the two solutions capabilities developed through the the earlier test phases. Summary of Results: Test suite integration is not recommended with a Selenium framework owing to the time and effort to deliver and maintain. The Provar tool shows better out of the box capabilities for test suite integration, including email and database testing, web services and running via continuous integration systems. it was observed that these test suite integration scenarios could be programmed manually with the Selenium framework. However, it is anticipated that the Selenium framework would require significant time and effort to deliver and maintain such functionality. Many of these scenarios, while technically possible within a custom framework, would not be advisable from an ROI perspective. In Provar s case, however, many different integration capabilities were observed out of the box while using the tool, including email and database testing, web services and running via continuous integration systems. For the purposes of the study it was observed that many more test scenarios would now be in scope for test automation. The following indicates those available with Provar out of the box or with minimal configuration tasks. Email testing (MS Exchange or Gmail) Database testing (Oracle, MySQL, PostGreSQL) Web Services (REST and SOAP) File Reading and Writing Messaging systems (Websphere MQ) Running via Continuous Integration Systems, e.g. Jenkins
Conclusions The company sought a comparison study after encountering the following challenges with Selenium: Java developers were required to create and maintain tests UI test failures were common due to broken locators Significant test script updates were required when testing a new environment, testing new deployments or refreshing an existing environment Through evaluation of the five test cases, Provar was found to be between four and six times faster than the Selenium framework across test case authoring, debugging, test execution and test maintenance activities. Provar also displayed promising integration capabilities. In addition, the tool requires little to no development capability to create and maintain tests. This study concludes that the company s adoption of Provar would successfully address the challenges encountered with Selenium and provide beneficial return on investment. Adoption of the tool is recommended. Want to know more? Want to find out how Provar can help your company reduce testing time and drive better ROI? Get in touch today for more details or to arrange a demo.