Summer Internship 2013 Group No.4-Enhancement of JMeter Week 1-Report-1 27/5/2013 Naman Choudhary For the first week I was given two papers to study. The first one was Web Service Testing Tools: A Comparative Study. There are commercial as well as open-source testing tools available in the market with different features and functionalities. The paper was focused in testing of two web services- SOAP(Simple Object Access Protocol) and WSDL(Web Service Description Language). SOAP -SOAP Version 1.2 (SOAP) is a lightweight protocol intended for exchanging structured information in a decentralized, distributed environment. It uses XML technologies to define an extensible messaging framework providing a message construct that can be exchanged over a variety of underlying protocols. The framework has been designed to be independent of any particular programming model and other implementation specific semantics. WSDL-WSDL is an XML format for describing network services as a set of endpoints operating on messages containing either document-oriented or procedure-oriented information. The operations and messages are described abstractly, and then bound to a concrete network protocol and message format to define an endpoint. Related concrete endpoints are combined into abstract endpoints (services). WSDL is extensible to allow description of endpoints and their messages regardless of what message formats or network protocols are used to communicate. There is a number of open-source web service testing tools available in the software market. Although the core functions of these tools are similar, they differ in functionality, features, usability and interoperability. (1)JMeter - JMeter is an open-source testing tool developed by Apache Software Foundation (ASF). It is distributed under Apache License. The core function of JMeter is to load test client/server application but it can also be used for performance measurement. Further, JMeter is also helpful in regression testing by facilitating in creation of test scripts with assertions. Features of JMeter:- JMeter supports full multithreading. JMeter offers high extensibility due to use of pluggable components. Eg-timers, samplers etc. JMeter offers user-friendly Graphical User Interface(GUI). Offers a number of statistical reports and graphical analysis. (2) soapui- soapui is an open-source testing tool for Service Oriented Architecture (SOA) and web service testing. It is developed by SmartBear Software and is provided freely under the GNU LGPL. soapui facilitates quick creation of advanced performance tests and execution of automated functional tests. (3)Storm - Storm is a free and open-source tool for testing web services. It is developed by Erik Araojo. Storm is developed in F# language and is available for free to use, distributed under New BSD license. Features of Storm:- allows to test web services written using any technology (.Net, Java, etc.). supports dynamic invocation of web service methods even those that have input parameters of complex data types
facilitates editing/manipulation of raw soap requests. The GUI is very simple and user friendly Multiple web services can be tested simultaneously that saves time, speed up testing schedule. To test the representative testing tools, each tool need to be configured to run the tests. The configuration includes installation, setting up test environment, test parameters, test data collection, reports analysis, etc. Each tool is configured to test the sample web services and gather test results. The tests were conducted on an Intel Core 2 Duo 2.0 GHz processor machine with 3GB RAM, running Microsoft Windows 7 Ultimate, and 2Mbps of DSL Internet connection. Results of the comparison tests:- On the basis of response times - Each tool has different architecture and internal processes to carry out tasks. This factor provides basis to compare the tools in terms of response time. From the results, we observe that JMeter is taking more time in responding to web services as compared to other two tools. Storm is behaving better than JMeter but not promising as soapui. In this test, soapui outperforms other two testing tools and can be regarded as fastest tool in terms of response time. The reason for the large response time of JMeter being that JMeter partitions the request packets into two and waits for the response for the first partition to come by to send the second partition which increases the overall response time. On the basis of average throughput - Throughput is the measure of the number of requests that can be served by web service in a specified time period. Only JMeter and soapui supports this type of testing.
The results of throughput test demonstrate that JMeter has better throughput than soapui. On the basis of number of kilobytes processed per second- As is evident from the above figure that number of bytes processed by JMeter is higher than soapui. This is in relation to the throughput attribute, as JMeter has better throughput. The second paper which I studied was Scalability Factors of JMeter in Performance Testing Projects Performance testing is in general testing performed to determine how a system performs in terms of responsiveness and stability under a particular workload. It can also serve to investigate, measure, validate or verify other quality attributes of the system, such as scalability, reliability and resource usage. Performance testing can be off the following types:- 1. Load Testing -Load testing is the simplest form of performance testing. A load test is usually conducted to understand the behaviour of the system under a specific expected load. This load can be the expected concurrent number of users on the application performing a specific number of transactions within the set duration. This test will give out the response times of all the important business critical transactions. If the database, application server, etc. are also monitored, then this simple test can itself point towards any bottlenecks in the application software. 2. Stress testing -Stress testing is normally used to understand the upper limits of capacity within the system. This kind of test is done to determine the system's robustness in terms of extreme load and helps application administrators to determine if the system will perform sufficiently if the current load goes well above the expected maximum.
3. Endurance testing -It essentially involves applying a significant load to a system for an extended, significant period of time. The goal is to discover how the system behaves under sustained use. The target is to detect potential memory leaks and monitor performance degradation. That is, to ensure that the throughput and/or response times after some long period of sustained activity are as good or better than at the beginning of the test. To carry out effective performance testing of web applications one has to ensure that sufficiently powerful hardware is used to generate required load levels. At the same time, one would prefer to avoid investing in unnecessarily expensive hardware. Currently the only benchmark available is how many virtual users the tool can support on different hardware configurations. The above table indicates that the load generation capability of the tool (in terms of # virtual users per machine/agent) is a function of the underlying hardware configuration. In fact, the # Virtual Users per machine that can be supported by any load generating tool is not only a 'function' of the underlying hardware but also depends on various application specific parameters and tool configurations. Virtual users are scripts that emulate the steps of a real user using the application. Design of a load-generation tool Here is how the load generation tool fires a set of requests for a single virtual user: It is important to note that the protocol engine fires requests synchronously Protocol engine fires Request A to application under test Protocol engine waits for Response A before it proceeds with execution of Request B Once Response A is received by the protocol engine, it is stored in memory for analysis and processing. This response is discarded from memory only after Request B is sent.
The maximum number of virtual user that can be simulated on a given hardware is dictated by the average memory/cpu footprint for each virtual user. The memory/cpu footprint is in turn affected by application response and the complexity of the script to be simulated. Scalability factors:- The various factors that affect the scalability of any load generation tool can be categorized as follows: 1. Application specific factors:- Average size of response Average response time Underlying protocol 2. Load generation tool related:- Complexity of client side processing i.e the test script Load generating tool architecture and configuration. 3. Hardware configuration of the load client(machine hosting the load generating tool). Effect of response time:- The above graph depicts the variation of optimal number of virtual users for various response time values for a constant response size of 20 kb. The optimal number of virtual users increases with increase in response time. Effect of response size:-
Application response size has a massive effect on the load generation capabilities of JMeter. The optimal # of virtual users drops from around 500 virtual users to a measly 115 virtual users when the application response size increases form 20 kb to 100kb. Effect of protocol change:- Considerable decrease in load generation capability of JMeter is observed when the underlying protocol changes from http protocol to https protocol. The load generation capability decreases by 50% or more when the protocol is HTTPS (script complexity is simple). Estimating the number of load generators required to simulate the expected load levels is an important activity in performance test strategy and planning phase. Here the # of load generators required can be calculated as: max no. virtual users that can be simulated /load generator Total virtual users that need to be simulated I would like to conclude my this week s report. As far as I have studied this week the possible improvement for JMeter on the basis of these two papers is the improvement of response time as we have seen that JMeter outperforms other open-source tools in every other department.