CHAPTER 7 RESULT ANALYSIS AND STATISTICS. 7.1 Introduction to manual Vs automated testing



Similar documents
JBoss SOAP Web Services User Guide. Version: M5

Using mobile phones to access Web Services in a secure way. Dan Marinescu

Author: Gennaro Frazzingaro Universidad Rey Juan Carlos campus de Mostòles (Madrid) GIA Grupo de Inteligencia Artificial

Summer Internship 2013 Group No.4-Enhancement of JMeter Week 1-Report-1 27/5/2013 Naman Choudhary

PROGRESS Portal Access Whitepaper

Scalability Factors of JMeter In Performance Testing Projects

MSP How to guide session 2 (Resources & Cost)

Introduction to Testing Webservices

Performance Workload Design

Semantic based Web Application Firewall (SWAF V 1.6) Operations and User Manual. Document Version 1.0

Performance Testing and Optimization in Web-Service Based Applications

Configuring Nex-Gen Web Load Balancer

Policy Guide Access Manager 3.1 SP5 January 2013

Performance Analysis of Web based Applications on Single and Multi Core Servers

StreamServe Persuasion SP4 Service Broker

Developing Web Services with Eclipse and Open Source. Claire Rogers Developer Resources and Partner Enablement, HP February, 2004

Monitoring Pramati EJB Server

SyncThru Database Migration

Table of Contents INTRODUCTION Prerequisites... 3 Audience... 3 Report Metrics... 3

Craig Carpenter MCT. MCSE, MCSA

SolarWinds. Understanding SolarWinds Charts and Graphs Technical Reference

bbc Adobe LiveCycle Data Services Using the F5 BIG-IP LTM Introduction APPLIES TO CONTENTS

How To Test Your Web Site On Wapt On A Pc Or Mac Or Mac (Or Mac) On A Mac Or Ipad Or Ipa (Or Ipa) On Pc Or Ipam (Or Pc Or Pc) On An Ip

SOLUTION BRIEF: SLCM R12.7 PERFORMANCE TEST RESULTS JANUARY, Load Test Results for Submit and Approval Phases of Request Life Cycle

SysAidTM. Monitoring Guide

MadCap Software. Upgrading Guide. Pulse

So in order to grab all the visitors requests we add to our workbench a non-test-element of the proxy type.

How to configure your Windows PC post migrating to Microsoft Office 365

Performance Optimization For Operational Risk Management Application On Azure Platform

Web Application Testing. Web Performance Testing

Performance Analysis of webmethods Integrations using Apache JMeter Information Guide for JMeter Adoption

Load testing with. WAPT Cloud. Quick Start Guide

Modifying Colors and Symbols in ArcMap

Document Capture and Distribution

IBM SPSS Collaboration and Deployment Services Version 6 Release 0. Single Sign-On Services Developer's Guide

Using SPSS, Chapter 2: Descriptive Statistics

WEBMAIL User s Manual

P6 Analytics Reference Manual

AquaLogic Service Bus

ManageEngine EventLog Analyzer. Best Practices Document

Customer Tips. Xerox Network Scanning HTTP/HTTPS Configuration using Microsoft IIS. for the user. Purpose. Background

Building Web Services with Apache Axis2

MEASURING WORKLOAD PERFORMANCE IS THE INFRASTRUCTURE A PROBLEM?

Table of Contents. Chapter 1: Installing Endpoint Application Control. Chapter 2: Getting Support. Index

Securepoint Network Access Controller (NAC)

The FTP Monitor application performs the following main functions:

HTTPS HTTP. ProxySG Web Server. Client. ProxySG TechBrief Reverse Proxy with SSL. 1 Technical Brief

Oracle Service Bus Examples and Tutorials

Publishing, Consuming, Deploying and Testing Web Services

Sophos SafeGuard Native Device Encryption for Mac Administrator help. Product version: 7

A Middleware-Based Approach to Mobile Web Services

B6: GET /started/with/ HTTP Analysis

Accessing Data with ADOBE FLEX 4.6

What are Web Services? A BT Conferencing white paper

Understanding Slow Start

Comparing Load Balancing for Server Selection Using Cloud Services

WEB SERVICES. Revised 9/29/2015

WhatsUpGold. v3.0. WhatsConnected User Guide

REQUIREMENTS AND INSTALLATION OF THE NEFSIS DEDICATED SERVER

DEPLOYMENT GUIDE Version 1.1. Deploying the BIG-IP LTM v10 with Citrix Presentation Server 4.5

ManageEngine EventLog Analyzer. Best Practices Document

Consuming, Providing & Publishing WS

Implementing Secure Sockets Layer on iseries

T Network Application Frameworks and XML Web Services and WSDL Tancred Lindholm

Usage of OPNET IT tool to Simulate and Test the Security of Cloud under varying Firewall conditions

Evolved Backup Features Computer Box 220 5th Ave South Clinton, IA

What s new in TIBCO Spotfire 6.5

A Middleware Strategy to Survive Compute Peak Loads in Cloud

WebSphere Business Monitor V6.2 KPI history and prediction lab

Holistic Performance Analysis of J2EE Applications

ADFS Integration Guidelines

PTC System Monitor Solution Training

Apache JMeter HTTP(S) Test Script Recorder

HTTP communication between Symantec Enterprise Vault and Clearwell E- Discovery

Java Access to Oracle CRM On Demand. By: Joerg Wallmueller Melbourne, Australia

Agility Database Scalability Testing

You re FREE Guide SSL. (Secure Sockets Layer) webvisions

Secure Identity Propagation Using WS- Trust, SAML2, and WS-Security 12 Apr 2011 IBM Impact

DOSarrest Security Services (DSS) Version 4.0

Connecting Software. CB Mobile CRM Windows Phone 8. User Manual

B M C S O F T W A R E, I N C. BASIC BEST PRACTICES. Ross Cochran Principal SW Consultant

Introduction Course in SPSS - Evening 1

How To Backup Your Hard Drive With Pros 4 Technology Online Backup


New Features... 1 Installation... 3 Upgrade Changes... 3 Fixed Limitations... 4 Known Limitations... 5 Informatica Global Customer Support...

Chapter 2 Editor s Note:

Port evolution: a software to find the shady IP profiles in Netflow. Or how to reduce Netflow records efficiently.

Overview of Web Services API

USER GUIDE: MaaS360 Services

Hudson configuration manual

Load Testing with JMeter

Job Reference Guide. SLAMD Distributed Load Generation Engine. Version 1.8.2

Cloud Performance Benchmark Series

Web Service Development Using CXF. - Praveen Kumar Jayaram

Project Management within ManagePro

How to set up Outlook Anywhere on your home system

New Relic & JMeter - Perfect Performance Testing

Astaro Security Gateway V8. Remote Access via L2TP over IPSec Configuring ASG and Client

Formulas, Functions and Charts

ERserver. iseries. Secure Sockets Layer (SSL)

Transcription:

CHAPTER 7 RESULT ANALYSIS AND STATISTICS 7.1 Introduction to manual Vs automated testing Testing is considered to be the most crucial part of the software development life cycle. It is very important because it really helps the users to measure different parameters of quality like correctness and robustness of a particular web service. Manual testing is so tiresome that we have to spent lot of time on analysis, identification and testing of test cases; where some tools are needed to automate the generation of test cases and implementation for web services. There are so many tools available in the market to generate test cases automatically and also which helps the user in analyzing the combination or request and response. Automated testing always concentrates on perfect generation of test cases automatically and executes such generated test cases. This kind of testing will save time and costs less effort. Sometimes we can consider this testing as model-based where initially the basic functional requirements are designed and then based on the designed and well-framed model automatically the test cases will be generated. When it comes to the manual testing, the results get compared to the expected behavior and all such observations will be recorded. Whereas in automated testing, all the cases are pre-defined. Once comparison is over, it becomes very easy to extend its functionality which is not possible in manual testing. 7.2 How to perform web services testing Testing web services from client side involves: 1. Creating a java project. 2. Consideration of web services. 3. Selecting Web service client. 4. Creating Web service project. 5. Creation of Classes. 124

The information related to the client will be considered as input to the web service. The Figure 7.1 a: shown is the interface which acts as barrier between the consumer and producer. When we think about component level testing; black box and white box testing comes into the picture which is necessary for each function. The practical test cases which are designed basically depend on boundary value analysis, path testing, forced error testing and equivalence partitioning. We can create a web service client by eclipse going to menu File click New click Other click Web Services click Web Service Client. The following is the interface generated by the Web service Client. Figure 7.1 a: Calculator.java - interface Client side testing is done by choosing the conceptual web services and the web service client. The java file called calaulator.java is a client which describes the functionality of the web service 125

which is acting as client. In order to create a client from the WSDL, you have to create both Configuration file and Java files both from WSDL. Configuration of the client talks about the bindings which allow the consumer or user to communicate with the host system. The Figure 7.1 a: talks about the Calculator acting as interface where java.lang.string can also be defined as class. We know that Strings are objects and particularly they are the instances of the class java.lang.string. Strings start their count from 0 (Zero). And its implementation is made known in Figure 7.1 b. Figure 7.1 b: Calculator.java - Implementation 126

Figure 7.2: CalculatorProxy.java Figure 7.2 shows the intervention of CalculatorProxy.java defined as a proxy class which usually implements different interfaces only at run time. Proxy class is often defined as public. It implements only those interfaces which are mentioned at the time of creation. The javax.xml.rpc.stub is the interface for all the classes of type Stub. There are different interfaces like Stub, Call, and Service which is also a class. This can also be considered a package which commonly contains core JAX-RPC APIs for the client. 127

Figure 7.3: CalculatorServiceLocator The ServiceLocator is a pattern used to identify the Enterprise Java Beans often called EJB and destinations of Java Message Service-JMS which are often called service components. The clients of J2EE communicate with them; in the process of communication they have to create their own components. Sometimes, these clients are capable of locating their service components. In order to down the complexity of the code, several clients can reuse the service locator. This behavior is shown in Figure 7.3 128

Figure 7.4: CalculatorSoapBindingStub.java More often SoapBindingStubs are automatically generated from the WSDL. We have to remember that all soap tool kits cannot contain these classes. These are client side stubs. For every specification in WSDL generating a SoapBindingStub.java is done by WSDL2Java which 129

is shown in Figure 7.4. It is commonly used like a stub at the client side. In order to generate the stubs you can use Apache Axis and WSDL2Java. Figure 7.5: Testing.java Here, in the Figure 7.5 the class defined is testing. The proxy is new CalculatorProxy( ). To communicate with the APIs of web services, a WS Proxy client will be used by JWS client. Invocation of WS APIs as local method is the general attitude of a proxy client. WS interfaces are always described by WSDL files. SetEndpoint( ) is a method in the proxy class, by using this we can change the url for a particular web service. Normally, the endpoint url is defined towards local host. If you won t set the endpoint url, it will consider the url in wsdl. 7.3 Understanding test results from TCG Figure 7.6: Calculator.wsdl Interface design 130

An interface commonly contains methods that have to be implemented. It is implemented by a class that defines all these methods. The Figure 7.6 shown here is the interface design of Calculator.wsdl file. Calculator itself acts as interface which contains different operations like add, subtract, multiply and division. Usually, interfaces contain collection of methods that implement drain body. 7.4 Statistics for response time analysis We categorize the operations into Lower, Medium, Medium Complex, Complex and High Complex. Let us take a look at the following example: The Figure 7.7 explains about the response time for a particular add request in the Calculator program. The response time is the amount of time taken to respond after a request is submitted for processing. Figure 7.7: Response time for request add operation 131

We identify the above categories based on number of attributes. And we assume that the business logic is involved in each operation and is directly proportional to the number of attributes. The property called number of classified attributes is the combination of input and output attributes. The response time for 3 attributes is recorded as 726 ms, if the business logic is very minimal and there are no network hassles. For other operations, like Chennai emergency services the response time is recoded as 936 ms (546 bytes) for 3 attributes due to network hassles and heavy processing logic. This behavior is shown in Figure 7.8. Figure 7.8: Response time for request - (getfire) 132

The Figure 7.8 talks about the response time for getting a fire station as response for the request getfire. The input given is 35 lat and 46 lon in which it gets the fire station address at these latitude and longitude as 13.095446~80.164404~Ambattur Fire Station~ Ambattur Estate, Chennai ~ Near Ambattur police station ~ 600053 ~ 26251601. So, this is called fireresponse. 7.4.1 Load Testing Load Testing - definition The process of keeping some load on a system and asking to quantify its response is called Load Testing. Figure 7.9: Response time for request - Currency Convert 133

Here, you can observe the response time recorded for the request Currency Convert as 1513 ms (382 bytes). The request is given for converting USD into INR (US Dollars to Indian Rupee). The response is 55.945. Now, let us create a test suite for this request, as you know that a collection of test cases is treated as test suite. Add this test request to the test suite and give a name to it. This is shown in the Figure 7.9. Figure 7.10: Creating Test Suite 134

Figure 7.11: Naming the test case Figure 7.10 and 7.11 shows creation of a test suite and naming the created test case. Now, when you click OK for the test suite, it will ask you to enter the name for test case, here we have given TS1 (TEST SUITE 1) and TC1 (TEST CASE 1). Sooner, you click OK; the following window appears. 135

Figure 7.12: Adding request to test case This is an Add Request to Test Case window which specifies the options for adding the request to a test case. As you can see here only Add SOAP response assertion, Close request window and show test case editor are highlighted. When you click OK, the first test step for the SOAP request Currency is created. This behavior is made known in Figure 7.12. 136

Figure 7.13: Running test case You can run this test case made known in Figure 7.13 and after execution you ll get the following screen shot. Figure 7.14: Finished execution 137

Thus after executing the test case; you ll get the status FINISHED and the response time taken to execute is 2527 ms (look at the bottom of the testing window). Figure 7.15: Creating a new Load Test The Figure 7.15 shows how a load test can be performed for the test case that is created. Load test is performed only after the execution of test case. While load testing, we need to remember the characteristic properties like response time, size of the load (threads), memory utilization, CPU utilization and delay time. Delay time When the request is submitted for processing, the time it takes to receive the first byte is called delay time. 138

Figure 7.16: Naming Load Test The Figure 7.16 shows specifying the name for the load test LD 1. When you click OK, a new window for load testing will be created test steps. As of now, we created only one test step. Load test is usually performed under various testing environments with different tools, sometimes with several systems running multiple requests in parallel to identify the load put on several machines. 139

Figure 7.17: Performing Load Test The Figure 7.17 shows the window to perform load testing with test steps so that we can run these test steps along with the properties called min,max,avg,cnt,tps,bytes,err,rat.(in ms) Metrics and Parameters of Load Testing Usually the Load Test is performed with the following parameters: 1. TEST STEP 2. MIN 3. MAX 4. AVG 5. LAST 6. CNT 7. TPS 8. BYTES 140

9. ERR 10. RAT TEST STEP Foe each thread the start up delay will be set up. MIN MAX AVG denotes the minimum, maximum and average times the test step takes. LAST The last time for the test step. CNT Count the number spell of times that a test case is executed. TPS Transactions Per Second. BPS - Bytes Per Second. ERR Number of Assertion Errors for the test case. RAT Failed requests ratio (The total percentage of requests which are failed). Figure 7.18: Load testing of Test Step 25% test coverage The Figure 7.18 shows that the load test is performed with 5 threads; strategy selected is simple, test delay of 1000 ms and setting the time limit to 60 seconds. The test coverage is 25%. When we set the strategy to simple the delay will be configured. In this context, the test step Currency has the CNT value of 35 with a predefined time limit of 60 seconds will have a TPS of 2.34 141

which is almost negligible and sometimes considered as 3. TPS is calculated as count/time limit taken. BPS is calculated as Number of bytes/time taken. Hence, if you see BPS value, it is recorded as 889. Figure 7.19: Load testing of Test Step 50% test coverage In this context, the TPS is 3.09. And the minimum and maximum response times are recorded as 629 ms and 1260 ms respectively. The TPS and BPS are recorded as 3 and 1177 respectively with 50% test coverage. So, the total number of transactions per second is 3 and 1177 bytes are processed. This description is shown in Figure 7.19 142

Figure 7.20: Load testing of Test Step 100% test coverage In this context, with 100% test coverage the TPS is recorded as 3. And the minimum and maximum response times are recorded as 622 ms and 1151 ms respectively. Similarly the BPS is recorded as 1227. It is shown that between each batch of requests the test delay is 1000 ms. This behavior is shown in Figure 7.20. 143

Figure 7.21: Load Test - Second run The Figure 7.21 explains all the properties of load testing of a service for second time, where 398 entries took place in total. When you run the test case it was configured before to 60 seconds and with 5 threads. You can even alter these predefined values and run the tests. The test case generator allows the user to run the load tests with many threads and these are called users more often virtual users. They are shown as 5 here. The concept of defining number of virtual users depends on the memory, response time of final service and CPU. For every thread the test case will be processed and defines its own strategy. It accesses an exclusive photocopy of the test case and its steps in the process of solving the issues of threads. More often the test cases are based on timing constraints so that you can start and stop running these cases at any point of time. Several load tests can be performed in parallel. Once the execution is finished, the statistics of the load test can be collected and updated. 144

The Figure 7.22 explains about the third run of load testing a test case; Transactions Per Second (TPS) and Bytes Per Second (BPS) are recorded in the column as 1 and 686. When the actual time is passed, a test case that can run 60 seconds take over 600 requests will get a TPS of 60. And a test case that has run for 60 seconds and take over 600000 bytes will get a BPS of 60000. This case you can see only up to the limit of 60 seconds which is pre-configured. Since, there are two requests (number of transactions - 2) are on the mark per second for processing it is executing twice so the count is 122. Figure 7.22: Load Test Third run 145

Further, it shows the min, max, and avg are recorded as 771, 8355 and 1,711 respectively. The CNT is 122 and the number of bytes is 46302. The TPS and BPS are recorded as 1.8 and 686 respectively with a test delay of 1000. Statistical Graph for LD1 Figure 7.23: Statistics for LD1 The Figure 7.23 shows the statistical graph for the load test 1 (LD1). It explains that the thread count, error count and TPS are nominal and moderate, but the average time and BPS are more (blue and yellow colour representation), this happens to be for the third run. There are several options in this graph which enable the user to update the graph with the similar period of intervals. Here, the selected step is for total number of test steps. This graph is used to see the statistics with respect to time and be capable to identify the abrupt changes. Setting the resolution to data determines how frequently the graph is going to be updated. 146

Statistical History Graph for LD1 Figure 7.24: Statistics History for LD1 This graph shows the statistical history for LD1. The selected value for statistics is AVERAGE and the resolution is set to data so as to update the graph immediately and explains how regularly the update is possible. Here, we observed that the distribution of values is very less and almost negligible because of only one test step. In this case, we considered the statistical value for all steps. We can also compare the average time in the same test. When there is no change in the number threads even for the third run, there won t be any change in the response time (represented in black color).this behavior is shown in Figure 7.24. 147

Figure 7.25: Change in response times (20 threads) In the Figure 7.25, we observed that there is a drastic change in the minimum, maximum and average times when compared to the previous case because of the alteration in the numeric value of threads; as the numeric value of threads is 5 before and 20 now. The response times are extremely changed including TPS and BPS with error rate 1. 7.4.2 Security Testing Security Testing definition A process which is used to describe and giving assurance, that a system protects the data and maintains its projected functionality. The security of the systems depends on several factors like authentication, verification, authorization etc. 148

Figure 7.26: Creating new Security Test Figure 7.27: Naming Security Test ST 1 149

Figure 7.28: Security Test for Test Step The Figure 7.28 shows the security test for the created test step. As mentioned earlier, security testing will be done under several properties like verification, authentication, authorization etc; since there is no authentication (not created), it is showing a warning message that no security scans available to run. When we use web services the security becomes the biggest problem, because the SOAP messages are usually transferred over the network. So, anyone with a sniffer which is packet or protocol analyzer may catch the SOAP message and study it. For this, the best possible solution is to use https (SSL- Secure Socket Layer) as an alternative for http so as to encrypt the message. In order to implement this, you have to install a certificate issued by an authority on your server. We need to assume and even sometimes confirm that all certificates are trusted ones. So, a SOAP platform which is based on servlet normally doesn t know about SSL. And it is the container of the servlet which thinks about the encryption and decryption. For this, you have to set up the container of the servlet for SSL. 150

Variation in response times for gethospital request in the example of Chennai emergency services. Figure 7.29: Response time for request gethospital 1 The figure 7.29 shows that for the request gethospital, initially the response time is recorded as 1815 ms and number of bytes in 532. 151

Figure 7.30: Response time for request - gethospital 2 (second run) The figure 7.30 shows the response time as 3151 ms for the second run with same number of bytes as there is no change in the number of threads that was initially considered. 152

Figure 7.31: Response time for gethospital request Resubmission (third run) As shown in the Figure 7.29, at the initial submission of the request gethospital, the recorded response time is 1815 ms and the data transferred is 532 bytes. If you observe carefully the input given for latitude and longitude are 32 and 35 respectively. The Figure 7.30 and Figure 7.31 talks about the response times for the same request gethospital. The response time recorded in the Figure 7.30 is 3151 ms and the amount of data transferred is 532 bytes. But, when the same is submitted as input again in third run, the recorded response time is 5360 ms, and the number of processed is same. This is shown in Figure 7.31 153

By this we can say that there is a radical change in the response times and the number of bytes processed is same when the requests are submitted with same input parameters, even for the same number of attributes because responding to the submitted request is not only depends on number of attributes but also on the heavy processing logic and network hassles. 7.5 Experimental environment In a real time environment we made it possible that it has 2.8 GHz/1.96 GHz, 4 GB DDR3, 500 GB 7200 rpm HDD. Real time environment which says that we have tested web services of our own and the web services of a third party. All through the experiments, we submit randomly different inputs in terms of number of attributes; where different variations of test results are generated. 7.6 Experimental Results The experimental results are published here: Firstly, As shown in the Figure 7.32, we have taken number of attributes on X-axis and time in ms (milli seconds) on Y-axis. When you give an input, as you have seen in the previous figures 7.7, 7.8, the response time varies from input to input. Have a glance at the Figure, it talks when 2 to 3 attributes were given as input and even up to 99 attributes there is no major change in the response time. 154

------- Change in the response times Figure 7.32: Statistical graph - 1 ---- Change in the response times Figure 7.33: Statistical graph 2 155

To explain further, as the number of attributes increase the response time automatically increases as you see in the Figure 7.33. From the 100 th attribute there is a drastic change in the response time. Our framework has shown the best implementation of the test requests when they were submitted to the endpoint url in generating the responses. Moreover, the response time generated depends on sometimes number of attributes and more often on business logic and network hassles. The shaded portion of the figure talks about the amount of radical change in the response times. ---------- Change in the response times Figure 7.34: Statistical graph 3 156

The Figure 7.34 shows that from 1000 to 10000 attributes, the response times radically changes from 1600 to 10000 ms. The Figure explains one such example where the response times are changing from time to time. Figure 7.35: Pie Chart 1 - representation for Number of attributes and response times As shown in the Figure 7.35; the pie chart represents the idea of number of attributes taken for consideration as input for the related response times. As you see in this Figure, the green colour depicts one attribute given for input and even up to 10 (1 to 10) and the response time recorded, it is not visible (negligible) because it is merged with the yellow colour. Next, the yellow colour shows the number of attributes from 10 to 100, the response time recorded is shown as very minimal. Similarly, the maroon colour shows that number of attributes ranging from 100 to 1000 in which the response time recorded is somewhat more when 157

compared to other inputs, may be between 1200 ms 2000 ms, you can see this behaviour in Figure 7.29. Finally, the blue colour shows that for the attributes given from 1000 to 10,000, the response time may be in between 2000 ms to 16000 ms. Figure 7.36: Pie Chart 2 Variation in response times The Figure 7.36 shows the variations in response times when more number of attributes are considered when compared to the attributes taken in the previous pie chart. As it is shown, 5, 10, 20,100 are the amount of percentages of response times generated for 10, 100, 1000 and 10,000 numbers of attributes respectively. 158

7.7 Conclusion We have observed that functional testing can be made easy with our framework. It is also observed that the tester doesn t need any background knowledge about the technology. Our framework tests any web service in generating the right response when the right request is submitted. The response times are changing according to the threads, strategy and time limit. By this we can say if the time limit exceeds the defined value, the load test will fail. If the numbers of threads are increased and time limit changes, automatically there will be a change in the minimum, maximum and average response times and also in TPS and BPS values. 159