Performance Testing and Evaluation of Web Applications Using WAPT Pro



Similar documents
Keywords: Load testing, testing tools, test script, Open-source Software, web applications.

Load Testing on Web Application using Automated Testing Tool: Load Complete

How To Test For Performance

Arti Tyagi Sunita Choudhary

Testing & Assuring Mobile End User Experience Before Production. Neotys

Bringing Value to the Organization with Performance Testing

Performance Tuning Guide for ECM 2.0

Comparative Study of Load Testing Tools

Performance Testing Process A Whitepaper

Understanding Slow Start

Business Application Services Testing

How To Test On The Dsms Application

Enhanced Grinder Framework with Scheduling and Improved Agents

IBM RATIONAL PERFORMANCE TESTER

Performance Testing. Slow data transfer rate may be inherent in hardware but can also result from software-related problems, such as:

Front-End Performance Testing and Optimization

Test Run Analysis Interpretation (AI) Made Easy with OpenLoad

Performance Analysis of IPv4 v/s IPv6 in Virtual Environment Using UBUNTU

Improvised Software Testing Tool

PERFORMANCE TESTING. New Batches Info. We are ready to serve Latest Testing Trends, Are you ready to learn.?? START DATE : TIMINGS : DURATION :

Mike Chyi, Micro Focus Solution Consultant May 12, 2010

TPC-W * : Benchmarking An Ecommerce Solution By Wayne D. Smith, Intel Corporation Revision 1.2

Table of Contents INTRODUCTION Prerequisites... 3 Audience... 3 Report Metrics... 3

Scalable Internet Services and Load Balancing

Charu Babbar 1, Neha Bajpai 2 and Dipti Kapoor Sarmah 3

Deploying the BIG-IP LTM with the Cacti Open Source Network Monitoring System

Best Practices for Web Application Load Testing

Development of Monitoring and Analysis Tools for the Huawei Cloud Storage

Whitepaper Performance Testing and Monitoring of Mobile Applications

1 How to Monitor Performance

1 How to Monitor Performance

Performance Test Process

Performance Testing Tools: A Comparative Analysis

Web Application Testing. Web Performance Testing

FIGURE Selecting properties for the event log.

How To Test A Web Server

STeP-IN SUMMIT June 18 21, 2013 at Bangalore, INDIA. Enhancing Performance Test Strategy for Mobile Applications

Application Performance Testing Basics

Computer Networking LAB 2 HTTP

Tuning Tableau Server for High Performance

Sensitivity Analysis and Patterns Implementation on Load Testing Software Systems

TRACE PERFORMANCE TESTING APPROACH. Overview. Approach. Flow. Attributes

Lab Testing Summary Report

Load/Stress Test Plan

Performance Comparison of low-latency Anonymisation Services from a User Perspective

Web Application s Performance Testing

Website Performance Analysis Based on Component Load Testing: A Review

Fundamentals of LoadRunner 9.0 (2 Days)

SharePoint Performance Optimization

Web Load Stress Testing

NetIQ Access Manager 4.1

TESTING AND OPTIMIZING WEB APPLICATION S PERFORMANCE AQA CASE STUDY

IBM Tivoli Monitoring Version 6.3 Fix Pack 2. Infrastructure Management Dashboards for Servers Reference

Load Manager Administrator s Guide For other guides in this document set, go to the Document Center

Successful Factors for Performance Testing Projects. NaveenKumar Namachivayam - Founder - QAInsights

BW-EML SAP Standard Application Benchmark

Muse Server Sizing. 18 June Document Version Muse

Applications. Network Application Performance Analysis. Laboratory. Objective. Overview

Siebel & Portal Performance Testing and Tuning GCP - IT Performance Practice

Performance Analysis of Web based Applications on Single and Multi Core Servers

Cisco Application Networking for IBM WebSphere

Web Performance Testing: Methodologies, Tools and Challenges

Performance Best Practices Guide for SAP NetWeaver Portal 7.3

The Critical Role of an Application Delivery Controller

Application. Performance Testing

Performance Modeling for Web based J2EE and.net Applications

Enterprise Application Performance Management: An End-to-End Perspective

An Oracle White Paper Released Sept 2008

ANALYSING SERVER LOG FILE USING WEB LOG EXPERT IN WEB DATA MINING

Performance And Scalability In Oracle9i And SQL Server 2000

Mike Canney Principal Network Analyst getpackets.com

Identifying the Number of Visitors to improve Website Usability from Educational Institution Web Log Data

Autodesk AutoCAD Map 3D Citrix XenApp 4.5 Performance Analysis

Scalable Internet Services and Load Balancing

Performance Testing of a Large Wealth Management Product

Performance analysis and comparison of virtualization protocols, RDP and PCoIP

Q: What is the difference between the other load testing tools which enables the wan emulation, location based load testing and Gomez load testing?

Case Study: Load Testing and Tuning to Improve SharePoint Website Performance

How To Model A System

STeP-IN SUMMIT June 2014 at Bangalore, Hyderabad, Pune - INDIA. Mobile Performance Testing

How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)

Monitoring the Real End User Experience

SSL VPN Technology White Paper

Mike Canney. Application Performance Analysis

2. RELATIONAL WORK. Volume 2, Issue 5, May 2013 Page 67

Performance Testing and Improvement in Agile

Tableau Server Scalability Explained

Legal Notices Introduction... 3

An Oracle White Paper Released October 2008

Agenda. Capacity Planning practical view CPU Capacity Planning LPAR2RRD LPAR2RRD. Discussion. Premium features Future

W21. Performance Testing: Step On It. Nadine Pelicaen. P r e s e n t a t i o n

PERFORMANCE ANALYSIS OF WEB SERVERS Apache and Microsoft IIS

Load/Performance Test Plan

Mobile Performance Testing Approaches and Challenges

Transcription:

ISSN(Online: 232981 ISSN (Print: 2329798 (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 Performance Testing and Evaluation of Web Applications Using WAPT Pro Dr. Deepak Dagar 1, Dr. Amit Gupta 2 Assistant Professor, Department of management, Maharaja Agrasen Institute of Management Studies, Rohini, New Delhi, India 1 Associate Professor, Department of management, Maharaja Agrasen Institute of Technology, Rohini, New Delhi, India 2 ABSTRACT: Web Applications are widely known as the building blocks of typical service oriented applications. Performance of such an application system is mainly dependent upon the components of web applications. Testing web application is nothing but to find out errors in its content, function, usability, navigability, performance, capacity, and security. Performance testing is a used to determine the responsiveness, throughput, reliability, and/or scalability of a system under a given workload. This paper presents performance testing features, types and tools for testing web applications performance. KEYWORDS: TPS, TTPS, CPS, PDPS. I. INTRODUCTION 1. Performance Testing The testing performed to determine the degree to which a system or component accomplishes its designated functions within given constraints regarding processing time and throughput rate. The purpose of the test is to measure characteristics, such as response times, throughput or the mean time between failures (for reliability testing. II. PERFORMANCE TESTING TOOL A tool to support performance testing and that usually has two main facilities: load generation and test transaction measurement. Load generation can simulate either multiple users or high volumes of input data. During execution, response time measurements are taken from selected transactions and these are logged. Performance testing tools normally provide reports based on test logs and graphs of load against response times. Features or characteristics of performancetesting tools include support for: generating a load on the system to be tested; measuring the timing of specific transactions as the load on the system varies; measuring average response times; producing graphs or charts of responses over time. III. LOAD TEST A test type concerned with measuring the behavior of a component or system with increasing load, e.g. number of parallel users and/or numbers of transactions to determine what load can be handled by the component or system. While doing Performance testing we measure some of the following: Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6965

ISSN(Online: 232981 ISSN (Print: 2329798 (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 Characteristics (SLA Response Time Hits per Second Throughput Transactions per Second (TPS TPS (TTPS Connections per Second (CPS Pages Downloaded per Second PDPS Measurement (units Seconds #Hits Bytes Per Second #Transactions of a Specific Business Process no. of Transactions #Connections/Sec #Pages/Sec Table 1.1 IV. OBJECTIVE OF PERFORMANCE TESTING The objective of a performance test is to ensure that the application is working perfectly under load. However, the definition of perfectly under load may vary with different systems. By defining an initial acceptable response time, we can benchmark the application if it is performing as anticipated. V. CHARACTERISTICS OF PERFORMANCE TESTING The importance of Transaction Response Time is that it gives the project team/ application team an idea of how the application is performing in the measurement of time. With this information, they can relate to the users/customers on the expected time when processing request or understanding how their application performed. 5.1 Transaction Response time The Transaction Response Time encompasses the time taken for the request made to the web server, there after being process by the Web Server and sent to the Application Server. This in most instances will make a request to the Database Server. All this will then be repeated again backward from the Database Server, Application Server, Web Server and back to the user. Take note that the time taken for the request or data in the network transmission is also factored in.the following diagram illustrates Transaction Response Time. Fig 1.1 Transaction Response Time = (t1 + t2 + t3 + t4 + t5 + t6 + t7 + t8 + t9 X 2 Transaction Response Time allows us to identify abnormalities when performance issues surface. This will be represented as slow response of the transaction, which differs significantly (or slightly from the average of the Transaction Response Time. With this, we can further drill down by correlation using other measurements such as the number of virtual users that is accessing the application at the point of time and the systemrelated metrics (e.g. CPU Utilization to identify the root cause. Bringing all the data that have been collected during the load test, we can correlate the measurements to find trends and bottlenecks between the response time, the amount of load that was generated and the payload of all the components of the application. Using Transaction Response Time, Project Team can better relate to their users using transactions as a Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6966

ISSN(Online: 232981 ISSN (Print: 2329798 (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 form of language protocol that their users can comprehend. Users will be able to know that transactions (or business processes are performing at an acceptable level in terms of time. So, it is very important when looking at this metric to consider what and how the application is intended to work. a. Hits per second A virtual client can request an HTML page, image, file, etc. Testing the application for Hits Per Second will tell you if there is a possible scalability issue with the application. For example, if the stress on an application increases but the Hits Per Second does not, there may be a scalability problem in the application. It is possible that out of a hundred hits on the application, the application server actually answered only one and all the rest were either cached on the web server or other caching mechanism. 5.3 Pages per second Pages per Second measures the number of pages requested from the application per second. The higher the Page per Second the more work the application is doing per second. Measuring an explicit request in the script or a frame in a frameset provides a metric on how the application responds to actual work requests. Thus if a script contains a Navigate command to a URL, this request is considered a page. If the HTML that returns include frames they will also be considered pages, but any other elements retrieved such as images or JS Files, will be considered hits, not pages. This measurement is key to the enduser s experience of application performance. 5.4 Throughput The amount of data transferred across the network is called throughput. It considers the amount of data transferred from the server to client only and is measured in Bytes/sec. This is an important baseline metric and is often used to check that the application and its server connection is working. Throughput measures the average number of bytes per second transmitted from the application being tested to the virtual clients running the test agenda during a specific reporting interval 5.5 Round Trips Another useful scalability and performance metric is the testing of Round Trips. Round Trips find the total number of times the test agenda was executed versus the total number of times the virtual clients attempted to execute the Agenda. The more times the agenda is executed, the more work is done by the test and the application. The test scenario the agenda represents influences the round Trips measurement. This metric can provide all kinds of useful information from the benchmarking of an application to the enduser availability of a more complex application. 5.6 Hit Time Hit time is the average time in seconds it took to successfully retrieve an element of any kind (image, HTML, etc. The time of a hit is the sum of the Connect Time, Send Time, Response Time and Process Time. It represents the responsiveness or performance of the application to the end user. The more stressed the application, the longer it should take to retrieve an average element. But, like Hits Per Second, caching technologies can influence this metric. Getting the most from this metric requires knowledge of how the application will respond to the end user. 5.7 Time to first byte This measurement is important because end users often consider a site malfunctioning if it does not respond fast enough. Time to First Byte measures the number of seconds it takes a request to return its first byte of data to the test software s Load Generator. Time to First Byte represents the time it took after the user pushes the enter button in the browser until the user starts receiving results. Generally, more concurrent user connections will slow the response time of a request. But there are also other possible causes for a slowed response. 5.8 Page Time Page Time calculates the average time in seconds it takes to successfully retrieve a page with all of its content. This statistic is similar to Hit Time but relates only to pages. In most cases this is a better statistic to work with because it deals with the true dynamics of the application. Since not all hits can be cached, this data is more helpful in terms of tracking a user s experience (positive or frustrated. It s important to note that in many test software application tools you can turn caching on or off depending on your application needs. Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6967

ISSN(Online: 232981 ISSN (Print: 2329798 (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 5.9 Failed Rounds/Failed Rounds per Second During a load test it s important to know that the application requests perform as expected. The Failed Rounds and Failed Rounds per Second tests the number of rounds that fail. Sometimes, basic image or page missing errors (HTTP 44 error codes could be set to fail a round, which would stop the execution of the test agenda at that point and start at the top of the agenda again, thus not completing that particular round. VI. WEB APPLICATION ARCHITECTURE A Web application is an application that can be accessed by the users through a Web browser or a specialized user agent. The browser creates HTTP requests for specific URLs that map to resources on a Web server. The server renders and returns HTML pages to the client, which the browser can display. The core of a Web application is its serverside logic. The application can contain several distinct layers. The typical example is a threelayered architecture comprised of presentation, business, and data layers. Figure 1.2 illustrates a typical Web application architecture with common components grouped by different areas of concern. Summary VII. PERFORMANCE TESTING DONE ON FLIPKART.COM _flipkart.com_july 5 Successfu l sessions Failed session s Successfu l pages Faile d pages Successfu l hits Faile d hits KByte s sent KBytes receive d 8 19 178 19 6894 14 3974 85322 Avg response time, sec (with page elements.98(2.78 Successful sessions (Failed sessions _flipkart.com_j uly5 :: :1: :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: ( ( 3( 2(9 2(5 (2 (41 1(9 (9 (16 8(1 9 Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6968

ISSN(Online: 232981 ISSN (Print: 2329798 Successful pages (Failed pages (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 _flipkart.com_ july5 :: :1: :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: 73( 232( 385( 131(9 251(5 26(2 2(41 299(9 292(9 89(16 178(1 9 Successful hits (Failed hits :: :1: :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: _flipkart.com _july5 38( 825( 1382( 773(1 4 934(8 1(2 5 22(5 12(1 4 845(9 55(2 6894(1 4 errors on pages (hits as a % of all completed pages (hits :: :1: :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: _flipkart.com _july5 ( ( ( 6.43 (1.78 1.95 (.85 43.5 (2 95.3 (69.4 2.92 (1.15 2.99 (1.5 15.2 (3.81 5.77 (1.99 Load agent utilization, % Name localho st Utilizati on :: :1: :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: CPU 12 14 18 18 2 11 1 19 18 16 15 Memory Mb(% 1(4 12(4 115(5 124(6 124(6 124(6 124(6 124(6 124(6 124(6 118(5 Network 1 1 1 1 1 Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6969

ISSN(Online: 232981 ISSN (Print: 2329798 (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 Overall performance Errors Average bandwidth Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 697

ISSN(Online: 232981 ISSN (Print: 2329798 (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 Average response time (without page elements _flipkart.com_july5.page_1: http://www.flipkart.com/ VII. PERFORMANCE TESTING USING WAPT PRO ( SNAPDEAL.COM Summary _snapdeal_ 5july Successful sessions Failed sessions Successful pages Failed pages Successful hits Failed hits KBytes sent KBytes received Avg response time, sec (with page elements 35 286 35 3661 415 1562 1524 1.95(26.2 Successful sessions (Failed sessions _snapdeal_5j uly :: :1: :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: (8 (28 (43 (35 (31 (34 (24 (32 (38 (32 (3 5 Successful pages (Failed pages _snapdeal_5 july :: :1: :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: 8(8 28(28 44(43 36(35 23(31 34(34 19(24 25(32 39(38 3(32 286(3 5 Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6971

ISSN(Online: 232981 ISSN (Print: 2329798 (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 Successful hits (Failed hits _snapdeal_5 july :: :1: 151(8 :1: :2: 384(32 :2: :3: 621(43 :3: :4: 423(48 :4: :5: 312(44 :5: :6: 436(48 :6: :7: 257(38 :7: :8: 281(58 :8: :9: 456(5 :9: :1: 34(46 3661(41 5 Successful sessions per second _snapdeal_5j uly :: :1: Successful pages per second _snapdeal_5j uly :: :1: Successful hits per second _snapdeal_5j uly :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: Tota l :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: Tota l.13.47.73.6.38.57.32.42.65.5.48 :: :1: :1: :2: :2: :3: :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: Tota l 2.52 6.4 1.4 7.5 5.2 7.27 4.28 4.68 7.6 5.67 6.1 errors on pages (hits as a % of all completed pages (hits _snapdeal _5july :: :1: 5(5. 3 :1: :2: 5(7. 69 :2: :3: 49.4(6. 48 :3: :4: 49.3(1.2 :4: :5: 57.4(12.4 :5: :6: 5(9. 92 :6: :7: 55.8(12.9 :7: :8: 56.1(17.1 :8: :9: 49.4(9. 88 :9: :1: 51.6(11.9 51.6(1.2 Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6972

ISSN(Online: 232981 ISSN (Print: 2329798 Load agent utilization, % Name localho st Utilizati on :: :1: :1: :2: (An ISO 3297: 27 Certified Organization :2: :3: Vol. 3, Issue 7, July 215 :3: :4: :4: :5: :5: :6: :6: :7: :7: :8: :8: :9: :9: :1: CPU 34 38 38 34 38 27 3 28 19 14 3 Memory Mb(% 91(4 95(4 14(4 19(5 19(5 19(5 19(5 19(5 19(5 19(5 15(4 Network 1 2 1 1 1 Overall Performance Errors Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6973

ISSN(Online: 232981 ISSN (Print: 2329798 (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 Average bandwidth Average response time (without page elements _snapdeal.page_1: http://www.snapdeal.com/ IX. RESULT AND CONCLUSION Performance Criteria Flipkart Snapdeal Successful Session 8 Failed Session 19 35 Successful Pages 178 286 Failed Pages 19 35 Successful Hits 6894 3661 Failed Hits 14 415 KB Sent 3974 1562 KB Received 85322 1524 Average Response Time.98 1.95 Load Utilization CPU 15 3 Memory 118 15 N/w Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6974

ISSN(Online: 232981 ISSN (Print: 2329798 (An ISO 3297: 27 Certified Organization Vol. 3, Issue 7, July 215 The result based on the comparisons of performance testing conducted using WAPT shows that both the website shows good performance in terms of overall performance, average response time, average bandwidth and error rate. However, WAPT pro shows good result for flipkart in terms of successful hits, failed hits, average response time and CPU Utilization. REFERENCES 1. Shariq Hussain, Zhaoshun Wang, Ibrahima Kalil Toure and Abdoulaye Diop, Web Service Testing Tools: A Comparative Study, IJCSI International Journal of Computer Science Issues, Vol. 1, Issue 1, No 3, January 213. 2. VCAA (25, Stress, Load, Volume, Performance, Baseline Testing Tool Evaluation and comparison [Online].Available:http://www.vcaa.com/tools/loadtesttoolevaluationchart23.pdf. 3. AgileloadBlog (212, Application Performance testing basics [Online].Available:http://www.agileload.com/agileload//blog/212/11/9/applicationperformancetestingbasics.\ 4. Lakshmi Narayan (211, Overview of Performance Testing Concepts[Online].Available:http://nonfunctionaltestingtools.blogspot.in/211/1/overviewofperformancetesting.html 5. Dr. S. M. Afroz, N. Elezabeth Rani and N. Indira Priyadarshini, Web Application A Study on Comparing 6. Software Testing Tools, International Journal of Computer Science and Telecommunications, Volume 2, Issue 3, June 211. 7. IDS Australia (29, A comparison of Open Source Load Testing tools [On line].available: http://www.jds.net.au/techtips/load testing tool comparison. 8. Vinod, P. (28, Open Source & Commercial Performance Testing Tools [Online].Available: 9. http://www.scribd.com/doc/27765939/performance TestToolsComparison. 1. Kualitatem (212, Comparison between HP, IBM and APACHE Performance Testing Tools 11. [Online].Avialable:http://www.kualitatem.com/comparisonbetweenperformancetestingtool/. 12. [9] VCAA (25, Stress, Load, Volume, Performance, Baseline Testing Tool Evaluation and comparison 13. [Online].Available:http://www.vcaa.com/tools/loadtesttoolevaluationchart23.pdf. Copyright to IJIRCCE DOI: 1.1568/ijircce.215. 3774 6975