Web Application: Performance Testing Using Reactive Based Framework



Similar documents
Optimised Realistic Test Input Generation

Augmented Search for Web Applications. New frontier in big log data analysis and application intelligence

A Case Study in Integrated Quality Assurance for Performance Management Systems

1 How to Monitor Performance

Performance Testing Process A Whitepaper

A JAVA TCP SERVER LOAD BALANCER: ANALYSIS AND COMPARISON OF ITS LOAD BALANCING ALGORITHMS

Mining for Web Engineering

Comparative Study of Load Testing Tools

Load testing with. WAPT Cloud. Quick Start Guide

Enhance Preprocessing Technique Distinct User Identification using Web Log Usage data

Performance Modeling for Web based J2EE and.net Applications

Performance Testing Tools: A Comparative Analysis

How To Partition Cloud For Public Cloud

Performance Testing Web 2.0

Web Application Testing. Web Performance Testing

International Journal of Engineering Research-Online A Peer Reviewed International Journal Articles available online

Advanced Preprocessing using Distinct User Identification in web log usage data

A STUDY OF WORKLOAD CHARACTERIZATION IN WEB BENCHMARKING TOOLS FOR WEB SERVER CLUSTERS

A Symptom Extraction and Classification Method for Self-Management

How To Test A Web Based Application Automatically

Load Testing on Web Application using Automated Testing Tool: Load Complete

How To Test Your Web Site On Wapt On A Pc Or Mac Or Mac (Or Mac) On A Mac Or Ipad Or Ipa (Or Ipa) On Pc Or Ipam (Or Pc Or Pc) On An Ip

A Survey on Web Mining From Web Server Log

AN EFFICIENT APPROACH TO PERFORM PRE-PROCESSING

A Comparative Study of Load Balancing Algorithms in Cloud Computing

Optimization of Search Results with Duplicate Page Elimination using Usage Data A. K. Sharma 1, Neelam Duhan 2 1, 2

CARDA: Content Management Systems for Augmented Reality with Dynamic Annotation

Data Mining in Web Search Engine Optimization and User Assisted Rank Results

Web Application s Performance Testing

2. RELATIONAL WORK. Volume 2, Issue 5, May 2013 Page 67

AN ADAPTIVE DISTRIBUTED LOAD BALANCING TECHNIQUE FOR CLOUD COMPUTING

Optimization and Ranking in Web Service Composition using Performance Index

Learning More About Load Testing

WHAT WE NEED TO START THE PERFORMANCE TESTING?

PERFORMANCE TESTING CONCURRENT ACCESS ISSUE AND POSSIBLE SOLUTIONS A CLASSIC CASE OF PRODUCER-CONSUMER

An Enhanced Framework For Performing Pre- Processing On Web Server Logs

APPLYING CASE BASED REASONING IN AGILE SOFTWARE DEVELOPMENT

Identifying the Number of Visitors to improve Website Usability from Educational Institution Web Log Data

Refining an IT-based management system Integrated Workflow & Document Management System (IWDMS)

Personalized e-learning a Goal Oriented Approach

DESKTOP BASED RECOMMENDATION SYSTEM FOR CAMPUS RECRUITMENT USING MAHOUT

Transforming LoadRunner Data into Information and Action

Intelligent Log Analyzer. André Restivo

Cloud deployment model and cost analysis in Multicloud

A Brief Overview of Software Testing Techniques and Metrics

@IJMTER-2015, All rights Reserved 355

Federation of Cloud Computing Infrastructure

Project 2 Performance Testing

Noelle A. Stimely Senior Performance Test Engineer. University of California, San Francisco

Microsoft Web Application Stress Tool

A Review of Data Mining Techniques

Tools for Testing Software Architectures. Learning Objectives. Context

Development of Integrated Management System based on Mobile and Cloud Service for Preventing Various Hazards

Arti Tyagi Sunita Choudhary

Analysis of Data Mining Concepts in Higher Education with Needs to Najran University

Bringing Value to the Organization with Performance Testing

A Comparative Approach to Search Engine Ranking Strategies

Development of Integrated Management System based on Mobile and Cloud service for preventing various dangerous situations

Siebel & Portal Performance Testing and Tuning GCP - IT Performance Practice

How To Model A System

A Study of Web Traffic Analysis

Assessing Learners Behavior by Monitoring Online Tests through Data Visualization

ASSOCIATION RULE MINING ON WEB LOGS FOR EXTRACTING INTERESTING PATTERNS THROUGH WEKA TOOL

Automatic Stress and Load Testing for Embedded Systems

Context-aware Library Management System using Augmented Reality

Application Performance Testing Basics

Analysing Large Web Log Files in a Hadoop Distributed Cluster Environment

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

Efficient DNS based Load Balancing for Bursty Web Application Traffic

Monitoring Web Browsing Habits of User Using Web Log Analysis and Role-Based Web Accessing Control. Phudinan Singkhamfu, Parinya Suwanasrikham

TestScape. On-line, test data management and root cause analysis system. On-line Visibility. Ease of Use. Modular and Scalable.

Smart Integrated Multiple Tracking System Development for IOT based Target-oriented Logistics Location and Resource Service

Profile Based Personalized Web Search and Download Blocker

EXTENDING JMETER TO ALLOW FOR WEB STRUCTURE MINING

Exploitation of Server Log Files of User Behavior in Order to Inform Administrator

Survey on Models to Investigate Data Center Performance and QoS in Cloud Computing Infrastructure

Software Requirements Specification. Schlumberger Scheduling Assistant. for. Version 0.2. Prepared by Design Team A. Rice University COMP410/539

SOA Solutions & Middleware Testing: White Paper

Indirect Positive and Negative Association Rules in Web Usage Mining

Monitoring Performances of Quality of Service in Cloud with System of Systems

Globule: a Platform for Self-Replicating Web Documents

How To Test For Performance

SEO Techniques for various Applications - A Comparative Analyses and Evaluation

Performing Load Capacity Test for Web Applications

A Framework to Improve Communication and Reliability Between Cloud Consumer and Provider in the Cloud

Automating Service Negotiation Process for Service Architecture on the cloud by using Semantic Methodology

Transcription:

International Journal of Research in Computer and Communication Technology, Vol 4,Issue 2,February -2015 ISSN (Online) 2278-5841 ISSN (Print) 2320-5156 : Performance Testing Using Reactive Based Framework Yogita M. Rasal, Sangeeta Nagpure. Department of Computer Engineering. K.J.Somaiya College of Engineering Department of Information Technology. K.J.Somaiya College of Engineering rasal.yogita@gmail.com, sangeetanagpure@somaiya.edu ABSTRACT application performance testing plays an important role in providing Quality of Service (QoS). Performance testing is very important for satisfying users. This research paper presents performance testing of web application using reactive-based framework for reducing the cost and increasing efficiency of the performance testing. This framework provides an approach which involves retrieving the web log from server side. With the help of web logs user pattern is retrieved at server side. Metrics derived from user perspective are used to find the usage pattern at client side. Finally test case can be generated automatically by solving an optimization problem through an evolutionary algorithm. Keywords Performance testing; Testing framework; Automated test case generation; applications. 1. INTRODUCTION Nowadays the growth of web application development has grown leaps and bounds. Every day a variety of new web applications are available on the Internet for public use. The main characteristic of web software products is that they eliminate geographical barriers. Due to the advantages of web applications such as speed search of information, easy collection of information, communication facilities etc, they are continuously evolving in order to provide a better solution. However, these advantages impose a demand for high availability. As web applications become more complex, testing web applications have also become complex. To ease the difficulty of web application testing, automated tools and testing frameworks are now available for different aspects of testing, such as unit testing, functional testing, and load and performance testing. Performance testing is very important to improve reliability and feasibility of web applications for satisfying users. As web application usage is enormous, traditional testing technique is not suitable to solve the problem because of several difficulties as follows[4] 1) Some metrics need to be predicted such as, the type of users, the number of concurrent users, and access methods because of difficulties for simulating real scenarios. 2) Since performance is mostly related to user satisfaction, issues related to reactivity (how the user reacts to different server response time) should be considered in performance testing. 3) Performance testing and scalability are the focus of system testing, because a large number of users will access a service in one distributed web application concurrently. To solve these mentioned problems, a Reactive-based performance testing framework[4] is implemented here. It can monitor and retrieve user patterns for web application through web logs and generate performance testing automatically using an evolutionary algorithm. 2. DISCUSSION application has become significant area of development. Continual availability is one of advantages of web application. The reliability for web applications can be defined as the probability of failurefree web operation completions. Thus Testing of web application is most important. Performance testing is very important for satisfying users to improve reliability and flexibility of web applications. Now-a-days number of techniques are used for performance testing of web application. Details of techniques disperse in several related fields. 2.1 Performance Testing Tools : Traditional commercial performance testing tools include mercury, SilkTest, LoadRunner, IBM Rational performance Tester, and the open source tools such as JMeter, OpenSTA, Load, etc. These softwares help www.ijrcct.org Page 114

to predict system performance by simulating thousands of users who concurrently access web applications, monitoring the system status, and then finding the performance problems and tuning the system by the results. But none of them support automated performance testing. Analysis and comparison of several existing tools which facilitates load testing and performance monitoring was discussed in [1], in order to find the most appropriate tools by criteria such as ease of use, supported features, and license. 2.2 Reactivity-based Research on On the research of reactivity based performance testing of web application, reactivity represents the way a user behaves according to the quality of service provided. It is important to consider users perspective because end users are the main determinant for the success of web site. The work in [2] uses the USAR model to evaluate the performance of a application. USAR model is a workload generation model which considers users reactivity to web sites. Reactivity-based scheduling mechanism[3] has been designed and evaluated and that gives priority according to user behavior. They also propose a hybrid admission control and scheduling Result analysis tool Automated test case generation Analysis Results Metrics from user perspective logs Usage pattern analysis Figure 1. Test case generation framework mechanism that combines both reactive approaches. Reactive-Based framework for automated performance testing is proposed in [4] which monitors and retrieves user patterns for web applications through web logs at the server side. Then, metrics derived from users perspective are applied and usage pattern from client side are gained. At last test case can be generated automatically by solving an optimization problem through an evolutionary algorithm. 2.3 Deriving user patterns from web log The web log is the best repository for the information source; it keeps the whole record of even a tiny event. log file gives the ability to collect humancomputer interaction data on a number of users, over time. A log file analysis tool, Hawk is presented in [5]. A particular analysis technique, based on Markov chain analysis, has described which can be used to derive high-level software usage patterns. Log [Type text] Page 115

Explorer[6] was a web usage mining tool which analysed web log data. Log file data of NASA web server is used to extract useful patterns. Before the application of pattern recognition, initially log file data is being preprocessed to remove any unwanted entries so that the patterns extracted are useful and relevant. Then usage pattern is extracted which is useful to help administrators managing the website resources in better way. The method of discovering helpful patterns from the online server log file of an educational institute was discussed in[7]. 2.4 Automated Test Case Generation for Performance Testing There are many researches on automated performance testing. One such research proposes a new reactivitybased performance testing Framework [4]. They also provide a complete approach to generate test cases automatically from original web logs. An automated model based testing technique[8] has been proposed to test web application from its structural model. The Godzilla system presented in [9] was fully automated test data generator for fault-based testing strategy. It not only demonstrates that software test data generation for mutation can be automated at a level beyond that of previous research of commercial available systems but also provides the vehicle for integrating many current testing techniques. A set of related search based testing algorithms[10] was introduced which adapted for web application testing and augmented the approach with static and dynamic seeding. An activity oriented approach was described to engineer automated tests for web applications with reference to a web application developed for grant funding agencies[11]. A method to generate performance test cases automatically based on Genetic Algorithms for any system consisting of composite services has been proposed in [1]. It considered users experience in the performance test model. 3. PROPOSED SYSTEM Reactive based framework for performance test case generation Reactivity represents the way, a user behaves according to the quality of service provided. It is important to consider users perspective for the success of any web site. Reactive based framework[1] is shown in Fig. 1. This framework proposes test case generation in 4 phases as follows 1. log generation and processing In this phase we develop one website for construction company and launch it on the server. This website is used by number of user. Then we take the web log file from server and process that file to get the relevant entries which are useful for our testing process. 2. Deriving usage pattern from web logs Usage pattern diagram at server side for a construction company application is derived as follows. Suppose we have request logs for each category of users. Each request log L includes the following information :UID, RequestType, RequestTime, ExecutionTime. UID is an identification of the customer submitting the request which could be obtained from cookies, dynamic URLs, or other techniques. RequestType represents the type of request. RequestTime is the time when the request arrives at the site. ExecutionTime is the execution time of this request. This value is not normally recorded in the HTTP log but it can be recorded at server side as follows privateintstarttime; doubleetime; stringreqtype = "Login"; doublethinktime; DateTimedt; protectedvoidpage_init(object sender, EventArgs e) { starttime = Environment.TickCount; } protectedvoidpage_load(object sender, EventArgs e) { stringstr = ConfigurationManager.ConnectionStrings["A"].ToStrin g(); con = newsqlconnection(str); intendtime = Environment.TickCount; doubleexecutiontime = (double)(endtime - starttime) / 1000.0; etime = executiontime; dt = DateTime.Now; A usage pattern model at server side is derived from web logs as shown in fig. 2. The square represents the RequestType and transition between request type represents the probabilities that one RequestType goes to another RequestType. E represents the average execution time for each request and T represent the average think time for the next request at server side. Table 1 : Average Execution and Think Time at server side Request type Avg Server Execution Time (Es) Avg Server Think Time(Ts) Login 4s 6s Insert 6s 3s Search 2s 1s www.ijrcct.org Page 116

Browse 5s 2s (Es) Login 2 2 Insert 11 5 Search 1 4 Browse 11 4 Performance Sensitivity Level (PSL) is to show the degree the request type is sensitive to response time. Therefore, a client-side usage pattern model could be transformed from previous model in Fig. 2 as shown in fig. 3 Figure 2. Usage pattern model at server side 3. Metrics Based On Users Perspective Metrics from users perspective could be useful rules for performance testing because user satisfaction is very important for the success of a web site. Client side behavior is derived because the execution time and think time from the client side is different from server side and they are calculated as follows T c = T s 2* T n (1) Where T s represents the server side think time, T c represents client side think time and T n represents the time for data transmission on the network E c = E s + 2* T n (2) Where E s reperesent the server side execution time, E c represent client side execution time and T n represent the time for data transmission on the network. Give up rate is metrics which can reflect use satisfaction. Usually response time is the main factor for giving up for waiting for the response. The give up rate found from weblog could be referred as users view on response time. Table 2 : Average Execution and Think Time at Client side Request type AvgClient Execution Time AvgClient Think Time(Ts) Figure 3. Usage pattern model at client side 4. Automated test case generation : Performance testing is used for test the application with respect to handle the performance. The main goal behind the performance test on web application is to find out the reason why the application bit slow over access on the network. So it helps to improve the access rate over the network. Traditional testing techniques have some difficulties in completing the task. Evolutionary technique could be an option which could be used as an automated testing technique. It considers the test case generation process as a numerical optimization problem. It can help to generate high quality test plans and reduce the cost by minimizing manual work. In our future work we will be implementing automated test case generation using genetic algorithm. 5. CONCLUSION In this paper we have implemented a Reactive based framework for performance testing. Firstly a web www.ijrcct.org Page 117

application is developed. After that, usage patterns are retrieved through web logs for web application at the server side. This is followed by deriving two metricsone from users perspective such as PSL value and second is give up rate. With the help of these two metrics, usage pattern at client side is derived. These are the basis input to our automated test case generation model. In our future work we will be implementing the automated test case generation model using evolutionary algorithm that is genetic algorithm. 6. REFERENCES [1] J. Krizanic, A. Grgurie, M. Mosmondor and P. Lazarevski, Load Testing and Performance Monitoring Tools in use with Ajax Based s MIPRO 2010, May 24-28, 2010, Opatija, Croatia [2] Leonardo Silva, Adriano Pereira, Wagner Meira Jr., "ReactivityBased Quality of Service Strategies For s," 2007 International Symposium on s and the Internet (SAINT'07),2007, pp.4. [3] A. Pereira, L. Silava, W. Meira Jr., W. Santros, Assessing the impact of reactive workloads on the performance of applications, 2006 IEEE International Symposium on Performance Analysis of Systems and Software, pp. 211-220 [4] TiantianGao, YujiaGe, Gongxin Wu and Jinlong Ni, A Reactivity Based Framework of Automated Performance Testing For s 9 th International Symposium on Distributed Computing and to Business, Engineering and Science. [5] Mark Guzdial, Deriving Software Uasages Patterns from Long Files, Gerogia Institute of Technology, Atlanta, GA [6] Nanhay Singh, Achin Jain, Ram Shringar Raw, Comparison Analysis Of Usage Mining Using Pattern Recognition Techniques, International Journal of Data Mining & Knowledge Management Process (IJDKP) Vol.3, No.4, July 2013 [7] Sana Siddiqui, Imran Qadri, Mining Log Files for Analytics and Usage Patterns to Improve Organization, International Journal of Advanced Research in Computer Science and Software Engineering, Volume 4, Issue 6, June 2014 ISSN: 2277 128X. [8] HamidehHajiabadi and Mohsen Kahani, An Automated Model Based Approach to Test Using Ontology, IEEE conference on Open Systems (ICOS 2011), September 25-28, 2011, Langkawi, Malaysia. [9] A. Jefferson Offutt, An Integrated Automatic Test Data Generation System, Journal of Systems Integration, Vol.1, No.3, November 1991, pp.391-409, Kluwer Academic Publishers. [10] Nadia Alshahwan and Mark Harman, Automated Testing Using Search Based Software Engineering, ASE 2011, Lawrence, KS, USA. [11] David A. Turner, Moonju Park, Jaehwan Kim and JinseokChae, An automated test code generationmethod for web application using activity oriented approach, California state university San Bernardino, University of Incheon, Korea. 2008. [12] YuanyanGu, YujiaGe, Search-based Performance Testing of s with Composite Services, 2009 www.ijrcct.org Page 118