Performance Engineering for Native Mobile Applications - Strategy, Implementation & Execution



Similar documents
Mobile Performance Testing Approaches and Challenges

Mobile Performance Management Tools Prasanna Gawade, Infosys April 2014

Testing & Assuring Mobile End User Experience Before Production. Neotys

Tool - 1: Health Center

STeP-IN SUMMIT June 18 21, 2013 at Bangalore, INDIA. Enhancing Performance Test Strategy for Mobile Applications

Java Monitoring. Stuff You Can Get For Free (And Stuff You Can t) Paul Jasek Sales Engineer

STeP-IN SUMMIT June 2014 at Bangalore, Hyderabad, Pune - INDIA. Mobile Performance Testing

Mobile Testing That s Just a Smaller Screen, Right?

Java VM monitoring and the Health Center API. William Smith

MEASURING WORKLOAD PERFORMANCE IS THE INFRASTRUCTURE A PROBLEM?

Practical Performance Understanding the Performance of Your Application

Responsive Web Design. vs. Mobile Web App: What s Best for Your Enterprise? A WhitePaper by RapidValue Solutions

WEBAPP PATTERN FOR APACHE TOMCAT - USER GUIDE

How To Test A Web Server

Tuning WebSphere Application Server ND 7.0. Royal Cyber Inc.

An Oracle White Paper March Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite

Liferay Portal Performance. Benchmark Study of Liferay Portal Enterprise Edition

Mobile App Testing Process INFLECTICA TECHNOLOGIES (P) LTD

Performance Testing of Java Enterprise Systems

Mobile Performance Testing

WebSphere Performance Monitoring & Tuning For Webtop Version 5.3 on WebSphere 5.1.x


Mobile App Testing Guide. Basics of Mobile App Testing

An Oracle White Paper July Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide

Mobile Testing: Actual Results from Nationwide Insurance How we could have tested the same application with the latest tools available

How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)

How To Improve Performance On An Asa 9.4 Web Application Server (For Advanced Users)

Avira Secure Backup INSTALLATION GUIDE. HowTo

Scalability Factors of JMeter In Performance Testing Projects

HYBRID APPLICATION DEVELOPMENT IN PHONEGAP USING UI TOOLKITS

Image Area. White Paper. Best Practices in Mobile Application Testing. - Mohan Kumar, Manish Chauhan.

Running SAP Solutions in the Cloud How to Handle Sizing and Performance Challenges. William Adams SAP AG

Mobile Test Strategy. Shankar Garg. Senior Consultant - Testing

Proactive and Reactive Monitoring

Issues of Hybrid Mobile Application Development with PhoneGap: a Case Study of Insurance Mobile Application

Performance Optimization For Operational Risk Management Application On Azure Platform

JBoss Data Grid Performance Study Comparing Java HotSpot to Azul Zing

Noelle A. Stimely Senior Performance Test Engineer. University of California, San Francisco

IMCM: A Flexible Fine-Grained Adaptive Framework for Parallel Mobile Hybrid Cloud Applications

Learning More About Load Testing

Installation Instructions

Testing Mobile Applications

Application. Performance Testing

System Requirements Table of contents

Mobinius Testing Services OPTIMIZE YOUR PERFORMANCE AND ENSURE A SUPERIOR USER EXPERIENCE

Contents Introduction... 5 Deployment Considerations... 9 Deployment Architectures... 11

PART IV Performance oriented design, Performance testing, Performance tuning & Performance solutions. Outline. Performance oriented design

Systems Manager Cloud Based Mobile Device Management

An Oracle White Paper September Advanced Java Diagnostics and Monitoring Without Performance Overhead

A closer look at HP LoadRunner software

Performance Testing Process A Whitepaper

Advanced Performance Forensics

Asta Powerproject Enterprise

View Point. Performance Monitoring in Cloud. Abstract. - Vineetha V

VIA CONNECT PRO Deployment Guide

Siebel & Portal Performance Testing and Tuning GCP - IT Performance Practice

Workshop on Android and Applications Development

McAfee Enterprise Mobility Management Performance and Scalability Guide

Luxriot Broadcast Server Manual

IBM Tivoli Workload Automation V8.6 Performance and scale cookbook

Delivering Quality in Software Performance and Scalability Testing

Performance Testing of a Large Wealth Management Product

Performance Analysis of Web based Applications on Single and Multi Core Servers

Mobile Application Testing Challenges & Best Practices

Guide to Mobile Testing

SOLUTION BRIEF: SLCM R12.7 PERFORMANCE TEST RESULTS JANUARY, Load Test Results for Submit and Approval Phases of Request Life Cycle

Comparative Study of Load Testing Tools

Product Overview. Dream Report. OCEAN DATA SYSTEMS The Art of Industrial Intelligence. User Friendly & Programming Free Reporting.

Angelika Langer The Art of Garbage Collection Tuning

Performance Tuning and Optimizing SQL Databases 2016

B M C S O F T W A R E, I N C. BASIC BEST PRACTICES. Ross Cochran Principal SW Consultant

VIA COLLAGE Deployment Guide


From Traditional Functional Testing to Enabling Continuous Quality in Mobile App Development

SOLUTION BRIEF: SLCM R12.8 PERFORMANCE TEST RESULTS JANUARY, Submit and Approval Phase Results

Enterprise Applications in the Cloud: Non-virtualized Deployment

A Practical Method to Diagnose Memory Leaks in Java Application Alan Yu

Holistic Performance Analysis of J2EE Applications

WebSphere Architect (Performance and Monitoring) 2011 IBM Corporation

BLACKBOARD LEARN TM AND VIRTUALIZATION Anand Gopinath, Software Performance Engineer, Blackboard Inc. Nakisa Shafiee, Senior Software Performance

TURKEY SOFTWARE QUALITY REPORT

LTE Test: EE 4G Network Performance

How To Test For Performance

Performance brief for IBM WebSphere Application Server 7.0 with VMware ESX 4.0 on HP ProLiant DL380 G6 server

Mobile App Testing is not something special

Improved metrics collection and correlation for the CERN cloud storage test framework

Java Performance. Adrian Dozsa TM-JUG

STeP-IN SUMMIT June 18 21, 2013 at Bangalore, INDIA. Performance Testing of an IAAS Cloud Software (A CloudStack Use Case)

AgencyPortal v5.1 Performance Test Summary Table of Contents

11.1 inspectit inspectit

WHAT WE NEED TO START THE PERFORMANCE TESTING?

Transcription:

Performance Engineering for Native Mobile s - Strategy, Implementation & Execution 13th Annual International Software Testing Conference 2013 Bangalore, 4 th -5 th December 2013 Arit Kumar Bishwas Capgemini India Pvt. Ltd. A-1, Technology Park, MIDC, Talwade, Pune-412114, Maharashtra, India

P a g e 2 Abstract The native mobile applications are becoming more & more critical to the business. It is now obligatory for businesses to ensure the top class performance of native mobile applications to the end users. Performance engineering of any native mobile application is very complex to handle as there are multi-dimensional matrix combinations for devices (Samsung Series, iphone, Nokia, Blackberry etc.), operating systems (Android, ios, Windows etc.), networks (2G, 3G, 4G, WiFi) and different mobile servers engagement. In native mobile application performance engineering, we need to focus on both the sides - client side (Mobile Devices) as well as server side. So we have to figure out an intelligent and innovative strategy to optimize the performance by inspecting both client side as well as server sides of the application. In this paper, an elegant approach & strategy has been projected. Here we will discuss how to implement such a performance engineering framework which offers a detailed set of solutions to engineer the performance of the native mobile applications. As a case study we will investigate & employee the performance engineering of Native Android Mobile in this paper. Key Factors: Performance Engineering, Native Mobile s 1. Introduction This is the era of mobile communication. Few years back when we used to think about mobile devices we thought about calling, taking photos and internet browsing only. Now the era is getting changed so rapidly that mobile devices are dominating the world very stunningly. Thanks to the technology advancements, today s Mobile devices are highly capable. The mobile devices are capable of handling all kinds of necessary activities that a computer can offer. We don t need to sit around bulky computer systems for internet and other stuffs. By using mobile devices one can do all the necessary activities such as from Online-Shopping, Watching Movies, Playing Video Games, Online Banking etc to lot many more. [1] In the current era time is one of the most important factors of our life, time is everything! So everyone wants the things to be done as fast as possible. When we talk about the usages of mobile phones, speed is one of the most important things to be considered. Nowadays users expect the native mobile application view to be opened within 4-5 seconds in the mobile devices in compare to the response time of 8-10 seconds in case of web pages in PCs. If your application is taking more time to respond, you may lose the user for forever! So in this era of high competition in mobile technologies, your mobile application should meet the speed threshold and it should be as fast as possible. As per the survey with American audience conducted by Soasta, about 88% felt that they associate negative feelings with brands that have poorly performing websites and apps. Figure 1: User Reaction Survey with American audience conducted by Soasta In this paper we will see how we can ensure the optimum performance of any native mobile application. We are going to see the strategy required to do effective performance engineering of any native mobile application. We will also see how we can implement and execute such strategy for efficient native mobile application performance engineering. [2, 3] 2. Background The approach to do performance test engineering of a web application is well known to us. With native mobile applications we need a different approach to do

P a g e 3 performance engineering. In case of native mobile application s performance engineering, the focus is on both the sides, client side as well as server side. 2.1. Native Mobile s A native mobile application is an application program that has been developed for use on a particular platform or mobile device. Because native apps are written for a specific platform, they can interact with and take advantage of operating system features and other software that is typically installed on that platform. Because a native app is built for a particular device and its operating system, it has the ability to use devicespecific hardware and software, meaning that native apps can take advantage of the latest technology available on mobile devices. A native mobile application is installed directly on a mobile device and developers create a separate app version for each mobile device. The native app may be stored on the mobile device out of the box, or it can be downloaded from a public or private app store and installed on the mobile device. Data associated with the native app is also stored on the device, although data can be stored remotely and accessed by the native app. Depending on the nature of the native app, Internet connectivity may not be required. [4] 2.2. Hybrid s 2.4. Web Vs Native Mobile s in Terms of Performance Engineering Table 1: Web on PC Vs Native Mobile Web Native Mobile on PC Each page should take approximately less than 8-10 seconds. Users generally log off the application after use. Server side analysis is on more focus. Generally don t bother about different N/Ws. Type of PCs does not bother. Each view should take approximately less than 4-5 seconds. Users generally don t log off the application after use, instead without log off switch to other application. More user sessions in a particular duration! Both Server side as well as client (Mobile device) side analysis are on focus in terms of performance. Each N/W (2G, 3G, 4G, and WiFi) with different N/W carriers is under consideration. Too many devices with different OS versions. 3. MAPE (Mobile Performance Engineering) Process & Framework 3.1. MAPE Process A hybrid application is one that combines elements of both native and Web applications. Hybrid application features can function whether or not the device is connected to web. They do integration with a device s file system, and also do integration with Web-based services. They are also having an embedded browser to improve access to dynamic online content. [5] 2.3. Mobile Web s A Web application is an application program that is stored on a remote server and delivered over the Internet through a browser interface. Web services are Web apps by definition and many, although not all, websites contain Web apps. One can open a web application using the mobile browser. Figure 2: Mobile Performance Engineering (MAPE) Process Architecture

P a g e 4 The above Figure 2 shows the process architecture for Mobile Performance Engineering (MAPE). For more details refer [6]. In the below Figure 2, we see the complete flow of processes that we need to follow for delivering effective and efficient performance engineering solutions for native mobile applications. We need to focus on both the sides, server side as well as client (Mobile devices!) sides. 3.2. MAPE Framework Implementation The below Figure 3 shows the Mobile Performance Engineering (MAPE) process framework which has been designed for providing solutions to the native mobile application performance engineering requirements. - Memory Consumed by the Native Mobile We need to monitor the memory consumed for each individual NMAPTS execution. - Dalvik Memory Utilization Dalvik Memory utilization is nothing but the memory consumed by the java heap in the device. Dalvik memory utilization is very important to investigate. This analysis helps in exploring if there is any memory leakage problem in the mobile device when native mobile application is running. - N/W Data Bytes Sent We need to investigate data bytes sent by the application during NMAPTS executions. - N/W Data Bytes Received We need to investigate data bytes received by the application during NMAPTS executions. - Battery Utilization Analyze the power consumed by battery during native mobile application view navigation. - Method Profiling Analysis Figure 3: Native Mobile Performance Engineering Environment Setup Architecture 3.2.1. On-Device Monitoring In case of native mobile applications we have to monitor the mobile devices and analyse the performance of the application in the device. How fast the application is working in the device. Following are some of the most important things to monitor and analyze during the performance test executions in the device. - CPU Utilization by the Native Mobile We need to monitor the CPU utilization for each individual Native Mobile Performance Test Script (NMAPTS) execution. If application launch time or any view s response time is higher than expected, also if there is any memory leak issues got observed etc., we need to investigate the object and methods which causes problem in the device. By method profiling in the device we can figure out the root cause of the higher launching response time and other problems. 3.2.2. Native Mobile Launching Time Native Mobile Launch Time is the time that a native mobile application takes to get launched in the device. So it should be as less as possible. Higher the launch time results more battery consumption and also it may give a negative impression to users! 3.2.3. Mobile Base Line Testing Mobile Baseline Testing is the process of identifying the Behaviour of the mobile application with a minimal

P a g e 5 user load at servers. Each MPTS is executed with single virtual user. The test is executed for a very small period of time under different mobile N/W (2G, 3G, 4G & WiFi) conditions. This test will be used as a benchmark for further Mobile Load Tests. 3.2.4. Mobile Load Testing Mobile Load Testing is the process of identifying the Behaviour of the mobile application with a heavy user load at servers. High volume of requests with each MPTSs are sent through different mobile N/W (2G, 3G, 4G & WiFi) to servers and investigate the behaviour of the servers during this phase. proper approach to execute this. We will have to monitor and investigate the health of the servers during heavy load under different N/W conditions (2G, 3G, 4G and WiFi). The heavy load will be deployed by different combinations of simulated mobile users (such as Android, ios and Windows mobile users). We will also have to monitor and investigate the health of the mobile devices when the native mobile application is running in the devices. The below Figure 4 demonstrates the device side execution and monitoring. 3.2.5. Mobile Endurance Testing Mobile Endurance Testing is the process of identifying the Memory Leak in the servers. High volume of requests with each MPTS is sent through different mobile N/W (2G, 3G, 4G & WiFi) to servers and run the test for longer hours to investigate the reliability of the application. 3.2.6. Server Side Monitoring Figure 4: Native Mobile Performance Engineering Test Execution Phases at Device Side The below Figure 5 demonstrates the server side execution and monitoring phases. We need to monitor and analyze the following things (not limited to the list but these are most important!) at server end during the Mobile Baseline, Mobile Load & Mobile Endurance test executions: 1. & Database server CPU utilization 2. & Database server memory utilization 3. Server JVM heap memory utilization 4. Server Connection Pool Sizing 5. Server Thread Pool Sizing 6. Database server AWR & ASH (Active Session History) Report. 7. Network bandwidth utilization between servers. 8. Total number of transactions per hour at database 9. Total number of Java operations per hour at JVM, etc. 3.3. Native Mobile Performance Test Execution Workarounds Designing a Native Mobile Performance Test Execution framework is a complex task and need Figure 5: Native Mobile Performance Engineering Test Execution Phases at Server Side 3.3.1. On-Device Performance Test Execution Process 1. Develop NMAPTSs (Native Mobile Performance Test Scripts) for each business scenarios. 2. Do execute three to five continuous iterations with each NMAPTS on the mobile devices and monitor the mentioned counters. In this case we can consider

P a g e 6 single or multiple N/W based on the time line and requirements. 3.3.2. Mobile Baseline, Load & Endurance Test Execution Process For Mobile Baseline Test: 1. Simulate the N/W conditions (2G, 3G, 4G or WiFi) with each test run in the load test tool. 2. Select the mobile OS (Android, ios etc.) specific MAPTSs (Mobile Performance Test Scripts) 3. Execute the test with single VUser per MAPTS using load test tool under different N/W conditions. 4. Monitor all the servers during execution. For Mobile Load Test: 1. Simulate the N/W conditions (2G, 3G, 4G or WiFi) with each test run in the load test tool. 2. Select the mobile OS (Android, ios etc.) specific MAPTSs 3. Execute the test with concurrent VUsers load using load test tool for a specific duration (as per requirement) with each set of OS specific MAPTS under different N/W conditions. 4. Monitor all the servers during executions. For Mobile Endurance Test: 1. Simulate the N/W conditions (2G, 3G, 4G or WiFi) with each test run in the load test tool. 2. Select the mobile OS (Android, ios etc.) specific MAPTSs 3. Execute the test with concurrent VUsers load using load test tool for longer duration (based on requirements but at least more than 8 Hrs.) with each set of OS specific MAPTS under different N/W conditions. 4. Monitor all the servers during executions. 3.3.3. Network Considerations We have to run separate tests with each different N/W conditions (2G, 3G, 4G & WiFi). 4. Experimental Case Study Here I am presenting a case study for native mobile application performance engineering with Android mobile device which I prepared for one of our esteemed client. This case study will demonstrate the strategy, implementation and execution of the proposed performance engineering solution. 4.1. Experiment Set up Table 2: Experiment Setup Client Side Setup Device: Samsung S4 [7] OS: Android For CPU, Memory & Network Utilization, Battery Utilization: Little Eye [8] Tools Used: For Method Profiling: Android SDK [9] Launching Time: Perfecto Cloud CUP Utilization, Memory Utilization, Network Utilization, Scope In: Launch Time Capturing, Method Profiling Network: WiFi Server Side Setup IBM WAS 7.0 Server: Database Oracle 11g Server: OS: Windows 2008 R, UNIX For Load Test: Neoload [10] For Server ( & Database) Side Monitoring: Windows SCOM [11] Tools Used: For Server Side Diagnostics: WAS Internal Tools For Database Server Side Diagnostics: Oracle AWR Report Test Type: Load Test with 200 Virtual Users Network: AT&T, Verizon and WiFi 4.2. On-Device Performance Statistics During the on-device test execution with android mobile device, the health of the native mobile application has been monitored and investigated. CPU, Memory, Data In/Out through Network and Battery utilizations have been analyzed for each NMATS. 4.2.1. CPU Utilization - CPU Utilization by Native Mobile

P a g e 7 This is the total CPU utilized by the native mobile application in the device during application view navigation. The Figure 6 shows the CPU utilizations graph for each NMPATS execution. Figure 8: Memory Utilization by Native Mobile - Memory Utilization by Android Methods Figure 6: CPU Utilization by Native Mobile 4.2.2. Memory Utilization - Total Memory Utilization by Device This is the memory utilization by the android application methods during application view navigation in the device. The Figure 9 shows the utilizations graph for each NMPATS execution. The total memory consumption by the device during native application view navigation. The Figure 7 shows the memory utilizations graph for each NMPATS execution. Figure 9: Memory Utilization by Android Methods - Memory Utilization by Android Dalvik Figure 7: Total Memory Utilization by Device - Total Memory Utilization by Native Mobile This is the memory consumption by the android Dalvik during application view navigation in the device. The Figure 10 shows utilizations graph for each NMPATS execution. This is the total memory utilization by the native mobile application during application view navigation in the device. The Figure 8 shows utilizations graph for each NMPATS execution. Figure 10: Memory Utilization by Dalvik - Garbage Collection Details in Device during Execution

P a g e 8 The following below Table 3 shows the GC statistics for each business NMAPTS during the execution. Table 3: GC Statistics during Execution in Device Devic Applic Numb e Applic ation er of Dalvi ation Runni Concu k Pause ng rrent Thres Time Time GC's hold (MB) Busine ss Flows Accou nt Statem ent Recent Activit y Make A Payme nt Send Money Redee m Card Redem ption history Quick View Num ber Of Alloc ated GC's 133.66 0.49 16 64 12 147.58 0.53 15 64 13 155.57 0.58 16 64 14 177.73 0.49 16 64 9 79.73 0.41 14 64 10 187.34 0.91 30 64 21 267.98 0.39 44 64 32 - Network Data Received by Native Mobile The data received through n/w by the native mobile application during application view navigation. The Figure 12 shows the graph for the utilization. Figure 12: Network Data Received by Native Mobile 4.2.4. Battery Utilization - Power Consumptions by CPU The power consumed by the CPU during native mobile application view navigation. The Figure 13 shows the graph for the utilization. 4.2.3. Network Utilization - Network Data Sent by Native Mobile The data transferred through n/w by the native mobile application during application view navigation. The Figure 11 shows the utilization graph. Figure 13: Power Consumptions by CPU during On- Device Native Mobile Execution - Power Consumptions during Data Usages The power consumed by the native mobile application during native mobile application view navigation. The Figure 14 shows the graph for the utilization. Figure 11: Network Data Sent by Native Mobile

P a g e 9 4.3. Server Side Performance Statistics We have executed Mobile load tests with three different N/W conditions. The following subsections will describe the monitoring and investigated results. [12] 4.3.1. Native View Response Time Figure 14: Power Consumptions during Data Usages - Power Consumptions by Display The power consumed by the display panel during native mobile application view navigation. The Figure 15 shows the graph for the utilization. Figure 16: Mobile Native Views Response Time with Different N/W Conditions Table 4: Native Mobile View Response Times Figure 15: Power Consumptions by Display during On- Device Native Mobile Execution 4.2.5. Launch Time This is the average time taken by the native mobile application to get launched in the mobile device. The following Table 3 shows the application launch times under different N/W conditions. The average application launch times of 10 test iterations run has been taken for analysis. Table 3: Native Mobile Launch Time Launch Times AT&T WiFi Verizon Overall Average Launch Time 5.8 6.9 7.9 Captured (Sec) Highest Average Launch Time Captured for each 10.3 12.2 11.1 NMAPTS (Sec) Highest Average Launch Time Captured (Sec) 67.2 73 66.6 4.3.2. 90th Percentile The 90th percentile is the value (or score) below which 90 percent of the response time observations may be found. This analysis explores whether the application s response time SLA is getting met 90% of time during mobile load test. The Figure 17 shows the 90 th percentile values for all the transactions when the load test run with different N/W conditions.

P a g e 10 Figure 17: Mobile Native Views 90 th Percentile with Different N/W Conditions 4.3.3. Standard Deviation (SD) SD analysis shows how reliable your application is. The lower the standard deviation, the more consistent the response times are, so more reliable will be the application. Here in this case study we found that the SD for most of the transactions is less than the half of the mean values, so the system indicates the high probabilities of reliability. Figure 19: CPU Queue Length Trend in Servers during Load Test with Different Mobile N/W Conditions 4.3.6. Server Memory Utilization The Figure 20 shows the memory utilization of the servers during mobile load test with different N/W conditions. 4.3.4. Server Side CPU Utilization The Figure 18 shows the CPU utilization of the servers during mobile load test with different N/W conditions. Figure 20: Memory Utilization Trend in & Database Servers during Load Test with Different Mobile N/W Conditions 4.3.7. JVM Heap Utilization Figure 18: CPU Utilization Trend in & Database Servers during Load Test with Different Mobile N/W Conditions 4.3.5. Server Side CPU Queue Length The Figure 19 shows the application server side CPU Queue length statistics during mobile load testing. Table 5: Heap Memory GC Details for Primary Server Summary for App Server-1 Forced collection count 0 Full collections 1473 Mean garbage collection pause (ms) 1344 Mean interval between collections (minutes) 0.21 Number of collections triggered by allocation failure 159168 Proportion of time spent in garbage collection pauses (%) 36.7 Proportion of time spent unpaused (%) 63.3 Rate of garbage collection (MB/minutes) 140291

P a g e 11 Table 6: Used Heap after GC Collection in Primary Server Used heap (after collection)_ App Server-1 Mean Minimum Maximum Heap (MB) Heap (MB) Heap (MB) 550 315 913 Figure 22: Heap Memory Utilization Trend in Secondary Servers during Load Test with Different Mobile N/W Conditions 5. Observation & Analysis Figure 21: Heap Memory Utilization Trend in Primary Servers during Load Test with Different Mobile N/W Conditions Table 7: Heap Memory GC Details for Secondary Server Summary for App Server-2 Forced collection count 0 Full collections 2776 Mean garbage collection pause (ms) 1339 Mean interval between collections (minutes) 0.11 Number of collections triggered by allocation failure 150546 Proportion of time spent in garbage collection pauses (%) 42.34 Proportion of time spent unpaused (%) 57.66 Rate of garbage collection (MB/minutes) 135052 Table 8: Used Heap after GC Collection in Secondary Server Used heap (after collection)_ App Server-2 Mean Minimum Maximum heap (MB) heap (MB) heap (MB) 575 332 914 Here some of the most important observations have been posted with analysis. 5.1. Client Side 1. The native mobile application s average launch time is high in the device. We can see from Table: 3 that the application launch time in the device is very high. This is a serious concern and need to be investigated and come up with a solution to optimize the application launch time in android device. 2. We can see clearly continuous increasing pattern of memory utilization in the memory graphs. So this may be the case of memory leaking. So the probability of crashing of application is high due to the pattern of memory leaking. Recommending for optimizing the application in terms of memory utilization. Each time a new activity was loaded; the application downloaded some data, and forgot to clear it when the user returned to the home screen to start the iteration again, resulting in the leak.

P a g e 12 3. During some NMAPTS execution, the CPU utilization is high. High CPU utilization may cause more battery consumption, so by proper optimization of the application the CPU utilization can be reduced to some level. This will save battery power. 4. It is also noted by analysing the graph Power Consumption by Display during View Navigation that power consumption is high. The power consumption can be reduced if low intensity or dark colour images are used. More vibrant colour images consume more battery power. 5.2. Server Side 1. The average response time for the following native mobile application view OfferDeal->Extras is more than 9 seconds on average with all N/W conditions. The expected average response time of the native mobile application view is not more than 5 seconds. 2. The heap utilization graphs show high heap memory utilization for both the application servers. Most of the time threshold value is getting touched. Here it is being recommended to increase the heap memory size and also optimize the GC cycles frequencies. 3. By analysing memory utilization graphs we found that there are probabilistic patterns of memory leak in both the JVM of the application server machines. 6. Some Tuning Recommendations for Native Mobile s 1. Try to optimize pre-fetch data handling mechanism. 2. Try to spread out the computationally intensive jobs. 3. Try to avoid floating point mathematics calculations where possible. 4. For data-parallel tasks like video/image processing try to use GPU. 5. Try to use such algorithms that consumes less CPU cycles. Like logarithmic time complexity algorithms instead of exponential one. 6. Dark the colour; lesser the battery consumption. So try to use less vibrant colours in the application. 7. Try to reduce the brightness of the app programmatically. 8. Check the following things for ensuring Crash-Free native mobile application: 8.1. Probability of out of memory situations 8.2. Encounter of unexpected condition 8.3. Un-responsiveness from servers 8.4. Continue to existence of unused object references 8.5. GC cannot collect objects properly 9. If you don't need to access an object's fields, make your method static. Invocations will be about 15%- 20% faster. 7. Conclusions We need a different approach for performance testing & engineering in case of native mobile applications. We have to investigate both the sides, client side as well as server side. The case study explained in detail how to implement the native mobile application performance testing & engineering framework and execute that effectively & efficiently. This paper will help the audiences to understand the new innovative proposed approach for practicing native mobile application performance engineering services. Appendix A Terms MAPE MAPTS NMAPTS Descriptions Mobile Performance Engineering. This is the process architecture for mobile application performance engineering. Mobile Performance Test Script. These scripts are used for baseline, load and endurance test execution to investigate the health of the servers. Native Mobile Performance Test Script. These scripts are used for ondevices native mobile test execution to investigate the health of the mobile devices.

P a g e 13 Appendix B No of Figures: 22 No of Tables: 8 References [1].Smart Mobs: The Next Social Revolution by Howard Rheingold Arit Kumar Bishwas, having a MS degree from BITS, Pilani. He is currently working with Capgemini India Pvt. Ltd. as a Performance Engineering Consultant. He is having 9+ years of total work experience out of which he has about 7 years of Software Industry experience and about 2+ years of computer science teaching experience. He is Sun Certified Java Programmer (SCJP 1.5) as well as ISTQB certified. Earlier he has worked with IBM India Pvt Ltd., Mindtree Ltd. and Zensar Technologies Ltd. In 2010, his paper was selected as one of the top ten best papers at STC. [2].SPE Experience: An Economic Analysis. http://www.perfeng.com/papers/hesselg2.pdf. [3].The Economics of SPE Panel. http://www.perfeng.com/papers/jennings.pdf. [4].Pro Android Scripting with SL4A: Writing Android Native Apps Using Python, Lua, and Beanshell by Paul Ferrill [5].Building Hybrid Android s with Java and JavaScript by Nizamettin Gok and Nitin Khanna (9 August 2013) [6].Mobile Performance Test Management A Strategy to Implement & Manage; 8th International Project Management Leadership Conference (PML 2013) June 14th, 2013, Mumbai, India [7].http://www.samsung.com/in/consumer/mobile- phone/mobile-phone/smartphone/gt- I9500ZWAINU?subsubtype=android-mobiles [8].http://www.littleeye.co/ [9].http://developer.android.com/sdk/index.html [10].http://www.neotys.com/product/overviewneoload.html [11].http://www.microsoft.com/en-in/servercloud/system-center/operations-manager.aspx [12].S. C. U. and L. G. Williams. Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Prentice Hall, Upper Saddle River, NY, 2002. About the Author