Businesses are increasingly realizing the need for wide-ranging functional testing of mobile apps, as well as per formance, load and user experience testing, although most enterprises do not test a l l a s p e c t s o f m o b i l e a p p l i c a t i o n s a t t h i s p o i n t.
Introduction A smartphone makes a voice call look really small when compared to the dozens of other things it does taking photos, surfing web, booking tickets, transferring funds, posting updates to social media, group chats, email, games and enterprise apps. Undoubtedly, Smartphones offer an enormous potential for digital interactions between businesses and their customers as well. Smartphones and tablets are the first point of interface and in some cases, the Point Of Sale (POS). The stakes are so high that it is mandatory for businesses to assess performance of the mobile app before releasing it into the market. From a business perspective, responsiveness of mobile apps is crucial to capture the market. The exponential rise of mobile internet usage is expected to replace the desktops as the devices to access the internet. Users expect the online mobile experience to be as fast as our desktop experience, but it isn t and that is the problem. Traditional vs. In any performance testing, it is intended to determine the following under a given load a) Speed b) Throughput c) Reliability d) Scalability Application developers have long understood the need for load testing conventional desktop web applications to ensure that the application behaves properly under load with the expected number of users. Despite the advent of mobile apps and mobile websites, the principles of load testing have not changed. On the surface, performance testing of mobile apps may appear to be no different from web apps. However, applying the same testing techniques used for traditional web-based systems may produce inaccurate results or fail to expose performance bottlenecks. Mobile apps present unique challenges that must be addressed to ensure performance tests accurately represent production user behavior and load profiles. Mobile performance testing can be broadly classified into two main categories namely client side performance testing and server side performance testing. In this post, I will discuss how to address unique mobile app testing challenges such as scripting for device / OS diversity, mobile user behavior simulation, application structure (browser versus native) and network characteristics (latency, bandwidth, packet loss & jitters). Server Side Performance Testing To observe the impact of performance on server, synthetic load test needs to be developed, executed and analyzed. While this is similar to the traditional performance testing of web applications, there are a few additional challenges in mobile performance testing. Challenge#1: The first challenge is in recording and designing test scripts. Unlike traditional web applications, the mobile applications test scripts could be difficult to record since they are accessible from the mobile devices. Most of the current loads testing tools use the concept of proxy server for recording and creating the test scripts. In order to record the traffic from a mobile device, it is must to setup the proxy on both the browser as well the native app. The proxy configuration varies depending on the platform being used for development. For e.g. Android has a different way of proxy configuration than ios. Native applications on Android cannot be proxied using any configuration. It requires a custom code to instruct the application to pass through the proxy. There are similar challenges when testing performance on BlackBerry, Windows 8 and other platforms.
If Proxy is not a viable option for recording the traffic, below workarounds could be tried a) Use mobile emulators for recording and creating test scripts b) Use Browser Simulation feature to simulate the User-agent corresponding to different platforms With the above workarounds, scripts recorded on one platform can be used for other platforms as well. The following sample User-agent strings for some of the well-known platforms: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-us) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19 Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.792.0 Safari/535.1 Mozilla/4.0 (Windows; MSIE 6.0; Windows NT 5.1; SV1;.NET CLR 2.0.50727) Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-us; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 GTB5 Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_2) AppleWebKit/536.26.14 (KHTML, like Gecko) Mozilla/5.0 (Linux; U; Android 1.5; de-de; HTC Magic Build/CRB17) AppleWebKit/528.5+ (KHTML, like Gecko) Version/3.1.2 Mobile Safari/525.20.1 Mozilla/4.0 (Mozilla/4.0; MSIE 7.0; Windows NT 5.1; FDM; SV1;.NET CLR 3.0.04506.30) Mozilla/5.0 (Windows; U; Windows NT 5.1; en-us) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19 Mozilla/4.0 (compatible; MSIE 5.0; Windows 98) Opera 5.12 [en] The User-agent strings can be changed using different methods a) Record the script using any of the default browser and later change the user-agent manually with required wireless devices b) Record the script using Mozilla Firefox browser and add add-ons like User Agent Switcher Challenge #2 The second challenge is in designing different workload models. The application can be accessed traditionally using desktop based browsers and the same user can access it on his / her mobile phone. A number of concurrent users can access the application on their tablets. It is important to simulate different models and browsers correctly as it differs by JS Processing Engine, Rendering Engine and number of parallel threads invoked for executing the requests. Though JS Processing Engine & Rendering Engine can t be simulated in synthetic load test, most of the load testing tools provide option for simulating the number of parallel threads behavior. For synthetic load testing, simulation of different mobile models & browsers are done by varying the number of parallel threads used for execution. Browser's high level structure The User Interface The Browser Engine Data Storage The Rendering Engine Networking JavaScript Interpreter UI Backend Figure 1 : Browser Main Components
Challenge #3 The third challenge is to simulate different network bandwidths and user behaviors. When the application is accessed through a desktop with a fixed line, bandwidth is comparatively on higher side than that of the same application when it is accessed using mobile. Apart from high bandwidth the other difference is that network characteristic doesn t vary when the application is accessed through a desktop. Different connection types that need simulation: 1. VoIP/RoIP 2. Satellite 3. Cloud 4. DSL/ADSL/XDSL 5. Microwave 6. OC-3 7. 2G/3G/4G 8. T1/T3/E1/E3 9. Wifi / WiMax 10. GPRS 11. Dial up Network conditions that need simulation: Busy Peak Times Long Distance Jittery Connections Duplication of Data Fragmentation of Data Limited Bandwidth Information Corruption De-Sequencing of Data Network Traffic Bursts Partial or Total Outage In addition to the traditional performance testing scenarios, different user behavior in the Mobile performance scenario include From different cities / geographical locations Go from wi-fi to 2G / 3G connections Keeps signed on forever and doesn t log out Slow session and always connected Large data transfer between server and client Move from lift Go from high strength to low strength area Connections - Dropped and reset Unidirectional server updates Static resources being served from CDN or proxies Ensuring optimal performance for mobile apps is complicated by the complexity of mobile networks. Performance testing not only has to consider the traditional operation environment in which service calls are working through your internal networks; it also needs to address a whole new infrastructure that the mobile device is connected through. This can include: 1. Telco networks 2. Wi-Fi/hotspots 3. 3rd party cloud services 4. Content networks 5. ISPs 3rd Party Cloud Services Phones Tablets Browsers Major ISP Wi Fi Content Networks (CDN) Figure 2 : Schema Network characteristic is permutation and combination of latency, bandwidth, packet loss and jitters and hence by using Network Virtualization tool like Shunra, we can simulate different connection types, network conditions, user behavior and mobile networks. Challenge #4 The fourth challenge is to analyze the impact of mobile users on server. Mobile devices hold connections longer than regular devices. The implication is that there is a rise in the number of concurrent open connections. The application, together with its backend servers and infrastructure, needs to cope up with the rise to ensure that the connections aren t exhausted and that other users aren t denied service. Data consumption from mobile device is typically slower than regular services and applications need to ensure that this will not cause a drain on system resources. Mobile devices communicate with cells, and as the device changes locations, the cell it communicates with changes. This can result in a bandwidth change, and even complete drop-outs. This can affect the way the application behaves.
Key performance testing metrics that can be monitored from server side could be 1. Session count 2. Average session length 3. Memory availability 4. Connections Queue Length Figure 3 - Session Count and Memory usage The above graphs indicate how sessions count and memory usage is increasing due to slow connections. (need to add updated graphs with more clarity) Client Side Performance Apart from performance issues on server side, response time of page load could vary with the type of smart phone model being used. The response time is effected mainly due to rendering and JS processing. Improper caching at client end can also impact page load time. Rendering, JS Processing and Caching depends on CPU & Memory of the mobile hardware. High battery consumption is an effect of high CPU utilization which in turn is internally dependent on rendering engine and JS Processing engine. The last but a major factor, we should consider while conducting performance testing of mobile application is data usage of the application. If data usage is more than the application becomes unusable in remote locations, which further reduces the user base of application. Scripting Network User Behaviors Analysis Execution Unique Challenges and Differences in Web, Native and Hybrid apps Recording of traffic over https in native apps Different browsers on different models Multiple connection types like 3G/4G, Wifi etc. Different Network characterizations (Latency, Bandwidth, Jitter, Packet Loss) Different network conditions like Network traffic bursts, jitters etc. Connections are getting dropped? Slow sessions and always connected Unidirectional server updates Large data transfer between server and client Traffic from different regions / cities Static resources being served from CDNs or proxies Impact due to large number of active sessions Impact due to long session time Impact due to data being hold in memory for longer duration due to less out speed Impact of high packet loss Different flow for mobile applications Apart from virtual users, GUI users and real users also take part
Approach While conducting performance testing of applications, which are mobile enabled, a hybrid approach can be taken. a) Stressing server by generating load in a traditional way using tools like HP LoadRunner, JMeter, NeoLoad etc. b) To ensure, response time is calculated and reported, leverage the cloud and generate load from different geographical locations c) Simulate scenarios for low latencies using network virtualization tools like Shunra d) While the server is stressed out, check the performance of the application on multiple devices. This can be done by automating the scenarios using functional tool (like SeeTest) and capture the response time metrics for multiple iterations. Performance testing on multiple devices can be checked manually also using crowd testing service like utest. e) Check the performance of the application on multiple real devices for rendering time, battery consumption, CPU utilization, Memory utilization and data usage. f) Run web performance test from tool like http://www.webpagetest.org and analyze the waterfall and web page performance score g) Use mobiready. This tool evaluates mobile web page readiness using industry best practices & standards Gallop s Performance Analyzer Smart Tool. Figure 4 - Framework Conclusion Performance is an outcome of the collective efforts and meticulous testing. The ability to simulate and to generate load in the mobile context is crucial to ensure performance that not only strengthens the business process but also ensures a great experience to the end user. The increasing adoption of smart phones has ushered in a new era in the global economy. In turn, the rise of mobile applications has spawned a whole new breed of business processes, practices and deployments. In a race to be the first one on the mobile space, enterprises need to ensure that their application s performance matches the demands of the market.
About Us Corporate Head Quarters 433 E Las Colinas Blvd, # 1300, Irving, TX 75039 USA Email: contact@gallop.net Phone: (972) 573-3705 630 Freedom Business Center,3rd Floor, King of Prussia, PA 19406 Email: contact@gallop.net Phone: +1 610 768 7736