Performance Analysis and Capacity Planning Whitepaper



Similar documents
Tableau Server Scalability Explained

Microsoft SharePoint Server 2010

msuite5 & mdesign Installation Prerequisites

How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)

Tableau Server 7.0 scalability

System Requirements Table of contents

Sostenuto 4.9. Hardware and Software Configuration Guide. Date: September Page 1 of 13

Philips IntelliSpace Critical Care and Anesthesia on VMware vsphere 5.1

POINT-TO-POINT vs. MEAP THE RIGHT APPROACH FOR AN INTEGRATED MOBILITY SOLUTION

Microsoft SharePoint Server 2010

Infor Web UI Sizing and Deployment for a Thin Client Solution

Ignify ecommerce. Item Requirements Notes

McAfee Enterprise Mobility Management Performance and Scalability Guide

Informatica Data Director Performance

Architecture Guide. SDL Campaign Manager 4.0.0

SERENA SOFTWARE Authors: Bill Weingarz, Pete Dohner, Kartik Raghavan, Amitav Chakravartty

MAGENTO HOSTING Progressive Server Performance Improvements

SERENA SOFTWARE Authors: Bill Weingarz, Pete Dohner, Kartik Raghavan, Amitav Chakravartty

Symantec Endpoint Protection 11.0 Architecture, Sizing, and Performance Recommendations

Securely. Mobilize Any Business Application. Rapidly. The Challenge KEY BENEFITS

Server Software Installation Guide

SYSTEM SETUP FOR SPE PLATFORMS

Performance Test Report: Novell iprint Appliance 1.1

An Oracle White Paper Released Sept 2008

Identikey Server Performance and Deployment Guide 3.1

Ensuring the security of your mobile business intelligence

Very Large Enterprise Network, Deployment, Users

Tuning Tableau Server for High Performance

CQG/LAN Technical Specifications. January 3, 2011 Version

scalability OneBridge

Capacity Planning for Microsoft SharePoint Technologies

Dell Virtual Remote Desktop Reference Architecture. Technical White Paper Version 1.0

AppSense Environment Manager. Enterprise Design Guide

System requirements. for Installation of LANDESK Service Desk Clarita-Bernhard-Str. 25 D Muenchen. Magelan GmbH

Using VMware VMotion with Oracle Database and EMC CLARiiON Storage Systems

Legal Notices Introduction... 3

Deployment Planning Guide

Troubleshooting BlackBerry Enterprise Service 10 version Instructor Manual

I. General Database Server Performance Information. Knowledge Base Article. Database Server Performance Best Practices Guide

NetIQ Access Manager 4.1

Introduction to the EIS Guide

Configuration Guide. BlackBerry Enterprise Service 12. Version 12.0

AlphaTrust PRONTO - Hardware Requirements

WHITE PAPER: BEST PRACTICES. Sizing and Scalability Recommendations for Symantec Endpoint Protection. Symantec Enterprise Security Solutions Group

Hardware and Software Requirements. Release 7.5.x PowerSchool Student Information System

A Comparison of Oracle Performance on Physical and VMware Servers

Accelerating Server Storage Performance on Lenovo ThinkServer

Eloquence Training What s new in Eloquence B.08.00

WITH A FUSION POWERED SQL SERVER 2014 IN-MEMORY OLTP DATABASE

AppDynamics Lite Performance Benchmark. For KonaKart E-commerce Server (Tomcat/JSP/Struts)

Exhibit B5b South Dakota. Vendor Questions COTS Software Set

Pivot3 Reference Architecture for VMware View Version 1.03

Introduction 1 Performance on Hosted Server 1. Benchmarks 2. System Requirements 7 Load Balancing 7

Windows Server ,500-user pooled VDI deployment guide

Improve Business Productivity and User Experience with a SanDisk Powered SQL Server 2014 In-Memory OLTP Database

Dragon NaturallySpeaking and citrix. A White Paper from Nuance Communications March 2009

DELL. Virtual Desktop Infrastructure Study END-TO-END COMPUTING. Dell Enterprise Solutions Engineering

A Comparison of Oracle Performance on Physical and VMware Servers

JBoss Seam Performance and Scalability on Dell PowerEdge 1855 Blade Servers

Cisco Prime Home 5.0 Minimum System Requirements (Standalone and High Availability)

Propalms TSE Deployment Guide

WHITE PAPER. Domo Advanced Architecture

Introduction to IBM Worklight Mobile Platform

owncloud Enterprise Edition on IBM Infrastructure

Skynax. Mobility Management System. System Manual

Server Installation ZENworks Mobile Management 2.7.x August 2013

Performance Analysis of Web based Applications on Single and Multi Core Servers

Best Practices for Deploying SSDs in a Microsoft SQL Server 2008 OLTP Environment with Dell EqualLogic PS-Series Arrays

Sage SalesLogix White Paper. Sage SalesLogix v8.0 Performance Testing

Estimate Performance and Capacity Requirements for Workflow in SharePoint Server 2010

Citrix EdgeSight Administrator s Guide. Citrix EdgeSight for Endpoints 5.3 Citrix EdgeSight for XenApp 5.3

Hardware Sizing and Bandwidth Usage Guide. McAfee epolicy Orchestrator Software

Reference Architecture for a Virtualized SharePoint 2010 Document Management Solution A Dell Technical White Paper

INUVIKA OPEN VIRTUAL DESKTOP FOUNDATION SERVER

Virtuoso and Database Scalability

Managing Mobile Devices Over Cellular Data Networks

Configuration Guide BES12. Version 12.1

Very Large Enterprise Network Deployment, 25,000+ Users

AP ENPS ANYWHERE. Hardware and software requirements


SOLUTION BRIEF: SLCM R12.8 PERFORMANCE TEST RESULTS JANUARY, Submit and Approval Phase Results

Performance Testing of Java Enterprise Systems

ProSystem fx Engagement. Deployment Planning Guide

Synergis Software 18 South 5 TH Street, Suite 100 Quakertown, PA , version

Priority Pro v17: Hardware and Supporting Systems

InterScan Web Security Virtual Appliance

Terminal Server Software and Hardware Requirements. Terminal Server. Software and Hardware Requirements. Datacolor Match Pigment Datacolor Tools

How To Build A Call Center From Scratch

Williamson County Technology Services Technology Project Questionnaire for Vendor (To be filled out withprospective solution provider)

Microsoft Exchange Server 2003 Deployment Considerations

Novacura Flow 5. Technical Overview Version 5.6

MEGA Web Application Architecture Overview MEGA 2009 SP4

Introduction to Mobile Access Gateway Installation

VMware Workspace Portal Reference Architecture

Decision Support System Software Asset Management (SAM)

XenDesktop 7 Database Sizing

Centrata IT Management Suite 3.0

Transcription:

Performance Analysis and Capacity Planning Whitepaper

Contents P E R F O R M A N C E A N A L Y S I S & Executive Summary... 3 Overview... 3 Product Architecture... 4 Test Environment... 6 Performance Test Model... 9 Test Run Results... 10 Typical Installation Enterprise Employee App... 11 Typical Installation Consumer Mobile App... 11 Test Considerations... 12 Appendix A Test Model Details... 13 Appendix B Test Result Details... 14 Reference Test Run... 14 Appendix C Sample Production Configurations... 19 Figures Figure 1 - Test App Screen Examples... 3 Figure 2 Verivo s System Architecture... 4 Figure 3 Verivo s Component Architecture... 5 Figure 4 - Performance Test System Configuration... 6 Figure 5 Test System Specifications... 8 Figure 6 - Test Model Parameters... 9 Figure 7 - Performance Test Results... 10 Figure 8 - Typical Enterprise Deployment Sizing... 11 Figure 9 - Typical B2C Deployment Sizing... 11 Figure 10 - Performance Test Model... 13 Figure 11 - One Hour Test, Transaction Detail... 15 Figure 12 Reference Test, CPU Performance... 16 Figure 13 Reference Test, TPS Rate... 16 Figure 14 - Reference Test, Memory Performance... 17 Figure 15 - Reference Test, Disk Performance... 17 Figure 16 - Reference Test, Network Performance... 18 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 2 of 20

Executive Summary P E R F O R M A N C E A N A L Y S I S & This report contains an in-depth performance and scalability analysis of Verivo s enterprise mobility platform. Our team conducted a series of tests simulating high-volume user loads under typical app usage patterns. The test results demonstrate that Verivo s platform is capable of supporting large user populations and high concurrency rates with reasonable system sizing. Our app servers are capable of handling sustained activity levels of 15 30 mobile transactions per second (TPS) per CPU under typical app configurations. This capacity is well within the throughput requirements needed to support user populations in the tens-of-thousands with a typical two server, dual-cpu load balanced configuration, and much larger deployments (500,000 + users) by adding CPUs or app servers in a load-balanced server farm. These results demonstrate the platform s readiness for deploying a diverse set of business-critical apps to large employee and partner populations, and for rolling out consumer-facing apps in very large, cross-platform, global deployments. Of course, server capacity does not equate to system reliability Verivo recommends system redundancy through load balancing for nearly all production-grade mobile deployments. The report below provides details on the test approach, technical configuration, results, and guidelines on sizing and capacity planning for your requirements. This document is intended for use by technical teams that are evaluating, installing, or designing mobile system architectures built using Verivo s platform. Overview Verivo s platform is designed to be highly scalable and reliable across a wide range of employee- and customer-facing apps, and under the stresses applied by large user loads and concurrency levels. This document analyzes system performance and scalability with a goal to: Demonstrate the platform s ability to support large user volumes and system loads Analyze system performance across typical and peak production load levels Guide customers through capacity planning to meet their business requirements Tests were conducted by driving high volumes of mobile activity through a production-grade Verivo server configuration. Verivo s Elevate Sales demonstration app, shown below, provided the business context for the test, using back-end web and database services typical of both enterprise and consumer apps. Mobile app activity was generated via desktop workstations, simulating load levels designed to approximate large mobile deployments with high activity levels. Server loads and system response time were measured throughout the test, and details are provided below. Figure 1 - Test App Screen Examples 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 3 of 20

Product Architecture Verivo Server Figure 2 Verivo System Architecture Verivo s enterprise mobility platform allows customers to build and deploy cross-device mobile apps that are securely integrated with a wide range of public and private data sources. A deployment of Verivo s platform consists of five main components, as shown above and described on the following page: 5 main components of Verivo s system architecture Client Verivo s AppServer The native software client runs on ios, Android, and BlackBerry devices providing all client-side services including native UI, device integration, caching, encryption, server communication, and many others. Clients request data and services from Verivo app servers via http(s). The app server provides mobile-optimized middleware services to support all apps built using Verivo s platform. The server authenticates users, delivers app configurations, and manages all bi-directional data handling via both real-time requests and data synchronization. It integrates with a wide range of data sources via a customizable plugin layer. The server provides a range of other services including single sign-on, workflow management, localization, time zone management, compression, logging, and usage tracking. The server is stateless, allowing multiple load balanced app servers to run in parallel to handle increased loads and provide system redundancy. Database Server Verivo s AppStudio Data Sources The database server houses all app and system configuration data, and tracks all runtime activity including logging, audit trail, and usage data. AppStudio is the tool used by app developers to build rich, native, cross-platform mobile apps. Verivo s platform mobilizes a wide range of enterprise systems including authentication & entitlement services, business apps, web services, and data warehouses. Developers may also mobilize public systems such as LinkedIn, Twitter, news feeds, market data, Facebook, and many others. Data sources are accessible via numerous protocols including SQL, stored procedures, RESTful services, and SOAP. The system requirements for Verivo v7.3 are detailed in the SystemRequirements7.3 document located on the Verivo FTP site. 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 4 of 20

These components provide an integrated development- and run-time environment that allows Verivo customers to rapidly build, deploy, and manage enterprise-grade mobile systems. The following diagram details the interactions between the elements of our component architecture: Figure 3 - Verivo Component Architecture Verivo s mobile client runs as a native app on each supported client platform iphone, ipad, Android, BlackBerry, etc. The client consumes its configuration at login, defining the user interface, app flow, caching rules, app integration, security policies and a wide array of other app behavior settings. Mobile apps are built in Verivo s AppStudio with no coding or compilation required by the customer The client is installed one-time, over the air (OTA) using all platform-supported deployment mechanisms including app stores, ad hoc, enterprise and delivery via third-party mobile device management (MDM) products. All communications are made using http/https over the carrier network; additional encryption mechanisms are typically employed including BES, encryption or mobile VPNs. Large responses are compressed prior to transport. 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 5 of 20

Test Environment Figure 4 - Performance Test System Configuration This performance test utilized a typical Verivo production installation. The server environment consisted of 2 dedicated, load-balanced Verivo s app servers supported by a dedicated database server. Data services were provided by a proprietary authentication web service, business data stored in a SQL database, and real-time news feeds via the NY Times API. Mobile app activity was generated using Apache JMeter to drive simulated production loads from a master test console and a set of 5 test slaves. The master manages a JMeter test plan designed to simulate typical high-volume app usage patterns as described in Appendix A Test Model Details Test Considerations There are many factors that may affect the performance and sizing results in other environments. Factors to consider include: Customers should assess their pattern of peak load activity and size accordingly. Common scenarios where volumes may spike abnormally include: Monday mornings, end of month, periods of major market activity, new app updates, and other business-triggering events where unusually high concurrency rates may occur. Adequate TPS capacity should be in place to handle such spikes to avoid reduced throughput and timeouts. Mobile applications face a challenging set of real-world limitations that may reduce throughput including relatively slow and sporadic wireless networks, network latency, reduced device processing power. Such factors are not considered in this analysis. 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 6 of 20

Customers running multiple applications on a single Verivo app server farm must consider the combined load of all applications and back-end services when conducting capacity planning. Customers may increase their server capacity by maximizing the number of CPU cores on their servers. These tests were run on 4-core machines additional cores will yield increased throughput at a relatively small incremental hardware cost and no increase in Verivo s platform license fees. Transport data encryption will reduce system throughput. Customers using VPN clients, SSL, or 3rdparty transport encryption schemes may experience moderate throughput reductions. Customers should optimize their use of data synchronization and on-device data caching to balance their requirements between offline convenience, application response time, and security. Applications that efficiently use local data will reduce overall server activity (by increasing their Cache Factor). Customers utilizing data synchronization must take care to ensure data sync sizes are not overly large. These tests were conducted with data sync sizes in the 250 Kb 5 Mb range. Larger data sync sizes require increased server processing and reduce overall system throughput. These tests were conducted on dedicated physical servers. Comparably-configured virtual servers are expected to yield similar results. Verivo recommends dedicated physical hardware for high-volume production systems. Customers should ensure Cache config is enabled in their production environment, to reduce disk I/O during configuration requests. See Server tab on Application Properties in AppStudio. Verivo logging was set at Errors Only for this test, the recommended level for production systems under normal operation. Higher logging levels generate increased database activity and reduce overall system throughput. See Server tab on Application Properties in AppStudio. These tests were run using standard Verivo plugins. Customers using customized plugins, executing complex joins or data filtering within their plugins or data objects, performing complex data manipulation or processing in their back-end services, or returning very large server responses, can expect longer response times and reduced throughput. Appendix A Test Model Details. The master distributes this test plan to all slave machines, who then execute the plan in parallel producing load on the server. These slaves execute the same http requests made by Verivo s client the app server does not know whether the requestor is a mobile device or a JMeter slave, therefore the response, load, and measurement of server performance are accurate simulations of mobile activity. This approach allows our team to test server scalability without the need to use hundreds of individual mobile devices. 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 7 of 20

The system components used in this test were configured as follows: Web App Servers Database Server Client Test Agents Load Balancer Network Data Sources Verivo Software s Mobility Platform Dell PowerEdge R610 Server 2x2.5 GHz Intel Xeon 5620 CPUs (Quad Core) 16 GB RAM 8x300 GB 10K RPM SAS Drives in RAID 5 Array Microsoft IIS 7.0 Windows Server 2008 R2 Standard Edition, 64-bit Approximate hardware cost: $4,850 USD Dell PowerEdge R610 Server 2x3.06 GHz Intel Xeon 5667 CPUs (Quad Core) 16 GB RAM 8x146 GB 15K RPM SAS Drives in RAID 5 Array Intel Quad Port Gigabit Ethernet NIC Microsoft SQL Server 2008 R2 Windows Server 2008 R2 Standard Edition, 64-bit Approximate hardware cost: $7,000 USD Various Dell professional grade Windows 7 laptops and Apple MacBook Pros Apache JMeter v2.5 Microsoft Application Request Routing Virtual machine with 2x2.53 GHz Intel Xeon E5540 vcpus 2 GB RAM 1 Virtual E1000 NIC Windows Server 2008 R2 Standard Edition, 64-bit 1 GB LAN Proprietary authentication web service Verivo Elevate sales demonstration DB (SQL Server 2008 R2) New York Times API Verivo v7.3.9 Figure 5 Test System Specifications 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 8 of 20

Performance Test Model This performance analysis simulated a production enterprise mobile app in use by a variety of user volumes and concurrency rates. The app used as the basis for the simulation is Verivo s Elevate Sales app a typical enterprise-style app that includes a wide range of server-side authentication, data synchronization, real-time data retrieval & updates, and various other services. The performance tests run to produce this report utilize an app environment that simulates a typical production enterprise mobile app. The test was run on dedicated physical servers, but could also be deployed in a virtual environment. The primary objective of this model is to ensure that the expected mobile Transaction per Second (TPS) rates fall well within a production server farm s capacity at estimated volume levels, activity rates, and usage profile. The model is built on a single user profile that reflects the type and frequency of server calls generated by a typical production user (i.e., Expected Calls/Day). The details of the test model, assumptions, expected TPS rates, and other key data are available in Appendix A Test Model Details. Parameter Default Value Description Notes Total # of Users 10,000 The total available user population for an app (additive for multiple apps) Active User Rate 60% The # of users expected to use the app on any given day, as a % of the Total User count -- Drives transaction volumes in each deployment scenario Hours per Day 10 The number of hours in a typical work day Adjust according to your business, considering global app use Peak Factor, Super Peak Factor 1.5x 2.5x Load accelerators that represent maximum expected loads under very heavy usage, peak periods, or high-concurrency conditions Adjust to reflect peak conditions of your business and ensure necessary capacity Cache Factor 10% Used to reduce server activity by an estimated rate of requests fulfilled by data available on device, either via caching or data synchronization Transaction Volume As shown = Call per Day * Cache Factor * Peak Factor * Active User Rate Apps that make extensive use of caching and data sync should increase Cache Factor, while those that rely on real-time data should reduce it The transaction counts shown in each cell of the model are calculated using this formula; required TPS rates are derived from these counts by dividing by time within the designated business day Figure 6 - Test Model Parameters 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 9 of 20

Test Run Results The test profiles described above were run more than 50 times under various user counts, durations, ramp-up times, IIS settings, concurrency rates, and other variables. Test runs ranged from short 5 min tests to long multi-hour runs. From these many iterations, this analysis focuses on a reference run of 1-hour that processed a half-million transactions. The details of this run, and of typical results across the many test iterations are shown in the table below. Reference Run (as shown) Typical Results Range Duration 57:11 5 min 5.5 hrs Total Transactions 514,830 55,000 2,056,080 Transactions per Second (avg max) 151 228 102 220 205 300 Average Response Time 1.4 sec 0.9 1.6 sec App Server CPU Load (avg max) DB Server CPU Load (avg max) 37% 44% 16% 22% Figure 7 - Performance Test Results 35 40% 40 56% 14 18% 20 31% These results show that a typical Verivo production configuration of 2 moderately sized, load-balanced app servers can easily support the concurrency rates and throughput demands of large, highly-active mobile deployments. The 151 TPS sustained rate of the reference test is the expected active load for a total user population of 220,000 users under Peak load conditions using our model, Appendix A Test Model Details. At this sustained rate all server resources CPU, memory, disk, and network are well within reasonable average load levels with neither sustained nor spiky activity near maximum levels. These results demonstrate a sustained average rate of 38 TPS/CPU, and a peak rate of 57 TPS/CPU (using four 4-core CPUs, two CPUs in each load balanced app server). Verivo recommends that customers plan for 15 30 TPS/CPU when sizing production systems, using a capacity plan that takes into account specific business requirements, environment variables, unplanned factors, and the Test Considerations noted below. 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 10 of 20

Typical Installation Enterprise Employee App The configuration shown here represents a typical enterprise deployment model. This example models an enterprise mobile app available to 10,000 employees with highly active users and standard peak usage rates. This model predicts concurrent volumes between 34 84 TPS. A two dual-cpu load balanced server configuration, similar to the test configuration, is recommended to achieve peak load TPS rates with sufficient head-room for growth, and to meet typical enterprise availability and redundancy requirements. Figure 8 - Typical Enterprise Deployment Sizing Typical Installation Consumer Mobile App The configuration shown here represents a typical B2C deployment model. This example models a consumerfacing mobile app available to 500,000 end users with moderate activity levels and standard peak rates. This model predicts concurrent volumes between 135 337 TPS. A load balanced server farm of 4-5 dual-cpu servers is recommended to achieve peak load TPS rates with sufficient head-room for growth, and to meet typical availability and failover requirements. Figure 9 - Typical B2C Deployment Sizing 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 11 of 20

Test Considerations P E R F O R M A N C E A N A L Y S I S & There are many factors that may affect the performance and sizing results in other environments. Factors to consider include: Customers should assess their pattern of peak load activity and size accordingly. Common scenarios where volumes may spike abnormally include: Monday mornings, end of month, periods of major market activity, new app updates, and other business-triggering events where unusually high concurrency rates may occur. Adequate TPS capacity should be in place to handle such spikes to avoid reduced throughput and timeouts. Mobile applications face a challenging set of real-world limitations that may reduce throughput including relatively slow and sporadic wireless networks, network latency, reduced device processing power. Such factors are not considered in this analysis. Customers running multiple applications on a single Verivo app server farm must consider the combined load of all applications and back-end services when conducting capacity planning. Customers may increase their server capacity by maximizing the number of CPU cores on their servers. These tests were run on 4-core machines additional cores will yield increased throughput at a relatively small incremental hardware cost and no increase in Verivo s platform license fees. Transport data encryption will reduce system throughput. Customers using VPN clients, SSL, or 3 rd - party transport encryption schemes may experience moderate throughput reductions. Customers should optimize their use of data synchronization and on-device data caching to balance their requirements between offline convenience, application response time, and security. Applications that efficiently use local data will reduce overall server activity (by increasing their Cache Factor). Customers utilizing data synchronization must take care to ensure data sync sizes are not overly large. These tests were conducted with data sync sizes in the 250 Kb 5 Mb range. Larger data sync sizes require increased server processing and reduce overall system throughput. These tests were conducted on dedicated physical servers. Comparably-configured virtual servers are expected to yield similar results. Verivo recommends dedicated physical hardware for high-volume production systems. Customers should ensure Cache config is enabled in their production environment, to reduce disk I/O during configuration requests. See Server tab on Application Properties in AppStudio. Verivo logging was set at Errors Only for this test, the recommended level for production systems under normal operation. Higher logging levels generate increased database activity and reduce overall system throughput. See Server tab on Application Properties in AppStudio. These tests were run using standard Verivo plugins. Customers using customized plugins, executing complex joins or data filtering within their plugins or data objects, performing complex data manipulation or processing in their back-end services, or returning very large server responses, can expect longer response times and reduced throughput. 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 12 of 20

Appendix A Test Model Details Shown here is the model used as the foundation for this test. This model shows the sizing assumptions and app profile used to structure the test. A version of this model is attached, allowing you to simulate your own app and derive similar capacity planning results. The test plan was built using this model as a basis. This plan provides the structure for executing the requests described below through a series of modeled loops and weightings, according to the targeted load level, run time, and concurrency rate of each test. The model exercises a variety of request types (data reads, inserts, updates), data interfaces (SQL, stored procedure, RESTful web service), and device types (ios, Android, BlackBerry). These requests are made using the same url structure that Verivo s client invokes, ensuring that test simulations accurately reflect real mobile device activity from the server s perspective. Our testing team used Apache JMeter, an open source load testing tool, to conduct these tests. We added a performance monitoring plugin to capture CPU, disk, memory, and network activity on app and database servers. JMeter test plans are configured via the JMeter UI and stored as.jmx files. The test plan attached here is the plan used by our testing team for this analysis. This plan is provided for illustration purposes, to show the general structure and approach our team used. Customers may use this plan as a structural starting point, pointing to your server environment, inserting urls for your apps, and building control flow according to your usage patterns. Verivo Figure 10 - Performance Test Model Performance test model spreadsheet. JMeter performance test plan 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 13 of 20

Appendix B Test Result Details The following charts provide detailed analysis of test activity and server performance during the reference test run, according to the Test Environment and test models described above. This run is representative of system performance under heavy concurrent load for a one-hour run. The charts below show mobile transaction throughput (TPS) along with CPU, memory, disk, and network response on each of the two test app servers (Athena & Ares) and the database server (Tango). In all cases, system loads were well within the physical capacity of app and database servers under these sustained, highconcurrency loads. Reference Test Run The results below show results for the reference high-volume performance test run described in Test Run Results, above. This test ran for 57 min 11 sec, and executed 514,830 transactions at an average rate of 151 TPS and average transaction response time of 1.4 seconds. As depicted in the detailed charts below, all servers operated well within capacity on all key system measures. Transaction Detail table shows the execution count of each type of request made; average/min/max response times in ms; request error rates; TPS rates (Throughput); and data sizes and flow rates CPU Performance shows the measured, aggregate CPU load on both app servers and the database server TPS Rate shows actual mobile transaction per second throughput rates during the test run Memory Performance shows server memory utilization throughout the test run (note the x10 factor on App Server 2 is a scaling artifact of the charting engine; memory use on both app servers was steady between 1.8 2.0 Gb during all tests) Disk Performance shows disk I/O read and write activity across all servers (note the x100, x1000 scaling adjustments on several series) Network Performance shows all data sent and received on each server during the test run 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 14 of 20

Figure 11 - One Hour Test, Transaction Detail 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 15 of 20

CPU activity levels remained steady throughout the Reference Test on both app servers and their supporting database servers. Both sustained and peak CPU loads were well within machine limits. App Servers @ 37% CPU DB Server @ 16% CPU Figure 12 Reference Test, CPU Performance Transaction levels showed peak activity in the early and late phases of the Reference Test, with steady sustained levels around 140 TPS throughout the core of the test run. These patterns of early and late spikes were consistent in both short and long test runs, and are an artifact of the test execution flow. Ignoring the endpoints yields strong, sustained TPS throughput and sustained 1.4 sec average response times. Figure 13 Reference Test, TPS Rate 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 16 of 20

Figure 14 - Reference Test, Memory Performance Figure 15 - Reference Test, Disk Performance 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 17 of 20

Figure 16 - Reference Test, Network Performance 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 18 of 20

Appendix C Sample Production Configurations The following system configurations provide examples of the types of production deployments Verivo customers use to meet their varied volume demands, response time requirements, and availability needs. The configurations shown below may be deployed physically or virtually. Customers may tune their systems by varying memory, CPUs and cores, disk technology, network topology, and other factors to reach optimal results. These configurations are provided as a guideline only customers should analyze their specific business, technical, and regulatory requirements and deploy development, test, and production systems accordingly. All tests in this document utilized a Typical Configuration. Small Configuration This entry-level configuration is designed for deployments with relatively small user populations (< 1,000) and low concurrency demands. This configuration relies on a single dual-cpu server, offering no redundancy or load balancing. Customers should consider a configuration like this when hardware cost is a key concern, transaction volumes are relatively low and predictable, and some risk of system downtime is tolerable by the business (i.e., no server redundancy). Typical Configuration This configuration is the most common production system setup, designed for deployments with moderate to large user populations (thousands to tens-of-thousands, based on activity levels) and above average concurrency. This configuration utilizes two (or more) load-balanced servers, providing a ready ability to scale and offering a significant level of redundancy and increased availability. Customers should consider a loadbalanced configuration like this for most production deployments in order to both meet growing mobile demands and maximize system uptime. 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 19 of 20

Large Configuration P E R F O R M A N C E A N A L Y S I S & The Large Configuration is designed for sizable deployments with large and growing user populations and high concurrency requirements (> 100,000 users). This configuration utilizes four (or more) multi-processing, loadbalanced servers, providing deep app server processing power and throughput, with high redundancy. Customers should consider a configuration like this for very large employee & partner app deployments, or for B2C apps where user counts or activity levels are high to start and expected to grow steadily. About Verivo Software A leading provider of enterprise mobility software, Verivo Software helps companies accelerate their business results. Its unique technology empowers teams to centrally build, deploy, manage and update their mobile apps rapidly, securely and across multiple devices. Hundreds of companies in numerous industries around the world rely on Verivo s platform to drive their mobility initiatives. To learn more, visit www.verivo.com. 2012 Verivo Software, Inc. CONFIDENTIAL. Unauthorized use or distribution is prohibited. Page 20 of 20