AppDynamics Lite Performance Benchmark. For KonaKart E-commerce Server (Tomcat/JSP/Struts)



Similar documents
Business white paper. HP Process Automation. Version 7.0. Server performance

Tableau Server 7.0 scalability

IT Business Management System Requirements Guide

Performance Analysis and Capacity Planning Whitepaper

Very Large Enterprise Network Deployment, 25,000+ Users

XTM Web 2.0 Enterprise Architecture Hardware Implementation Guidelines. A.Zydroń 18 April Page 1 of 12

Performance Analysis of Web based Applications on Single and Multi Core Servers

Very Large Enterprise Network, Deployment, Users

AgencyPortal v5.1 Performance Test Summary Table of Contents

PARALLELS CLOUD SERVER

Informatica Data Director Performance

SOLUTION BRIEF: SLCM R12.8 PERFORMANCE TEST RESULTS JANUARY, Submit and Approval Phase Results

Performance test report

Centrata IT Management Suite 3.0

SOLUTION BRIEF: SLCM R12.7 PERFORMANCE TEST RESULTS JANUARY, Load Test Results for Submit and Approval Phases of Request Life Cycle

A Comparison of Oracle Performance on Physical and VMware Servers

MAGENTO HOSTING Progressive Server Performance Improvements

Benchmark Performance Test Results for Magento Enterprise Edition

Power Comparison of Dell PowerEdge 2950 using Intel X5355 and E5345 Quad Core Xeon Processors

PipeCloud : Using Causality to Overcome Speed-of-Light Delays in Cloud-Based Disaster Recovery. Razvan Ghitulete Vrije Universiteit

Introduction to the NI Real-Time Hypervisor

Instant Queue Manager V4

A Hybrid Web Server Architecture for e-commerce Applications

Implementation & Capacity Planning Specification

Legal Notices Introduction... 3

JBoss Seam Performance and Scalability on Dell PowerEdge 1855 Blade Servers

SIGMOD RWE Review Towards Proximity Pattern Mining in Large Graphs

CentOS Linux 5.2 and Apache 2.2 vs. Microsoft Windows Web Server 2008 and IIS 7.0 when Serving Static and PHP Content

Enterprise Edition Scalability. ecommerce Framework Built to Scale Reading Time: 10 minutes

vrealize Business System Requirements Guide

MySQL Enterprise Monitor

Performance Testing Tools: A Comparative Analysis

Adobe LiveCycle Data Services 3 Performance Brief

An Oracle White Paper March Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite

OTM in the Cloud. Ryan Haney

Sun 8Gb/s Fibre Channel HBA Performance Advantages for Oracle Database

COLO: COarse-grain LOck-stepping Virtual Machine for Non-stop Service

Scaling in a Hypervisor Environment

Uptime Infrastructure Monitor. Installation Guide

KonyOne Server Installer - Linux Release Notes

Web Application s Performance Testing

MS EXCHANGE SERVER ACCELERATION IN VMWARE ENVIRONMENTS WITH SANRAD VXL

NetIQ Access Manager 4.1

Measuring Cache and Memory Latency and CPU to Memory Bandwidth

Oracle TimesTen In-Memory Database on Oracle Exalogic Elastic Cloud

Introduction 1 Performance on Hosted Server 1. Benchmarks 2. System Requirements 7 Load Balancing 7

An Oracle Benchmarking Study February Oracle Insurance Insbridge Enterprise Rating: Performance Assessment

RTI v3.3 Lightweight Deep Diagnostics for LoadRunner

Kronos Workforce Central on VMware Virtual Infrastructure

AIX NFS Client Performance Improvements for Databases on NAS

Tableau Server Scalability Explained

JVM Performance Study Comparing Oracle HotSpot and Azul Zing Using Apache Cassandra

Development of Monitoring and Analysis Tools for the Huawei Cloud Storage

A Comparison of Oracle Performance on Physical and VMware Servers

Cisco is a registered trademark or trademark of Cisco Systems, Inc. and/or its affiliates in the United States and certain other countries.

Liferay Portal Performance. Benchmark Study of Liferay Portal Enterprise Edition

Holistic Performance Analysis of J2EE Applications

Datacenter Operating Systems

Benchmarking Cassandra on Violin

Scalability Factors of JMeter In Performance Testing Projects

JBoss Data Grid Performance Study Comparing Java HotSpot to Azul Zing

The deployment of OHMS TM. in private cloud

Virtuoso and Database Scalability

Minimum Hardware Configurations for EMC Documentum Archive Services for SAP Practical Sizing Guide

Dell Virtualization Solution for Microsoft SQL Server 2012 using PowerEdge R820

Muse Server Sizing. 18 June Document Version Muse

SAP BusinessObjects BI4 Sizing What You Need to Know

LS-DYNA Best-Practices: Networking, MPI and Parallel File System Effect on LS-DYNA Performance

farmerswife Contents Hourline Display Lists 1.1 Server Application 1.2 Client Application farmerswife.com

HP ProLiant DL585 G5 earns #1 virtualization performance record on VMmark Benchmark

Maximizing Hadoop Performance and Storage Capacity with AltraHD TM

Comparing NoSQL Solutions In a Real-World Scenario: Aerospike, Cassandra Open Source, Cassandra DataStax, Couchbase and Redis Labs

OpenProdoc. Benchmarking the ECM OpenProdoc v 0.8. Managing more than documents/hour in a SOHO installation. February 2013

TriDoc Install Guide - Mac OS X Operating System This install guide was prepared for professional use!

CloudSim: A Toolkit for Modeling and Simulation of Cloud Computing Environments and Evaluation of Resource Provisioning Algorithms

DELL TM PowerEdge TM T Mailbox Resiliency Exchange 2010 Storage Solution

Dell Reference Configuration for Hortonworks Data Platform

Performance brief for IBM WebSphere Application Server 7.0 with VMware ESX 4.0 on HP ProLiant DL380 G6 server

SUBHASRI DUTTAGUPTA et al: PERFORMANCE EXTRAPOLATION USING LOAD TESTING RESULTS

JAMF Software Server Installation Guide for Linux. Version 8.6

Capacity Planning Guide for Adobe LiveCycle Data Services 2.6

Performance Comparison of Fujitsu PRIMERGY and PRIMEPOWER Servers

Architecting for the next generation of Big Data Hortonworks HDP 2.0 on Red Hat Enterprise Linux 6 with OpenJDK 7

How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)

Converged storage architecture for Oracle RAC based on NVMe SSDs and standard x86 servers

VDI Without Compromise with SimpliVity OmniStack and Citrix XenDesktop

Load Testing on Web Application using Automated Testing Tool: Load Complete

VMware vcenter Update Manager Performance and Best Practices VMware vcenter Update Manager 4.0

DOCUMENT REFERENCE: SQ EN. SAMKNOWS TEST METHODOLOGY Web-based Broadband Performance White Paper. July 2015

Enterprise Network Deployment, 10,000 25,000 Users

Scaling Database Performance in Azure

ORACLE VM MANAGEMENT PACK

CloudCmp:Comparing Cloud Providers. Raja Abhinay Moparthi

Stress Testing for Performance Tuning. Stress Testing for Performance Tuning

Transcription:

AppDynamics Lite Performance Benchmark For KonaKart E-commerce Server (Tomcat/JSP/Struts) At AppDynamics, we constantly run a lot of performance overhead tests and benchmarks with all kinds of Java/J2EE applications, to measure the overhead of AppDynamics agent on a production application. In this article, we are publishing the results of benchmarks tests we performed recently with AppDynamics Lite on a well-known E-Commerce application suite called KonaKart (http://www.konakart.com/). We picked KonaKart as one of applications for our performance benchmarks because it is a complex business application used in real-world revenue-critical environments, and is more representative of how our customers use AppDynamics, compared to simple demo applications like MedRec and PetStore. Also, KonaKart is available to download for free, and anyone can run the same performance benchmark tests as we do and replicate the results. Test Application The KonaKart application uses Sun Java 1.6, Tomcat Servlet Container, and a MySQL Database. The application uses a standard JSP/Struts based web programming stack. Here is the architecture of the application:

Test Goals The goals of the benchmark tests are to measure the following: - Response Time (Latency) overhead of AppDynamics Agent, measured as change in Response Time for different transactions - Throughput overhead of AppDynamics Agent, measured as change in Throughput for different transactions - CPU, Network I/O and Disk I/O overhead of AppDynamics Agent, measured as change in CPU utilization, and Network/Disk activity on the App Server machine Test Environment We used 3 machines for this benchmark test. Machine 1: Tomcat Application Server Machine 2: MySQL Database Machine 3: Apache JMeter Load Generator Scripts All the 3 machines have Ubuntu Linux on them, and are Dell PowerEdge SC 1425 boxes with Quad-Core Intel Xeon 2.8GHz CPU and 4 GB RAM. Test Methodology The tests were done by using JMeter to generate load. We generated concurrent user load using 15 JMeter threads. The JMeter scripts & AppDynamics Agent thresholds were configured to simulate a good distribution of normal requests, slow requests and very slow requests, with the goal to simulate a highvolume production environment having some performance problems and the AppDynamics solution configured to collect all diagnostic data needed to troubleshoot those problems. All the JMeter scripts are available on request, and these tests can be reproduced by anyone by downloading a) KonaKart server (http://www.konakart.com/downloads), b) AppDynamics Lite (http://www.appdynamics.com/free), and c) the JMeter scripts and instructions. Response Time Overhead Test To measure Response Time overhead, we used JMeter to generate load on a populated KonaKart application. We captured the JMeter results on Response Time (Average, Median, 90% Line, Min and Max) with and without the AppDynamics Agent. The test was run for 6 hours in both the cases, and JMeter was configured to generate load representative of high-volume production environments, of about 120K requests per hour. 2

Measured Overhead on Avg Response Time for the period of 6 hours: 1.68% Throughput Overhead Test To measure Throughput overhead, we used JMeter to generate load on a populated KonaKart application. We captured the JMeter results with and without the AppDynamics Agent. The test was run for 6 hours in both the cases, and JMeter was configured to generate load representative of high-volume production environments, of about 120K requests per hour. Measured Avg Overhead on Throughput for the period of 6 hours: 0.06% 3

CPU, Network I/O & Disk I/O Overhead Test To measure CPU, Network I/O & Disk I/O overhead, we used JMeter to generate load on a populated KonaKart application. We captured resource utilization metrics using the Linux Collectl tool (http://collectl.sourceforge.net) with and without the AppDynamics Agent. The test was run for 6 hours in both the cases, and JMeter load generation was configured to generate load representative of high- volume production environments, of about 120K requests per hour. Measured Avg Overhead for the period of 6 hours: CPU Utilization Network In MB/sec Network Out MB/sec Disk Writes KB/sec 2.29% 0.91% 0.78% 0.74% 4

Summary The summary of the results of the above benchmark tests to measure the overhead of AppDynamics Lite on a high-volume KonaKart E-commerce server application is: Response Time Overhead 1.68% Throughput Decrease 0.06% CPU Utilization Increase 2.29% Network Reads Increase 0.91% Network Writes Increase 0.78% Disk Writes Increase 0.74% @Copyright 2010, AppDynamics, Inc. All rights reserved. AppDynamics Proprietary & Confidential - All Rights Reserved 5