Web Services Performance: Comparing Java 2 TM Enterprise Edition (J2EE TM platform) and the Microsoft.NET Framework



Similar documents
Performance brief for IBM WebSphere Application Server 7.0 with VMware ESX 4.0 on HP ProLiant DL380 G6 server

CHARON-VAX application note

JBoss Seam Performance and Scalability on Dell PowerEdge 1855 Blade Servers

.NET 3.0 vs. IBM WebSphere 6.1 Benchmark Results

Informatica Master Data Management Multi Domain Hub API: Performance and Scalability Diagnostics Checklist

Migrating IIS 6.0 Web Application to New Version Guidance

WebSphere Performance Monitoring & Tuning For Webtop Version 5.3 on WebSphere 5.1.x

Windows Server 2003 default services

ILOG JRules Performance Analysis and Capacity Planning

IBM CICS Transaction Gateway for Multiplatforms, Version 7.0

HP NonStop JDBC Type 4 Driver Performance Tuning Guide for Version 1.0

Application Servers - BEA WebLogic. Installing the Application Server

Performance Analysis of Web based Applications on Single and Multi Core Servers

Oracle WebLogic Foundation of Oracle Fusion Middleware. Lawrence Manickam Toyork Systems Inc

SECURITY BEST PRACTICES FOR CISCO PERSONAL ASSISTANT (1.4X)

MDM Multidomain Edition (Version 9.6.0) For Microsoft SQL Server Performance Tuning

A Scalability Study for WebSphere Application Server and DB2 Universal Database

OpenProdoc. Benchmarking the ECM OpenProdoc v 0.8. Managing more than documents/hour in a SOHO installation. February 2013

Liferay Portal Performance. Benchmark Study of Liferay Portal Enterprise Edition

JAVA WEB SERVICES PERFORMANCE ANALYSIS AND BENEFITS OF FAST INFOSET

Delivering Quality in Software Performance and Scalability Testing

Performance of Enterprise Java Applications on VMware vsphere 4.1 and SpringSource tc Server

System Requirements Table of contents

IBM Tivoli Composite Application Manager for WebSphere

EMC Documentum Content Management Interoperability Services

IBM Rational Web Developer for WebSphere Software Version 6.0

50331D Windows 7, Enterprise Desktop Support Technician (Windows 10 Curriculum)

Capacity Planning Guide for Adobe LiveCycle Data Services 2.6

Also on the Performance tab, you will find a button labeled Resource Monitor. You can invoke Resource Monitor for additional analysis of the system.

Understanding the Benefits of IBM SPSS Statistics Server

Tuning WebSphere Application Server ND 7.0. Royal Cyber Inc.

JBoss Data Grid Performance Study Comparing Java HotSpot to Azul Zing

Performance and scalability of a large OLTP workload

Informatica Data Director Performance

Tuning Your GlassFish Performance Tips. Deep Singh Enterprise Java Performance Team Sun Microsystems, Inc.

Magento & Zend Benchmarks Version 1.2, 1.3 (with & without Flat Catalogs)

Introduction 1 Performance on Hosted Server 1. Benchmarks 2. System Requirements 7 Load Balancing 7

WebSphere Server Administration Course


An Oracle White Paper July Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide

Contents Introduction... 5 Deployment Considerations... 9 Deployment Architectures... 11

IBM WebSphere Server Administration

Introduction to Sun ONE Application Server 7

IT Best Practices Audit TCS offers a wide range of IT Best Practices Audit content covering 15 subjects and over 2200 topics, including:

Legal Notices Introduction... 3

Performance Tuning Guidelines for PowerExchange for Microsoft Dynamics CRM

WebSphere Architect (Performance and Monitoring) 2011 IBM Corporation

Mobile Application Languages XML, Java, J2ME and JavaCard Lesson 04 Java

Enterprise Edition Scalability. ecommerce Framework Built to Scale Reading Time: 10 minutes

Cognos8 Deployment Best Practices for Performance/Scalability. Barnaby Cole Practice Lead, Technical Services

Oracle Weblogic. Setup, Configuration, Tuning, and Considerations. Presented by: Michael Hogan Sr. Technical Consultant at Enkitec

Planning Domain Controller Capacity

echomountain Enterprise Monitoring, Notification & Reporting Services Protect your business

Blackboard Learn TM, Release 9 Technology Architecture. John Fontaine

What Is Specific in Load Testing?

CHAPTER 1 - JAVA EE OVERVIEW FOR ADMINISTRATORS

Application Note. Windows 2000/XP TCP Tuning for High Bandwidth Networks. mguard smart mguard PCI mguard blade

Course Description. Course Audience. Course Outline. Course Page - Page 1 of 5

Internet Engineering: Web Application Architecture. Ali Kamandi Sharif University of Technology Fall 2007

NoSQL Performance Test In-Memory Performance Comparison of SequoiaDB, Cassandra, and MongoDB

Installation and Configuration Guide for Windows and Linux

Synergis Software 18 South 5 TH Street, Suite 100 Quakertown, PA , version

Crystal Reports Server 2008

Microsoft Windows Server 2003 with Internet Information Services (IIS) 6.0 vs. Linux Competitive Web Server Performance Comparison

MEASURING WORKLOAD PERFORMANCE IS THE INFRASTRUCTURE A PROBLEM?

IBM WebSphere Enterprise Service Bus, Version 6.0.1

Load Testing and Monitoring Web Applications in a Windows Environment

Novell Access Manager

Adobe LiveCycle Data Services 3 Performance Brief

Molecular Devices High Content Data Management Solution Database Schema

An Oracle Benchmarking Study February Oracle Insurance Insbridge Enterprise Rating: Performance Assessment

PARALLELS CLOUD SERVER

McAfee Enterprise Mobility Management Performance and Scalability Guide

Performance in the Infragistics WebDataGrid for Microsoft ASP.NET AJAX. Contents. Performance and User Experience... 2

24x7 Scheduler Multi-platform Edition 5.2

IBM Tivoli Composite Application Manager for Microsoft Applications: Microsoft Internet Information Services Agent Version Fix Pack 2.

Application Discovery Manager User s Guide vcenter Application Discovery Manager 6.2.1

Cost Savings with Tcat

AP ENPS ANYWHERE. Hardware and software requirements

Quark Publishing Platform 9.5 ReadMe

ABSTRACT INTRODUCTION SOFTWARE DEPLOYMENT MODEL. Paper

Part 3 - Performance: How to Fine-tune Your ODM Solution. An InformationWeek Webcast Sponsored by

System Requirements - Table of Contents

System Requirements and Configuration Options

System Requirements Version 8.0 July 25, 2013

Oracle WebLogic Server 11g Administration

CentOS Linux 5.2 and Apache 2.2 vs. Microsoft Windows Web Server 2008 and IIS 7.0 when Serving Static and PHP Content

11.1 inspectit inspectit

Quark Publishing Platform ReadMe

Performance Characteristics of VMFS and RDM VMware ESX Server 3.0.1

JBoss enterprise soa platform

Black Viper's Windows XP Home and Professional Service Pack 2 Service Configurations (Posted because his site went down inexplicably)

An Oracle White Paper March Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite

Tuning U2 Databases on Windows. Nik Kesic, Lead Technical Support

2012 LABVANTAGE Solutions, Inc. All Rights Reserved.

white paper Capacity and Scaling of Microsoft Terminal Server on the Unisys ES7000/600 Unisys Systems & Technology Modeling and Measurement

Transcription:

Web Services Performance: Comparing 2 TM Enterprise Edition (J2EE TM platform) and the Microsoft Framework A Response to Sun Microsystem s Benchmark Microsoft Corporation July 24

Introduction In June 24, Sun Microsystems published a benchmark showing the relative performance of Microsoft vs. Sun s Web Service Developer Pack on a series of simple Web Service tests. The original Sun benchmark report and results can be found at http://java.sun.com/performance/reference/whitepapers/ws_test-1_.pdf. Sun s report shows outperforming in their WSTest benchmark suite by a wide margin. Sun did not publish the source code used for the tests when they published the paper (as of July 14 th, 24) for customers to examine the test suite, or to replicate and verify the results at that time. In their paper, however, they describe the functional specification for the Web Services tested, the benchmark driver program, the tuning used for both and, and the software/hardware platform used in their testing. This information provided enough detail for Microsoft to create an implementation of both the and benchmark suites, and to verify the test results using the latest platform components including Sun s J2SDK 1.4.2.5, Tomcat 5, and JWSDP 1.4. In this paper we describe the results of testing these re-created implementations. Microsoft is making all source code available for both our Sun JWSDP and implementations as well as our driver program so that customers and other vendors can download the code and verify the results on their own hardware platforms (to download the code, visit http://www.theserverside.net/articles/showarticle.tss?id=sunbenchmarkresponse). In our tests, we used the recently released Tomcat 5. and Sun s JWSDP version 1.4, slightly newer releases of these software components than Sun used. We have also created several additional tests of Web Services on each platform to illustrate the relative performance when the backend Web Services are required to do more work. These additional tests are more realistic than Sun s tests, and show the relative performance when the Web Service message payload is increased. Summary of Results In the four tests that replicate the original Sun tests, we got similar results for the performance of the Web Services as reported by Sun. Our results were slightly better than Sun s results for, a difference that can be attributed to the fact our backend server had CPUs that were slightly faster (3 GHz vs. 2.6 GHz) than the CPUs Sun used. However, we also found that Sun seriously misrepresented performance, with our results significantly outperforming the results as reported by Sun. In short, the results are actually more than two to three times better than Sun reported. In our tests, roughly matched or slightly exceeded J2EE performance for Sun s four original tests. Furthermore, in the additional more realistic tests involving higher Web Service message payloads we found to significantly outperform. Issues with Sun s Benchmark Most benchmarks are simplifications of real-world scenarios designed at showing the performance of one part of an overall system. Sun s WSTest is such a simplification. However, the tests performed in Sun s WSTest suite are so simplistic that we do not

believe they represent enough information for customers to make balanced decisions on the relative performance of and for Web Services. For example, Sun s tests do not include any database interactions, nor do they include the use of core features of J2EE such as Enterprise Beans. Furthermore, Sun did not use a major commercial application server platform such as IBM WebSphere or BEA WebLogic in their testing, choosing instead to use Tomcat, a lightweight application server that does not support core J2EE features such as EJBs. For these reasons, we encourage customers to also read the more thorough Doculabs Web Service benchmark report, which can be found at http://www.theserverside.net/articles/content/doculabswebservicescalability/doculabswebservicescalability.pdf. This benchmark reports the relative performance of and J2EE Web Services when used against a variety of backend databases, is inclusive of EJB and Servlet testing, and includes results for J2EE on three different application servers, including JBOSS and two major commercial J2EE application servers. Notwithstanding these issues, customers should examine the results for in this report as they are significantly different than Sun s findings, and also examine the results for tests involving Web Services that perform more realistic work on the backend, where significantly outperforms Sun s JWSDP. Finally, customers can download, examine and execute the code for the implementation and implementation, as well as the common driver program we used in the benchmark to fully replicate the results and perform further testing. Test Description As described by Sun, WSTest simulates a multi-threaded server program that makes multiple Web Service calls in parallel. To avoid the effects of other platform components, the Web Service methods perform no business logic but simply return the parameters that were passed in. Sun s WSTest benchmark and Microsoft s implementation of WSTest include the following four Web Services, as described in Sun s benchmark paper: echovoid Sends and receives and empty message. echostruct Sends and receives an array of size listsize, each element is a structure composed of one element each of an integer, float and string datatype. echolist Sends and receives a linked list of size listsize each element is a Struct as defined in echostruct. echosynthetic Sends and receives multiple parameters of different types string, struct and byte array of size 1K. Microsoft also extended WSTest to additionally test the following new Web Services that perform more work on the backend and are hence more realistic: echolist with list size 1 Sends and receives a linked list of 1 elements, each element is a Struct as defined in echostruct. echolist with list size 2 -- Sends and receives a linked list of 2 elements, each element is a Struct as defined in echostruct.

echostruct with list size 1 -- Sends and receives an array of size 1, each element is a structure composed of one element each of an integer, float and string datatype. echostruct with list size 2 -- Sends and receives an array of size 2, each element is a structure composed of one element each of an integer, float and string datatype. getorder accepts two integers as parameters and creates a customer order object and returns it to the client. The customer order object simulates a real order with XML/SOAP data types including structures representing customer information such as bill to and ship to addresses, an order header, and a randomly generated number of line items from 1 to 1 per order. Like the Sun benchmark, the Microsoft implementation of WSTest reports throughput as the average number of Web Service operations executed per second, as well as response times as measured at peak throughput. In all cases care was taken to ensure the client driver was not the bottleneck, and the number of agent threads was enough to saturate each server so that the numbers reported accurately reflect peak throughput on the hardware used. We found that a single 2-proc client machine was unable to completely saturate Web Service host machine for several of the Web Services tested. Without saturating the server, peak throughput is not achieved, so it is critical to run enough clients to ensure full saturation of the server. Hence, we used two client machines in our tests, and properly ensured that the Web Service host machine was just saturated to 1% CPU load for both the and implementations tested. It should be noted that with two client driver machines in use, we were able to saturate both the and implementations properly to above 97% CPU load. Test Details As described by Sun, our implementation of WSTest can be configured with the following parameters, specified in an initialization file: Agents This is the number of client threads and is set to maximize CPU utilization and system throughput. RampUp Time allotted to system warmup. SteadyState Time allotted to collecting data. RampDown Time allotted for rampdown. EchoVoidMix - % of operations that are echovoid EchoStructMix - % of operations that are echostruct EchoListMix - % of operations that are echolist GetOrderMix - % of operations that are getorder EchoSyntheticMix - % of operations that are echosynthetic ListSize size of the list for echolist and echostruct NumBytes size of byte array for echosynthetic. To replicate Sun s testing, no think time is used in the client driver. The throughput is reported by the clients at the end of the run. When running two clients, throughput is

aggregated across the client machines, and response times if different are averaged across the clients. System Configuration WSTest was run on the following system configuration, with the same hardware and Windows Server 23 software used for both J2EE and. The common client driver was run on two separate machines than the Web Service. Systems were connected via a 1GB Ethernet link. Web Service host machine Dell Power Edge 265 2 x 3.6GHz w/ 2GB RAM Windows Server 23 Enterprise Edition with 1.1 enabled J2SE 1.4.2..5 SDK (uses 1GB heap) JWSDP 1.4 with Tomcat 5 Benchmark client driver machines (2) Unisys 4 x 3.GHz w/ 8GB RAM Windows Server 23 Enterprise Edition J2SE 1.4.2.5 SDK (uses 1GB heap) JWSDP 1.4 The Web Services Developer Pack Version 1.4 was used for testing the JAX-RPC implementation. This pack includes Tomcat 5. as the Web server. Windows Server 23 includes 1.1 and IIS 6. as the Web server. WSTest Configuration For the results reported, WSTest was run with the following parameters set, with the mix changed to 1% for each of the Web Services tested: Machines client driver run on 2 Config for each client machine Agents = 8 RampUp = 3 SteadyState = 3 RampDown = 1 NumBytes = 124 ListSize = 2, 1 and 2 to test three different message sizes Windows Tuning We did not find any of the following tuning parameters to have a material impact on either the or the results. Instead we found the Windows tuning performed by Sun below to be completely unnecessary, but we replicated it anyway to be consistent with Sun s tests:

Disabling the following services: Alerter ClipBook Computer Browser DHCP Client DHCP Server Fax Service File Replication Infrared Monitor Internet Connection Sharing Messenger NetMeeting Remote Desktop Sharing Network DDE Network DDE DSDM NWLink NetBIOS NWLink IPX/SPX Print Spooler TCP/IP NetBIOS Helper Service Telephony Telnet Uninterruptible Power Supply Setting registry keys \\HKEY_LOCAL_MACHINE\System\Current_Control_Set\Services\Tcp\parameters: TcpWindowSize = xffff TcpMaxSendFree = 6424 Tuning The table below shows the tuning applied by Sun and the tuning applied by Microsoft in our replicated tests. Key differences: We find no impact in changing the Garbage Collection mode for and chose to leave it at its default setting It is unnecessary in this test on a 2-proc machine to increase the number of worker processes If Sun used a client driver program vs. a driver program for their tests (their report does not say), they should have increased MaxNetworkConnections in machine.config on their client machine. The default setting of 2 throttles the number of outbound network connections for the client to make Web Service calls, and should have been set to at least equal the number of client threads being run. Since we used a common driver program for the benchmark, this clientside setting did not need to be applied in our tests.

Element Tuned Sun s Setting Microsoft s Setting Garbage Collection Server Default (User) Mode IIS Logging Disabled Disabled Worker Process Recycling Disabled Disabled Pinging and Rapid Fail Disabled Disabled Protection Number of Kernel Requests Unlimited Unlimited Number of Worker 4 1 (default value) Processes ASP Session State Disabled Disabled ASP Authentication None None Mode Debug Compilation False False IIS Web Server Authentication Mode Not specified (presumably default of integrated Windows Authentication) Anonymous (integrated authentication disabled to match Tomcat authentication behavior) Tuning We precisely matched Sun s tuning as follows: 1. The following parameters were changed from their default value for Tomcat in <JWSDP_HOME>\conf\server.xml : <!-- Disable access log writing <Valve classname="org.apache.catalina.valves.accesslogvalve" directory="logs" prefix="access_log." suffix=".txt" resolvehosts="false"/> --> <!-- Disable Connector lookups of Non-SSL connections <enablelookups="false"/> --> <!-- Set the minimium processors of Non-SSL connections <minprocessors="8"/> 2. The JVM options for Catalina server in <JWSDP_HOME\jwsdpshared\ bin\launcher.xml were set as follows : <launch classname="org.apache.catalina.startup.bootstrap"... > <jvmarg value="-server" /> <jvmarg value="-xms256m" />

<jvmarg value="-xmx256m" /> 3. On the client side, the JVM options were changed in <WSTest_HOME>\\src\build.xml : -Dhttp.maxConnections=256 -Xms3M -Xmx3M Performance Results The measured throughput and response times obtained are shown graphically below, with throughput measured in transactions per second (higher is better) and response times measured in millseconds (lower is better). In the four simple tests Sun performed, performed 2-3 times better than Sun reported across the board, beating the results in both the echostruct and echolist tests even with a small list size of 2 (as Sun tested). As the list size was increased in these tests, hence increasing the XML SOAP message size, exceeded performance by a wider and wider margin. For example, with a list size of 1 performed 72% better in the echolist test, while at a list size of 2 performed 82% better than. also exceeded performance in the new test GetOrder.

echovoid Throughput echovoid Response Time 3 9 25 2 15 1 5 8 7 6 5 4 3 2 1 echostruct listsize 2 Throughput echostruct listsize 2 Response Time 14 12 1 8 6 4 2 14.5 14 13.5 13 12.5 12 11.5 11 echostruct listsize 1 Throughput echostruct listsize 1 Response Time 6 5 4 3 2 1 5 45 4 35 3 25 2 15 1 5

echostruct listsize 2 Throughput echostruct listsize 2 Response Time 35 3 25 2 15 1 5 9 8 7 6 5 4 3 2 1 echolist listsize 2 Throughput echolist listsize 2 Response Time 12 18 1 8 6 4 2 16 14 12 1 8 6 4 2 echolist listsize 1 Throughput echolist listsize 1 Response Time 5 45 4 35 3 25 2 15 1 5 7 6 5 4 3 2 1

echolist listsize 2 Throughput echolist listsize 2 Response Time 3 25 2 15 1 5 12 1 8 6 4 2 echosynthetic Throughput echosynthetic Response Time 2 12 18 16 14 12 1 8 6 4 2 1 8 6 4 2 getorder Throughput getorder Response Time 1 9 8 7 6 5 4 3 2 1 3 25 2 15 1 5

Conclusion In this paper, we respond to the Sun representation of vs. relative Web Service performance with corrected results for, and expand on Sun s tests to show relative vs. J2EE performance for more realistic Web Services that do more work. We provide all the source code including the implementation, the implementation, and the driver program so customers can replicate the results. We believe we have accurately re-created Sun s original tests given that the results we achieve for Sun s four original tests very closely match Sun s reported results when taking into account the slightly faster server hardware we used in our testing (3 GHz vs. 2.6 GHz CPUs). However, we find the results to be 2-3 times better than Sun reports. Finally, we find to significantly exceed the performance of Sun s JWSDP 1.4 in tests involving larger message sizes. We encourage customers to download our benchmark kit and test the platforms for themselves; and also to examine the more comprehensive Doculabs @Bench Web Services benchmark which can be downloaded from http://www.theserverside.net/articles/content/doculabswebservicescalability/doculabswebservicescalability.pdf. 24 Microsoft Corporation. All rights reserved. Microsoft, Windows, and Windows Server are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners.