Business white paper. HP Process Automation. Version 7.0. Server performance



Similar documents
Microsoft Windows Server 2003 with Internet Information Services (IIS) 6.0 vs. Linux Competitive Web Server Performance Comparison

Dragon NaturallySpeaking and citrix. A White Paper from Nuance Communications March 2009

How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)

HP reference configuration for entry-level SAS Grid Manager solutions

SOLUTION BRIEF: SLCM R12.8 PERFORMANCE TEST RESULTS JANUARY, Submit and Approval Phase Results

vrealize Business System Requirements Guide

AppDynamics Lite Performance Benchmark. For KonaKart E-commerce Server (Tomcat/JSP/Struts)

Oracle Database Scalability in VMware ESX VMware ESX 3.5

Performance Analysis of Web based Applications on Single and Multi Core Servers

IT Business Management System Requirements Guide

JBoss Seam Performance and Scalability on Dell PowerEdge 1855 Blade Servers

Hardware & Software Specification i2itracks/popiq

NetIQ Access Manager 4.1

HP SN1000E 16 Gb Fibre Channel HBA Evaluation

TREND MICRO SOFTWARE APPLIANCE SUPPORT

Performance Characteristics of VMFS and RDM VMware ESX Server 3.0.1

7 Real Benefits of a Virtual Infrastructure

Agility Database Scalability Testing

Arrow ECS sp. z o.o. Oracle Partner Academy training environment with Oracle Virtualization. Oracle Partner HUB

System Requirements Table of contents

Legal Notices Introduction... 3

CentOS Linux 5.2 and Apache 2.2 vs. Microsoft Windows Web Server 2008 and IIS 7.0 when Serving Static and PHP Content

HP ProLiant BL660c Gen9 and Microsoft SQL Server 2014 technical brief

Optimizing SQL Server Storage Performance with the PowerEdge R720

Dell Microsoft Business Intelligence and Data Warehousing Reference Configuration Performance Results Phase III

Power Comparison of Dell PowerEdge 2950 using Intel X5355 and E5345 Quad Core Xeon Processors

Performance brief for IBM WebSphere Application Server 7.0 with VMware ESX 4.0 on HP ProLiant DL380 G6 server

Minimum Hardware Specifications Upgrades

FLOW-3D Performance Benchmark and Profiling. September 2012

Dell Virtualization Solution for Microsoft SQL Server 2012 using PowerEdge R820

HP high availability solutions for Microsoft SQL Server Fast Track Data Warehouse using SQL Server 2012 failover clustering

Centrata IT Management Suite 3.0

Fusionstor NAS Enterprise Server and Microsoft Windows Storage Server 2003 competitive performance comparison

Scalability. Microsoft Dynamics GP Benchmark Performance: Advantages of Microsoft SQL Server 2008 with Compression.

10.2 Requirements for ShoreTel Enterprise Systems

AgencyPortal v5.1 Performance Test Summary Table of Contents

Adaptec: Snap Server NAS Performance Study

SNOW LICENSE MANAGER (7.X)... 3

Performance Comparison of Fujitsu PRIMERGY and PRIMEPOWER Servers

HP ProLiant DL580 Gen8 and HP LE PCIe Workload WHITE PAPER Accelerator 90TB Microsoft SQL Server Data Warehouse Fast Track Reference Architecture

Virtuoso and Database Scalability

Referencia: Dell PowerEdge SC YMJ3J Nettó Ár: Db: 1 Tag Number: 37YMJ3J

Removing Performance Bottlenecks in Databases with Red Hat Enterprise Linux and Violin Memory Flash Storage Arrays. Red Hat Performance Engineering

Capacity Planning Guide for Adobe LiveCycle Data Services 2.6

SNOW LICENSE MANAGER (7.X)... 3

PARALLELS CLOUD SERVER

Sage SalesLogix White Paper. Sage SalesLogix v8.0 Performance Testing

Virtual Compute Appliance Frequently Asked Questions

Adaptec: Snap Server NAS Performance Study

White Paper Open-E NAS Enterprise and Microsoft Windows Storage Server 2003 competitive performance comparison

System Requirements. SuccessMaker 5

DELL. Virtual Desktop Infrastructure Study END-TO-END COMPUTING. Dell Enterprise Solutions Engineering

Big Data Management. What s Holding Back Real-time Big Data Analysis?

JBoss Data Grid Performance Study Comparing Java HotSpot to Azul Zing

Minimum Hardware Specifications Upgrades

Dell Reference Configuration for Hortonworks Data Platform

Analysis of VDI Storage Performance During Bootstorm

Intel Solid- State Drive Data Center P3700 Series NVMe Hybrid Storage Performance

Accelerating Application Performance on Virtual Machines

Benchmarking Cassandra on Violin

Tableau Server 7.0 scalability

Microsoft Windows Apple Mac OS X

Implementation & Capacity Planning Specification

WHITE PAPER. Domo Advanced Architecture

Achieving Real-Time Business Solutions Using Graph Database Technology and High Performance Networks

PowerEdge 1900 Starting Price $2,701 Instant Savings $200

EMC Unified Storage for Microsoft SQL Server 2008

Microsoft Windows Apple Mac OS X

Microsoft Windows Server 2003 vs. Linux Competitive File Server Performance Comparison

System requirements for A+

An Oracle White Paper July Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide

Competitive Comparison Dual-Core Intel Xeon Processor-based Platforms vs. AMD Opteron*

Visual UpTime Select Server Specifications

Best Practices for Deploying SSDs in a Microsoft SQL Server 2008 OLTP Environment with Dell EqualLogic PS-Series Arrays

Qualified PC Workstations for Avid Media Composer v5.5, Avid NewsCutter v9.5, Avid Assist 2.3, and Avid Instinct 3.5

Benchmarking Guide. Performance. BlackBerry Enterprise Server for Microsoft Exchange. Version: 5.0 Service Pack: 4

Liferay Portal Performance. Benchmark Study of Liferay Portal Enterprise Edition

Competitive Comparison

Notice. RuffaloCODY 65 Kirkwood North Road SW Cedar Rapids, Iowa 52404

Summary. Key results at a glance:

PERFORMANCE. Microsoft Dynamics CRM 3.0. Performance Report. White Paper

Intel RAID SSD Cache Controller RCS25ZB040

Sun Microsystems Special Promotions for Education and Research January 9, 2007

Microsoft Office SharePoint Server 2007 Performance on VMware vsphere 4.1

Scaling Objectivity Database Performance with Panasas Scale-Out NAS Storage

XTM Web 2.0 Enterprise Architecture Hardware Implementation Guidelines. A.Zydroń 18 April Page 1 of 12

AlphaTrust PRONTO - Hardware Requirements

Oracle Database Reliability, Performance and scalability on Intel Xeon platforms Mitch Shults, Intel Corporation October 2011

Integrated Data Protection for VMware infrastructure

A Comparison of VMware and {Virtual Server}

Introduction 1 Performance on Hosted Server 1. Benchmarks 2. System Requirements 7 Load Balancing 7

Using PCI Express Technology in High-Performance Computing Clusters

Liferay Portal s Document Library: Architectural Overview, Performance and Scalability

Handling Multimedia Under Desktop Virtualization for Knowledge Workers

Informatica Data Director Performance

Cisco is a registered trademark or trademark of Cisco Systems, Inc. and/or its affiliates in the United States and certain other countries.

Molecular Devices High Content Data Management Solution Database Schema

Streaming and Virtual Hosted Desktop Study: Phase 2

ivos Technical Requirements V For Current Clients as of June 2014

Transcription:

Business white paper HP Process Automation Version 7.0 Server performance

Table of contents 3 Summary of results 4 Benchmark profile 5 Benchmark environmant 6 Performance metrics 6 Process throughput 6 Autonomy Process Automation server processor utilization 6 Database server processor utilization 7 Notes and summation 7 Comparison with earlier performance results 7 Virtual users and real Users 7 Building your Autonomy Process Automation server cluster 2

As the leading provider of world-class business process management integrated with enterprise search, Autonomy delivers high performance solutions that exceed our customer s expectations. Autonomy Process Automation benchmarks demonstrate performance characteristics for a range of processing volumes with a specific platform configuration. This information can be used by customers and prospects in planning the resource allocations necessary to support their processing requirements. The primary objective of our benchmarking effort is to provide as much data as possible to support this important decision. Summary of results Virtual Users 400 Number of APA Server nodes 4 Highlight 1 1.26 Process completions per second Highlight 2 46% APA Server processor utilization Highlight 3 96% Database server processor utilization 3

Benchmark profile In the spring of 2011, Autonomy conducted benchmark testing in Vista, CA, USA to gather performance metrics for Autonomy Process Automation 7. Four Dell PowerEdge SC1425 dual Intel Xeon processor systems were used for Autonomy Process Automation Servers in a single-tier clustered environment, backed by a Microsoft SQL Server 2008 R2 database server on a Dell PowerEdge 6850 quad Intel Xeon processor system. The benchmark measured the same metric set across APA Server configurations with 1, 2, and 4 member nodes using a representative financial application process of approximately 100 non-parallel tasks total (including 3 sub-processes embedded within the parent process). On average, 30 to 45 of the tasks in each process execution were completed, 90% of submitted applications went through to the screening process, and 65% of submitted applications made it to the funding process. The test process also involved fairly complex server-side JavaScript, e-mail notifications, multiple database exports, and LDAP user administration technology. Apache JMeter was used as the load driver, simulating concurrent real users (referred to as virtual users in this document). The load driver simulated user form submissions based on a representative financial application process. The process used in this benchmark has three distinct user profiles: applicant, ranking approver, and funding approver. The three user profiles were distributed among the virtual users in a manner reflecting a typical production environment. Each test run was scheduled to last 30 minutes, with virtual users starting at a rate of 1 per second in a staggered manner based on the user profile. Each virtual user performed repeated executions of the test script with appropriate pauses to simulate real user behavior (such as time to fill in forms for submission). The testing was conducted in a controlled environment with no other applications running. The goal of this benchmark was to obtain measurements at each of the node levels to provide baseline expectations for cluster scalability and performance in a Autonomy Process Automation production environment. Figure 1. Application process Figure 2. Technical screening sub-process Figure 3. Technical funding sub-process 4

File server Dell PowerEdge 6850 Quad Intel Xeon EM64T 3.16 processors, 1MB cache 8GB DDR2 400MHz, 16x512MB Single Ranked DIMMs Microsoft Windows Server 2003 Enterprise x64 Edition SP2 73GB Ultra320 SCSI Drive (10k rpm) for system and binaries 4x 73GB Ultra320 SCSI Drive (10k rpm) in RAID 0 for data Intel Pro 1000MT Ethernet port on private network Load balancer Dual Intel Xeon 2.4GHz processors 1GB Memory Red Hat Enterprise Linux ES release 3 (Taroon Update 5) Apache httpd 2.2.13 configured with mod_proxy_balancer 100BaseT Ethernet port on public network 1000BaseT Ethernet port on private network Load generator Benchmark environment APA servers 1-4x Dell PowerEdge SC1425 Dual Intel Xeon EM64T 2.8GHz processors, 1MB cache, 800MHz FSB 4GB DDR2 400MHz, 4x1GB Dual Ranked DIMMs Microsoft Windows Server 2003 Enterprise Edition SP2 Oracle Java SE 6 Update 24 Autonomy Process Automation 7 80GB 7200 RPM SATA Hard Drive Intel Pro 1000MT Ethernet port on private network Maximum and minimum heap sizes for Java VM set to 1024 MB MaxThreads for Tomcat HTTP connector set to 500 AcceptCount for Tomcat HTTP connector set to 100 Tomcat AJP connector disabled Process Server database connection pool (MaxPoolSize) increased to 80 Process Engine database connection pool increased to 30 Process Engine thread pool increased to 42 Primary database server Dell PowerEdge SC1425 Dual Intel Xeon EM64T 2.8GHz processors, 1MB cache, 800MHz FSB 4GB DDR2 400MHz, 4x1GB Dual Ranked DIMMs Microsoft Windows Server 2003 Enterprise Edition SP2 Oracle Java SE 6 Update 24 Apache JMeter 2.3.4 80GB 7200 RPM SATA Hard Drive Intel Pro 1000MT Ethernet port on public network Load generator Load balancer Node 1 Node 2 Node 3 Cluster switch Database File share Dell PowerEdge 6850 LDAP Quad Intel Xeon EM64T 3.16 processors, 1MB cache 8GB DDR2 400MHz, 16x512MB Single Ranked DIMMs Microsoft Windows Server 2003 Enterprise x64 Edition SP2 Microsoft SQL Server 2008 R2 73GB Ultra320 SCSI Drive (10k rpm) for system and binaries 4x 73GB Ultra320 SCSI Drive (10k rpm) in RAID 0 for data Intel Pro 1000MT Ethernet port on private network Figure 4. Performance cluster Node 4 5

Performance metrics Process throughput This is by far the most informative metric gathered. Increasing the number of member servers in the Autonomy Process Automation cluster increases the peak throughput of the system. The peak on each line is the theoretical max throughput for that configuration. Tests run with virtual user numbers higher than the peak for each configuration line show a performance degradation that is remedied by increasing the number of server nodes in the Autonomy Process Automation Cluster. These points are good indicators that show proper recommendations for adding additional server nodes (i.e. scaling the Autonomy Process Automationenvironment) based on expected throughput requirements (process completions / second). 100 90 80 70 60 50 40 30 20 10 0 0 50 100 100 200 250 300 350 400 450 Number of virtual users 1 node 2 nodes 4 nodes 100 90 80 70 60 50 40 30 20 10 0 0 50 100 100 200 250 300 350 400 450 Number of virtual users 1 node 2 nodes 4 nodes Figure 6. Node CPU Database server processor utilization This chart demonstrates the fact that an increasing cluster size results in higher database server processor utilization, and that database server performance is the primary limiting factor for larger Autonomy Process Automation clusters. Limiting factors for the database are processor utilization, followed by disk I/O operations. Additional measurements of APA Server local file I/O, network file share I/O, and network traffic taken during each test run demonstrated that these resources are not limiting factors. Note that the primary database server is not configured to manage the archive database as well. Placing the main and archive databases on the same server has a severe performance impact, due to the mixing of very different workloads. Figure 5. Process throughput Autonomy Process Automation server processor utilization This chart demonstrates the ability of increasing cluster node members within a reasonable range to decrease the average processor utilization for the Autonomy Process Automation Servers. It is apparent that Autonomy Process Automation Server processor utilization is not a limiting factor for throughput capacity once the cluster environment contains 4 server nodes. The standalone and 2-node configurations both show a rapid increase in average processor time per sever when increasing the number of concurrent users on the system, while the 4-node configuration demonstrates a more gradual increase in average processor time per server. 2000 1500 1000 500 0 0 50 100 100 200 250 300 350 400 450 Number of virtual users 1 node 2 nodes 4 nodes Figure 7. Database CPU 6

Notes and summation Comparison with earlier performance results Due to changes in methodology, the results in this performance white paper are not directly comparable with results from earlier performance white papers. Virtual users and real users This test utilized virtual users managed by Apache JMeter as the load driver. Virtual users are not real-world users, but the scripted instructions for these virtual users, the processing of work generated by each user, and the measured responses of the client / server environment are meaningful tests of Autonomy Process Automation performance. Each test was scripted to account for the real time and effort that live users put into process execution, whether it is a form-based submission or a form-based decision. It is appropriate to examine the provided metrics included in this report at the virtual levels specified and treat the Virtual User count as equivalent to the number of live users that simultaneously use the Autonomy Process Automation system (opening, filling, submitting, and approving form-based processes). Building your Autonomy Process Automation server cluster The number of workflow completions in our 30 minute test runs showed significant improvement when adding servers to the existing cluster. Of specific note is the test involving 400 virtual users in a 4-node Autonomy Process Automation Server cluster. Extrapolating from the test results, the 4-node cluster is capable of completing over 36,000 workflows in an 8 hour workday, or more than 700,000 workflows in a 20 workday month. Note that the size, complexity, and task execution percentages for your workflows contribute significantly to the throughput potential of the target production system. The process used in this examination contained moderate-to-high complexity; workflows with less complexity or size should be expected to return a faster throughput rate overall. Note that, during the execution of this benchmark test, the following factors were not limiting factors in maximum throughput: network, file share, and file I/O. The Autonomy Process Automation 7 system is scalable and provides tremendous benefit to users that have increasing user counts and processing demands. The requirements of your specific production environment can be determined most accurately by applying a similar approach to the one in this white paper, using your own workflows and forms in a representative test environment. About Autonomy Autonomy, an HP Company, is a global leader in software that processes human information, or unstructured data, including social media, email, video, audio, text and web pages, etc. Autonomy s powerful management and analytic tools for structured information together with its ability to extract meaning in real time from all forms of information, regardless of format, is a powerful tool for companies seeking to get the most out of their data. Autonomy s product portfolio helps power companies through enterprise search analytics, business process management and OEM operations. Autonomy also offers information governance solutions in areas such as ediscovery, content management and compliance, as well as marketing solutions that help companies grow revenue, such as web content management, online marketing optimization and rich media management. Please visit autonomy.com to find out more. About HP HP creates new possibilities for technology to have a meaningful impact on people, businesses, governments and society. The world s largest technology company, HP brings together a portfolio that spans printing, personal computing, software, services and IT infrastructure to solve customer problems. More information about HP (NYSE: HPQ) is available at hp.com. 7

Get the insider view on tech trends, alerts, and HP solutions for better business outcomes autonomy.com Copyright 2012 Autonomy Inc., an HP Company. All rights reserved. Other trademarks are registered trademarks and the properties of their respective owners. 20120606_PI_WP_HP_APA_Server_Performance