SOLUTION BRIEF: SLCM R12.8 PERFORMANCE TEST RESULTS JANUARY, 2013. Submit and Approval Phase Results

Similar documents
SOLUTION BRIEF: SLCM R12.7 PERFORMANCE TEST RESULTS JANUARY, Load Test Results for Submit and Approval Phases of Request Life Cycle

AgencyPortal v5.1 Performance Test Summary Table of Contents

How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)

Business white paper. HP Process Automation. Version 7.0. Server performance

Agility Database Scalability Testing

How To Test On The Dsms Application

Windows with Microsoft SQL Server Performance Load Testing on Clarity Technical White Paper

NetIQ Access Manager 4.1

Serena Business Manager Performance Test Results

JBoss Data Grid Performance Study Comparing Java HotSpot to Azul Zing

An Oracle White Paper March Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite

SOLUTION BRIEF CA SERVICE MANAGEMENT - SERVICE CATALOG. Can We Manage and Deliver the Services Needed Where, When and How Our Users Need Them?

Performance Testing of Java Enterprise Systems

Adobe LiveCycle Data Services 3 Performance Brief

White Paper. Proving Scalability: A Critical Element of System Evaluation. Jointly Presented by NextGen Healthcare & HP

Dragon Medical Enterprise Network Edition Technical Note: Requirements for DMENE Networks with virtual servers

Liferay Portal Performance. Benchmark Study of Liferay Portal Enterprise Edition

JBoss Seam Performance and Scalability on Dell PowerEdge 1855 Blade Servers

JVM Performance Study Comparing Oracle HotSpot and Azul Zing Using Apache Cassandra

Performance White Paper

Performance Characteristics of VMFS and RDM VMware ESX Server 3.0.1

Amazon EC2 XenApp Scalability Analysis

Virtuoso and Database Scalability

Performance test report

System Requirements Table of contents

AppDynamics Lite Performance Benchmark. For KonaKart E-commerce Server (Tomcat/JSP/Struts)

Legal Notices Introduction... 3

Enterprise Deployment: Laserfiche 8 in a Virtual Environment. White Paper

Performance brief for IBM WebSphere Application Server 7.0 with VMware ESX 4.0 on HP ProLiant DL380 G6 server

Microsoft Dynamics CRM 2011 Guide to features and requirements

An Oracle White Paper Released Sept 2008

Installation Guide. Release Management for Visual Studio 2013

WEBAPP PATTERN FOR APACHE TOMCAT - USER GUIDE

Tableau Server 7.0 scalability

Throughput Capacity Planning and Application Saturation

Delivering Quality in Software Performance and Scalability Testing

White Paper. Scalability Results. Select the hardware configuration that s right for your organization to optimize performance

.NET 3.0 vs. IBM WebSphere 6.1 Benchmark Results

Ignify ecommerce. Item Requirements Notes

Scalability Factors of JMeter In Performance Testing Projects

Performance Testing Tools: A Comparative Analysis

Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for On-Premises Single Tenant Deployments

DELL. Virtual Desktop Infrastructure Study END-TO-END COMPUTING. Dell Enterprise Solutions Engineering

Performance Analysis of Web based Applications on Single and Multi Core Servers

Performing Load Capacity Test for Web Applications

Performance Testing Process A Whitepaper

Tableau Server Scalability Explained

Very Large Enterprise Network Deployment, 25,000+ Users

IBM Tivoli Workload Automation V8.6 Performance and scale cookbook

LCMON Network Traffic Analysis

Deploying Load balancing for Novell Border Manager Proxy using Session Failover feature of NBM and L4 Switch

A Comparison of Oracle Performance on Physical and VMware Servers

Scalability. Microsoft Dynamics GP Benchmark Performance: Advantages of Microsoft SQL Server 2008 with Compression.

An Oracle White Paper Released October 2008

VirtualCenter Database Performance for Microsoft SQL Server 2005 VirtualCenter 2.5

Web Server (Step 1) Processes request and sends query to SQL server via ADO/OLEDB. Web Server (Step 2) Creates HTML page dynamically from record set

XTM Web 2.0 Enterprise Architecture Hardware Implementation Guidelines. A.Zydroń 18 April Page 1 of 12

VMware vrealize Automation

EWeb: Highly Scalable Client Transparent Fault Tolerant System for Cloud based Web Applications

CHAPTER 4 PERFORMANCE ANALYSIS OF CDN IN ACADEMICS

Table 1. Requirements for Domain Controller. You will need a Microsoft Active Directory domain. Microsoft SQL Server. SQL Server Reporting Services

Performance characterization report for Microsoft Hyper-V R2 on HP StorageWorks P4500 SAN storage

Evaluating Intel Virtualization Technology FlexMigration with Multi-generation Intel Multi-core and Intel Dual-core Xeon Processors.

McAfee Enterprise Mobility Management Performance and Scalability Guide

How to Install and Set Up the FASTER Dashboard

Oracle Hyperion Financial Management Virtualization Whitepaper

VMware vcloud Automation Center 6.1

ADAM 5.5. System Requirements

CentOS Linux 5.2 and Apache 2.2 vs. Microsoft Windows Web Server 2008 and IIS 7.0 when Serving Static and PHP Content

A Middleware Strategy to Survive Compute Peak Loads in Cloud

A Comparison of Oracle Performance on Physical and VMware Servers

Very Large Enterprise Network, Deployment, Users

Content Repository Benchmark Loading 100 million documents

Using VMware VMotion with Oracle Database and EMC CLARiiON Storage Systems

Performance Test Report For OpenCRM. Submitted By: Softsmith Infotech.

Improve Business Productivity and User Experience with a SanDisk Powered SQL Server 2014 In-Memory OLTP Database

NETWRIX EVENT LOG MANAGER

Advantage Database Server

Summer Internship 2013 Group No.4-Enhancement of JMeter Week 1-Report-1 27/5/2013 Naman Choudhary

Server Installation Procedure - Load Balanced Environment

An Oracle Benchmarking Study February Oracle Insurance Insbridge Enterprise Rating: Performance Assessment

Dell Compellent Storage Center SAN & VMware View 1,000 Desktop Reference Architecture. Dell Compellent Product Specialist Team

PARALLELS CLOUD SERVER

SP Apps Performance test Test report. 2012/10 Mai Au

Performance Analysis and Capacity Planning Whitepaper

An Oracle White Paper Released April 2008

VMware vrealize Automation

Learning More About Load Testing

Estimate Performance and Capacity Requirements for Workflow in SharePoint Server 2010

HP ProLiant BL660c Gen9 and Microsoft SQL Server 2014 technical brief

Knowledge Article Performance Comparison: BMC Remedy ITSM Incident Management version Vs on Windows

HP OpenView Service Desk Process Insight 2.10 software

Microsoft SharePoint 2010 on HP ProLiant DL380p Gen8 servers

Load Testing on Web Application using Automated Testing Tool: Load Complete

IBM Lotus Notes and Lotus inotes on Citrix XenApp 4.5/5.0: A scalability analysis

VMware vcloud Automation Center 6.0

11.1 inspectit inspectit

Web Application Testing. Web Performance Testing

VMware vcenter Server Performance and Best Practices

Sage Grant Management System Requirements

Transcription:

SOLUTION BRIEF: SLCM R12.8 PERFORMANCE TEST RESULTS JANUARY, 2013 Submit and Approval Phase Results

Table of Contents Executive Summary 3 Test Environment 4 Server Topology 4 CA Service Catalog Settings 6 Database Settings 6 CA Process Automation Settings 6 Transaction volume 6 Test Scenarios 7 Requester Scenario 7 Approver Scenario 7 Refresher Scenario 8 Test Results 9 Transaction Statistics 9 Request placed & approved statistics 9 Percentile Chart 10 Hits/Second 12 Throughput Rate 14 Response Times 15 Custom Response Times 15 Conclusion 17 PAGE 2 OF 20

Executive Summary Challenge IT is under greater pressure to reduce operational costs and demonstrate value in services it delivers. These pressures are often a result of disconnect between what IT provides and perception of what business users are really requesting. Often this perception is a result of services separated by IT silos that are costly, labor intensive, and prone to errors. Also, it is difficult to understand usage and cost of delivered services, aligned to the business perception of service value. Opportunity With CA Service Catalog, you can define and manage a top-down-oriented service catalog that provides a complete view of service offerings and exposes them by tenancy and role where, when, and how they are needed. This removes disconnect between what IT provides and the perception of what is delivered. CA Service Catalog thus enables your IT organization to set expectations, to replace costly and labor-intensive manual fulfillment methods, and to manage service costs and consumption. Benefits CA Service Catalog helps reduce cost of service delivery with consistent repeatable services that are managed as a portfolio of offerings. Service offerings are defined from a top-down, business perspective to meet the needs of enterprise. You can communicate service offerings in rich, descriptive business language to increase satisfaction and elevate service experience of business and IT users. With CA Service Catalog, entire request lifecycle is managed and automated from service request through approval to fulfillment, accelerating time-to-value. For large organizations, depending on number of end users and types of catalog items being offered, there could be a high amount of request submission, approval and fulfillment activity per day. CA Service Catalog application needs to be scalable to support large volume of requests. CA Technologies performed a series of load tests on CA Service Catalog Release 12.8 based on data and inputs received from large enterprises. The results demonstrate that CA Service Catalog Release 12.8 is scalable, enterprise-ready solution for any client who needs to provide self-service capabilities in their enterprise. The 1-hour test exercised repeatable execution of self-service request life cycle from submission through approval at a rate of 15 requests per minute or 900 requests for an hour. This request life cycle activity was occurring at the same time as 70 concurrent users were refreshing a CA Service Catalog screen, with a total 420 refreshes in 1 hour. The objective of this refresher activity is to generate additional concurrent user load on CA Service Catalog. Other than these multiple 1-hour runs, there were 8-hours and 40-hours run also which resulted in similar performance. Average transaction success rate during the tests was measured at 100%. Please note that these results are based on test environment with physical machines and they may not be comparable to virtual environment. The remainder of this document provides details of the test environment, scenarios, and results. PAGE 3 OF 20

Test Environment Multiple test servers were used for performance test environment. HP LoadRunner 11.0 was used to create and run scripts to exercise the functionality being tested. The system (database) was pre-populated with data obtained from large enterprise customer(s). Server Topology The load test environment was configured as cluster setups for CA Service Catalog and integrated products. For simulating a production environment, physical machines were used. Separate database machines were used for CA Service Catalog and CA Process Automation (formerly known as CA IT PAM). FIGURE 1: SERVER TO POLOGY DIAGRAM PAGE 4 OF 20

There were total 7 machines in the performance test environment with following configurations: Server Service Catalog (Primary) Load balancer Service Catalog (Secondary) Database (MDB) CA PAM Node 1 Load balancer Database CA PAM Node 2 Software CA Service Catalog Release 12.8 Apache HTTP Server 2.2 CA Service Catalog Release 12.8 MS SQL Server 2008 R2 (SP1) (X64) CA PAM R4.0 SP1 Apache HTTP Server 2.2 MS SQL Server 2008 R2 (X64) CA PAM R4.0 SP1 CA EEM CA EEM R12.0 LoadRunner Controller HP LoadRunner 11.0 Machine Configuration Dell OPTIPLEX-790 RAM: 8 GB Processor: Intel i3-2120cpu Quad-Core @ 3.30GHz Win 2008 R2 Ent - 64 bit Dell OPTIPLEX-790 RAM: 8 GB Processor: Intel i3-2120cpu Quad-Core @ 3.30GHz Win 2008 R2 Ent (SP1) - 64 bit Dell OPTIPLEX-790 RAM: 8 GB Processor: Inteli3-2120CPU Quad-Core @ 3.30GHz Win 2008 R2 Ent - 64 bit Dell OPTIPLEX-790 RAM: 8 GB Processor: Intel i3-2120cpu Quad-Core @ 3.30GHz Win 2008 R2 Ent - 64 bit Dell OPTIPLEX-760 RAM: 4 GB Processor: Intel E7400CPU Dual-Core @ 2.80 GHz Win 2008 R2 Ent (SP1) - 64 bit Dell OPTIPLEX-780 RAM: 4 GB Processor: Intel E7400CPU Dual-Core @ 2.93 GHz Win 2008 R2 Ent - 64 bit Virtual Machine RAM: 4 GB Processor: Intel Xeon E5645CPU @ 2.40 GHz (2 Processors) Win 2008 R2 Ent (SP1) - 64 bit PAGE 5 OF 20

CA Service Catalog Settings The following CA Service Catalog configuration settings were adjusted to promote optimal performance. Item Tomcat threads Tomcat connection timeout Java initial heap size Java maximum heap size Max database connection pool size Setting Unit 800 number 1,000,000 milliseconds 512 MB 1,025 MB 200 number Database Settings No changes were made to the default settings. Transaction log for MS SQL Server was set at Full mode (the default). CA Process Automation Settings The following CA Process Automation configuration settings were adjusted for optimal performance Item Setting Unit Initial Java Heap Size (in MB) 1,024 MB Maximum Java Heap Size (in MB) 2,048 MB clientleaseperiod 100,000 Milliseconds SupportsFailover True - SupportsLoadBalancing True - reconnectinterval 25 Seconds DLQMaxResent 10 Times Transaction volume As highlighted earlier, the test data was obtained from different large enterprise customers. Following table summarizes the transaction data and counts were captured just before the execution of the first load test: Record type Count No. of Requests 100,000 No. of Services 1,800 No. of Business Units 100 No. of Users 20,000 PAGE 6 OF 20

Test Scenarios The test scripts simulated submit and approval process of the CA Service Catalog request life cycle. In addition, test scripts simulated additional CA Service Catalog activity of refreshing logged in users profile. This activity is unrelated to the request life cycle; however, it generates application load. The details of the performance test scripts are given below. For all scripts, to simulate real time users behavior, there was a pause (or think time) between user actions. The think time was recorded with the script. The service used for the test was the predefined Create Email service which is available out of the box. Requester Scenario In the requester scenario, a user in the Catalog End User role logs in to CA Service Catalog, selects a service to add to his cart, submits his request for approval, and logs out of CA Service Catalog. This test simulates the catalog entry selection and request submission phases of the CA Service Catalog request life cycle. This script results in average of 900 requests created over a 1-hour time period. The steps in the script are as follows 1 15 requesters log in to CA Service Catalog at an interval of 5 seconds each, ramping up load on the application. 2 Each of these 15 users; selects a service from the CA Service Catalog to add to its cart. 3 The shopping cart is displayed and user submits the cart. 4 The user logs out. When the user submits the cart, CA Service Catalog causes an associated CA Process Automation process to initiate. The process instance determines the manager of the requester and assigns an approval task to that manager (the approver). Approver Scenario The approver scenario tests the condition where a user in the Request Manager user role logs into CA Service Catalog, refreshes his pending actions list until a request for approval appears, selects a request from the displayed list; approves it and logs out of CA Service Catalog. This test simulates the approval phase of the CA Service Catalog request life cycle. The steps in the script are as follows. 1 15 approvers log in to Service Catalog at an interval of 5 seconds. 2 These 15 users each look for greater than 0 pending actions else logs out and waits for 90 seconds before they login again. 3 Go to pending action page and select a request for approval and open the approval page for the same. 4 The approval page displays for the selected request. 5 The request is approved. 6 The user logs out. Performance test suite was configured to have a lag of 3 minutes for approver scripts so PAGE 7 OF 20

that few requests are placed before approver scripts starts. When approver approves the request, CA Service Catalog causes an associated CA IT PAM process to start which changes the status of the request to Pending Fulfillment. Refresher Scenario In the refresher scenario, a user with the Catalog End User role is logged into CA Service Catalog and continually refreshes the User Profile page. This test simulates random CA Service Catalog activity while the primary request life cycle test is being conducted. The steps in the script are as follows. 1 All 70 refreshers are logged in to CA Service Catalog at the rate of 5 users every 10 seconds. 2 Each of these 70 refreshers re-load the User Profile page once every 10 minutes 3 The user logs out from CA Service Catalog. This is repeated for the entire duration of the test to have a consistent load of 100 virtual users. PAGE 8 OF 20

Test Results Transaction Statistics Performance test results are obtained as an average of 3 runs (1hour each) and following table represents the same: Test # Successful Transactions # Failed Transactions Success Rate Test 1 15,582 0 100.0% Test 2 15,588 0 100.0% Test 3 14,582 0 100.0% Average transaction success rate for 3 tests was 100%. There were 0 failed transactions and no exceptions in the application logs. Request placed & approved statistics Test Requests Placed Requests Approved Requests processed per minute Test 1 900 900 15 Test 2 900 900 15 Test 3 900 900 15 Note: The count of requests placed was controlled and restricted to 15 requests per minute hence the total of 900 requests in 1 hour PAGE 9 OF 20

Percentile Chart Percentile chart shows the transaction times sorted in an ascending order. The X-axis shows the percentiles and y-axis shows response time for transactions in seconds for respective percentile. An intersection of percentile and response time conveys the response time taken for that percentile of users. Test 1 Requestor Approver Test 2 Requestor Approver PAGE 10 OF 20

Test 3 Requestor Approver PAGE 11 OF 20

Hits/Second The chart below show the number of transactions included in all scripts over time. The graph displays number of transactions taking place over time. Average transactions per second means the total number of transactions divided by total time of test. Test 1 Test 2 Test 3 PAGE 12 OF 20

PAGE 13 OF 20

KB MB Throughput Rate The charts below show the throughput from all scripts over time for the test. The high values of the throughput * are indicators that application is scalable to handle large volume of data transactions. During load testing of CA Service Catalog r12.7 average throughput* was recorded as 640.20 KB/sec as compared to average throughput of 119.64 KB/sec during previous load test conducted for CA Service Catalog r12.5. 7,640 7,630 7,620 7,610 Total Throughput (MB) Test 1 Test 2 Test 3 Total Throughput (MB) 1,960 Average Throughput (KB/second) 1,955 1,950 Test 1 Test 2 Test 3 Average Throughput (KB/second) Note: Displays the amount of throughput (in bytes) on the Web server during the load test. Throughput represents the amount of data that the Vusers received from the server at any given second. This graph helps you to evaluate the amount of load Vusers generate, in terms of server throughput. PAGE 14 OF 20

Seconds Response Times Custom Response Times The charts below show average response times for custom timers used for various user actions in performance test scripts. This is average time taken for a virtual user to perform specified action: Custom timers are used for these scripts with user interaction steps and following is list of same: Requestor LoadLoginPage LoginSubmit SelectEmailFolder SelectCreateEmailAcc SelectItemCheckout SaveSubmitCart LogoutSubmit Approver LoadLoginPage LoginSubmit GotPendingRequestsPage OpenRequest ClickApproveRequest SaveRequest LogoutSubmit Refresher LoadLoginHome LoginSubmit ViewUserProfile DoneViewing Logout Requestor Script 10 8 6 4 2 0 Requester Average Response Time Test 1 Test 2 Test 3 PAGE 15 OF 20

Seconds Seconds Approver Script 6 5 4 3 2 1 0 Approver Average Response Time Test 1 Test 2 Test 3 Refresher Script 2 1.5 1 0.5 0 Refresher Average Response Time Test 1 Test 2 Test 3 PAGE 16 OF 20

Conclusion Load testing of CA Service Catalog Release 12.8 was successfully conducted covering 3 scenarios (requester, approval and refresher) at 100VU (Virtual User) load at a test environment primed with a predefined record volume. During these load tests average transaction success rate of 100% was recorded. These load tests results provide reliable supporting facts that CA Service Catalog Release 12.8 can be used for providing the enterprise with a self-service request capability that meets the high standards of reliability and availability that large enterprises require. PAGE 17 OF 20