AgencyPortal v5.1 Performance Test Summary Table of Contents 1. Testing Approach 2 2. Server Profiles 3 3. Software Profiles 3 4. Server Benchmark Summary 4 4.1 Account Template 4 4.1.1 Response Time 4 4.1.2 Throughput 5 4.1.3 CPU and Memory Utilization 5 4.2 Personal Auto Template 7 4.2.1 Response Time 7 4.2.2 Throughput 8 4.2.3 CPU and Memory Utilization 8 4.3 Workers Compensation Template 10 4.3.1 Response Time 10 4.3.2 Throughput 11 4.3.3 CPU and Memory Utilization 11 4.4 Commercial Auto Template 13 4.4.1 Response Time 14 4.4.2 Throughput 15 4.4.3 CPU and Memory Utilization 15 5. Comparison with v5.0 18 6. Client Stress Testing 20 Page 1 of 20
1. Testing Approach The stress tests ran various numbers of users from a single client machine. These were performed using JMeter 2.9. The test scripts recorded earlier for AgencyPortal v5.0 Account, Workers Compensation, Personal Auto, and Commercial Auto LOB Templates respectively were reused. No new scripts were created for AgencyPortal v5.1. During the testing, the auto save, timeline and CSRF functions were disabled for the AgencyPortal application. Application logging was set to SEVERE. The stress tests were spread over a 2 week period starting first week of May 2015 and going through to end of second week of May 2015. A total of 57 tests were run across the virtual user/scenario/database matrix totaling 960,749 measured HTTP requests. The total aggregate testing time for all 57 tests came in at 31.75 hours. The average length of each test took 50.13 minutes with the shortest test taking 7 minutes and the longest test taking 3.38 hours. The request response times are measured from the points of entering and exiting the SecurityFilter including all FrontServlet, IntraPageDTRServlet and other servlets. These server side response times are measured using the product s performance object collection feature. Wait time simulation In order to simulate the length of time that a user lingers on a screen before initiating a submission or an AJAX call, JMeter s Uniform Random Timer mechanism was used. This was configured with a 3 second constant delay and a 20 second random delay maximum. In other words, wait times between HTTP requests per user session randomly vary between a minimum wait time of 3 seconds and a maximum of 23 seconds. Page 2 of 20
2. Server Profiles Web/Application Server: Database Server: Intel Xeon 4 CPU 2.80GHz 16G Physical Ram 64-bit Intel Xeon CPU X7350 2.93GHz 4G Physical Ram 64-bit 3. Software Profiles Operating System: Red Hat Enterprise Linux Server release 5.9 (Tikanga) Linux version 2.6.18-348.4.1.el5 Web/Application Server: Apache Tomcat 7.0.54 DB connection pool size for all tests: 260 connections JVM: 1.7.0_60-b19 64 bit Heapsize ms1g mx2g Database Servers: Client Stress Testing: Oracle 12c - Enterprise Edition Release v12.1.0.1.0-64bit SQL Server 2012 - v11.0.2100.60 (X64) Build 7601:Service Pack 1 IBM-DB2 10.5 - v10.5.300.125 Fix Pack #3 Selenium for Firefox TestNG Framework 1.5 Agencyport's Selenium Testing Framework v1.0 Internet Explorer 11.0.09600.17420 AgencyPortal: SDK 5.1.0-SNAPSHOT May 7 build date Template Core 5.1 May 7 build date Account 5.1 May 7 build date Workers Comp5.1 May 7 build date Personal Auto 5.0 May 7 build date Commercial Auto 5.0 May 7 build date Page 3 of 20
4. Server Benchmark Summary All response times and throughput are plotted on the y-axis. Response times are always reported in milliseconds. The throughput graphs report how many HTTP requests were processed on average per minute. Virtual user counts are plotted on the x-axis. The legend on each response time and throughput graph denotes the database for each particular test. 4.1 Account Template Accounts created in the testing have 2 Locations and 2 Contacts. Virtual users were ramped up at 1 user/ second. 4.1.1 Response Time 60.00 Account Average Response Time Request Processing Time (ms) 50.00 40.00 30.00 20.00 10.00 SQL Server DB2 Oracle 0.00 100 200 400 Number of concurrent users Average response time (ms) with various numbers of virtual users and databases Page 4 of 20
4.1.2 Throughput Account Throughput Requests Processed/min 700.00 600.00 500.00 400.00 300.00 200.00 100.00 SQL Server DB2 Oracle 0.00 100 200 400 Number of concurrent users Average throughput (requests processed per minute) with various numbers of virtual users and databases 4.1.3 CPU and Memory Utilization CPU and memory utilizations with 400 virtual users and DB2 CPU and memory utilizations with 400 virtual users and Oracle Page 5 of 20
CPU and memory utilizations with 400 virtual users and SQL Server Page 6 of 20
4.2 Personal Auto Template Personal auto work items created in the testing have the following components: 2 Locations, 2 Drivers, 2 Vehicles, 2 Vehicle / Driver Assignments. Virtual users were ramped up at 1 user/ second. 4.2.1 Response Time 180.00 160.00 Personal Auto Average Response Time Request Processing Time (ms) 140.00 120.00 100.00 80.00 60.00 40.00 20.00 SQL Server DB2 Oracle 0.00 100 200 400 600 Number of concurrent users Average response time (ms) with various numbers of virtual users and databases Page 7 of 20
4.2.2 Throughput Requests Processed/min 1800.00 1600.00 1400.00 1200.00 1000.00 800.00 600.00 400.00 200.00 0.00 Personal Auto Throughput 100 200 400 600 Number of concurrent users SQL Server DB2 Oracle Average throughput (requests processed per minute) with various numbers of virtual users and databases 4.2.3 CPU and Memory Utilization CPU and memory utilizations with 600 virtual users and DB2 CPU and memory utilizations with 600 virtual users and Oracle Page 8 of 20
CPU and memory utilizations with 600 virtual users and SQL Server Page 9 of 20
4.3 Workers Compensation Template Workers Compensation work items created in the testing have the following components: 4 Locations, 4 Rating Classifications, 2 Individuals Included/Excluded, 2 Prior Carrier Information/Loss History. Virtual users were ramped up at 1 user/second. 4.3.1 Response Time 120.00 Workers Comp Average Response Time Request Processing Time (ms) 100.00 80.00 60.00 40.00 20.00 SQL Server DB2 Oracle 0.00 100 200 400 Number of concurrent users Average response time (ms) with various numbers of virtual users and databases Page 10 of 20
4.3.2 Throughput Workers Comp Throughput Requests Processed/min 1400.00 1200.00 1000.00 800.00 600.00 400.00 200.00 SQL Server DB2 Oracle 0.00 100 200 400 Number of concurrent users Average throughput (requests processed per minute) with various numbers of virtual users and databases 4.3.3 CPU and Memory Utilization CPU and memory utilizations with 200 virtual users and DB2 CPU and memory utilizations with 400 virtual users and DB2 Page 11 of 20
CPU and memory utilizations with 200 virtual users and Oracle CPU and memory utilizations with 400 virtual users and Oracle CPU and memory utilizations with 200 virtual users and SQL Server CPU and memory utilizations with 400 virtual users and SQL Server Page 12 of 20
4.4 Commercial Auto Template Commercial auto work items created in the testing have the following components: 2 Locations, 10, 25 or 100Vehicles, 1 Optional Coverage, 1 Premium Modification, 10, 25 or 100 Drivers, 2 Additional Interests, 2 Prior Carriers, 2 Loss History. Tests were done with 10 Drivers/10 Vehicles (CA_10), 25 Divers/25 Vehicles (CA_25) and 100 Divers/100Vehicles (CA_100). Virtual users were ramped up as follows: CA_10 10 Drivers/10 Vehicles CA_25 25 Drivers/25 Vehicles CA_100 100 Drivers/100 Vehicles 67 Users/80 Seconds 125 Users/150 Seconds 250 Users/300 Seconds 67 Users/80 Seconds 125 Users/150 Seconds 250 Users/300 Seconds 25 Users/30 Seconds 50 Users/60 Seconds 100 Users/120 Seconds Page 13 of 20
4.4.1 Response Time 3000.00 Commercial Auto Average Response Time Request Processing Time (ms) 2500.00 2000.00 1500.00 1000.00 500.00 0.00 SQL Server DB2 Oracle Number of concurrent users Average response time (ms) with various numbers of virtual users and databases Page 14 of 20
4.4.2 Throughput 1200.00 Commercial Auto Throughput Requests Processed/min 1000.00 800.00 600.00 400.00 200.00 SQL Server DB2 Oracle 0.00 Number of concurrent users Average throughput (requests processed per minute) with various numbers of virtual users and databases 4.4.3 CPU and Memory Utilization CPU and memory utilizations with CA_10, 250 virtual users and DB2 CPU and memory utilizations with CA_25, 125 virtual users and DB2 Page 15 of 20
CPU and memory utilizations with CA_25, 250 virtual users and DB2 CPU and memory utilizations with CA_10, 250 virtual users and Oracle CPU and memory utilizations with CA_25, 125 virtual users and Oracle CPU and memory utilizations with CA_25, 250 virtual users and Oracle CPU and memory utilizations with CA_10, 250 virtual users and SQL Server Page 16 of 20
CPU and memory utilizations with CA_25, 125 virtual users and SQL Server CPU and memory utilizations with CA_25, 250 virtual users and SQL Server CPU and memory utilizations with CA_100, 100 virtual users and DB2 CPU and memory utilizations with CA_100, 100 virtual users and Oracle CPU and memory utilizations with CA_100, 100 virtual users and SQL Server Page 17 of 20
5. Comparison with v5.0 A comparative analysis was done between the v5.1 performance tests results against those published towards end of 2014 for our 5.0 version. Note that we did not rerun the 5.0 tests on the 5.1 platform. Our goal was to run the tests for v5.1 using the same hardware profile, business scenarios, JVM settings and database connection pool size as used by v5.0. 5.1 average response times were 24% as fast as those for 5.0 with a slight increase of 3.1% in throughput. The following two graphs represent the overall average response time of one HTTP request and the average throughput per minute for all 57 tests respectively. The improvement in performance can be attributed to setting the property transaction_file_manager_mode value to cache. Assuming v5.0 had been run with this flag, the performance metrics would have been comparable to v5.1. Overall Average Response Time Request Processing Time (ms) 450.00 400.00 350.00 300.00 250.00 200.00 150.00 100.00 50.00 291.40 383.61 0.00 Portal 5.1 Portal 5.0 Portal Versions Page 18 of 20
Overall Average Throughput 685.00 680.00 679.70 Requests Processed/min 675.00 670.00 665.00 660.00 655.00 650.00 645.00 659.04 Portal 5.1 Portal 5.0 Portal Versions Page 19 of 20
6. Client Stress Testing The same selenium recordings that are used for functional regression testing AgencyPortal were repurposed to repeatedly create accounts, personal auto, workers comp and commercial auto work items. These were run to determine any exposure to memory leaks under Internet Explorer 11.0 caused in part by JavaScript and/or AJAX. This test ran for several hours while Windows perfmon.exe was monitoring the private bytes utilization of Internet Explorer. The following graph shows the client memory during that 3 hour period. The results do not show any evidence of memory leaks. Client memory usage Page 20 of 20