Agility Database Scalability Testing



Similar documents
Performance brief for IBM WebSphere Application Server 7.0 with VMware ESX 4.0 on HP ProLiant DL380 G6 server

VirtualCenter Database Performance for Microsoft SQL Server 2005 VirtualCenter 2.5

SOLUTION BRIEF: SLCM R12.8 PERFORMANCE TEST RESULTS JANUARY, Submit and Approval Phase Results

Holistic Performance Analysis of J2EE Applications

Business white paper. HP Process Automation. Version 7.0. Server performance

AgencyPortal v5.1 Performance Test Summary Table of Contents

How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)

Legal Notices Introduction... 3

Liferay Portal Performance. Benchmark Study of Liferay Portal Enterprise Edition

System Requirements Table of contents

Analysis of VDI Storage Performance During Bootstorm

Best Practices for Deploying SSDs in a Microsoft SQL Server 2008 OLTP Environment with Dell EqualLogic PS-Series Arrays

Adobe LiveCycle Data Services 3 Performance Brief

Delivering Quality in Software Performance and Scalability Testing

A Comparison of Oracle Performance on Physical and VMware Servers

Performance Characteristics of VMFS and RDM VMware ESX Server 3.0.1

Benchmarking Guide. Performance. BlackBerry Enterprise Server for Microsoft Exchange. Version: 5.0 Service Pack: 4

Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for On-Premises Single Tenant Deployments

Performance Analysis of Web based Applications on Single and Multi Core Servers

Performance Testing of Java Enterprise Systems

WebSphere Performance Monitoring & Tuning For Webtop Version 5.3 on WebSphere 5.1.x

Capacity Planning Guide for Adobe LiveCycle Data Services 2.6

A Comparison of Oracle Performance on Physical and VMware Servers

Cognos8 Deployment Best Practices for Performance/Scalability. Barnaby Cole Practice Lead, Technical Services

JBoss Seam Performance and Scalability on Dell PowerEdge 1855 Blade Servers

MAGENTO HOSTING Progressive Server Performance Improvements

Sage SalesLogix White Paper. Sage SalesLogix v8.0 Performance Testing

Content Repository Benchmark Loading 100 million documents

An Oracle White Paper July Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide

Virtuoso and Database Scalability

Informatica Data Director Performance

JBoss Data Grid Performance Study Comparing Java HotSpot to Azul Zing

Microsoft Windows Server 2003 with Internet Information Services (IIS) 6.0 vs. Linux Competitive Web Server Performance Comparison

XTM Web 2.0 Enterprise Architecture Hardware Implementation Guidelines. A.Zydroń 18 April Page 1 of 12

Benchmarking Hadoop & HBase on Violin

Analyzing IBM i Performance Metrics

Resource Monitoring During Performance Testing. Experience Report by Johann du Plessis. Introduction. Planning for Monitoring

A Scalability Study for WebSphere Application Server and DB2 Universal Database

SOLUTION BRIEF: SLCM R12.7 PERFORMANCE TEST RESULTS JANUARY, Load Test Results for Submit and Approval Phases of Request Life Cycle

Microsoft SharePoint 2010 on HP ProLiant DL380p Gen8 servers

Fusion iomemory iodrive PCIe Application Accelerator Performance Testing

Tuning WebSphere Application Server ND 7.0. Royal Cyber Inc.

System Requirements and Configuration Options

VMware vcenter 4.0 Database Performance for Microsoft SQL Server 2008

PERFORMANCE TUNING ORACLE RAC ON LINUX

Web Application Testing. Web Performance Testing

Estimate Performance and Capacity Requirements for Workflow in SharePoint Server 2010

Oracle Weblogic. Setup, Configuration, Tuning, and Considerations. Presented by: Michael Hogan Sr. Technical Consultant at Enkitec

B M C S O F T W A R E, I N C. BASIC BEST PRACTICES. Ross Cochran Principal SW Consultant

Using VMware VMotion with Oracle Database and EMC CLARiiON Storage Systems

5,100 PVS DESKTOPS ON XTREMIO

11.1 inspectit inspectit

DIABLO TECHNOLOGIES MEMORY CHANNEL STORAGE AND VMWARE VIRTUAL SAN : VDI ACCELERATION

HP SN1000E 16 Gb Fibre Channel HBA Evaluation

Accelerating Server Storage Performance on Lenovo ThinkServer

Big Data Management. What s Holding Back Real-time Big Data Analysis?

Adaptec: Snap Server NAS Performance Study

IT Best Practices Audit TCS offers a wide range of IT Best Practices Audit content covering 15 subjects and over 2200 topics, including:

Amazon EC2 XenApp Scalability Analysis

Performance Optimization For Operational Risk Management Application On Azure Platform

Benchmark Testing Results: OpenText Monitoring and Records Management Running on SQL Server 2012

XenDesktop 7 Database Sizing

MagDiSoft Web Solutions Office No. 102, Bramha Majestic, NIBM Road Kondhwa, Pune Tel: /

Load/Stress Test Plan

Foglight. Monitoring Application Servers User and Reference Guide

Lab Validations: Optimizing Storage for XenDesktop with XenServer IntelliCache Reducing IO to Reduce Storage Costs

Configuring Apache Derby for Performance and Durability Olav Sandstå

System Requirements. SAS Regular Price Optimization 4.2. Server Tier. SAS Regular Price Optimization Long Jobs Server

Oracle Primavera P6 Enterprise Project Portfolio Management Performance and Sizing Guide. An Oracle White Paper October 2010

MS EXCHANGE SERVER ACCELERATION IN VMWARE ENVIRONMENTS WITH SANRAD VXL

An Oracle Benchmarking Study February Oracle Insurance Insbridge Enterprise Rating: Performance Assessment

Implementation & Capacity Planning Specification

Java Monitoring. Stuff You Can Get For Free (And Stuff You Can t) Paul Jasek Sales Engineer

An Oracle White Paper March Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite

Perfmon Collection Setup Instructions for Windows Server 2008+

WEBLOGIC ADMINISTRATION

JVM Performance Study Comparing Oracle HotSpot and Azul Zing Using Apache Cassandra

A Performance Engineering Story

Informatica Master Data Management Multi Domain Hub API: Performance and Scalability Diagnostics Checklist

WebSphere Architect (Performance and Monitoring) 2011 IBM Corporation

Oracle WebLogic Server 11g Administration

7 Real Benefits of a Virtual Infrastructure

Energy-aware Memory Management through Database Buffer Control

MONITORING A WEBCENTER CONTENT DEPLOYMENT WITH ENTERPRISE MANAGER

Cloud Storage. Parallels. Performance Benchmark Results. White Paper.

PeopleSoft Online Performance Guidelines

Performance White Paper

PRODUCT BRIEF 3E PERFORMANCE BENCHMARKS LOAD AND SCALABILITY TESTING

Avoid Paying The Virtualization Tax: Deploying Virtualized BI 4.0 The Right Way. Ashish C. Morzaria, SAP

How To Monitor A Server With Zabbix

Oracle WebLogic Thread Pool Tuning

PARALLELS CLOUD SERVER

Blackboard Learn TM, Release 9 Technology Architecture. John Fontaine

Application. Performance Testing

Transcription:

Agility Database Scalability Testing V1.6 November 11, 2012 Prepared by on behalf of

Table of Contents 1 Introduction... 4 1.1 Brief... 4 2 Scope... 5 3 Test Approach... 6 4 Test environment setup... 7 5 Test data used... 7 6 Test Schedule... 8 6.1 Baseline PIM Sessions against the server. 250,000 SKU s... 8 6.1.1 Test 1.... 8 6.1.2 Test 2.... 8 6.1.3 Test 3.... 8 6.2 Baseline PIM Sessions against the server. 2.5 Million SKU s... 8 6.2.1 Test 4.... 8 6.2.2 Test 5.... 9 6.2.3 Test 6.... 9 7 Results Summary... 10 7.1 250k SKU database End User Summary... 10 7.2 2.5 Million SKU database End User Summary... 10 8 Conclusion... 11 9 Summary... 11 10 Appendix A... 11 10.1 Weblogic Service Monitor Descriptions... 11 10.2 SQL Advanced Service Monitoring Descriptions... 14 11 Results - Detailed... 15 11.1 Test 1... 15 11.2 Test 2... 20 11.3 Test 3... 25 11.4 Test 4... 29 11.5 Test 5... 33 11.6 Test 6... 37 Agility Scalability Testing prepared on behalf of Agility Multichannel

Document Scope This document is to provide information on scalability testing regarding the PIM environment developed by Agility Multichannel. The documentation is designed for the reader to have a medium knowledge of IT to understand the concepts of environments and performance statistics. The following information will be included within this document: Introduction Which details why and what we are testing. Test Approach How we are going to test. Test Schedule Timelines of tests run. Results Completed and stored data analysis from each test run within the test schedule. Conclusion What the information has given us. Agility Scalability Testing prepared on behalf of Agility Multichannel

1 Introduction 1.1 Brief Agility Multichannel is a software company that develops a product information management system. Agility Multichannel has approached Fusion Systems to help it benchmark and document the scalability of its software. This is detailed in section 3. Fusion Systems was brought in as a consultant to provide a third party unbiased view of the testing requirements and results. Fusion Systems is a technology company providing support and consultancy to third party companies. Fusion Systems has been involved with full cycle application testing within the broadcasting, insurance and financial business sectors. Agility Scalability Testing prepared on behalf of Agility Multichannel

2 Scope Out of the many different types of testing this exercise will be focusing on scalability testing. Functional Testing Non Functional Testing Performance Testing Scalability Testing Tuning Penetration testing The scope of the scalability testing will be focused on a non-customer specific data set and infrastructure for the following reasons: The Agility application operates with different business processes, different infrastructure and different data and data model for each customer. The platforms used at each customer can vary in a number of ways. Examples of permutations are as follows: Weblogic + Oracle Weblogic + SQL Apache + Websphere + Oracle Apache + Websphere + SQL IBM HTTP Server + Websphere + Oracle IBM HTTP Server + Websphere + SQL In addition to the above there can also be different subversions of these components in use as well as a mixture of operating systems and environments on both server and client sides. Within each customer organisation there can also be different network configurations including load balancing / Server locations / Different network standards (Lease lines / MPLS) Also each customer can have different data and data model and amount of data Finally each customer has different numbers of users with different roles and responsibilities and use cases and usage patterns. With the above in mind we plan a generic subset of tests which will not be representative of a specific client environment but will show the scalability of the Agility system against a subset of data defined internally consisting of 250,000 SKU records with at least 40 attributes (10 million attribute instances) then ramping up to 2.5 Million with at least 100 Attributes (250 million attribute instances). Agility Scalability Testing prepared on behalf of Agility Multichannel

3 Test Approach The test will be comprised of a content management (PIM) harness To represent business as usual traffic for managing product information through the web user interface. The PIM Harnesses will be run to show performance against the database. 5 Runs will be used at the following loads: 10 PIM Sessions 30 PIM Sessions 50 PIM Sessions The above tests will be run on two different data sets as follows: 250,000 Sku s x 40 attribute instances = 10 million attribute instances 2.5 Million Sku s x 100 attribute instances = 250 million attribute instances The attribute instance is a key unit of measure for data in Agility as it equates to a record in the area of the database used to store attribute values. Timing points will be taken from the PIM user harness. Performance statistics will be completed on the server using Performance monitor showing the following: CPU Memory Network Weblogic Stats SQL Stats The test harnesses will report transaction times and response times that the end user would experience. During each test the environment will be rolled back to the original settings / database content to show clean runs against a clean environment so not to skew individual results. All test harnesses include random wait times. These wait times have been designed to mimic random end user delays during normal use to give varying load patterns. Otherwise the scenarios wouldn t be as representative as they would be in a production environment. In the opinion of Agility Multichannel one PIM test session in the test harness equates to more than the load from one real world user because the tests perform actions more frequently. However it is hard to put a specific value on that e.g. one test harness session might be equivalent to the load from 2 real world users. Agility Scalability Testing prepared on behalf of Agility Multichannel

4 Test environment setup The following information details the environment setup: Application Server HP DL360 G7 Processor Xeon E5506 Quad Core 2.13Ghz Memory 72GB (Capped at 4GB) Drive Configuration 2 x SATA 250GB (5400 rpm) in a mirror Database Server HP Proliant DL380 G7 Processor Xeon E5620 Quad Core 3.40Ghz Memory 72GB (Capped at 4GB) Drive Configuration 4 x SAS 500GB (7200 rpm) in a raid 5 Both application and database servers will be run within a Virtual environment using VMWARE ESXi 4. These will be configured to use all the processing power of each machine for test purposes. The network interface will be bound to each server to simulate full network interface usage. The memory will be capped at 4GB within the VM environment and increase when and IF required. Both systems will have Windows Server 2008 R2 64-bit including all patching up to the 1 st November 2012. The application server will consist of Weblogic 10.3.5 The database server will consist of SQL Server 2008 R2 SP2 The above configuration will allow for hardware changes if we deem the performance is not adequate during testing to give a baseline minimum requirement for hardware. The network path between the environments consists of a gigabyte switch. Each environment is hosted on a dedicated network interface connecting directly to the switch. 5 Test data used The initial 250,000 SKU data set was based on a customer database. This data set consists of rich data with an average of 40 Attributes associated to each SKU. The second data set consisting of 2.5 Million records with 100 attributes against each SKU to provide adequate test data to test against. This also includes the previous data set of 250,000k. Agility Scalability Testing prepared on behalf of Agility Multichannel

6 Test Schedule The following information details the test schedule which was completed during the testing phase. All tests include random waits between 3 and 10 seconds between tests: 6.1 Baseline PIM Sessions against the server. 250,000 SKU s 6.1.1 Test 1. 10 Sessions run concurrently on a clean environment running the following: The load consists of the following - each connection runs a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search 6.1.2 Test 2. 30 Sessions run concurrently on a clean environment running the following: The load consists of the following - each connection runs a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search 6.1.3 Test 3. 50 Sessions run concurrently on a clean environment running the following: The load consists of the following - each connection runs a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search 6.2 Baseline PIM Sessions against the server. 2.5 Million SKU s 6.2.1 Test 4. Agility Scalability Testing prepared on behalf of Agility Multichannel

10 Sessions run concurrently on a clean environment running the following: The load consists of the following - each connection runs a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search 6.2.2 Test 5. 30 Sessions run concurrently on a clean environment running the following: The load consists of the following - each connection runs a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search 6.2.3 Test 6. 50 Sessions run concurrently on a clean environment running the following: The load consists of the following - each connection runs a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search Agility Scalability Testing prepared on behalf of Agility Multichannel

7 Results Summary The results showed the following end user times against each trigger using the two sets of data for 250k SKU s and 2.5 Million records. The vertical scale is seconds. 7.1 250k SKU database End User Summary 8 7 S e c o n d s 6 5 4 3 2 1 0 10 Sessions 250k 30 Sessions 250k 50 Sessions 250k 7.2 2.5 Million SKU database End User Summary 12 10 S e c o n d s 8 6 4 2 0 10 Sessions 2.5Million 30 Sessions 2.5Million 50 Sessions 2.5Million Agility Scalability Testing prepared on behalf of Agility Multichannel

8 Conclusion All results and statistics have been stored and can be accessed on request to validate findings. All reports used have been stored since only a subset will be shown within the document to report against to reduce content. These can be provided on request. You can see clearly that the end user experience profile changes the more loads which are introduced to the system. This is normal and what you would expect within any environment. What is noticeable within the test is that the search does degrade as expected when the datasets are expanded, but not as much as we had anticipated. During the test phase using the 2.5 Million SKU data set you can see obvious peaks within the performance graphs. We believe that these peaks were due to the agent we are using to monitor statistics. You can see the agent did not impact the testing as the end user experience did not change dramatically during each of the tests. The tests were run multiple times to deduce this conclusion. Tests 1 and Test 4 were compared showing that the database completed more work with the second data set; however the environment ran as expected with no major issues. The environment showed adequate health when running up to 50 Sessions concurrently on the two data sets. 9 Summary This document provides results for PIM users for Scalability testing based on a generic environment. With the information we have we can make a statement that on our test environment the Agility SQL Database shows little performance degradation against a database with 2.5 Million SKU s. 10 Appendix A 10.1 Weblogic Service Monitor Descriptions Per Connection Pool The following metrics are collected per connection pool on a WebLogic server: Agility Scalability Testing prepared on behalf of Agility Multichannel

Metric Explanation MBean Name FailuresToReconnectCount The number of times that the connection pool failed to reconnect to a data store. JDBCConnectionPool Runtime ActiveConnectionsCurrentC ount ActiveConnectionsHighCou nt LeakedConnectionsCount CurrCapacity NumAvailable WaitingForConnectionCurre ntcount Per EJB The current number of active connections in a JDBC connection pool. The highest number of active connections in a JDBC connection pool. The total number of connections that have been checked out of, but not returned to, the connection pool. The current number of database connections in the JDBC connection pool. The number of available sessions in the session pool that are not currently being used. The current number of requests that are waiting for a connection to the connection pool. JDBCConnectionPool Runtime JDBCConnectionPool Runtime JDBCConnectionPool Runtime JDBCConnectionPool Runtime JDBCConnectionPool Runtime JDBCConnectionPool Runtime The following metrics are collected per Enterprise JavaBean (EJB) on a WebLogic server: AccessTotalCount BeansInCurrentUseCount CachedBeansCurrentCount ActivationCount PassivatedCount The total number of times an attempt was made to get an Enterprise JavaBeans (EJB) instance from the free pool. The number of EJB instances in the free pool which are currently in use. The total number of EJBs that are in the execution cache. The number of EJBs that have been activated. The number of EJBs that have been passivated. StatelessEJBRuntime StatelessEJBRuntime StatefulEJBRuntime StatefulEJBRuntime EJBCacheRuntimeMB ean Agility Scalability Testing prepared on behalf of Agility Multichannel

cacheaccesscount cachemisscount The number of attempts that were made to access an EJB in the cache. The number of times that an attempt to access an EJB in the cache failed. EJBCacheRuntimeMB ean EJBCacheRuntimeMB ean Other The following metrics are collected from the WebLogic system, the Java Virtual Machine, or the thread pool: HeapSizeCurrent HeapFreeCurrent OpenSocketsCurrentCount AcceptBacklog ExecuteThreadCurrentIdleC ount PendingRequestCurrentCou nt TransactionCommittedTotal Count TransactionRolledBackTotal Count InvocationTotalCount The amount of memory, in bytes, that is in the WebLogic server's Java Virtual Machine (JVM) heap. The current amount of free memory, in bytes, that is in the WebLogic server's JVM heap. The current number sockets on the server that are open and receiving requests. The number of requests that are waiting for a TCP connection. The number of threads in the server's execution queue that are idle or which are not being used to process data. The number of pending requests that are in the server's execution queue. The total number of transactions that have been processed by the WebLogic server. The total number of transactions that have been rolled back. The total number of times that a servlet running on the WebLogic server was invoked. JVMRuntime JVMRuntime ServerRuntime Server ExecuteQueueRuntim e ExecuteQueueRuntim e TransactionResourceR untime TransactionResourceR untime ServletRuntime Agility Scalability Testing prepared on behalf of Agility Multichannel

10.2 SQL Advanced Service Monitoring Descriptions Metric Explanation Performance Object Lock Wait / Sec. Lock Requests / Sec. Average Wait Time User Connections Transactions / Sec. Data File(s) Size / KB Total Latch Wait Time Latch Waits / Sec. Average Latch Wait Time Maximum Workspace Memory Connection Memory SQL Cache Memory Total Server Memory The amount of time, in seconds, that the server waits for a database lock. The number of new database locks and lock conversions that are requested from the lock manager every second. The average time, in milliseconds, that it takes for database locks to clear. The number of user connections to the database that are allowed. The number of transactions started for the databases across the host per second. The cumulative size of all the files in all of the databases on the host system. The total time, in milliseconds, that it takes to complete the latch requests that were waiting over the last second. The number of latch requests that were not immediately granted, and which waited before being granted. The average time, in milliseconds, that latch requests had to wait before being granted. The maximum amount of memory, in kilobytes, that the server has available to execute such processes as sort, bulk copy, hash, and index creation. The total amount of dynamic memory, in kilobytes, that the server is using to maintain connections. The amount of memory, in kilobytes, that the server is using for the dynamic SQL cache. The total amount of committed memory from the buffer pool, in kilobytes, that the server is SQL ServerLocks SQL ServerLocks SQL ServerLocks SQL ServerGeneral Statistics SQL ServerDatabase SQL ServerDatabases SQLServerLatches SQLServerLatches SQLServerLatches SQL ServerMemory Manager SQL ServerMemory Manager SQL ServerMemory Manager SQL ServerMemory Manager Agility Scalability Testing prepared on behalf of Agility Multichannel

using. Appendix B 11 Results - Detailed Please see Appendix A for graph descriptions. All statistics for each description is available on request for each test. 11.1 Test 1 The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows: Description Min (Secs) Average (Secs) Max (Secs) CREATE 0.198 0.586 12.304 UNLINK 0.027 0.102 3.212 DELETE 0.212 0.425 3.961 ADD_ATTR 0.098 0.514 2.145 SEARCH 0.116 3.009 28.74 The Environment statistics were as follows: Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

11.2 Test 2 The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows: Description Min (Secs) Average (Secs) Max (Secs) CREATE 0.202 1.627 25.87 UNLINK 0.028 0.162 2.107 DELETE 0.26 0.709 5.454 ADD_ATTR 0.106 1.799 10.861 SEARCH 0.115 7.138 67.223 The Environment statistics were as follows: Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

11.3 Test 3 The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows: Description Min (Secs) Average (Secs) Max (Secs) CREATE 0.193 2.106 28.06 UNLINK 0.031 0.217 4.431 DELETE 0.262 0.804 8.032 ADD_ATTR 1.007 3.827 20.445 SEARCH 0.267 7.25 64.158 The Environment statistics were as follows: Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

11.4 Test 4 The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows: Description Min (Secs) Average (Secs) Max (Secs) CREATE 0.194 0.898 5.73 UNLINK 0.028 0.241 1.597 DELETE 0.293 0.902 3.649 ADD_ATTR 0.105 1.325 8.976 SEARCH 0.131 6.811 39.199 Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

11.5 Test 5 The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows: Description Min (Secs) Average (Secs) Max (Secs) CREATE 0.2 2.047 23.723 UNLINK 0.028 0.499 7.027 DELETE 0.32 1.12 10.007 ADD_ATTR 0.1 3.546 22.369 SEARCH 0.192 6.707 52.607 Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

11.6 Test 6 The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows: Description Min (Secs) Average (Secs) Max (Secs) CREATE 0.192 3.035 30.225 UNLINK 0.0284 1.17 23.01 DELETE 0.319 2.036 23.424 ADD_ATTR 0.1 7.456 50.61 SEARCH 0.182 11.067 73.92 Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel

Agility Scalability Testing prepared on behalf of Agility Multichannel