White Paper
Table of Contents Table of Contents... 1 Summary... 2 Client Performance Recommendations... 2 Test Environments... 2 Web Server (TLWEBPERF02)... 2 SQL Server (TLPERFDB01)... 3 Client Machine (Offline Web Client, UI Timings)... 3 "Medium" Performance Database Specs... 3 Testing Methodology... 3 Load Balancing... 4 User Interface Timing Results... 4 Sage SalesLogix Web Client Scenario Timings IE9 SLX 7.5.4 vs. SLX 8.0... 5 Sage SalesLogix Web Client Scenario Timings IE9 SLX 7.5.4 IE8 vs. SLX 8.0 IE9... 6 Sage SalesLogix Offline Web Client Scenario Timings IE9 SLX 8.0 IE9... 7 Sage SalesLogix Mobile 2.0 Client - SLX 8.0... 8 SData SDK Timings - SLX 8.0 vs. SLX 7.5.4... 9 Load Testing Results... 10 Sage SalesLogix Web Client - SLX 8.0 Average Page Duration Comparison (seconds)... 10 Sage SalesLogix Web Client - SLX 8.0 Bandwidth Comparison (Kbytes Total/Sec)... 11 Sage SalesLogix Web Client - SLX 8.0 - Web Server CPU Usage Comparison... 12 Sage SalesLogix Web Client - SLX 8.0 - SQL Server CPU Usage Comparison... 13 Sage SalesLogix Web Client - SLX 8.0 - Web Server - Memory, Available MBytes Comparison... 14 Sage SalesLogix Web Client - SLX 8.0 - SQL Server - Memory, Available MBytes Comparison... 15 Sage SalesLogix Mobile 2.0 - SLX 8.0 1,000 Users Average Page Durations... 16 Sage SalesLogix Mobile 2.0 - SLX 8.0 1,000 Users CPU and Memory Comparison... 17 Sage SalesLogix Mobile 2.0 Sage SalesLogix 8.0 1,000 Users Bandwidth Comparison (Kbytes Total/Sec)... 18 Page 1
Summary This document contains the information and results for performance testing conducted against the Sage SalesLogix version 8.0 release. Performance testing is a standard Sage SalesLogix pre-release protocol comparing the new release against prior versions to validate that a consistent or improved level of performance has been achieved across releases. The performance tests are scenario-based, and timings are collected by scenario. In each instance, the scenarios are comprised of multiple steps and are designed to mimic the experience of a real-world user working in the product. The tests simulate stresses that can be reasonably anticipated in both a normal and throttled environment. Client Performance Recommendations Performance testing indicated that the largest variances in performance can be attributed to variances in hardware and software. When working in the Web Client there are two main factors that determine the performance of the UI loading/rendering. The first factor is the performance of the JavaScript engine in your selected browser. The newer browsers typically have better JavaScript engines and therefore better JavaScript rendering times. For example, the JavaScript engine in IE9 is significantly faster than the one contained in IE8. Therefore, using the latest supported version of your respective browser is strongly recommended when UI performance is a top priority. The second factor is the client hardware from which you are running the browser to access the Web Client. This is tied into the first factor above, as the browser s JavaScript engines can be CPU intensive while rendering the UI. Running the Web Client on an under-powered machine can slow down the UI performance of the Sage SalesLogix Web Client. Please see the Sage SalesLogix Compatibility Checklist for the latest supported browser information and for more detailed hardware recommendations. In some instances, these tests were performed using hardware with a higher specification than is listed in the Compatibility Checklist as the recommendation for Sage SalesLogix v8. This was to address the volume of records. Customers with large implementations are encouraged to consider scaling hardware. Test Environments The following details specific information about the components that comprised the environments used for these performance tests. Web Server (TLWEBPERF02) Machine: HP ProLiant DL160 G6 OS: Windows 2008 R2 x64 SP1 CPU: Intel Xeon X5670 (2.93GHz/6-core/12MB/95W) Memory: 8GB (DDR3-1333) Hard Drive: 2, 300GB SATA Drives in a RAID0 configuration, 15K RPM Page 2
SQL Server (TLPERFDB01) Machine: HP ProLiant DL380 G7 OS: Windows 2008 R2 x64 SP1 SQL: SQL Server 2008 R2 x64 CPU: Dual, Intel Xeon E5645 (2.40GHz/6-core/12MB/80W) Memory: 48GB (DDR3-1333) Hard Drive: 72GB 6G SAS 15K rpm SFF (2.5-inch) Database: "Medium" Performance Database Client Machine (Offline Web Client, UI Timings) Machine: Dell Mobile Precision M4500 Processor: Intel Core i7-840qm Quad Core 1.86GHz 8MB Memory: 8GB, DDR3-1333 SDRAM Hard Drive: 128GB SSD OS: Windows 7 Pro 64-bit SQL: SQL Express 2008 R2 "Medium" Performance Database Specs Accounts 200,468 Activities 257,741 Addresses 1,200,786 Contacts 998,250 Contracts 10,018 Defects 335 History 234,956 Leads 37,898 Opportunity 301,117 Packages 48 Products 5,035 Returns 217 Tickets 10,072 Users 1,036 Database Size 8 GB Testing Methodology Visual Studio 2012 Ultimate was used for the load and performance testing. The Sage SalesLogix environment was set up with SSL and using a Unicode converted database. For the UI Timings, the Visual Studio network emulation functionality was used to throttle the client machines where the timings were taken. The setting used for these UI Timings was IntracontinentalWAN, which throttles speed to 1,500 kbps with 50 ms latency. Page 3
Load Balancing Any load-balancing in this cycle of performance testing was achieved using Microsoft load-balancing principles, which should work with any solution. For more information see Getting Started with Network Load Balancing available at http://technet.microsoft.com/en-us/library/cc731499.aspx. This information is also in the Microsoft Windows help system. In this instance, SalesLogix Load balancing is defined as sharing user s access across multiple servers (but not an individual user s session across multiple servers). Once logged on to one of the servers, any processing specific to that user stays on that server for the duration of the session until he or she logs off. Any load-balancing solution that supports sticky sessions should be equally successful. No specific load balancing solution is supported/recommended. User Interface Timing Results The interface timings show total time for completing a given task by following a scenario using the Sage SalesLogix interface. Scenarios are multi-step and may vary in complexity depending on the actions that are required to complete the task. When considering the timings, take into account that forms which contain the largest number of fields in the interface (for example, for creating a new Account or new Ticket) will require multiple actions on the part of the user before the form is saved to the database. The time the user takes to complete these fields is reflected in the performance timings, so longer times do not necessarily indicate poor or slow response time. Page 4
Sage SalesLogix Web Client Scenario Timings IE9 SLX 7.5.4 vs. SLX 8.0 The following table compares scenario response time for the 7.5.4 Web Client (green bar) and the 8.0 Web Client (blue bar). In all instances, response time for 8.0 was equal or improved, with adding notes improving by approximately 50% and creating activities improving by approximately 10%. Note: Although IE 9 is not a supported browser for Sage SalesLogix v 7.5.4 and use may generate idiosyncratic results, this test is provided for comparison purposes. Timings are shown in seconds. Page 5
Sage SalesLogix Web Client Scenario Timings IE9 SLX 7.5.4 IE8 vs. SLX 8.0 IE9 The following table uses the same set of scenarios to quantify improvements realized when using the Sage SalesLogix Web Client with one of the most recently-qualified browsers (Internet Explorer v9). Notice the significant improvement for expanding Contact City filters scenarios showing improvements in the region of 650%. Improvements in this this high-traffic area is likely to resonate positively with users. Timings are shown in seconds. Page 6
Sage SalesLogix Offline Web Client Scenario Timings IE9 SLX 8.0 IE9 The following table shows response times using the same set of scenarios on the Sage SalesLogix Offline Web Client with one of the most recently-qualified browsers (Internet Explorer v9). Timings are from the point when the user calls the form until all data has been entered and the form saved. Findings are closely comparable to findings for the online Web Client, indicating that offline shows now decline in performance. Timings are shown in seconds. Page 7
Sage SalesLogix Mobile 2.0 Client - SLX 8.0 The following table shows response times for the Sage SalesLogix Mobile Client running comparable scenarios on a Sage SalesLogix v8.0 database. Response time for 75% of scenarios is 1.5 seconds or lower. The user in this scenario has access to approximately 50,000 history records. All Mobile timings apply to the first time the information is accessed, before any has been cached. After caching, load times were too insignificant to be usefully measured. Timings are shown in seconds. Page 8
SData SDK Timings - SLX 8.0 vs. SLX 7.5.4 The following table shows response time when using features that use SData calls. SData (Sage Data) is the communication protocol that serves as a common language for interaction between Sage products worldwide. Sage products use SData to generate and consume feeds of information, similar to RSS feeds. The Mobile Client uses SData exclusively, and the Sage SalesLogix v8.0 Web Client leverages multiple aspects of SData, specifically for lists. For the purpose of this test, touch points were grouped and averaged. The response time for 8.0 is equal to or better than in 7.5.4, with significant improvements for Products and Packages, and to a lesser extent, for Owners. Timings are shown in seconds and milliseconds. Page 9
Load Testing Results The graphs in this section of the Performance Analysis white paper focus on product response time as the number of individuals accessing Sage SalesLogix increases. Sage SalesLogix Web Client - SLX 8.0 Average Page Duration Comparison (seconds) The following graph shows the load on the server by timing roundtrip page requests, from the time the user submits the request to the time the page displays. Tests were conducted on a single box. The run time for the test was 1 hour, 40 minutes. As the number of users doubled, so did the response time, indicating consistency of average page duration regardless of the volume of users. X axis shows times elapsed during the testing cycle. Y axis shows response times in second. Page 10
Sage SalesLogix Web Client - SLX 8.0 Bandwidth Comparison (Kbytes Total/Sec) The following graph shows the impact on Web Server bandwidth as the volume of users increases. The graph indicates a linear progression as the volume increases and decreases. The Web Server can be scaled sideways to meet greater demand by splitting processing across multiple servers. For more information see Getting Started with Network Load Balancing available at http://technet.microsoft.com/en-us/library/cc731499.aspx. This information is also in the Microsoft Windows help system. In this instance, SalesLogix Load balancing is defined as sharing user s access across multiple servers (but not an individual user s session across multiple servers). Once logged on to one of the servers, any processing specific to that user stays on that server for the duration of the session until he or she logs off. X axis shows times elapsed during the testing cycle. Y axis shows bandwidth consumed in Kbytes per second. Page 11
Sage SalesLogix Web Client - SLX 8.0 - Web Server CPU Usage Comparison The following chart shows the percentage of CPU used on the Sage SalesLogix Web server as the volume of users increases. The graphic indicates a linear progression as the volume increases and decreases. The Web Server can be scaled sideways to meet greater demand by splitting processing across multiple servers. For more information see Getting Started with Network Load Balancing available at http://technet.microsoft.com/en-us/library/cc731499.aspx. This information is also in the Microsoft Windows help system. In this instance, SalesLogix Load balancing is defined as sharing user s access across multiple servers (but not an individual user s session across multiple servers). Once logged on to one of the servers, any processing specific to that user stays on that server for the duration of the session until he or she logs off. X axis shows time elapsed during the testing cycle. Y axis shows percentage of CPU consumed. Page 12
Sage SalesLogix Web Client - SLX 8.0 - SQL Server CPU Usage Comparison The following chart shows how resources are used on the Sage SalesLogix database server as the volume of users increases. The graph indicates a linear progression as the volume increases and decreases. The database used was Microsoft SQL Server. X axis shows time elapsed during the testing cycle. Y axis shows percentage of CPU consumed as the volume of users changes. Page 13
Sage SalesLogix Web Client - SLX 8.0 - Web Server - Memory, Available MBytes Comparison The following chart shows memory usage on the Sage SalesLogix Web Server as the volume of users increases. The graph indicates that memory utilization is being managed effectively. X axis shows time elapsed during the testing cycle. Y axis shows memory available in megabytes as the volume of users changes. Page 14
Sage SalesLogix Web Client - SLX 8.0 - SQL Server - Memory, Available MBytes Comparison The following chart shows memory consumed on the Sage SalesLogix database server as the volume of users increases. The graph indicates that memory utilization is being managed effectively. The database used was Microsoft SQL Server. X axis shows time elapsed during the testing cycle. Y axis shows memory available in megabytes as the volume of users changes. Page 15
Sage SalesLogix Mobile 2.0 - SLX 8.0 1,000 Users Average Page Durations The following chart shows average page response time in seconds for pages featured in testing scenarios. Results indicate that the most time-intensive area is History Lookup (the user in this scenario has access to approximately 50,000 history records.) All Mobile timings are for the first time the information is accessed, and before it has been cached. The graph indicates that most responses are sub-one second, with the majority being sub-500 milliseconds. X axis shows time elapsed during the testing cycle. Y axis shows page response time in seconds as the volume of users changes. Page 16
Sage SalesLogix Mobile 2.0 - SLX 8.0 1,000 Users CPU and Memory Comparison The following charts show the impact on CPU and memory as the volume of traffic increases. The charts indicate both low CPU resource utilization and low memory utilization. CPU Usage X axis shows time elapsed during the testing cycle. Y axis shows percentage of CPU consumed on the Sage SalesLogix Web Server machine and the SQL Server machine as the volume of Mobile users accessing the Mobile Client changes. Memory Usage X axis shows time elapsed during the testing cycle. Y axis shows the amount of memory consumed on the Sage SalesLogix Web Server machine and the SQL Server machine as the volume of Mobile users accessing the Mobile Client changes. Page 17
Sage SalesLogix Mobile 2.0 Sage SalesLogix 8.0 1,000 Users Bandwidth Comparison (Kbytes Total/Sec) The bandwidth comparison chart shows the total bandwidth consumed by the traffic generated directly by the load test engines throughout the test relative to the elapsed test time (sample period). This number scales linearly with the applied load (number of users). X axis shows time elapsed during the testing cycle. Y axis shows percentage of CPU consumed as the volume of users changes. Page 18
2013 Sage Software, Inc. All rights reserved. Sage, the Sage logos, and the Sage product and service names mentioned herein are registered trademarks or trademarks of Sage Software, Inc., or its affiliated entities. All other trademarks are the property of their respective owners. Sage 8800 N. Gainey Center Dr., Suite 200 Scottsdale, AZ 85258 Page 19