Cloud Analysis: Performance Benchmarks of Linux & Windows Environments
|
|
|
- Percival Lee
- 10 years ago
- Views:
Transcription
1 Cloud Analysis: Performance Benchmarks of Linux & Windows Environments Benchmarking comparable offerings from HOSTING, Amazon EC2, Rackspace Public Cloud By Cloud Spectator July 2013
2 Table of Contents Introduction 2 Executive Summary 3 System Performance 4 CPU Performance 5 Internal Network Performance 8 Disk Performance 11 Appendix 14 Methodology 14 Terms & Definitions 15 About Cloud Spectator 18
3 Introduction Performance Testing Cloud Spectator monitors the CPU, RAM, storage, and internal network performance of over 20 of the world s most well-known IaaS services to understand important aspects of virtual server performance. Tests are run at least three times per day, 365 days per year to capture variability in addition to performance level. Tests are chosen based on reliability and practicality. The goal is to provide an indication of where certain providers perform well relative to others. This can give consumers an indication of which services would be best for their application(s) by understanding the performance of provider resources most critical to that application. Singular benchmarks alone should not be the only deciding factor in the provider selection process. Feature sets, configuration matches, pricing and ancillary services such as security, compliance, and disaster recovery should always factor into any vendor selection process. However, performance is a very important piece to the puzzle. The Comparison For the purpose of generating this document, Cloud Spectator measured the performance of HOSTING s virtual machines against comparable offerings from Amazon and Rackspace. Over a period of twenty days, Cloud Spectator ran benchmark tests across HOSTING Cloud Enterprise, Amazon AWS, and Rackspace OpenCloud. Each test was run to understand the unique performance capabilities of each provider s CPU, internal network, and disk IO. Cloud Spectator accounted for performance capability and stability for each provider to understand the value each one delivers to its users. Tests were run on 4GB servers for HOSTING and Rackspace, and 3.75GB servers for Amazon (for more details, see Methodology section in the Appendix). While other factors can sway the perception of provider comparisons, including value-added services, features, support, security, etc., Cloud Spectator chose to focus objectively on performance to derive numerical relationships between the levels of service delivered from each provider. Performance testing and benchmarking of cloud computing platforms is a complex task, compounded by the differences between providers and the use cases of cloud infrastructure users. IaaS services are utilized by a large variety of industries, and performance metrics cannot be completely understood by simply representing cloud performance with a single value. When selecting a cloud computing provider, IT professionals consider many factors: feature-sets, cost, security, location and more. However, performance is a key issue that drives many others, including cost. In many cases, three primary resources affect overall server performance: central processing unit (CPU), disk, and internal network. These three resources are what this comparison focuses on.
4 Executive Summary Overall System Performance HOSTING outperforms both Amazon and Rackspace across the 14-day testing period. In the general server test, using UnixBench in a Linux environment, HOSTING outperforms both Amazon and Rackspace across the 14-Day testing period. The results of the test are a reflection of HOSTING s overall superior performance when compared to Amazon and Rackspace based upon the specific server sizes in this report. Specifically, HOSTING outperformed Amazon and Rackspace in 66% of the Windows tests shown in this report. HOSTING performs better than Amazon and Rackspace in more than 70% of Linux tests shown on the report. CPU Performance HOSTING offers the fastest CPUs on average, as well as greater performance predictability. HOSTING outperforms Rackspace by an average of 10%, and Amazon by 49%, in Virtual Machines (VM) running Linux environments. HOSTING performs an even higher 26% better than Rackspace and 81% better than Amazon in Windows based VMs. The stability of CPU performance did not present itself as an issue with HOSTING or Amazon. The Coefficient of Variations (CV, see Terms and Definitions in the Appendix) of the CPU tests are all relatively low, staying below 7%. Rackspace s CPUs on the other hand measured CVs up to 31% in the tests. Network Performance HOSTING offers superior network performance across the board, as shown by the results of the internal network tests. Out of the network tests, HOSTING outperformed Amazon and Rackspace in every benchmark (7 out of 7 network tests). Compared to Rackspace and Amazon, HOSTING has higher speed, lower latency and more stable performance. Rackspace showed relatively stable throughput speeds with 15.6% CV in the Iperf test, and an average 11% CV in the five remaining network speed tests. Amazon had the least stable throughput with 37% CV in the Iperf test, and an average 29% CV in the five remaining network speed tests. HOSTING showed the most stable speeds with a 4.2% CV in the Iperf test, and an average 8% CV in the five remaining network speed tests. Amazon s network speed is at least 100 Mibits/sec, as is HOSTING s, but Rackspace s speed appears to be throttled to around 90 MiBits/sec. Both Amazon and Rackspace have high network latency at 1,757 microseconds and 1,353 microseconds respectively. HOSTING s average network latency of 474 microseconds is about 70% less than either provider. Both providers also have a problem with the stability of their latency, with Rackspace showing 88% CV on the Ping test, and Amazon at a smaller 39% CV. End users could face lag and unreliable network performance when transferring large amounts of data between multiple VMs if using either of these providers. Disk Performance HOSTING offers superior disk write speed and disk performance stability. HOSTING s disk performance is about equal to or greater than Amazon in Windows environments, but varies greatly compared to Rackspace. HOSTING s disk performance in Linux environments is slightly below Rackspace in the Dbench and File Copy tests, but significantly outperforms in throughput and IOPS tests for write speed. HOSTING has a smaller CV in five out of eight disk performance tests compared to Rackspace and Amazon. Rackspace offers high performance and a stable sustained disk read rate, as shown in the H2bench benchmark results of 199 MiBits/sec, with a 5.7% CV. Comparably, HOSTING averages 55 Mibits/sec, and Amazon averages 59 Mibits/sec. However, the File Copy test in Windows shows HOSTING outperforming Amazon and Rackspace. While HOSTING may not be as stable as Amazon and Rackspace in the File Copy test, it makes up for it with higher overall performance. In the Dbench and File Copy tests in Linux, Rackspace is highest overall, but unstable. HOSTING performs at 90% of Rackspace s level in both tests, but offers greater stability. Amazon also performed with greater stability than Rackspace (but less than HOSTING). The File System tests prove HOSTING performs well in disk write speed, but does not match Rackspace in disk read or overall disk performance. Similarly Amazon outperforms HOSTING in disk read, but not disk write speed.
5 May 23 Score System Performance Results HOSTING performs 67% greater than Rackspace, and 167% greater than Amazon in overall system performance. Amazon has the most predictable performance stability with a CV of 3.1%, followed by HOSTING at 5.9% and Rackspace at 20.5%. The overall higher performance of HOSTING downplays the significance of its minimal performance unpredictability compared to Amazon. UnixBench (Linux) System Performance Test Scores: HOSTING vs. Rackspace vs. Amazon HOSTING Rackspace Amazon The graph above compares general server performance on a Linux server between HOSTING, Rackspace and Amazon. The graph shows the score of the UnixBench benchmark of each provider over a period of 14 days, with two data points shown for each day. As shown above, HOSTING consistently outperforms both Rackspace and Amazon over the entire test period. PROVIDER AVERAGE STANDARD DEVIATION CV 14-DAY HIGH 14-Day LOW HOSTING % Rackspace % Amazon % Test Description See Appendix B
6 May 23 Score CPU Performance Why CPU Matters The CPU is the most important component of overall system performance. The CPU is the brain of the system. It pulls data out of storage and fills the RAM. It determines how fast things can happen. Gaming, video editing, encryption, and graphics rendering are just a few actions that are reliant upon CPU performance. The increasing presence of HD video, high quality gaming, and other engaging content adds to the need for high CPU performance. Security concerns will lead to more complex encryption methods, which also rely on the CPU. The big data revolution we are seeing is possible because of the immense power of modern CPUs. In a big data scenario, superior CPU performance can significantly shrink the time it takes to crunch massive amounts of data to produce meaningful results. The amount of data being calculated, consumed, compressed and encrypted is growing fast. Massive processing power is more of a need now than ever. CPU Results Windows Server Results The tests of CPU performance in Windows systems show HOSTING outperforming Rackspace and Amazon in the majority of benchmarks. The average overall PassMark CPU Score of HOSTING is 27% higher than Rackspace, and 82% higher than Amazon. The predictability of performance of the three providers does not present itself to be an issue since the CVs are all 2.1% or lower PassMark (Windows) CPU Performance Test Scores: HOSTING vs. Rackspace vs. Amazon HOSTING Rackspace Amazon 0 The graph above compares CPU performance on a Windows server between HOSTING, Rackspace and Amazon. The graph shows the scores from the PassMark benchmark test for each provider over a period of 14 days, with two data points shown for each day. As shown above, HOSTING consistently outperforms both Rackspace and Amazon over the entire test period. PROVIDER AVERAGE STANDARD DEVIATION CV 14-DAY HIGH 14-DAY LOW HOSTING % Rackspace % Amazon % Linux Server Results The aggregate performance differentials between the three providers for Linux systems show HOSTING CPUs outperforming Rackspace s CPUs by 10.1% and Amazon s by 49%. HOSTING also shows it provides the benefits of stable systems with low coefficients of variation (CVs) that are less than one-third that of Rackspace. However, Amazon is the most stable with an average CV of 3.1%, compared to HOSTING s CV of 6.8% and Rackspace s CV of 19.3%. HOSTING s CPUs do exceptionally well in the Timed Apache Compilation tests, outperforming Rackspace and Amazon by 54% and 66%, respectively. HOSTING also shows superiority in the Timed Linux Kernel Compilation test by being 14% and 52% faster than Rackspace and Amazon, respectively.
7 May 23 % Difference Comparing only HOSTING and Rackspace, HOSTING outperforms Rackspace by a large margin in one of the benchmarks, the Timed Apache Compilation, while outperforming them by a smaller margin in the Timed Linux Kernel test. HOSTING slightly underperforms in the 7zip Compression, x264 and LAME tests, on average. However, HOSTING offers slightly greater performance for the majority of the test period, except for a brief period where Rackspace s servers perform at a high enough level to affect the averages for the entire test period % 80.0% 60.0% 40.0% 20.0% 0.0% -20.0% -40.0% -60.0% -80.0% % CPU Test Result % Comparison (Linux): HOSTING's % Performance Above/Below Rackspace (+/-) 7zip Linux Kernel Apache x264 LAME The graph above compares CPU performance on a Linux server between HOSTING and Rackspace. The aggregate results of five separate benchmarks are shown using the different colored trend lines. Rackspace was used as the baseline (0% on the y-axis) for comparison against HOSTING. As shown above, HOSTING outperforms Rackspace on most tests for the majority of the testing period. Performance TEST HOSTING AVG RACKSPACE AVG HOSTING VS RAX 7zip 3160 MIPS 3417 MIPS -7.5% Timed Linux Kernel 391 seconds 453 seconds 13.7% Timed Apache 56 seconds 121 seconds 53.7% X Fps 26 Fps -3.8% LAME MP3 33 seconds 35 seconds -5.7% AVERAGE 10.1% Variability TEST HOSTING CV RACKSPACE CV 7zip 6.8% 28.5% Timed Linux Kernel 6.4% 17.0% Timed Apache 6.4% 14.5% X % 31.1% LAME MP3 4.7% 5.5% AVERAGE 6.2% 19.3%
8 May 23 % Difference Comparing HOSTING and Amazon, HOSTING outperforms Amazon by a very large margin in four out of five Linux CPU benchmarks. HOSTING only underperforms in the LAME MP3 Encoding test by a small margin % 100.0% 80.0% 60.0% 40.0% 20.0% 0.0% -20.0% -40.0% CPU Test Result % Comparison (Linux): HOSTING's Performance Above/Below Amazon 7zip Linux Kernel Apache x264 LAME The graph above compares CPU performance on a Linux server between HOSTING and Amazon. The aggregate results of five separate benchmarks are shown using the different colored trend lines. Amazon was used as the baseline (0% on the y-axis) for comparison against HOSTING for each of the tests. As shown above, HOSTING outperforms Amazon on most tests, between 50-80% on average, except for the LAME test in which it is 10% slower. Performance TEST HOSTING AVG AMAZON AVG HOSTING VS AMZN 7zip 3160 MIPS 1969 MIPS 60.5% Timed Linux Kernel 391 seconds 935 seconds 58.2% Timed Apache 56 seconds 191 seconds 70.7% X Fps 15 Fps 66.7% LAME MP3 33 seconds 30 seconds -10.0% AVERAGE 49.2% Variability TEST HOSTING CV AMAZON CV 7zip 6.8% 2.8% Timed Linux Kernel 6.4% 5.5% Timed Apache 6.4% 2.9% X % 2.2% LAME MP3 4.7% 2.0% AVERAGE 6.2% 3.1% Test Descriptions See Appendix C
9 Microseconds May 23 May 23 Mibits/Sec Internal Network Performance Why internal network performance matters The internal network is used to transmit data between different servers within a data center. In cloud computing, data is stored and processed in virtual environments spread across multiple physical servers. It is important that the data is accessible from anywhere within the data center, as quickly as possible. Applications that split their workload across multiple machines can be limited by bottlenecks in the internal network. When using a SAN, I/O capability can also be limited by a slow internal network. Even if a physical server is running top of the line hardware, and the virtualization environment is appropriately abstracted, a slow network negates the benefit of having high performance machines. Results HOSTING s network throughput for Linux systems is on average 888 Mibits/sec, which is more than 2.5x that of Amazon and nearly 9x that of Rackspace. HOSTING also provides the most stable network throughput with a CV of 4.2%, compared to 15.9% for Rackspace and 37% for Amazon Iperf (Linux) Network Throughput Results: HOSTING vs. Rackspace vs. Amazon HOSTING Rackspace Amazon 0 The graph above compares internal network performance between Linux servers for HOSTING, Rackspace and Amazon. The graph shows the network throughput results from the Iperf benchmark of each provider over a period of 14 days, with two data points shown for each day. As shown above, HOSTING outperforms Amazon by about 2.5x and Rackspace by 8.5x. PROVIDER AVERAGE STANDARD DEVIATION CV 14-DAY HIGH 14-DAY LOW HOSTING 888 Mibits/sec 37 Mibits/sec 4.2% 912 Mibits/sec 760 Mibits/sec Rackspace 101 Mibits/sec 16 Mibits/sec 15.7% 148 Mibits/sec 85 Mibits/sec Amazon 343 Mibits/sec 126 Mibits/sec 37% 720 Mibits/sec 286 Mibits/sec The results of the Ping benchmark show the stability of HOSTING throughout the twenty-day test period, while Rackspace s and Amazon s performance was erratic and unstable for this duration. HOSTING has the lowest average latency at 474 microseconds. Except for higher latency recorded on th, HOSTING has the most stable throughout for most of the 14-day test period. Ping (Linux) Internal Network Results: HOSTING vs. Rackspace vs. Amazon HOSTING Rackspace Amazon The graph above compares internal network performance between Linux servers for HOSTING, Rackspace and Amazon. The graph shows the network latency results from the Ping benchmark of each provider over a period of 14 days, with two data points shown for each day. As shown above, HOSTING has stable latency throughout most of the testing period, Copyright compared Cloud to Spectator, Rackspace and LLC Amazon All whose rights latency reserved is highly variable and elevated over several days.
10 May 23 % Difference PROVIDER AVERAGE STANDARD DEVIATION CV 14-DAY HIGH 14-DAY LOW HOSTING 474 µs 466 µs 98.3% 2844 µs 324 µs Rackspace 1353 µs 1210 µs 89.4% 4858 µs 411 µs Amazon 1757 µs 712 µs 40.5% 3880 µs 1178 µs In the remaining five network tests, HOSTING outperformed Rackspace by an average of 633%. Against Amazon, HOSTING outperformed by 111% on average (see chart on following page). Comparing HOSTING to Rackspace, HOSTING dramatically outperforms Rackspace in a majority of the tests. In the Network FTP RAM, wget/apache and wget/nginix tests, HOSTING is over 800% faster for each test. On the Network SCP test, HOSTING outperforms Rackspace by a significant 175% % Internal Network Test Result % Comparison (Linux): HOSTING's % Performance Above/Below Rackspace (+/-) % % 500.0% 0.0% FTP RAM FTP Disk Apache nginix SCP The graph above compares internal network performance on a Linux server between HOSTING and Rackspace. The aggregate results of five separate benchmarks are shown using the different colored trend lines. Rackspace was used as the baseline (0% on the y-axis) for comparison to HOSTING for each of the tests. As shown above, HOSTING outperforms Rackspace on each of the tests, up to a magnitude of 1,000% and higher. Performance TEST HOSTING AVG RACKSPACE AVG HOSTING VS RAX Network FTP RAM 834 Mibits/sec 92 Mibits/sec 806% Network FTP Disk 522 Mibits/sec 90 Mibits/sec 480% wget/apache 864 Mibits/sec 91 Mibits/sec 849% wget/nginix 868 Mibits/sec 91 Mibits/sec 854% Network SCP 239 Mibits/sec 87 Mibits/sec 175% AVERAGE 633% Variability TEST HOSTING CV RACKSPACE CV Network FTP RAM 7.8% 9.0% Network FTP Disk 12.9% 12.8% wget/apache 4.5% 11.2% wget/nginix 4.2% 10.4% Network SCP 8.3% 10.7% AVERAGE 7.5% 10.8% Comparing HOSTING to Amazon, HOSTING outperforms Amazon at least 2x in 60% of the internal network tests. Performing best in the wget/apache and wget/nginix tests, HOSTING s internal transfer speed between web servers is higher than Amazon by about 125%. Overall, HOSTING performs better than Amazon across all the internal network tests, and offers an average 74% greater stability.
11 May 23 % Difference 300.0% Internal Network Test Result % Comparison (Linux): HOSTING's % Performance Above/Below Rackspace (+/-) 250.0% 200.0% 150.0% 100.0% 50.0% 0.0% FTP RAM FTP Disk Apache nginix SCP The graph above compares internal network performance on a Linux server between HOSTING and Amazon. The aggregate results of five separate benchmarks are shown using the different colored trend lines. Amazon was used as the baseline (0% on the y-axis) for comparison to HOSTING for each of the tests. As shown above, HOSTING outperformed Amazon in each of the tests. HOSTING also has greater stability for most of the tests by comparison. Performance TEST HOSTING AVG AMAZON AVG HOSTING VS AMZN Network FTP RAM 834 Mibits/sec 380 Mibits/sec 119% Network FTP Disk 522 Mibits/sec 272 Mibits/sec 92% wget/apache 864 Mibits/sec 382 Mibits/sec 126% wget/nginix 868 Mibits/sec 387 Mibits/sec 124% Network SCP 239 Mibits/sec 122 Mibits/sec 96% AVERAGE 111% Variability TEST HOSTING CV AMAZON CV Network FTP RAM 7.8% 43.0% Network FTP Disk 12.9% 3.5% wget/apache 4.5% 44.6% wget/nginix 4.2% 46.0% Network SCP 8.3% 8.0% AVERAGE 7.5% 29.2% Test Descriptions See Appendix D
12 May 23 Mibytes/sec Disk Performance Why Disk performance matters The disk is where it all starts. All the data has to go somewhere, and the disk is where it is stored. Whether they are using traditional spinning hard drives, or newer solid-state drives (SSD), service vendors need to provide customers with reliable and efficient means of storing their data. The disk needs to be able to quickly and reliably read/write the data it receives or that is requested from the other components of the server. The most common measurement for storage system performance is the input/output operations per second (IOPS) which measures how long the system has to wait as data is read from, or written to, the disk. As the quantity of data stored and analyzed scales upwards, every single IOPS a disk can handle will carry greater significance. Results Windows Server The results of the H2bench clearly show Rackspace with the highest sustained disk read rate out of the three providers. Rackspace averages 199 Mibytes/sec, compared to Amazon at 59 Mibytes/sec and HOSTING at a close 3 rd place at 55 Mibytes/sec. Amazon leads in performance stability with a standard deviation of 2 Mibytes/sec and a CV of 3%. Rackspace is also relatively stable with only an 11 Mibytes/sec standard deviation and 6% CV. HOSTING is third, with a standard deviation of 5 Mibytes/sec and 9% CV. 250 H2bench (Windows) Sustained Disk Read Rate: HOSTING vs. Rackspace vs. Amazon HOSTING Rackspace Amazon 0 The graph above compares disk performance on a Windows server between HOSTING, Rackspace and Amazon. The graph shows the sustained disk read rates from the H2bench benchmark results of each provider over a period of 14 days, with two data points shown for each day. As shown above, Rackspace outperforms both HOSTING and Amazon over the course of the testing period, where HOSTING and Amazon perform almost on par with each other. PROVIDER AVERAGE STANDARD DEVIATION CV 14-DAY HIGH 14-DAY LOW HOSTING 55 MiBytes/sec 5 MiBytes/sec 8.6% 60 MiBytes/sec 45 MiBytes/sec Rackspace 199 MiBytes/sec 12 MiBytes/sec 5.7% 216 MiBytes/sec 171 MiBytes/sec Amazon 59 MiBytes/sec 2 MiBytes/sec 2.9% 61 MiBytes/sec 54 MiBytes/sec The File Copy test measures the read/write ability of the disk by copying a large file. HOSTING s disk performs the copy an average of 3x faster than Rackspace and 6x faster than Amazon. In terms of performance reliability, Rackspace and Amazon offer a more consistent disk speed. However, the values for HOSTING at its lowest points are higher than the majority of the highest values for Rackspace and Amazon. When processing large amounts of data, slow and/or inconsistent read/write speeds can result in poor performance, affecting end users.
13 May 23 % Difference May 23 Mibyts/sec File Copy (Windows) Test Results: HOSTING vs. Rackspace vs. Amazon HOSTING Rackspace Amazon The graph above compares disk performance on a Windows server between HOSTING, Rackspace and Amazon. The graph shows the file copy speed from the File Copy benchmark results of each provider over a period of 14 days, with two data points shown for each day. As shown above, HOSTING outperforms Rackspace by nearly 3x and Amazon by nearly 6x throughout most of the testing period. PROVIDER AVERAGE STANDARD DEVIATION CV 14-DAY HIGH 14-DAY LOW HOSTING 172 MiBytes/sec 14 MiBytes/sec 8.3% 190 MiBytes/sec 131 MiBytes/sec Rackspace 62 MiBytes/sec 7 MiBytes/sec 10.6% 71 MiBytes/sec 47 MiBytes/sec Amazon 30 MiBytes/sec 2 MiBytes/sec 8.2% 33 MiBytes/sec 25 MiBytes/sec Linux Server Rackspace outperforms both HOSTING and Amazon in the Dbench and File Copy tests. Rackspace recorded the highest average result of 227 Mibits/sec on Dbench, and 72 Mibits/sec on File Copy tests. HOSTING nearly matched Rackspace s performance with 208 Mibytes/sec on Dbench, and 48 Mibytes/sec on File Copy. HOSTING and Amazon match up in stability for the Dbench test, but Rackspace varies across the entire testing period. In the File Copy test, Rackspace again offers the least predictable performance with a CV of 24.7% Dbench & File Copy (Linux) Test Results: HOSTING vs. Rackspace vs. Amazon HOSTING-Dbench Rackspace-Dbench Amazon-Dbench HOSTING-File Copy Rackspace-File Copy Amazon-File Copy The graph above compares disk performance on a Linux server between HOSTING, Rackspace and Amazon. The graph shows the aggregate results from the Dbench and File Copy benchmark tests of each provider over a period of 14 days, with two data points shown for each day. As shown above, Rackspace performs best on both tests, followed closely by HOSTING. Amazon performs around half as well as the other two providers. PROVIDER AVERAGE STANDARD DEVIATION CV 14-DAY HIGH 14-DAY LOW HOSTING Dbench 208 MiBytes/sec 5 MiBytes/sec 2.2% 218 MiBytes/sec 198 MiBytes/sec Rackspace Dbench 227 MiBytes/sec 60 MiBytes/sec 26.4% 311 MiBytes/sec 123 MiBytes/sec Amazon Dbench 140 MiBytes/sec 4 MiBytes/sec 3.0% 150 MiBytes/sec 131 MiBytes/sec HOSTING File Copy 48 MiBytes/sec 6 MiBytes/sec 12.3% 60 MiBytes/sec 32 MiBytes/sec Rackspace File Copy 72 MiBytes/sec 18 MiBytes/sec 24.7% 92 MiBytes/sec 36 MiBytes/sec Amazon File Copy 27 MiBytes/sec 1 MiBytes/sec 3.4% 28 MiBytes/sec 24 MiBytes/sec HOSTING outperforms Rackspace by a large margin on file system write IOPS and throughput by 144% and 128% respectively. However, HOSTING performs 65% and 74% slower in file system read IOPS and throughput compared to Rackspace.
14 May 23 % Difference May 23 % Difference 500.0% 400.0% 300.0% 200.0% 100.0% 0.0% % % Disk Test Result % Comparison (Linux): HOSTING's % Performance Above/Below Rackspace (+/-) Thruput-Write IOPS-Write Thruput-Read IOPS-Read The graph above compares disk performance on a Linux server between HOSTING and Rackspace. The aggregate results of four separate benchmarks are shown using the different colored trend lines. Rackspace was used as the baseline (0% on the y-axis) for comparison to HOSTING for each of the tests. As shown above, HOSTING outperforms Rackspace in the disk write tests, and Rackspace outperforms HOSTING in the disk read tests. Performance TEST HOSTING AVG RACKSPACE AVG HOSTING VS RAX FIO Throughput Write Kb/sec Kb/sec 128% FIO IOPS Write 743 Ops/sec 305 Ops/sec 144% FIO Throughput Read Kb/sec Kb/sec -65% FIO IOPS Read 204 Ops/sec 784 Ops/sec -74% Variability TEST HOSTING CV RACKSPACE CV FIO Throughput Write 23.2% 39.0% FIO IOPS Write 25.8% 39.1% FIO Throughput Read 10.5% 52.2% FIO IOPS Read 10.5% 52.3% AVERAGE 17.5% 45.7% HOSTING outperforms Amazon by a respective 102% and 116% on throughput and IOPS file system write tests. However, Amazon outperforms HOSTING by a similar 66% and 66% for throughput and IOPS file system read % Disk Test Result % Comparison (Linux): HOSTING's % Performance Above/Below Amazon (+/-) 500.0% 400.0% 300.0% 200.0% 100.0% 0.0% % % Thruput-Write IOPS-Write Thruput-Read IOPS-Read The graph above compares disk performance on a Linux server between HOSTING and Amazon. The aggregate results of four separate benchmarks are shown using the different colored trend lines. Amazon was used as the baseline (0% on the y-axis) for comparison to HOSTING for each of the tests. As shown above, HOSTING outperforms Amazon in the disk write tests, and Amazon outperforms HOSTING in the disk read tests.
15 Performance TEST HOSTING AVG Amazon AVG HOSTING VS AWS FIO Throughput Write Kb/sec Kb/sec 102% FIO IOPS Write 743 Ops/sec 344 Ops/sec 116% FIO Throughput Read Kb/sec Kb/sec -68% FIO IOPS Read 204 Ops/sec 646 Ops/sec -68% Variability TEST HOSTING CV AMAZON CV FIO Throughput Write 23.2% 38.7% FIO IOPS Write 25.8% 38.8% FIO Throughput Read 10.5% 13.1% FIO IOPS Read 10.5% 13.1% AVERAGE 17.5% 25.9% Test Descriptions See Appendix E
16 Appendix Appendix A Methodology Server Setup: Two servers are set up on each provider that match the following configurations as closely as possible. The Primary Server runs all tests specific to CPU and storage. The Secondary Server is used in conjunction with the Primary Server to test the internal network. Primary Server 2 vcpu 4 GB RAM 50 GB Storage Secondary Server 1 vcpu 1 GB RAM 50 GB Disk The following server configurations were set up on each provider. The pre-configured VMs that characterize Amazon and Rackspace offerings make it impossible to match target configurations exactly. Amazon AWS EC2 Primary: Availability zone East-1a OS: Ubuntu (Linux Test) Windows Server 2008 R2 (Windows Test) vcpus: 1 (with 2 ECU) RAM: 3.75GB Disk: 410GB HOSTING Public Cloud (Cloud Enterprise) Primary: Data center East-1 OS: Red Hat Enterprise Linux 5 (Linux Test) Windows Server 2008 R2 (Windows Test) vcpus: 2 RAM: 4GB Disk: 50GB Rackspace OpenStack Cloud Primary: Data center Dallas Fort Worth (DFW) OS: Ubuntu (Linux Test) Windows Server 2008 R2 (Windows Test) vcpus: 2 RAM: 4GB Disk: 160GB Secondary: Availability zone East-1a OS: Ubuntu (Linux Test) Windows Server 2008 R2 (Windows Test) vcpus: 1 (with 2 ECU) RAM: 3.75GB Disk: 410GB Secondary: Data center East-1 OS: Red Hat Enterprise Linux 5 (Linux Test) Windows Server 2008 R2 (Windows Test) vcpus: 2 RAM: 4GB Disk: 50GB Secondary: Data center Dallas Fort Worth (DFW) OS: Ubuntu (Linux Test) Windows Server 2008 R2 (Windows Test) vcpus: 2 RAM: 4GB Disk: 50GB Tests Used The benchmarks used to test the providers listed in this report are derived from a number of sources. Many of the Linux tests use benchmarks from the Phoronix Test Suite, a compilation of varying types of open-source benchmarks. Most of the Windows tests use benchmarks from the CPU and disk tests of PassMark Software. For descriptions of test used, please see the Test Descriptions below. System (Linux) UnixBench CPU Linux Windows 7zip Compression Timed Linux Kernal Compilation Timed Apache Compilation X264 LAME MP3 Encoding PassMark Internal Network (Linux) Iperf Ping Network FTP RAM Network FTP Disk Wget/Apache Wget/Nginx Network SCP Storage Linux DBench File Copy FIO Throughput Write FIO Throughput Read FIO IOPS Write FIO IOPS Read Windows H2Bench File Copy Timeframe The test period for Linux systems ranges from 5/23/2013 to 6/5/2013. The test period for Windows systems ranges from 5/16/2013 to 5/29/2013.
17 Data Collection Performance data for each test was collected three times per day, every day throughout the above testing periods. Data shown in the graphs and tables only take the highest and lowest point of each day into account. Rather than showing all the data points collected, only showing the highest and lowest points of the day let customers determine the range of best to worst performance they could expect from a virtual server from the providers. Cloud Spectator obtains cloud servers by purchasing the server space directly from the providers as any user would. For certain providers, the client may reimburse Cloud Spectator for the sever space needed for data collection relevant to that active project. Cloud Spectator collects and compiles the data into the CloudSpecs database and translates it into a visual display. Terms and Definitions For the purpose of understanding the relational values of the data, all numbers within the tables below each graph are expressed as whole numbers except the percentages, which are expressed to the tenth decimal point. Percentages are expressed in that manner to account for instances when the coefficient of variation (CV), which is expressed as a percentage, falls below 1%, indicating a high degree of performance predictability. Average When describing averages, Cloud Spectator refers to the average numerical value over a period of x days from y until z (14 days; from 5/23/2013 to 6/5/2013 and 5/16/2013 to 5/29/2013). Average provider scores can be found inside the tables underneath each graph within this document. The average is used to summarize the data from the charts in a simplified overview. Standard Deviation The standard deviation is calculated over a period of x days from y until z (14 days; from 5/23/2013 to 6/5/2013 and 5/16/2013 to 5/29/2013). The standard deviation can be found inside the tables underneath each graph within this document. The standard deviation is used to understand the amount of variation from the average benchmark score of a provider; i.e., how predictable a provider s server performance is for that test. The standard deviation can only be used to understand the amount of variation within a certain provider, and cannot be used to compare among providers. Coefficient of Variation (CV) The coefficient of variation is expressed as a percentage. The CV can be found inside the tables underneath each graph within this document. The CV is a measure of precision. It normalizes the standard deviation as a percentage of the average, which can be compared across providers. A lower CV means more stable performance. [(Standard Deviation) / (Average)] * Day Highs and Lows From the tested period between y until z (14 days; from 5/23/2013 to 6/5/2013 and 5/16/2013 to 5/29/2013), Cloud Spectator extracts and presents the highest and lowest achieved scores by each provider in the tables underneath the graphs within this document. Appendix B Test Descriptions UnixBench The purpose of UnixBench is to provide a basic indicator of the performance of a Unix-like system; hence, multiple tests are used to test various aspects of the system's performance. These test results are then compared to the scores from a baseline system to produce an index value, which is generally easier to handle than the raw scores. The entire set of index values is then combined to make an overall index for the system. Appendix C Test Descriptions Passmark (Windows) The Passmark suite is comprised of nine separate CPU tests that are all aggregated into a final score:
18 Integer Math Test measures how fast the CPU can perform mathematical operations with integers Compression Test measures the speed at which the CPU can compress blocks of data into smaller blocks (kilobytes/sec) Prime Number Test measures how fast the CPU can search for prime numbers (operations/sec) Encryption Test encrypts blocks of random data with different encryption methods Floating Point Math Test performs same operations as Integer Math Test, but with numbers with fractional parts (e.g ) Multimedia Instructions measures SSE capabilities (instructions for processing data at higher speeds) of a CPU String Sorting Test uses the qsort algorithm to see how fast the CPU can sort strings Physics Test uses the Tokamak Physics Engine to measure how fast the CPU can calculate physics interactions of hundreds of objects colliding Single Core Test an aggregate of the floating point, string sorting, and data compression tests Source: cpubenchmark.net 7-zip Compression A benchmark found within the Phoronix Test Suite. The test measures how many instructions a CPU can complete per second, expressed as millions of instructions per second (MIPS). The test consists of compressing a file with random data using the 7-zip program and then dividing the number of CPU instructions executed during the compression by the number of seconds. The result is then divided by one million to calculate the value in MIPS. Timed Linux Kernel A test to determine the amount of time needed to build the Linux kernel. Timed Apache A test to determine the amount of time needed to build the Apache HTTP server. X264 A benchmark found within the Phoronix Test Suite. Tests the performance of the CPU by converting an uncompressed video to MPEG-4/H.264 format. Scoring is based upon the number of frames converted per second. LAME MP3 Encoding Measures the amount of time required to convert a WAV file to MP3 format. Appendix D Test Descriptions Iperf A frequently used network testing tool that can create TCP and UDP data streams to measure the throughput of a network that is carrying them. Cloud Spectator s test transfers as much data as possible through the local network for 120 seconds over TCP port Ping In contrast to throughput testing, ping commands can be used to measure the latency between nodes on a private network. Higher latency severely degrades application performance in a distributed computing environment, and can adversely impact user experience, especially with regard to timesensitive applications. Other Benchmarks: Network FTP RAM Transfer of a 1 GB file from ramdisk to ramdisk between two machines hosted by the same provider using the lftp client, vsftpd server. Network FTP Disk Transfer of a 5 GB file from disk to disk between two machines hosted by the same provider using the lftp client, vsftpd server. wget/apache Transfer of a 1 GB file between two machines hosted by the same provider using the apache web server and wget client. wget/nginix Transfer of a 1 GB file between two machines hosted by the same provider using the nginx web server and wget client.
19 Network SCP Transfer of a 5 GB file from disk to disk between two machines hosted by the same provider using scp. Appendix E Test Descriptions H2bench (Windows) Average sustained disk read rate File Copy (Linux/Windows) Copy of a 10 GB file to the same disk Dbench Dbench is a benchmark designed by the Samba project as a free alternative to Netbench, but Dbench contains only file-system calls for testing the disk performance. FIO Cloud Spectator used Fio to measure file system IO. Fio is a benchmark tool used to measure IO for hardware verification. For the purpose of this set of tests, Cloud Spectator runs a random pareto distribution with a size of 10GB for 1 thread and 8 concurrent threads to understand performance as the system is scaled. Data is calculated to return values of operations per second. File system IO is important in understanding and predicting the performance of production cloud servers such as a database. About Cloud Spectator Cloud Spectator is the premier, international cloud analyst group focused on infrastructure pricing and server performance. Since 2011, Cloud Spectator has monitored the cloud Infrastructure industry on a global scale and continues to produce research reports for businesses to make informed purchase decisions by leveraging its CloudSpecs utility, an application that automates live server performance tests 3 times a day, 365 days a year with use of open source benchmark tests. Currently, the CloudSpecs system actively tracks 20 of the top IaaS providers around the world. Cloud Spectator 485 Massachusetts Avenue Suite 300 Cambridge, MA Website: Phone: (USA)
CLOUDSPECS PERFORMANCE REPORT LUNACLOUD, AMAZON EC2, RACKSPACE CLOUD AUTHOR: KENNY LI NOVEMBER 2012
CLOUDSPECS PERFORMANCE REPORT LUNACLOUD, AMAZON EC2, RACKSPACE CLOUD AUTHOR: KENNY LI NOVEMBER 2012 EXECUTIVE SUMMARY This publication of the CloudSpecs Performance Report compares cloud servers of Amazon
Performance Analysis: Benchmarks of Bare-Metal & Virtual Clouds
Performance Analysis: Benchmarks of Bare-Metal & Virtual Clouds Benchmarking offerings from Internap, Amazon Web Services and By Cloud Spectator October 213 Table of Contents Abstract and Introduction
Comparison of Windows IaaS Environments
Comparison of Windows IaaS Environments Comparison of Amazon Web Services, Expedient, Microsoft, and Rackspace Public Clouds January 5, 215 TABLE OF CONTENTS Executive Summary 2 vcpu Performance Summary
Choosing Between Commodity and Enterprise Cloud
Choosing Between Commodity and Enterprise Cloud With Performance Comparison between Cloud Provider USA, Amazon EC2, and Rackspace Cloud By Cloud Spectator, LLC and Neovise, LLC. 1 Background Businesses
IaaS Performance and Value Analysis A study of performance among 14 top public cloud infrastructure providers
IaaS Performance and Value Analysis A study of performance among top public cloud infrastructure providers By Cloud Spectator and the Cloud Advisory Council October 5, 3 Table of Contents Executive Summary
By Cloud Spectator July 2013
Benchmark Report: Performance Analysis of VS. Amazon EC2 and Rackspace Cloud A standardized, side-by-side comparison of server performance, file IO, and internal network throughput. By Cloud Spectator
Cloud Server Performance A Comparative Analysis of 5 Large Cloud IaaS Providers
Cloud Server Performance A Comparative Analysis of 5 Large Cloud IaaS Providers Cloud Spectator Study June 5, 2013 With a lack of standardization in the IaaS industry, providers freely use unique terminology
Performance Analysis: Benchmarking Public Clouds
Performance Analysis: Benchmarking Public Clouds Performance comparison of web server and database VMs on Internap AgileCLOUD and Amazon Web Services By Cloud Spectator March 215 PERFORMANCE REPORT WEB
Comparing major cloud-service providers: virtual processor performance. A Cloud Report by Danny Gee, and Kenny Li
Comparing major cloud-service providers: virtual processor performance A Cloud Report by Danny Gee, and Kenny Li Comparing major cloud-service providers: virtual processor performance 09/03/2014 Table
Cloud IaaS Performance & Price-Performance
Cloud IaaS Performance & Price-Performance Comparing Linux Compute Performance of 1&1, Amazon AWS, Aruba Cloud, CloudSigma, and Microsoft Azure Prepared for 1&1 on Behalf of SolidFire Commercial Report
Cloud IaaS Performance & Price-Performance
Cloud IaaS Performance & Price-Performance Comparing Linux Compute Performance of 1&1, Amazon AWS, Aruba Cloud, CloudSigma, and Microsoft Azure Prepared for 1&1 on Behalf of SolidFire Commercial Report
Cloud Computing Performance Benchmarking Report. Comparing ProfitBricks and Amazon EC2 using standard open source tools UnixBench, DBENCH and Iperf
Cloud Computing Performance Benchmarking Report Comparing and Amazon EC2 using standard open source tools UnixBench, DBENCH and Iperf October 2014 TABLE OF CONTENTS The Cloud Computing Performance Benchmark
Cloud Computing Performance. Benchmark Testing Report. Comparing ProfitBricks vs. Amazon EC2
Cloud Computing Performance Benchmark Testing Report Comparing vs. Amazon EC2 April 2014 Contents The Cloud Computing Performance Benchmark report is divided into several sections: Topics.Page Introduction...
Cloud Computing Workload Benchmark Report
Cloud Computing Workload Benchmark Report Workload Benchmark Testing Results Between ProfitBricks and Amazon EC2 AWS: Apache Benchmark, nginx Benchmark, SysBench, pgbench, Postmark October 2014 TABLE OF
Rackspace Cloud Servers Analysis A comparison of cloud servers across Generations of Rackspace Cloud Offerings
Rackspace Cloud Servers Analysis A comparison of cloud servers across Generations of Rackspace Cloud Offerings By Cloud Spectator November 7, 2013 Section Page Number Introduction 2 Key Findings 2 To Consider
On- Prem MongoDB- as- a- Service Powered by the CumuLogic DBaaS Platform
On- Prem MongoDB- as- a- Service Powered by the CumuLogic DBaaS Platform Page 1 of 16 Table of Contents Table of Contents... 2 Introduction... 3 NoSQL Databases... 3 CumuLogic NoSQL Database Service...
Network Performance Between Geo-Isolated Data Centers. Testing Trans-Atlantic and Intra-European Network Performance between Cloud Service Providers
Network Performance Between Geo-Isolated Data Centers Testing Trans-Atlantic and Intra-European Network Performance between Cloud Service Providers Published on 4/1/2015 Network Performance Between Geo-Isolated
2016 TOP 10 CLOUD VENDOR BENCHMARK. EUROPE REPORT Price-Performance Analysis of the Top 10 Public IaaS Vendors
216 TOP 1 CLOUD VENDOR BENCHMARK EUROPE REPORT Price-Performance Analysis of the Top 1 Public IaaS Vendors TABLE OF CONTENTS INTRODUCTION 3 WHY IS THIS INFORMATION NECESSARY? 4 MISCONCEPTIONS ABOUT PERFORMANCE
WEBSITE PERFORMANCE TAKING A LOOK AT PERFORMANCE TO HELP DECIDE WHERE TO HOST YOUR NEXT WEBSITE.
WEBSITE PERFORMANCE TAKING A LOOK AT PERFORMANCE TO HELP DECIDE WHERE TO HOST YOUR NEXT WEBSITE. WHY LOOK AT PERFORMANCE? The industry is filled with o ptions when it comes to where you can h ost a website.
The State of Cloud Storage
205 Industry Report A Benchmark Comparison of Speed, Availability and Scalability Executive Summary Both 203 and 204 were record-setting years for adoption of cloud services in the enterprise. More than
Generational Performance Comparison: Microsoft Azure s A- Series and D-Series. A Buyer's Lens Report by Anne Qingyang Liu
Generational Performance Comparison: Microsoft Azure s A- Series and D-Series A Buyer's Lens Report by Anne Qingyang Liu Generational Performance Comparison: Microsoft Azure s A-Series and D-Series 02/06/2015
I/O PERFORMANCE COMPARISON OF VMWARE VCLOUD HYBRID SERVICE AND AMAZON WEB SERVICES
I/O PERFORMANCE COMPARISON OF VMWARE VCLOUD HYBRID SERVICE AND AMAZON WEB SERVICES Businesses are rapidly transitioning to the public cloud to take advantage of on-demand resources and potential cost savings.
UBUNTU DISK IO BENCHMARK TEST RESULTS
UBUNTU DISK IO BENCHMARK TEST RESULTS FOR JOYENT Revision 2 January 5 th, 2010 The IMS Company Scope: This report summarizes the Disk Input Output (IO) benchmark testing performed in December of 2010 for
WWW.PROFITBRICKS.COM. The Secret World of Cloud IaaS Pricing: How to Compare Apples and Oranges Among Cloud Providers
The Secret World of Cloud IaaS Pricing: How to Compare Apples and Oranges Among Cloud Providers TABLE OF CONTENTS Executive Summary:...3 Part 1: The Current State of Cloud Computing IaaS Pricing and Packaging...3
Cloud Vendor Benchmark 2015. Price & Performance Comparison Among 15 Top IaaS Providers Part 1: Pricing. April 2015 (UPDATED)
Cloud Vendor Benchmark 2015 Price & Performance Comparison Among 15 Top IaaS Providers Part 1: Pricing April 2015 (UPDATED) Table of Contents Executive Summary 3 Estimating Cloud Spending 3 About the Pricing
DIABLO TECHNOLOGIES MEMORY CHANNEL STORAGE AND VMWARE VIRTUAL SAN : VDI ACCELERATION
DIABLO TECHNOLOGIES MEMORY CHANNEL STORAGE AND VMWARE VIRTUAL SAN : VDI ACCELERATION A DIABLO WHITE PAPER AUGUST 2014 Ricky Trigalo Director of Business Development Virtualization, Diablo Technologies
Part 1: Price Comparison Among The 10 Top Iaas Providers
Part 1: Price Comparison Among The 10 Top Iaas Providers Table of Contents Executive Summary 3 Estimating Cloud Spending 3 About the Pricing Report 3 Key Findings 3 The IaaS Providers 3 Provider Characteristics
Amadeus SAS Specialists Prove Fusion iomemory a Superior Analysis Accelerator
WHITE PAPER Amadeus SAS Specialists Prove Fusion iomemory a Superior Analysis Accelerator 951 SanDisk Drive, Milpitas, CA 95035 www.sandisk.com SAS 9 Preferred Implementation Partner tests a single Fusion
Price Comparison ProfitBricks / AWS EC2 M3 Instances
Price Comparison / AWS EC2 M3 Instances Produced by, Inc. Additional Information and access to a 14- day trial are avaialble at: http://www.profitbricks.com Getting Started: Comparing Infrastructure- as-
VDI Solutions - Advantages of Virtual Desktop Infrastructure
VDI s Fatal Flaw V3 Solves the Latency Bottleneck A V3 Systems White Paper Table of Contents Executive Summary... 2 Section 1: Traditional VDI vs. V3 Systems VDI... 3 1a) Components of a Traditional VDI
Cloud Performance Benchmark Series
Cloud Performance Benchmark Series Amazon EC2 CPU Speed Benchmarks Kalpit Sarda Sumit Sanghrajka Radu Sion ver..7 C l o u d B e n c h m a r k s : C o m p u t i n g o n A m a z o n E C 2 2 1. Overview We
PARALLELS CLOUD SERVER
PARALLELS CLOUD SERVER Performance and Scalability 1 Table of Contents Executive Summary... Error! Bookmark not defined. LAMP Stack Performance Evaluation... Error! Bookmark not defined. Background...
The Secret World of Cloud IaaS Pricing in 2014: How to Compare Apples and Oranges Among Cloud Providers
The Secret World of Cloud IaaS Pricing in 2014: How to Compare Apples and Oranges Among Cloud Providers TABLE OF CONTENTS: EXECUTIVE SUMMARY... 3 PART 1: THE CURRENT STATE OF CLOUD COMPUTING IAAS: PRICING
SOLUTION BRIEF: SLCM R12.7 PERFORMANCE TEST RESULTS JANUARY, 2012. Load Test Results for Submit and Approval Phases of Request Life Cycle
SOLUTION BRIEF: SLCM R12.7 PERFORMANCE TEST RESULTS JANUARY, 2012 Load Test Results for Submit and Approval Phases of Request Life Cycle Table of Contents Executive Summary 3 Test Environment 4 Server
MAGENTO HOSTING Progressive Server Performance Improvements
MAGENTO HOSTING Progressive Server Performance Improvements Simple Helix, LLC 4092 Memorial Parkway Ste 202 Huntsville, AL 35802 [email protected] 1.866.963.0424 www.simplehelix.com 2 Table of Contents
WHITE PAPER Optimizing Virtual Platform Disk Performance
WHITE PAPER Optimizing Virtual Platform Disk Performance Think Faster. Visit us at Condusiv.com Optimizing Virtual Platform Disk Performance 1 The intensified demand for IT network efficiency and lower
Performance Beyond PCI Express: Moving Storage to The Memory Bus A Technical Whitepaper
: Moving Storage to The Memory Bus A Technical Whitepaper By Stephen Foskett April 2014 2 Introduction In the quest to eliminate bottlenecks and improve system performance, the state of the art has continually
IBM Platform Computing Cloud Service Ready to use Platform LSF & Symphony clusters in the SoftLayer cloud
IBM Platform Computing Cloud Service Ready to use Platform LSF & Symphony clusters in the SoftLayer cloud February 25, 2014 1 Agenda v Mapping clients needs to cloud technologies v Addressing your pain
The State of Cloud Storage
203 Industry Report A Benchmark Comparison of Performance, Availability and Scalability Executive Summary In the last year, Cloud Storage Providers (CSPs) delivered over an exabyte of data under contract.
Everything you need to know about flash storage performance
Everything you need to know about flash storage performance The unique characteristics of flash make performance validation testing immensely challenging and critically important; follow these best practices
IOmark- VDI. Nimbus Data Gemini Test Report: VDI- 130906- a Test Report Date: 6, September 2013. www.iomark.org
IOmark- VDI Nimbus Data Gemini Test Report: VDI- 130906- a Test Copyright 2010-2013 Evaluator Group, Inc. All rights reserved. IOmark- VDI, IOmark- VDI, VDI- IOmark, and IOmark are trademarks of Evaluator
Analysis of VDI Storage Performance During Bootstorm
Analysis of VDI Storage Performance During Bootstorm Introduction Virtual desktops are gaining popularity as a more cost effective and more easily serviceable solution. The most resource-dependent process
Deep Dive: Maximizing EC2 & EBS Performance
Deep Dive: Maximizing EC2 & EBS Performance Tom Maddox, Solutions Architect 2015, Amazon Web Services, Inc. or its affiliates. All rights reserved What we ll cover Amazon EBS overview Volumes Snapshots
CloudHarmony Performance Benchmark: Select High-Performing Public Cloud to Increase Economic Benefits
What You Will Learn Even small differences in cloud performance can affect costs, customer satisfaction, and time to market. This white paper, intended for enterprises evaluating public cloud services,
Benchmark Performance Test Results for Magento Enterprise Edition 1.14.1
Benchmark Performance Test Results for Magento Enterprise Edition 1.14.1 March 2015 Table of Contents 01 EXECUTIVE SUMMARY 03 TESTING METHODOLOGY 03 TESTING SCENARIOS & RESULTS 03 Compare different Enterprise
Why Computers Are Getting Slower (and what we can do about it) Rik van Riel Sr. Software Engineer, Red Hat
Why Computers Are Getting Slower (and what we can do about it) Rik van Riel Sr. Software Engineer, Red Hat Why Computers Are Getting Slower The traditional approach better performance Why computers are
Integrated Grid Solutions. and Greenplum
EMC Perspective Integrated Grid Solutions from SAS, EMC Isilon and Greenplum Introduction Intensifying competitive pressure and vast growth in the capabilities of analytic computing platforms are driving
Performance Benchmark for Cloud Block Storage
Performance Benchmark for Cloud Block Storage J.R. Arredondo vjune2013 Contents Fundamentals of performance in block storage Description of the Performance Benchmark test Cost of performance comparison
Benchmarking Cassandra on Violin
Technical White Paper Report Technical Report Benchmarking Cassandra on Violin Accelerating Cassandra Performance and Reducing Read Latency With Violin Memory Flash-based Storage Arrays Version 1.0 Abstract
Comparing NoSQL Solutions In a Real-World Scenario: Aerospike, Cassandra Open Source, Cassandra DataStax, Couchbase and Redis Labs
Comparing NoSQL Solutions In a Real-World Scenario: Aerospike, Cassandra Open Source, Cassandra DataStax, Couchbase and Redis Labs Composed by Avalon Consulting, LLC June 2015 1 Introduction Specializing
Full Drive Encryption with Samsung Solid State Drives
Full Drive with Solid State Drives A performance and general review of s new selfencrypting solid state drives. Trusted Strategies LLC Author: Bill Bosen November 2010 Sponsored by Electronics Full Drive
Frequently Asked Questions
Frequently Asked Questions 1. Q: What is the Network Data Tunnel? A: Network Data Tunnel (NDT) is a software-based solution that accelerates data transfer in point-to-point or point-to-multipoint network
Monitoring Databases on VMware
Monitoring Databases on VMware Ensure Optimum Performance with the Correct Metrics By Dean Richards, Manager, Sales Engineering Confio Software 4772 Walnut Street, Suite 100 Boulder, CO 80301 www.confio.com
Removing Performance Bottlenecks in Databases with Red Hat Enterprise Linux and Violin Memory Flash Storage Arrays. Red Hat Performance Engineering
Removing Performance Bottlenecks in Databases with Red Hat Enterprise Linux and Violin Memory Flash Storage Arrays Red Hat Performance Engineering Version 1.0 August 2013 1801 Varsity Drive Raleigh NC
Load DynamiX Storage Performance Validation: Fundamental to your Change Management Process
Load DynamiX Storage Performance Validation: Fundamental to your Change Management Process By Claude Bouffard Director SSG-NOW Labs, Senior Analyst Deni Connor, Founding Analyst SSG-NOW February 2015 L
DSS. Diskpool and cloud storage benchmarks used in IT-DSS. Data & Storage Services. Geoffray ADDE
DSS Data & Diskpool and cloud storage benchmarks used in IT-DSS CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/it Geoffray ADDE DSS Outline I- A rational approach to storage systems evaluation
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for On-Premises Single Tenant Deployments
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for On-Premises Single Tenant Deployments July 2014 White Paper Page 1 Contents 3 Sizing Recommendations Summary 3 Workloads used in the tests 3 Transactional
MaxDeploy Ready. Hyper- Converged Virtualization Solution. With SanDisk Fusion iomemory products
MaxDeploy Ready Hyper- Converged Virtualization Solution With SanDisk Fusion iomemory products MaxDeploy Ready products are configured and tested for support with Maxta software- defined storage and with
Dimension Data Enabling the Journey to the Cloud
Dimension Data Enabling the Journey to the Cloud Grant Morgan General Manager: Cloud 14 August 2013 Client adoption: What our clients were telling us The move to cloud services is a journey over time and
When Does Colocation Become Competitive With The Public Cloud? WHITE PAPER SEPTEMBER 2014
When Does Colocation Become Competitive With The Public Cloud? WHITE PAPER SEPTEMBER 2014 Table of Contents Executive Summary... 2 Case Study: Amazon Ec2 Vs In-House Private Cloud... 3 Aim... 3 Participants...
SLIDE 1 www.bitmicro.com. Previous Next Exit
SLIDE 1 MAXio All Flash Storage Array Popular Applications MAXio N1A6 SLIDE 2 MAXio All Flash Storage Array Use Cases High speed centralized storage for IO intensive applications email, OLTP, databases
When Does Colocation Become Competitive With The Public Cloud?
When Does Colocation Become Competitive With The Public Cloud? PLEXXI WHITE PAPER Affinity Networking for Data Centers and Clouds Table of Contents EXECUTIVE SUMMARY... 2 CASE STUDY: AMAZON EC2 vs IN-HOUSE
OTM in the Cloud. Ryan Haney
OTM in the Cloud Ryan Haney The Cloud The Cloud is a set of services and technologies that delivers real-time and ondemand computing resources Software as a Service (SaaS) delivers preconfigured applications,
Comparison of Hybrid Flash Storage System Performance
Test Validation Comparison of Hybrid Flash Storage System Performance Author: Russ Fellows March 23, 2015 Enabling you to make the best technology decisions 2015 Evaluator Group, Inc. All rights reserved.
How AWS Pricing Works May 2015
How AWS Pricing Works May 2015 (Please consult http://aws.amazon.com/whitepapers/ for the latest version of this paper) Page 1 of 15 Table of Contents Table of Contents... 2 Abstract... 3 Introduction...
Achieving Nanosecond Latency Between Applications with IPC Shared Memory Messaging
Achieving Nanosecond Latency Between Applications with IPC Shared Memory Messaging In some markets and scenarios where competitive advantage is all about speed, speed is measured in micro- and even nano-seconds.
How AWS Pricing Works
How AWS Pricing Works (Please consult http://aws.amazon.com/whitepapers/ for the latest version of this paper) Page 1 of 15 Table of Contents Table of Contents... 2 Abstract... 3 Introduction... 3 Fundamental
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for Multitenant Deployments
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for Multitenant Deployments February 2014 Contents Microsoft Dynamics NAV 2013 R2 3 Test deployment configurations 3 Test results 5 Microsoft Dynamics NAV
PrimaryIO Application Performance Acceleration Date: July 2015 Author: Tony Palmer, Senior Lab Analyst
ESG Lab Spotlight PrimaryIO Application Performance Acceleration Date: July 215 Author: Tony Palmer, Senior Lab Analyst Abstract: PrimaryIO Application Performance Acceleration (APA) is designed to provide
8Gb Fibre Channel Adapter of Choice in Microsoft Hyper-V Environments
8Gb Fibre Channel Adapter of Choice in Microsoft Hyper-V Environments QLogic 8Gb Adapter Outperforms Emulex QLogic Offers Best Performance and Scalability in Hyper-V Environments Key Findings The QLogic
HRG Assessment: Stratus everrun Enterprise
HRG Assessment: Stratus everrun Enterprise Today IT executive decision makers and their technology recommenders are faced with escalating demands for more effective technology based solutions while at
WHITE PAPER. How To Compare Virtual Devices (NFV) vs Hardware Devices: Testing VNF Performance
WHITE PAPER How To Compare Virtual Devices (NFV) vs Hardware Devices: Testing VNF Performance www.ixiacom.com 915-3132-01 Rev. B, June 2014 2 Table of Contents Network Functions Virtualization (NFV): An
HP Z Turbo Drive PCIe SSD
Performance Evaluation of HP Z Turbo Drive PCIe SSD Powered by Samsung XP941 technology Evaluation Conducted Independently by: Hamid Taghavi Senior Technical Consultant June 2014 Sponsored by: P a g e
Technical Paper. Moving SAS Applications from a Physical to a Virtual VMware Environment
Technical Paper Moving SAS Applications from a Physical to a Virtual VMware Environment Release Information Content Version: April 2015. Trademarks and Patents SAS Institute Inc., SAS Campus Drive, Cary,
Using VMware VMotion with Oracle Database and EMC CLARiiON Storage Systems
Using VMware VMotion with Oracle Database and EMC CLARiiON Storage Systems Applied Technology Abstract By migrating VMware virtual machines from one physical environment to another, VMware VMotion can
Virtualization Performance on SGI UV 2000 using Red Hat Enterprise Linux 6.3 KVM
White Paper Virtualization Performance on SGI UV 2000 using Red Hat Enterprise Linux 6.3 KVM September, 2013 Author Sanhita Sarkar, Director of Engineering, SGI Abstract This paper describes how to implement
Laboratory Report. An Appendix to SELinux & grsecurity: A Side-by-Side Comparison of Mandatory Access Control & Access Control List Implementations
Laboratory Report An Appendix to SELinux & grsecurity: A Side-by-Side Comparison of Mandatory Access Control & Access Control List Implementations 1. Hardware Configuration We configured our testbed on
How swift is your Swift? Ning Zhang, OpenStack Engineer at Zmanda Chander Kant, CEO at Zmanda
How swift is your Swift? Ning Zhang, OpenStack Engineer at Zmanda Chander Kant, CEO at Zmanda 1 Outline Build a cost-efficient Swift cluster with expected performance Background & Problem Solution Experiments
Performance test report
Disclaimer This report was proceeded by Netventic Technologies staff with intention to provide customers with information on what performance they can expect from Netventic Learnis LMS. We put maximum
Evaluation Report: Accelerating SQL Server Database Performance with the Lenovo Storage S3200 SAN Array
Evaluation Report: Accelerating SQL Server Database Performance with the Lenovo Storage S3200 SAN Array Evaluation report prepared under contract with Lenovo Executive Summary Even with the price of flash
Leveraging NIC Technology to Improve Network Performance in VMware vsphere
Leveraging NIC Technology to Improve Network Performance in VMware vsphere Performance Study TECHNICAL WHITE PAPER Table of Contents Introduction... 3 Hardware Description... 3 List of Features... 4 NetQueue...
Directions for VMware Ready Testing for Application Software
Directions for VMware Ready Testing for Application Software Introduction To be awarded the VMware ready logo for your product requires a modest amount of engineering work, assuming that the pre-requisites
How to Choose your Red Hat Enterprise Linux Filesystem
How to Choose your Red Hat Enterprise Linux Filesystem EXECUTIVE SUMMARY Choosing the Red Hat Enterprise Linux filesystem that is appropriate for your application is often a non-trivial decision due to
Performance Evaluation of VMXNET3 Virtual Network Device VMware vsphere 4 build 164009
Performance Study Performance Evaluation of VMXNET3 Virtual Network Device VMware vsphere 4 build 164009 Introduction With more and more mission critical networking intensive workloads being virtualized
Comparing Cloud Computing Resources for Model Calibration with PEST
Comparing Cloud Computing Resources for Model Calibration with PEST CWEMF Annual Meeting March 10, 2015 Charles Brush Modeling Support Branch, Bay-Delta Office California Department of Water Resources,
Understanding the Performance of an X550 11-User Environment
Understanding the Performance of an X550 11-User Environment Overview NComputing's desktop virtualization technology enables significantly lower computing costs by letting multiple users share a single
StACC: St Andrews Cloud Computing Co laboratory. A Performance Comparison of Clouds. Amazon EC2 and Ubuntu Enterprise Cloud
StACC: St Andrews Cloud Computing Co laboratory A Performance Comparison of Clouds Amazon EC2 and Ubuntu Enterprise Cloud Jonathan S Ward StACC (pronounced like 'stack') is a research collaboration launched
Rackspace Cloud Databases and Container-based Virtualization
Rackspace Cloud Databases and Container-based Virtualization August 2012 J.R. Arredondo @jrarredondo Page 1 of 6 INTRODUCTION When Rackspace set out to build the Cloud Databases product, we asked many
Object Storage: A Growing Opportunity for Service Providers. White Paper. Prepared for: 2012 Neovise, LLC. All Rights Reserved.
Object Storage: A Growing Opportunity for Service Providers Prepared for: White Paper 2012 Neovise, LLC. All Rights Reserved. Introduction For service providers, the rise of cloud computing is both a threat
Where IT perceptions are reality. Test Report. OCe14000 Performance. Featuring Emulex OCe14102 Network Adapters Emulex XE100 Offload Engine
Where IT perceptions are reality Test Report OCe14000 Performance Featuring Emulex OCe14102 Network Adapters Emulex XE100 Offload Engine Document # TEST2014001 v9, October 2014 Copyright 2014 IT Brand
Marvell DragonFly Virtual Storage Accelerator Performance Benchmarks
PERFORMANCE BENCHMARKS PAPER Marvell DragonFly Virtual Storage Accelerator Performance Benchmarks Arvind Pruthi Senior Staff Manager Marvell April 2011 www.marvell.com Overview In today s virtualized data
Case Study I: A Database Service
Case Study I: A Database Service Prof. Daniel A. Menascé Department of Computer Science George Mason University www.cs.gmu.edu/faculty/menasce.html 1 Copyright Notice Most of the figures in this set of
Oracle Database Scalability in VMware ESX VMware ESX 3.5
Performance Study Oracle Database Scalability in VMware ESX VMware ESX 3.5 Database applications running on individual physical servers represent a large consolidation opportunity. However enterprises
Best Practices for Optimizing SQL Server Database Performance with the LSI WarpDrive Acceleration Card
Best Practices for Optimizing SQL Server Database Performance with the LSI WarpDrive Acceleration Card Version 1.0 April 2011 DB15-000761-00 Revision History Version and Date Version 1.0, April 2011 Initial
HP Smart Array Controllers and basic RAID performance factors
Technical white paper HP Smart Array Controllers and basic RAID performance factors Technology brief Table of contents Abstract 2 Benefits of drive arrays 2 Factors that affect performance 2 HP Smart Array
