Cloud Computing Performance Benchmarking Report Comparing and Amazon EC2 using standard open source tools UnixBench, DBENCH and Iperf October 2014
TABLE OF CONTENTS The Cloud Computing Performance Benchmark report is divided into several sections: Introduction... 3 Executive Summary... 4 Benchmark Testing Methodology... 4 UnixBench Results... 6 DBench Results... 9 Iperf Results... 12 Next Steps... 15 We invite you to contact one of our cloud service engineers by emailing inboundus@profitbricks.com or calling 866-852-5229. We offer a 14-day, no obligation trial that does not require a credit card. Please visit http://www.profitbricks.com to get started. Page 2
Introduction At, cloud computing performance is a primary focus for our teams of engineers. We take a comprehensive view on IaaS performance benchmarking. We contract with 3rd parties, explore, research and develop new testing methodologies in-house, and discover the bugs and limitations of traditional and new benchmarking tools. As a cloud computing service provider we realize that we are an edge case. Our responsibility for high performance extends from hardware and network architecture to the virtualization layer and the software we develop to manage the environment. Cloud computing s multi-tenant architecture requires that we keep and maintain performance levels that exceed our customers expectations, and we strive to remain the highest performance cloud available in the market. To this end, the performance engineering team continually tests the performance of other cloud platforms and services. Performance is our passion and the team is dedicated to publishing accurate and repeatable results. Benchmark testing dissimilar cloud computing environments requires a thorough understanding of the respective environments and the effects that each component may have on performance, It s also essential that each benchmark test is run on top of a similar stack and software configuration. We recommend you to download our Workload-Specific in which we explore the performance of specific applications and databases like MySQL and Apache. strives to share our expertise with the community and to engage in an open dialogue about cloud performance. We welcome questions about our methodologies. Page 3
Executive Summary In this report, the performance engineering team presents our latest series of standardized benchmark testing results (UnixBench, DBENCH and Iperf), comparing virtual data centers and instances with Amazon s EC2 instances. We strive to create an apples to apples comparison of virtual servers/instances at both and Amazon EC2, and have presented our methodologies with full transparency (see next section, Benchmark Testing Methodology). In January 2014 Amazon introduced a new line of instance sizes, based on new CPUs and new architecture. engineers spent eight weeks configuring and testing these new EC2 m3 instance types and compared them to similar instances alongside Amazon s legacy m1 instances. The results show that continues to be the performance leader in the cloud on all tested configurations, using all three standard benchmarks, showing results that are at least twice the performance of Amazon. In some cases the performance difference between and Amazon EC2 is an astounding 17x (Iperf). Benchmark Testing Methodology All benchmarking tools were compiled on Ubuntu 12.04 (64-bit) servers running a 3.2.0.58 Linux kernel, and each test was performed three times on three separate days with the results averaged. By taking advantage of granular scaling features, we were able to compare Amazon s latest EC2 offerings to servers with similar resource specifications. For example, we contrasted the performance of an EC2 m3.large with 2 vcpus and 7.5GB of RAM with a server containing 2 dedicated cores and 7.5GB of dedicated RAM (and equivalent InfiniBand-powered double redundant storage devices). In all cases, these are standard configurations at both and Amazon. No special or optional services were added to the configurations. For each report, three test runs were completed on each instance for three days. Note: we continually see performance variations (sometimes large) on Amazon EC2 services that vary by instance size, by hour, and by data center. This is the main reason for multiple runs we want to ensure that we ve determined a representative composite result. Server Configuration: Amazon virtual instances are sold and packaged with a pre-packaged quantity of vcpus, RAM and temporary storage. virtual instances are not pre-packaged, thus every instance can be uniquely configured. We ve done our best to match equivalent instances to Amazon instances on a hardware basis not a cost basis. Page 4
OS Configuration: Operating System: Ubuntu 12.04 (64-bit) with 3.2.0.58 Linux kernel Each instance had the following packages installed if the provider did not include them with their images: make, GCC, automake, libpopt-dev, zlib1g-dev, g++. git All benchmark software used in these tests is open source and we have provided the information you would need (including configuration details) to run these tests. You can download the tools from these URLs: UnixBench: https://byte-unixbench.googlecode.com/files/unixbench5.1.3.tgz DBENCH: Use git to create a local branch of the official git repository and download the tree: git clone git://git.samba.org/sahlberg/dbench.git dbench Iperf: http://downloads.sourceforge.net/project/iperf/iperf-2.0.5.tar.gz For DBENCH and UnixBench, both of which test storage performance, a separate block device (EBS on EC2 vs. block storage) was formatted with ext4 and mounted: mkfs.ext4+/dev/second_device+ mount+/dev/second_device+/bench+ Page 5
UnixBench Results UnixBench is one of the most popular open source benchmark performance testing tools which combines multiple tests to assess various aspects of a system s performance in a Unix-like system. performance engineers run these tests multiple times over numerous days. The entire set of index values is then combined to calculate an overall performance index for the system. In this UnixBench benchmark test, exceeds the performance of Amazon s new m3 instances by 2 to 4.4x, and Amazon s m1 instances by 2.2 to 3.1x depending on the instance size. These tests were run over three separate days and the results and test configuration can be found in the UnixBench test results on the next two pages. UnixBench test configuration: Get HZ environment variable to what is configured as "CONFIG_HZ_?" in the kernel configuration: "grep 'CONFIG_HZ_' /boot/config-3.2.0-58-virtual" In case of Ubuntu 12.04 LTS: 250 HZ Edit Makefile and change the following line: from "OPTON = -O2 -fomit-frame-pointer -fforce-addr -ffast-math -Wall" to "OPTON = -march=native -O2 -fomit-frame-pointer -fforce-addr -ffast-math -Wall" Compile UnixBench: -HZ="250" make Run UnixBench: HZ="250"./Run -c NR_OF_CORES NR_OF_CORES equals the number of virtual cores available to each VM Page 6
UnixBench Results: Cloud Server Instances vs. Amazon AWS EC2 Instances type m3 3000 UnixBench score 2500 2000 1500 UnixBench score AWS EC2 UnixBench score 1000 500 0 m3.medium Instance Size EC2: m3.medium, equivalent (1 CPU Core/3.75GB EC2: m3.large, equivalent (2 CPU cores/7.5gb EC2: m3.xlarge, equivalent (4 CPU cores/15gb m3.large m3.xlarge UnixBench score AWS EC2 UnixBench score Performance Advantage 1213.16 274.31 4.4x 1766.58 759.50 2.3x 2475.73 1218.80 2.0x Page 7
UnixBench Results: Cloud Server Instances vs. Amazon AWS EC2 Instances type m1 UnixBench score 3000 2500 2000 1500 UnixBench score AWS EC2 UnixBench score 1000 500 0 m1.medium Instance Size EC2: m1.medium, equivalent (1 CPU Core/3.75GB EC2: m1.large, equivalent (2 CPU cores/7.5gb EC2: m1.xlarge, equivalent (4 CPU cores/15gb m1.large m1.xlarge UnixBench score AWS EC2 UnixBench score Performance Advantage 1213.16 389.10 3.1x 1766.58 762.82 2.3x 2475.73 1103.38 2.2x Page 8
DBench Results DBENCH is a popular open source performance testing tool that generates I/O workloads for a file system or networked CIFS of NFS server. It is used to stress a file system to see which workload becomes saturated. In this DBENCH benchmark test, exceeds the performance of standard EBS volumes on Amazon s new m3 instances by 10.5 to 16x and Amazon s m1 instances by 9.5 to 12.7x depending on instance size. These tests were run over three separate days and the results and test configuration can be found on the DBENCH test results on the next two pages. DBENCH test configuration: Compile(DBENCH:+ CFLAGS= march=native +./autogen.sh+ CFLAGS= march=native +./configure+ make+ Run(DBENCH:(./dbench+ backend=fileio+ t+60+ D+/benchmark/bench+ loadfile=loadfiles/client.txt+48+ Page 9
DBENCH Results: Cloud Server Instances vs. Amazon AWS EC2 Instances type m3 700 DBENCH (MB/s) 600 500 400 DBECNH (MB/s) 300 AWS EC2 DBECNH (MB/s) 200 100 0 m3.medium Instance Size EC2: m3.medium, equivalent (1 CPU Core/3.75GB EC2: m3.large, equivalent (2 CPU cores/7.5gb EC2: m3.xlarge, equivalent (4 CPU cores/15gb m3.large m3.xlarge DBENCH (MB/s) AWS EC2 DBENCH (MB/s) Performance Advantage 344.85 32.87 10.5x 505.05 31.13 16.2x 641.31 48.63 13.2x Page 10
DBENCH Results: Cloud Server Instances vs. Amazon AWS EC2 Instances type m1 700 DBENCH (MB/s) 600 500 400 DBECNH (MB/s) 300 AWS EC2 DBECNH (MB/s) 200 100 0 m1.medium Instance Size EC2: m1.medium, equivalent (1 CPU Core/3.75GB EC2: m1.large, equivalent (2 CPU cores/7.5gb EC2: m1.xlarge, equivalent (4 CPU cores/15gb m1.large m1.xlarge DBENCH (MB/s) AWS EC2 DBENCH (MB/s) Performance Advantage 344.85 27.13 12.7x 505.05 41.74 12.1x 641.31 67.80 9.5x Page 11
Iperf Results Iperf is a popular open source network testing tool that creates TCP and UDP data streams and measures the throughput of the network that it is running on. In this Iperf benchmark test, exceeds the performance of Amazon s new m3 instances by 7.3 to 17x and Amazon s m1 instances by 4.9 to 7.2x depending on the instance size. These tests were run over three separate days and the results and test configuration can be found on the Iperf test results on the next two pages. Iperf test configuration: VMs were configured on both providers for both the client and the server instances appeared to run on separate hardware to ensure that a local bridge was not used in these tests. Compile iperf: +./configure( make( Run iperf: Server:+./src/iperf+ s( Client:+./src/iperf+ c+ip_ofserver+ f+m+ t+60+ P+4( Page 12
Iperf Results: Cloud Server Instances vs. Amazon AWS EC2 Instances type m3 6000 Iperf (Mbit/s) 5000 4000 3000 Iperf (Mbit/s) AWS EC2 Iperf (Mbit/s) 2000 1000 0 m3.medium Instance Size EC2: m3.medium, equivalent (1 CPU Core/3.75GB EC2: m3.large, equivalent (2 CPU cores/7.5gb EC2: m3.xlarge, equivalent (4 CPU cores/15gb m3.large m3.xlarge Iperf (Mbit/s) AWS EC2 Iperf (Mbit/s) Performance Advantage 5112.67 300.11 17.0x 5189.67 678.78 7.6x 5660.89 778.89 7.3x Page 13
Iperf Results: Cloud Server Instances vs. Amazon AWS EC2 Instances type m1 6000 Iperf (Mbit/s) 5000 4000 3000 Iperf (Mbit/s) AWS EC2 Iperf (Mbit/s) 2000 1000 0 m1.medium Instance Size EC2: m1.medium, equivalent (1 CPU Core/3.75GB EC2: m1.large, equivalent (2 CPU cores/7.5gb EC2: m1.xlarge, equivalent (4 CPU cores/15gb m1.large m1.xlarge Iperf (Mbit/s) AWS EC2 Iperf (Mbit/s) Performance Advantage 5112.67 1033.56 4.9x 5189.67 717.67 7.2x 5660.89 1120.89 5.1x Page 14
Next Steps is here to help you evaluate us as a cloud provider, from benchmark testing to pricing questions. Our team of knowledgeable cloud service engineers are available by emailing inbound-us@profitbricks.com or calling 866-852-5229. offers a 14-day, no obligation trial account that does not require a credit card for activation. Visit http://www.profitbricks.com/trial today Download our Workload-Specific at http://info.profitbricks.com/rs/profitbricks/images/cloud-computing-performance-workloadbenchmarks-aws-.pdf. Finally, for up-to-the-minute information about our products and services, be sure to check out our Resource Center at http://www.profitbricks.com/press-and-info-center. Inc. 15900 La Cantera Pkwy Ste. 19210 San Antonio, TX, 78256 Phone: +1 866 852 5229 Fax: +1 888 620 3376 Email: info-us@profitbricks.com http://www.profitbricks.com twitter.com/profitbricksusa blog.profitbricks.com 2014, Inc. All rights reserved., the logo and Data Center Designer are trademarks of Inc. All other trademarks are the property of their respective owners. reserves the right to make changes without further notice. 2014 Inc. All rights reserved., the logo and Dynamic Data Center are trademarks of Inc. All other trademarks are the property of their respective owners. reserves the right to make changes without further notice. Page 15