Demartek June 2012. Broadcom FCoE/iSCSI and IP Networking Adapter Evaluation. Introduction. Evaluation Environment

Similar documents
Evaluation Report: Emulex OCe GbE and OCe GbE Adapter Comparison with Intel X710 10GbE and XL710 40GbE Adapters

How To Evaluate Netapp Ethernet Storage System For A Test Drive

Where IT perceptions are reality. Test Report. OCe14000 Performance. Featuring Emulex OCe14102 Network Adapters Emulex XE100 Offload Engine

Doubling the I/O Performance of VMware vsphere 4.1

LSI MegaRAID FastPath Performance Evaluation in a Web Server Environment

Broadcom 10GbE High-Performance Adapters for Dell PowerEdge 12th Generation Servers

Boosting Data Transfer with TCP Offload Engine Technology

Linux NIC and iscsi Performance over 40GbE

8Gb Fibre Channel Adapter of Choice in Microsoft Hyper-V Environments

Overview and Frequently Asked Questions Sun Storage 10GbE FCoE PCIe CNA

Building High-Performance iscsi SAN Configurations. An Alacritech and McDATA Technical Note

Emulex OneConnect 10GbE NICs The Right Solution for NAS Deployments

Building Enterprise-Class Storage Using 40GbE

3G Converged-NICs A Platform for Server I/O to Converged Networks

1-Gigabit TCP Offload Engine

Interoperability Test Results for Juniper Networks EX Series Ethernet Switches and NetApp Storage Systems

Using NetApp Unified Connect to Create a Converged Data Center

Microsoft SQL Server 2012 on Cisco UCS with iscsi-based Storage Access in VMware ESX Virtualization Environment: Performance Study

QLogic 16Gb Gen 5 Fibre Channel in IBM System x Deployments

White Paper. Advanced Server Network Virtualization (NV) Acceleration for VXLAN

Private cloud computing advances

Converged Networking Solution for Dell M-Series Blades. Spencer Wheelwright

Accelerating Network Virtualization Overlays with QLogic Intelligent Ethernet Adapters

iscsi Top Ten Top Ten reasons to use Emulex OneConnect iscsi adapters

Windows TCP Chimney: Network Protocol Offload for Optimal Application Scalability and Manageability

Best Practices for Deploying SSDs in a Microsoft SQL Server 2008 OLTP Environment with Dell EqualLogic PS-Series Arrays

Introduction to MPIO, MCS, Trunking, and LACP

Dell PowerEdge Blades Outperform Cisco UCS in East-West Network Performance

EMC Unified Storage for Microsoft SQL Server 2008

Comparing the Network Performance of Windows File Sharing Environments

QLogic 16Gb Gen 5 Fibre Channel for Database and Business Analytics

LSI MegaRAID CacheCade Performance Evaluation in a Web Server Environment

How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)

Meeting the Five Key Needs of Next-Generation Cloud Computing Networks with 10 GbE

Oracle Database Deployments with EMC CLARiiON AX4 Storage Systems

Virtualizing SQL Server 2008 Using EMC VNX Series and Microsoft Windows Server 2008 R2 Hyper-V. Reference Architecture

KVM Virtualized I/O Performance

Romley/Sandy Bridge Server I/O Solutions By Seamus Crehan Crehan Research, Inc. March 2012

An Analysis of 8 Gigabit Fibre Channel & 10 Gigabit iscsi in Terms of Performance, CPU Utilization & Power Consumption

Deployments and Tests in an iscsi SAN

From Ethernet Ubiquity to Ethernet Convergence: The Emergence of the Converged Network Interface Controller

XMC Modules. XMC-6260-CC 10-Gigabit Ethernet Interface Module with Dual XAUI Ports. Description. Key Features & Benefits

New!! - Higher performance for Windows and UNIX environments

FCoE Deployment in a Virtualized Data Center

Achieving Mainframe-Class Performance on Intel Servers Using InfiniBand Building Blocks. An Oracle White Paper April 2003

VXLAN Performance Evaluation on VMware vsphere 5.1

The 8Gb Fibre Channel Adapter of Choice in Oracle Environments

The HBAs tested in this report are the Brocade 825 and the Emulex LPe12002 and LPe12000.

Frequently Asked Questions k. Third-party information brought to you courtesy of Dell. NIC Partitioning (NPAR) FAQs

Oracle Database Scalability in VMware ESX VMware ESX 3.5

High Performance SQL Server with Storage Center 6.4 All Flash Array

Achieving a High-Performance Virtual Network Infrastructure with PLUMgrid IO Visor & Mellanox ConnectX -3 Pro

Virtualizing the SAN with Software Defined Storage Networks

Brocade and EMC Solution for Microsoft Hyper-V and SharePoint Clusters

Emulex OneConnect 10GbE NICs The Right Solution for NAS Deployments

Lab Testing Summary Report

Solving the Hypervisor Network I/O Bottleneck Solarflare Virtualization Acceleration

Cisco Data Center 3.0 Roadmap for Data Center Infrastructure Transformation

Cisco Unified Computing System and EMC VNX5300 Unified Storage Platform

Accelerating Microsoft Exchange Servers with I/O Caching

Accelerating High-Speed Networking with Intel I/O Acceleration Technology

Cisco, Citrix, Microsoft, and NetApp Deliver Simplified High-Performance Infrastructure for Virtual Desktops

Ethernet: THE Converged Network Ethernet Alliance Demonstration as SC 09

over Ethernet (FCoE) Dennis Martin President, Demartek

EMC Virtual Infrastructure for Microsoft SQL Server

Video Surveillance Storage and Verint Nextiva NetApp Video Surveillance Storage Solution

Qsan Document - White Paper. Performance Monitor Case Studies

Dell EqualLogic Best Practices Series. Dell EqualLogic PS Series Reference Architecture for Cisco Catalyst 3750X Two-Switch SAN Reference

Performance Evaluation of VMXNET3 Virtual Network Device VMware vsphere 4 build

Achieving a High Performance OLTP Database using SQL Server and Dell PowerEdge R720 with Internal PCIe SSD Storage

Broadcom Ethernet Network Controller Enhanced Virtualization Functionality

Performance Comparison of Fujitsu PRIMERGY and PRIMEPOWER Servers

VMware vsphere 4.1 Networking Performance

Next Gen Data Center. KwaiSeng Consulting Systems Engineer

Whitepaper. Implementing High-Throughput and Low-Latency 10 Gb Ethernet for Virtualized Data Centers

Virtual Compute Appliance Frequently Asked Questions

1000-Channel IP System Architecture for DSS

QoS & Traffic Management

RoCE vs. iwarp Competitive Analysis

HP SN1000E 16 Gb Fibre Channel HBA Evaluation

What Is Microsoft Private Cloud Fast Track?

Microsoft Windows Server 2012 Hyper-V Storage Performance: Measuring SMB 3.0, iscsi, and FC Protocols

Best Practices for Simplifying Your Cloud Network

Windows Server 2008 R2 Hyper-V Live Migration

WI-FI PERFORMANCE BENCHMARK TESTING: Aruba Networks AP-225 and Cisco Aironet 3702i

High Performance MySQL Cluster Cloud Reference Architecture using 16 Gbps Fibre Channel and Solid State Storage Technology

Optimizing SQL Server Storage Performance with the PowerEdge R720

Removing Performance Bottlenecks in Databases with Red Hat Enterprise Linux and Violin Memory Flash Storage Arrays. Red Hat Performance Engineering

Best Practices Guide: Network Convergence with Emulex LP21000 CNA & VMware ESX Server

HP ProLiant Gen8 vs Gen9 Server Blades on Data Warehouse Workloads

Leveraging NIC Technology to Improve Network Performance in VMware vsphere

Certification Document macle GmbH Grafenthal-S1212M 24/02/2015. macle GmbH Grafenthal-S1212M Storage system

Windows Server 2008 R2 Hyper-V Live Migration

Transcription:

June 212 FCoE/iSCSI and IP Networking Adapter Evaluation Evaluation report prepared under contract with Corporation Introduction Enterprises are moving towards 1 Gigabit networking infrastructures and are examining the uses for multiprotocol adapters, especially as server virtualization continues to be deployed. With an increasing number of operating systems and/or applications installed in today s servers, the need for network and storage adapters that can support multiple protocols becomes more important. IT administrators are also turning to 1 Gigabit Ethernet (1GbE) technologies for cost-effective and flexible methods of addressing growing network traffic demands. A key component of the 1GbE network is the network adapter with network protocol processing offload capabilities for greater host processor utilization. 1GbE networking adapters by allow servers to fully leverage the available 1GbE bandwidth, while reducing processor utilization. As a result, 1GbE end-to-end performance is comparable to more specialized and costlier cloud based data center server to fabric interconnects. With 1GbE, organizations can expand application capabilities, increase scalability and improve responsiveness to address dynamic business environments. has introduced its BCM95781S dual-port Converged NIC that supports 1 Gbps IP networking with network protocol offload capabilities, Fibre Channel over Ethernet (FCoE) with full hardware offload and iscsi with full hardware offload. This adapter fits the description of a Converged Network Adapter (CNA). commissioned Demartek to evaluate this adapter and compare its performance to other competitive 1 Gbps adapters. Evaluation Environment For the storage protocol offload testing, the evaluation environment consisted of a Windows Server 28 R2 SP1 platform with an Intel Xeon E5-269 processor and 32GB of RAM using various 1 Gbps adapters communicating with two different DRAM-based storage targets. The FCoE storage target consisted of 32 DRAM LUNs and the iscsi storage target consisted of 24 DRAM LUNs. The open source I/O load generator IOmeter was used to drive the test workloads. As is customary for storage performance tests, throughput results are measured in MegaBytes per second. For the IP networking tests, the evaluation environment consisted of a server with the 1GbE adapters installed, connected to a 1GbE/1GbE network environment with 16 clients, and separately, the same server and 1GbE adapters connected to specialized networking hardware. As is customary for network performance tests, throughput results are measured in megabits per second. The 1 Gbps adapters shown in the table below were tested. The most current drivers and/or firmware were installed for each of the adapters for these tests. 212 Demartek www.demartek.com Email: info@demartek.com

BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 2 of 2 Vendor Adapter Model Hardware Offload BCM95781S Yes Emulex OCe1112-F & OCe1112-I Yes Intel X52 No QLogic QLE8242 Yes Evaluation Summary Storage Offload Performance We found that the BCM95781S adapter outperformed the competitive adapters. We were able to achieve approximately 2.5 million IOPS for FCoE random reads and more than 1.5 million IOPS for iscsi random reads using the BCM95781S adapter. has demonstrated very strong performance and good CPU efficiency when compared to both fullhardware-offload adapters and non-hardware-offload adapters. IP Networking Performance We found that the BCM95781 1GbE adapter outperformed the equivalent competitive adapters in our tests, for the transmit and receive tests as well as the small packet tests. 212 Demartek www.demartek.com Email: info@demartek.com

IOPS IOPS BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 3 of 2 1 FCoE Performance: Full Hardware Offload Comparisons As enterprises review their long-term server, networking and storage plans, consideration must be given to converged networking technologies such as Fibre Channel over Ethernet (FCoE). The long-term economics of reduced numbers of adapters, cables and disparate switches can be a compelling reason to explore the use of FCoE. The adapter significantly outperformed the competitive adapters in the smaller block sizes in terms of I/Os per second (IOPS). 3,, IOPS - RndRead - FCoE 2,5, 2,, 25% higher than Emulex 814% higher than QLogic at 512 bytes 1,5, 1,, 5, QLogic Emulex IOPS - RndWrite - FCoE 1,6, 1,4, 1,2, 1,, 8, 6, 4, QLogic Emulex 2, 212 Demartek www.demartek.com Email: info@demartek.com

MBPS MBPS BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 4 of 2 The adapter achieved line rate at smaller block sizes than the competitive adapters in terms of bandwidth throughput. At 8K block sizes and above, one of the competitive adapters matched the performance of the adapter for FCoE random reads. In the graph, the two data lines overlap each other for those data points. 2,5 MBPS - RndRead - FCoE 2, 1,5 1, 5 QLogic Emulex MBPS - RndWrite - FCoE 2,5 2, 1,5 1, 5 QLogic Emulex 212 Demartek www.demartek.com Email: info@demartek.com

MBPS IOPS BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 5 of 2 2 FCoE Performance: Non-Hardware Offload Comparisons These comparison charts show the adapter performance compared to non-hardware offload (Intel) adapter. At the smaller block sizes, the adapter significantly outperformed the Intel adapter for FCoE random reads. For the larger block sizes, the two adapters performed approximately the same, so there is some overlap in the lines on the graph. 3,, IOPS - RndRead - FCoE 2,5, 2,, 1% higher than Intel at 512 bytes 1,5, 1,, Intel 5, MBPS - RndRead - FCoE 2,5 2, 1,5 1, Intel 5 212 Demartek www.demartek.com Email: info@demartek.com

MBPS IOPS BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 6 of 2 For FCoE random reads, the adapter was better in terms of CPU effectiveness for IOPS and MBPS for each percentage of CPU utilization. 3, 25, 2, IOPS Per CPU % - RndRead - FCoE 1% higher than Intel at 2K 15, 1, Intel 5, 1,4 MBPS Per CPU % - RndRead - FCoE 1,2 1, 8 6 4 Intel 2 212 Demartek www.demartek.com Email: info@demartek.com

IOPS IOPS BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 7 of 2 3 iscsi Performance: Full Hardware Offload Comparisons As enterprises consider deploying iscsi in 1GbE environments, iscsi adapter performance can be a critical factor in the overall iscsi solution. has demonstrated very strong performance for iscsi, as shown by the performance results below. These comparison charts show the adapter performance compared to iscsi hardware-offload (Emulex and QLogic) adapters. Again, the adapter significantly outperformed the competitive adapters in the smaller block sizes in terms of I/Os per second (IOPS). IOPS - RndRead - iscsi 1,6, 1,4, 1,2, 1,, 8, 6, 4, 1% higher than Emulex 15% higher than QLogic at 512 bytes QLogic Emulex 2, IOPS - RndWrite - iscsi 1,4, 1,2, 1,, 8, 6, QLogic Emulex 4, 2, 212 Demartek www.demartek.com Email: info@demartek.com

MBPS MBPS BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 8 of 2 The adapter achieved line rate at the block sizes shown on the charts for iscsi offload, while the hardware offload competitive adapters were unable to achieve line rate at these block sizes. 2,5 MBPS - RndRead - iscsi 2, 1,5 1, 5 QLogic Emulex MBPS - RndWrite - iscsi 2,5 2, 1,5 1, 5 QLogic Emulex 212 Demartek www.demartek.com Email: info@demartek.com

IOPS BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 9 of 2 For iscsi hardware offload random reads, the adapter was better in terms of CPU effectiveness for IOPS. 45, IOPS Per CPU % - RndRead - iscsi 4, 35, 3, 25, 2, 15, 1, 5, Emulex QLogic 212 Demartek www.demartek.com Email: info@demartek.com

IOPS IOPS BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 1 of 2 4 iscsi Performance: Non-Hardware Offload Comparisons These comparison charts show the adapter performance compared to non-hardware offload iscsi (Intel) adapters. For iscsi random read operations, the adapter significantly outperformed the competitive adapter, especially for many of the smaller block sizes. 1,6, 1,4, 1,2, 1,, 8, 6, 4, 2, IOPS - RndRead - iscsi 119% higher than Intel at 512 bytes Intel 2,5 MBPS - RndRead - iscsi 2, 1,5 1, 5 Intel 212 Demartek www.demartek.com Email: info@demartek.com

BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 11 of 2 For iscsi random reads, the adapter was better in terms of CPU effectiveness for IOPS and MBPS for each percentage of CPU utilization. 45, 4, IOPS Per CPU % - RndRead - iscsi 35, 3, 25, 2, 15, 361% higher than Intel at 4K Intel MBPS IOPS 1, 5, MBPS Per CPU % - RndRead - iscsi 2, 1,8 1,6 1,4 1,2 1, 8 6 4 2 Intel 212 Demartek www.demartek.com Email: info@demartek.com

BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 12 of 2 5 IP Networking Performance: Transmit and Receive Tests - Linux As public and private cloud providers move to 1GbE infrastructure, the choice of 1GbE adapters in host servers plays an important role. Many cloud environments use Linux hosts due to the low cost of the operating system. IXIA IxChariot For these tests, dual-port 1GbE adapters were tested using IXIA IxChariot, a popular network test tool for simulating real-world applications to predict device and system performance under realistic load conditions. Comprised of the IxChariot Console, Performance Endpoints and IxProfile, the IxChariot product family offers thorough network performance assessment and device testing by simulating protocols across the Ethernet network endpoints. IxChariot provides the ability to assess the performance characteristics of any application running on wired and wireless networks. Three Tests Three tests were run, measuring throughput performance and CPU utilization. These three tests were: Transmit/Receive (TX/RX) Test Receive-only (RX) Test Transmit-only (TX) Test For these tests, the system under test (SUT) was configured as shown in the diagram. Each 1GbE port was connected to a different 1GbE switch. These switches have a combination of 1GbE and 1GbE ports. Network traffic was sent from the server to each of 16 clients. Each client had two 1GbE adapters, one connected to the first 1GbE switch and the other port connected to the second switch. The goal was to make sure that there was enough traffic coming to and from the clients to keep the 1GbE adapter ports on the SUT fully busy. For these tests, both of the 1Gb ports on the adapters were connected and tested together. For the Transmit/Receive test, the maximum combined theoretical line rate is 4Gb per second because traffic is flowing in both directions on both ports simultaneously. For the receive-only and transmitonly tests, traffic is flowing in one direction only, resulting in a maximum combined line rate of 2Gb per second. However, traffic must flow through the adapter, its driver, and the full TCP/IP stack on the host server. These tests measure the throughput performance and CPU utilization of the complete solution. 212 Demartek www.demartek.com Email: info@demartek.com

Throughput (Mbps) Throughput (Mbps) BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 13 of 2 Transmit/Receive (TX/RX) Test The first set of tests are combined transmit and receive tests. For these tests, traffic is flowing in both directions on both of the 1GbE ports on the adapters and different block sizes were tested. In these combined transmit and receive tests, the adapter had higher performance. 4, 35, 3, 25, 2, 15, 1, 5, 1GbE Transmit/Receive Test 64K 32K 16K 8K 4K BRCM-95781 ELX-OCe1112 QLGC-QLE8242 3% higher than Emulex 19% higher than QLogic at 64K Receive (RX) Test For the receive-only test, traffic is flowing in one direction only. Because the throughput values were quite close, the scale is adjusted to highlight the differences. 2, 18, 16, 14, 12, 1, 8, 6, 4, 2, 1GbE Receive Test 64K 32K 16K 8K 4K BRCM-95781 ELX-OCe1112 QLGC-QLE8242 212 Demartek www.demartek.com Email: info@demartek.com

Throughput (Mbps) BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 14 of 2 Transmit (TX) Test For the transmit-only test, traffic is flowing in one direction only. Because the throughput values were very close, the scale is adjusted to highlight the differences. Note the results were almost identical for the larger block sizes, but the adapter had a performance advantage at 4K block size. 19,2 19, 18,8 1GbE Transmit Test 18,6 18,4 18,2 18, 17,8 17,6 64K 32K 16K 8K 4K BRCM-95781 ELX-OCe1112 QLGC-QLE8242 212 Demartek www.demartek.com Email: info@demartek.com

Percent of Line Rate (1Gb/sec) Frames per Second (fps) BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 15 of 2 6 IP Networking Performance: Small Packet Performance For the small packet performance test, we used the IXIA IxAutomate application test tool to configure and automate test scenarios and analyze test results for back to back, one to many, many to one throughput results. Using the capabilities of Ixia test hardware, such as wire-speed traffic generation, filtering, capturing, and statistics collection, we utilized IxAutomate suite of pre-built tests based on industry-standard RFCs requirements. These tests handle packet routing through the between the test hardware and the adapter installed in the host server. These tests focus on small packets and utilize the adapter, driver and the IP portion of the TCP/IP stack. Single-threaded Packet Routing For these tests, the adapter was able to handle more frames per second than the Emulex and QLogic adapters. These results are for both active ports. 2,5, Single-threaded Packet Routing Frames per Second - dual port 2,, 1,5, 1,, 5, 24% higher than Emulex 1% higher than QLogic at 64 bytes 64 128 256 Frame Size (bytes) 512 BRCM-95781 ELX-OCe1112 QLGC-QLE8242 5 Single-threaded Packet Routing Percent of dual port 1GbE Line Rate 4 3 2 1 64 128 256 Frame Size (bytes) 512 BRCM-95781 ELX-OCe1112 QLGC-QLE8242 212 Demartek www.demartek.com Email: info@demartek.com

Percent of Line Rate (1GbE) Frames per Second (fps) BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 16 of 2 Multi-threaded Packet Routing For these tests, the adapter was able to handle more frames per second than the Emulex and QLogic adapters. These results are for both active ports. 7,, 6,, 5,, 4,, 3,, 2,, 1,, Multi-threaded Packet Routing Frames per Second 264% higher than Emulex 2% higher than QLogic at 64 bytes 64 128 256 512 Frame Size (bytes) BRCM-95781 ELX-OCe1112 QLGC-QLE8242 Multi-threaded Packet Routing Percent of 1GbE Line Rate 1 8 6 4 2 64 128 256 512 Frame Size (bytes) BRCM-95781 ELX-OCe1112 QLGC-QLE8242 212 Demartek www.demartek.com Email: info@demartek.com

Throughput (Mbps) Throughput (Mbps) BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 17 of 2 7 IP Networking Performance: Transmit and Receive Tests - Windows We also conducted transmit and receive tests in Windows, and found that the adapter outperformed the other adapters in almost every case. 1GbE TX/RX Test - Windows - L2 4, 35, 3, 25, 2, 15, 1, 5, 64K 32K 16K 8K 4K 2K 1K BRCM-95781 ELX-OCe1112 QLGC-QLE8242 TCP/IP Offload Engine (TOE) is a feature available in Microsoft Windows operating systems that offloads the TCP/IP stack. The QLogic card does not support TOE mode, so it is not represented on the chart below. 1GbE TX/RX Test - Windows - TOE 4, 35, 3, 25, 2, 15, 1, 5, 64K 32K 16K 8K 4K 2K 1K BRCM-95781 ELX-OCe1112 212 Demartek www.demartek.com Email: info@demartek.com

BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 18 of 2 Summary and Conclusion For IT managers considering a Converged Network Adapter (CNA) with storage over Ethernet capabilities as well as high-performance IP networking, we recommend evaluating the BCM95781S adapter. In these tests, we found that the adapter provided very strong performance across the board for IP networking and storage protocol offloads (FCoE and iscsi), and would be an excellent choice for 1GbE deployment in enterprise data center and cloud environments. 212 Demartek www.demartek.com Email: info@demartek.com

BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 19 of 2 Appendix Evaluation Environment Host Server Dell PowerEdge R72 Intel Xeon E5-269, 2.9 GHz, 8 cores, 16 logical processors 32GB RAM Windows Server 28 R2 SP1, version 6.1.761 Adapters BCM95781S, driver version 7.2.8. Emulex OCe1112, FCoE driver version 7.2.5.7, iscsi driver version 4..317. Intel X52, driver version 2.9.71. QLogic QLE8242, FCoE driver version 9.1.9.39, iscsi driver version 2.1.5.27 Storage Targets FCoE: SANBlaze DRAM targets, 32 total targets connected to Cisco Nexus 52 switch iscsi: SANBlaze DRAM targets, 24 total targets connected to HP ProCurve 1 Gbps switch IOmeter Parameters FCoE: 16 workers, 2 physical drives per worker, queue depths 1 through 64 iscsi: 12 workers, 2 physical drives per worker, queue depths 1 through 64 212 Demartek www.demartek.com Email: info@demartek.com

BCM95781S FCoE/iSCSI & IP Adapter Evaluation June 212 Page 2 of 2 The original version of this document is available at http://www.demartek.com/demartek BCM95781_FCoE-iSCSI_Adapter_Evaluation_212-6.html., Connecting everything logo and NetXtreme are trademarks or registered trademarks of Corporation and/or its affiliates in the United States and certain other countries. Emulex is a registered trademark and OneConnect is a trademark of Emulex Corporation. Intel and Xeon are registered trademarks of Intel Corporation. QLogic and the QLogic logo are registered trademarks of QLogic Corporation. Demartek is a registered trademark of Demartek, LLC. All other trademarks are the property of their respective owners. 212 Demartek www.demartek.com Email: info@demartek.com