FCoE Deployment in a irtualized Data Center Satheesh Nanniyur (satheesh.nanniyur@qlogic.com) Sr. Staff Product arketing anager QLogic Corporation All opinions expressed in this presentation are that of the author. They do not indicate any product plans or endorsements from his employer! February 2010 1
Agenda Trends in irtualized Data Centers FCoE CNA Compatibility with FC IO irtualization with FCoE CNAs FCoE Phased Deployment Software FCoE v/s FCoE CNA Deployment obility and QoS with FCoE CNA Brief Case Study of FCoE Deployment February 2010 2
Drivers for irtualization IDC survey suggests virtualization driven by 4 key factors Server consolidation Business continuity Test and development Resource pooling and sharing leading to dynamic provisioning Industry experiencing rapid growth of virtualization WW Software Revenue Forecast February 2010 3 Source: IDC irtualization Tracker Survey, 2008
irtualized Data Center Trends ulti-core CPUs allow bigger and multiple workloads, increase server consolidation irtual achine IO Performance near Physical achine requiring more bandwidth per server (IO irtualization, Hardware assists, Offloads) Reduced Hardware Components reduces Power and Cooling cost (Nearly 60% of data center cost comes from Power and Cooling) Simplified anagement and Dynamic Provisioning February 2010 4
FCoE Overview It is just Fibre Channel encapsulated in Ethernet FC frames are not modified enroute FCoE Goal FCoE Packet: Enable Server I/O Consolidation and FC Frame: FC Header Network Convergence Payload S O F Ethernet Header FCoE Header FC Header Payload CRC E O F February 2010 5
FCoE CNA Software Compatibility FC and NIC Driver odel Operating System FCoE and NIC Driver odel Operating System Network Stack NIC Driver Storage Stack Network Stack Reliability from field proven driver stack 100% FC Driver application transparency NIC Driver Investment protection and smooth transition Storage Stack FC/FCoE FCoE Driver February 2010 6
Operating System view of FCoE CNA Storage Ethernet February 2010 7
FCoE CNA anagement Compatibility Deploy common SAN practices such as FC zoning, LUN masking, etc. Transparent Layer 2 FCoE to FC address mapping Reuse proven SAN & LAN management practices Reuse familiar SAN & LAN management tools Build failover Reuse solutions in-house SAN & using LAN admin multi-pathing expertise software Same driver interface as FC Troubleshoot LAN/SAN issues Unique AC address or Ethertype for FCoE data traffic February 2010 8
IO irtualization with FCoE CNA Fdrvr F- drvr Fdrvr F- drvr irtual Intermediary (I) irtual Intermediary (I) I I F0 F1 F0 F1 CNA CNA PF0 PF1 PF0 PF1 CNA CNA I/O irtualization IO NIC Assist SR-IO NI/EPA SW based February 2010 9
FCoE Phased Deployment TODAY: Dual adapters and networks for LAN and SAN PHASE 1: CNA reduces power, enables efficient use of real estate in blades NO NEED for infrastructure overhaul; deploy in phases NO NEED for rip and replace; use existing PHASE LAN 2: Converged & SAN Network NO NEED for two convergence technologies; consolidates one network fits ALL resources while coexisting with current networks PHASE 3: Simplified data center with Server and Network Consolidation reduces IT cost February 2010 10
Software FCoE v/s FCoE CNA Deployment anagement anagement irtual SCSI Layer irtual SCSI Layer Block Devices (File System, L) Block Devices (File System, L) Converged Network SCSI Layer Adapter: Provides superior IO Performance SCSI Layer Increases FC/FCoE server Adapter consolidation Driver ; frees up CPU from protocol FCoE Layer processing Provides field-tested solution across Operating Systems and Platforms Network Device Layer Ethernet NIC driver Converged Network Adapter Ethernet NIC February 2010 11
Why not Software FCoE? Performance disclaimer from FCoE Initiator Functional Specification * : This is a software FCoE solution, there are code paths which are CPU intensive (e.g. FC-CRC calculation). Depending on the CPU power, when I/O traffic through the FCoE ports is heavy, CPU utilization could go very high. * - Source: Solaris Open Storage FCoE Initiator Functional Specification, v1.00, Jul 30, 2008 February 2010 12
QoS Configuration with FCoE CNA 10Gb CEE FCoE SAN LAN SAN and LAN bandwidth allocation using ETS vwwn1/npi1 vwwn2/npi2 Enhanced Transmission Selection (ETS) protocol allows bandwidth percentage allocation across FCoE SAN and Networking traffic Further, FC level QoS can be configured for the bandwidth allocated at FC level February 2010 13
obility with FCoE CNA -1-2 -1-3 NPI1 NPI2 NPI1 NPI3 irtual SCSI Layer irtual SCSI Layer FC/FCoE Adapter Driver FC/FCoE Adapter Driver CNA CNA vwwn1/npi1 vwwn2/npi2 vwwn1/npi1 vwwn3/npi3 vac1 vac2 vac1 vac3 FCoE Fabric FCoE supports N-port ID irtualization similar to FC FC zoning configuration migrated with NPI All other irtual Port properties migrated to new host along with NPI February 2010 14
Case Study Before CNA deployment LAN SAN 26 Servers Ethernet FC Total Adapters 26 26 52 Switches 2 2 4 Cables 56 56 112 2 2 2 2 Uplinks 4 4 8 Access Layer with Cisco Catalyst and DS switches 1 2 3 26 February 2010 15
Case Study After CNA deployment LAN 4 4 SAN 26 Servers CNA Total Savings Adapters 26 26 50% Switches 2 2 50% Cables 60 60 46% Up links 8 8 0% Access Layer with Cisco Nexus 5020 + Expansion odules 1 2 3 26 February 2010 16
Cost per server Cost Savings from FCoE CNA deployment Component FC and Ethernet FCoE Cable $300 $200 FC HBA (2-port) $1200-1GbE NIC (2-port) $800-10GbE FCoE CNA (2-port) - $1800 FC switch $2400-1GbE switch $2800 - FCoE switch - $4000 Total $7500 $6000 Total Cost (26 servers) $195000 $156000 20% February 2010 17
What to look for when you buy FCoE CNA? Full FCoE protocol offload Full 10GbEE NIC functionality with stateless offloads Certified across Operating Systems and Platforms Single ASIC solution with low power dissipation Leveraged from years of FC expertise and field proven drivers and management apps End-to-end ecosystem support February 2010 18
Q&A February 2010 19