StorMagic SvSAN on VMware vsphere 6.0 with Cisco UCS Mini

Size: px
Start display at page:

Download "StorMagic SvSAN on VMware vsphere 6.0 with Cisco UCS Mini"

Transcription

1 Deployment Guide StorMagic SvSAN on VMware vsphere 6.0 with Cisco UCS Mini October 2015 Introduction Virtualization in today s computer infrastructure has led to a distinct and prominent need for storage, accessible to multiple servers simultaneously. Concurrently shared storage provides abilities like vmotion, High Availability, and replication, which provide stability and application availability. This need is satisfied in large data centers typically through the use of large-scale SAN devices. Frequently these solutions leave the smaller deployment models unaddressed, such as Remote Office/Branch Office (ROBO), needs unresolved, as the SAN networking is costly to extend, and suffers from the involved latency. This deployment guide provides an Integrated Infrastructure solution that enables shared storage at the remote data center, using capacity attached directly to the compute layer, for a self-contained application delivery platform. StorMagic SvSAN combined with Cisco UCS-Mini provides compute, networking, and storage in even the most remote location. Combined in a proven and confirmed architecture, compute and storage support for hundreds of virtual desktops as well as necessary infrastructure VMs can be reliably deployed. This joint solution allows the edge enterprise to deploy a data storage infrastructure that works for its multi-site nature This document provides a reference architecture and deployment guide for StorMagic SvSAN on VMware vsphere 6.0 running on Cisco UCS Mini environment. Table of Contents Introduction...1 Technology Overview...2 Cisco UCS C-Series Rack Servers...2 Cisco UCS Manager Cisco UCS 6324UP Fabric Interconnect...3 Cisco UCS B200 M4 Blade Servers...3 Cisco VIC VMware vsphere StorMagic SvSAN...4 StorMagic SvSAN with Cisco UCS Mini Architecture...5 Compute...5 Networking...5 Storage...6 Deployment of StorMagic SvSAN for VMware vsphere on Cisco UCS Environment...9 Cisco UCS Configuration...10 Configuring the Cisco UCS Mini Fabric Interconnects...10 Configure UCS Manager...10 Synchronize Cisco UCS to NTP...10 Enable QoS in Cisco UCS Fabric...10 Create QoS policy...10 Enable Uplink Ports...11 Create VLANs...12 Create Local Disk Configuration Policy...13 Create Boot Policy...13 Create BIOS Policy...14 Configure Server Pool and Qualifying Policy...14 Create UUID Suffix Pools...15 Create MAC Pool...15 Create an IP Pool for KVM Access...16 Create vnic Templates and LAN Connectivity Policies...16 Create a Service Profile Template...18 Create Service Profiles from Templates...18 Create Virtual Drives...19 Install VMware Prepare for SvSAN Deployment...20 Configure Host Networking...23 vcenter server...25 Installing the SvSAN software on a vcenter Server...26 Installing StorMagic PowerShell Toolkit...27 Deploy SvSAN VSAs...27 Recommended SvSAN Mirror Configurations...31 Overview...31 Mirrored Targets...31 Cisco UCS Blade Server Expansion and SvSAN Target Migration...32 Datastore Spanning Using VMware Extents...35 Creating a shared datastore with mirrored Storage...36 Manage VSAs...38 Configure Jumbo Frames on SvSAN Network Interfaces...39 Manage shared datastores Cisco and/or its affiliates. All rights reserved.

2 Technology Overview The following are the hardware and software components used for this deployment guide: Cisco Unified Compute System Mini Cisco UCS Manager 3.0(2c) Cisco UCS B200 M4 Server Cisco VIC 1340 Adapter VMware vsphere 6.0.0b StorMagic SvSAN 5.2 Cisco UCS Mini Cisco UCS, originally designed for the data center, is now optimized for branch and remote offices, point-of-sale, and smaller IT environments with Cisco UCS Mini. UCS Mini is for customers who need fewer servers but still want the robust management capabilities provided by UCS Manager. This solution delivers servers, storage, and 10 Gigabit networking in an easy-to-deploy, compact 6 rack unit form factor. Cisco UCS Mini provides a total computing solution with the proven management simplicity of the award-winning Cisco UCS Manager. Figure 1. Cisco UCS Mini Cisco Nexus Series Switches LAN 10 Gigabit Ethernet Uplinks 10 Gigabit Ethernet Connections to QSFP or SFP Ports on Cisco UCS Gigabit Ethernet Connections to QSFP or SFP Ports on Cisco UCS 6324 Cisco UCS 6324 Fabric Interconnects (Plug into Back of Cisco UCS 5100 Series Blade Chassis) Cisco UCS 5100 Series Blade Chassis Cisco UCS Manager 3.0 Cisco Unified Computing System (UCS) Manager provides unified, embedded management of all software and hardware components of the Cisco UCS through choice of an intuitive GUI, a Command Line Interface (CLI), a Microsoft PowerShell module, or an XML API. The Cisco UCS Manager provides unified management domain with centralized management capabilities and controls multiple chassis and thousands of virtual machines Cisco and/or its affiliates. All rights reserved.

3 The Cisco UCS 6324 Fabric Interconnect hosts and runs Cisco UCS Manager in a highly available configuration, enabling the fabric interconnects to fully manage all Cisco UCS elements. The Cisco UCS 6324 Fabric Interconnects support out-of-band management through dedicated 10/100/1000-Mbps Ethernet management ports. Cisco UCS Manager typically is deployed in a clustered active-passive configuration with two Cisco UCS 6324 Fabric Interconnects connected through the cluster interconnect built into the chassis. Cisco UCS Manager 3.0 supports the 6324 Fabric Interconnect that integrates the FI into the UCS Chassis and provides an integrated solution for a smaller deployment environment. Cisco UCS Mini simplifies the system management and saves cost for smaller scale deployments. The hardware and software components support Cisco unified fabric, which runs multiple types of data center traffic over a single converged network adapter. Cisco UCS 6324UP Fabric Interconnect The Cisco UCS 6324 Fabric Interconnect Fabric Interconnect provides the management, LAN and storage connectivity for the Cisco UCS 5108 Blade Server Chassis and direct-connect rack-mount servers. It provides the same full-featured Cisco UCS management capabilities and XML API as the full-scale Cisco UCS solution in addition to integrating with Cisco UCS Central Software and Cisco UCS Director. From a networking perspective, the Cisco UCS 6324 Fabric Interconnect uses a cut-through architecture supporting deterministic, low-latency, line-rate 10 Gigabit Ethernet on all ports with switching capacity of up to 500Gbps, independent of packet size and enabled services. Sixteen 10Gbps links connect to the servers, providing a 20Gbps link from each Cisco UCS 6324 Fabric Interconnect to each server. The product family supports Cisco low-latency, lossless 10 Gigabit Ethernet unified network fabric capabilities that increase the reliability, efficiency, and scalability of Ethernet networks. The Fabric Interconnect supports multiple traffic classes over a lossless Ethernet fabric from the blade through an interconnect. Significant TCO savings come from Fibre Channel over Ethernet (FCoE)-optimized server design in which network interface cards (NICs), host bus adapters (HBAs), cables, and switches can be consolidated. The Cisco UCS 6324 Fabric Interconnect is a 10 Gigabit Ethernet, FCoE and Fibre Channel switch offering up to 500Gbps throughput and up to four unified ports and one scalability port. Cisco UCS B200 M4 Blade Servers The enterprise-class Cisco UCS B200 M4 blade server extends the capabilities of Cisco s Unified Computing System portfolio in a half-width blade form factor. The Cisco UCS B200 M4 uses the power of the latest Intel Xeon E v3 Series processor family CPUs with up to 1536 GB of RAM (using 64 GB DIMMs), two solid-state drives (SSDs) or hard disk drives (HDDs), and up to 80 Gbps throughput connectivity. The UCS B200 M4 Blade Server mounts in a Cisco UCS 5100 Series blade server chassis or UCS Mini blade server chassis. It has 24 total slots for registered ECC DIMMs (RDIMMs) or loadreduced DIMMs (LR DIMMs) for up to 1536 GB total memory capacity (B200 M4 configured with two CPUs using 64 GB DIMMs). It supports one connector for Cisco s VIC 1340 or 1240 adapter, which provides Ethernet and Fibre Channel over Ethernet (FCoE). Cisco VIC 1340 The Cisco UCS Virtual Interface Card (VIC) 1340 is a 2-port 40-Gbps Ethernet or dual 4 x 10-Gbps Ethernet, Fibre Channel over Ethernet (FCoE)-capable modular LAN on motherboard (mlom) designed exclusively for the M4 generation of Cisco UCS B-Series Blade Servers. When used in combination with an optional port expander, the Cisco UCS VIC 1340 capabilities is enabled for two ports of 40-Gbps Ethernet. The Cisco UCS VIC 1340 enables a policy-based, stateless, agile server infrastructure that can present over 256 PCIe standards-compliant interfaces to the host that can be dynamically configured as either network interface cards (NICs) or host bus adapters (HBAs). In addition, the Cisco UCS VIC 1340 supports Cisco Data Center Virtual Machine Fabric Extender (VM- FEX) technology, which extends the Cisco UCS fabric interconnect ports to virtual machines, simplifying server virtualization deployment and management Cisco and/or its affiliates. All rights reserved.

4 VMware vsphere 6.0 VMware vsphere 6.0, the industry-leading virtualization platform, empowers users to virtualize any application with confidence, redefines availability, and simplifies the virtual data center. The result is a highly available, resilient, on-demand infrastructure that is the ideal foundation for any cloud environment. This release contains a lot of new features and enhancements, many of which are industry-first features StorMagic SvSAN StorMagic SvSAN is a software storage solution that enables enterprises to eliminate downtime of business critical applications at remote sites, where this disruption directly equates to a loss in service and revenue. StorMagic SvSAN ensures high availability through a virtualized shared storage platform, so that these business critical applications remain operational. This is achieved by leveraging the direct attached or internal, cost effective server storage including solid-state disk and presenting it as a virtual SAN. StorMagic SvSAN supports the industry leading hypervisors, VMware vsphere and Microsoft Hyper-V. It is installed as a Virtual Storage Appliance (VSA) requiring minimal server resources to provide the shared storage necessary to enable advanced hypervisor features such as High Availability/Failover Cluster, vmotion/live Migration and Distributed Resource Scheduler (DRS)/ Dynamic Optimization. StorMagic SvSAN can be deployed as a simple 2-node cluster; however the flexibility of the architecture enables the virtual infrastructure performance and/or capacity to be scaled to meet the changing business needs, without impacting service availability. This is achieved by adding additional capacity to existing servers or by growing the StorMagic SvSAN cluster Cisco and/or its affiliates. All rights reserved.

5 StorMagic SvSAN with Cisco UCS Mini Architecture The architecture for this deployment guide is simple and provides a highly available environment using a combination of features from StorMagic SvSAN, VMware vsphere and Cisco UCS Mini by utilizing the local storage available on B200 M4 blade servers. Compute The UCS Mini chassis with four B200 M4 blades are used for this deployment guide. Each B200 M4 blade server has Intel Xeon E v3, 128GB RAM, 2 x 32GB FlexFlash SD Card, 2 x 1.2TB 10K rpm SAS drives and VIC Figure 2. Physical Connectivity Diagram 1GE Management Network 1 1GE Management Network Fabric Interconnect 6324 USB Port Filmware Upgrades Management Port 10/100/1000 Mbps 10GE Infra Network UCS Mini Chassis LAN 10GE Infra Network 1X40G QSFP+ Ethernet/FCoE Licensed Server Port Direct-attached C-Series no FEX support Appliance Port FCoE Storage Port 4x10G SFP+ Unified Port (Eth/FC/FCoE) Uplink Port (Eth/FC/FCoE) Server Port - Direct-attached C-Series, no FEX support Appliance Port FC/FCoE Storage Port Supports 1/10G Eth, 2/4/8G FC Console Port Networking The management ports on both the Cisco UCS 6324 Fabric Interconnects housed inside the UCS Mini chassis are connected to the management network and at least one 1G or 10G from each side of the fabric is used as uplink ports to connect to the LAN through a pair of upstream switches as shown in the above figure 2. In this example we have created 5 vnics in the service profile associated to the B200 M4 blade servers and will be used to configure the networking as explained below: SvSAN uses three different logical interface traffic types: Management used for accessing the web GUI, plug-in or CLI. At least one must be present. iscsi listens for incoming connection requests from iscsi initiators. At least one must be present. Mirror VSA nodes communicate data and metadata associated with mirrored volumes. StorMagic recommends use of four or more physical NICs between hosts and SvSAN Virtual Storage Appliances (VSA) to provide the recommended level of redundancy and load balancing. vswitch0 will connect directly to the customer s production network and it is presumed this is where production VMs will reside. The VSA would also have an interface on vswitch0 for management, administration and monitoring purposes only. vswitch1 is an isolated network either by vlan or IP. This network should not be used for any other traffic apart from the VSA iscsi and Mirror traffic. This is to prevent against any contention and provide the best storage throughput possible. vswitch2 is an isolated network either by vlan or IP. This network should not be used for any other traffic apart from the VSA iscsi and Mirror traffic. This is to prevent against any contention and provide the best storage throughput possible. vswitch3 is an isolated network either by vlan or IP. This network is dedicated to VMotion traffic Cisco and/or its affiliates. All rights reserved.

6 Figure 3. Logical Network Connectivity Diagram VM VM VSA SvSAN Synchronous Mirroring VSA VM VM Hypervisor Hypervisor iscsi/mirror iscsi/mirror vmotion Production NSH Storage In this deployment guide, each blade has 2 x 1.2TB 6Gbps SAS disks and 2 x 32GB Cisco FlexFlash SD cards. The 2 x 1.2TB SAS disks are configured as a RAID0 striped and presented as two virtual drives (VD). The first VD is fixed for 2 TB and the remaining is configured as second VD. Figure 4. Disk Drives in Cisco UCS B200 M4 Server Figure 5. RAID0 Group with two Virtual Drives in Cisco UCS B200 M4 Server Cisco and/or its affiliates. All rights reserved.

7 The 2 x 32GB Cisco FlexFlash SD cards are configured for Mirror (RAID1) in the UCS service profile s local disk configuration policy. The OS will be installed to the Cisco FlexFlash SD. The VSA deployment will utilize the 185GB virtual disk and must be configured as persistent storage on the host. The second virtual drive is allocated explicitly to the VSA as a Raw Disk Mapping. The VSA sees the virtual disk as a pool of storage which iscsi target(s) can be carved from. These iscsi target(s) can be mirrored with other VSAs to create highly available storage. For this deployment guide we have created four mirror volumes two mirror volumes between a pair of SvSAN VSAs as shown in figure 6. All the iscsi mirror volumes are presented to all the hosts but the mirroring is only between a pair of SvSAN VSAs. A mirrored target is one that is mirrored between a pair of VSAs. This means that there are two identical copies of the target data when running normally, one on each VSA. Any data sent to one plex is automatically copied to the other. This is done synchronously, meaning that if an initiator sends data to one plex, the VSA does not acknowledge the request until the data is present on both plexes. A synchronous mirror can be used to provide highly available storage, in that an initiator can access any plex at any time, and if one plex fails it can continue without interruption by using the other plex. Figure 6. StorMagic SvSAN Mirror Target with four Cisco UCS B200 M4 Servers VMware vsphere Cluster 2 x 1.2TB RAID0 Boot Drive Boot Drive 2 x 1.2TB RAID0 VD 2 2TB VD 1 185GB VSA Shared VMFS VSA VD 1 185GB VD 2 2TB Shared VMFS 2 x Synchronous Mirrors VSA VSA Shared VMFS VD 2 2TB VD 1 185GB Shared VMFS VD 1 185GB VD 2 2TB 2 x 1.2TB RAID0 Boot Drive 2 x 32GB SD card Hypervisor (HV) Partition Mirror Boot Drive 2 x 1.2TB RAID0 Table 1 lists the hardware and software components used for this deployment guide Cisco and/or its affiliates. All rights reserved.

8 Table 1. Hardware Software Components Component Description Cisco UCS 1 x Cisco UCS Mini Chassis 2x Cisco UCS 6324 Fabric Interconnects - UCSM 3.0(2c) 4 x Cisco UCS B200 M4 Blade Servers 2 x Intel Xeon processor E v3 CPUs per blade server 8 x 16GB DDR MHz RDIMM/PC /dual rank/x4/1.2v per blade server 2 x 1.2TB SAS 10K RPM SFF HDD per blade server 1 UCSB-MRAID12G per blade server 1 Cisco UCS VIC1340 VIC (UCSB-MLOM-40G-03) per blade server 2 x 32GB Cisco Flexible Flash cards (UCS-SD-32G-S) per blade server VMware vsphere VMware vcenter 6.0.0b VMware 6.0.0b StorMagic SvSAN Cisco and/or its affiliates. All rights reserved.

9 Deployment of StorMagic SvSAN for VMware vsphere on Cisco UCS Environment This flow chart describes the high-level steps to deploy StorMagic SvSAN for VMware vsphere on a Cisco UCS environment. Figure 7. High-Level steps for Deploying SvSAN for VMware on Cisco UCS Mini with B200 M4 Servers UCS Mini Cable Connectivity Configuration Cisco UCS Mini Fabric Interconnects vcenter StorMagic Software Installation Pre-requisites Configure Host Networking Install StorMagic SvSAN on vcenter Install StorMagic PowerShell Toolkit Deploy SvSAN VSAs Configure Cisco UCSM QoS Ports/Uplinks VLANs Pools and Policies vnic Templates Service Profile Templates Service Profiles Prepare for SvSAN Deployment Create a Shared Datastore with Mirrored Storage Manage SvSAN VSAs Create Virtual Drives Install 6.0 Manage Shared Datastore Storage Cisco and/or its affiliates. All rights reserved.

10 Cisco UCS Configuration The following section provides a very high-level procedure to configure the Cisco UCS Mini environment to deploy StorMagic SvSAN for VMware vsphere. Configuring the Cisco UCS Mini Fabric Interconnects In this section perform the initial setup of the Cisco UCS 6324 fabric interconnects in a cluster configuration. Refer to the below URL for step-by-step instructions to perform an initial system setup for a cluster configuration. UCSM_GUI_User_Guide_3_0_chapter_0101.html Configure UCS Manager Login to UCS Manager and configure the pools, policies, service profiles as listed in the sections that follow. Synchronize Cisco UCS to NTP In UCS Manager, navigate to Admin > Time Zone Management. Select a Time Zone and add a NTP server address to synchronize Cisco UCS with an NTP server. Enable QoS in Cisco UCS Fabric In UCS Manager, navigate to LAN tab > LAN Cloud > QoS System Class to enable the priorities and enter the MTU size as shown below. Figure 8. UCS Manager-Enable QoS System Class Create QoS policy Navigate to LAN tab > root > Policies > QoS Policies and create three policies with the information provided in the reference table below: Table 2. QoS Policy for different traffic QoS Policy Name Priority Selection Other Parameters StorMagicISCSI Gold Default StorMagicVM Platinum Default StorMagicVMotion Bronze Default Cisco and/or its affiliates. All rights reserved.

11 Figure 9. UCS Manager-Create QoS Policy Enable Uplink Ports In UCS Manager, navigate to Equipment tab > Fabric Interconnects > Fabric Interconnects A > Fixed Module > Ethernet Ports and configure the ports connected to upstream switch as Uplink ports. Repeat the same step on Fabric Interconnect B. Figure 10. UCS Manager-Enable Uplink Ports Cisco and/or its affiliates. All rights reserved.

12 Create VLANs Navigate to LAN tab > LAN > LAN Cloud and create one VLAN in each Fabric A (StorMagicISCSI-A VLAN) and Fabric B (StorMagicISCSI-B VLAN) for the iscsi Storage traffic as shown in figure 11. Figure 11. UCS Manager-Create VLANs for iscsi Storage Traffic Navigate to LAN tab > LAN > VLANs and create the global VLANs for your environment as shown in figure 12. For this guide we created two VLANs one for management and the other for vmotion traffics as shown below. Figure 12. Create Global VLANs for Management and vmotion Traffic Cisco and/or its affiliates. All rights reserved.

13 Create Local Disk Configuration Policy Navigate to Servers tab > Policies > root > Local Disk Configuration Policy and create a Local Disk Configuration Policy. Select RAID 0 Striped for Mode and enable both the FlexFlash State and FlexFlash RAID Reporting State as shown in figure 13. Figure 13. Create Local Disk Configuration Policy Create Boot Policy Navigate to Servers tab > Policies > root > Boot Policies and Create a Boot Policy by adding CD/DVD and SD Card in the Boot order as shown in figure 14. Figure 14. Create Boot Policy Cisco and/or its affiliates. All rights reserved.

14 Create BIOS Policy Navigate to Servers tab > Policies > root > BIOS Policies and create a BIOS Policy with settings as shown in figure 15. Figure 15. Create BIOS Policy Configure Server Pool and Qualifying Policy In this section you create Server pool, Server Pool Policy Qualifications and Server Pool Policy for auto-population of servers depending on the configuration. Navigate to Servers tab > Pools > root > Server Pool and right-click to create a Server Pool. Provide a name and click Finish without adding any servers to the pool. Navigate to Servers tab > Policies > root > Server Pool Policy Qualifications. Right-click to create a Server Pool Policy Qualifications with Server PID Qualifications as the criteria, and enter UCSB-B200-M4 for the model as shown in figure 16. Figure 16. Create Server Pool Policy Qualifications Cisco and/or its affiliates. All rights reserved.

15 Navigate to Servers tab > Policies > root > Server Pool Policies and right-click on it to create a Server Pool Policy. Provide a name and select the Target Pool and Qualification created in the previous section from the drop-down lists. Figure 17. Create Server Pool Policy Create UUID Suffix Pools Navigate to Servers tab > Pools > root > UUID Suffix Pools and right-click on it to Create UUID Suffix Pool and then add a block of UUIDs to it. Figure 18. Create UUID Suffix Pools Create MAC Pool Navigate to LAN tab > Pools > root > UUID Suffix Pools and right-click on it to Create MAC Pool and then add a block of MAC addresses to it. Figure 19. Create MAC Pools Cisco and/or its affiliates. All rights reserved.

16 Create an IP Pool for KVM Access Navigate to LAN tab > Pools > root > IP Pools > IP Pool ext-mgmt and right-click to create a block of IPv4 addresses for KVM access. Figure 20. Create IP Pool for KVM Access Create vnic Templates and LAN Connectivity Policies Navigate to Servers tab > Policies > root > vnic Templates and right-click on it to create vnic Templates using the reference table given below. Table 3. vnic Templates vnic Template Name Fabric ID Enable Failover (Yes/ No) Template Type VLANs Allowed Native VLAN MTU MAC Pool QoS Policy eth0 Fabric A No Updating Mgm Mgm 1500 StorMagicMAC StorMagicVM eth1 Fabric B No Updating Mgm Mgm 1500 StorMagicMAC StorMagicVM eth2 Fabric A No Updating StorMagicISCSI-A StorMagicISCSI-A 9000 StorMagicMAC StorMagicISCSI eth3 Fabric B No Updating StorMagicISCSI-B StorMagicISCSI-B 9000 StorMagicMAC StorMagicISCSI eth4 Fabric B Yes Updating StorMagicVMotion StorMagicVMotion 9000 StorMagicMAC StorMagicVMotion Cisco and/or its affiliates. All rights reserved.

17 Figure 21. Create vnic Template Create a LAN Connectivity Policy using the above vnic templates and selecting VMware as the adapter policy for Adapter Performance Profile. Figure 22. Create LAN Connectivity Policy Cisco and/or its affiliates. All rights reserved.

18 Create a Service Profile Template Create a Service Profile Template from under the Servers tab using the all the pools and policies that are created in the above sections. Figure 23. Create Service Profile Template Create Service Profiles from Templates Create Service Profiles from a Template under the Servers tab. Provide a Naming Prefix, a starting number with number of instances and select the service profile template created in the above step. Figure 24. Create Service Profiles from Template This creates four service profiles and automatically associates with the blade servers matching the selection criteria chosen in the Server Pool Policy Cisco and/or its affiliates. All rights reserved.

19 Figure 25. Service Profiles Associated to Blade Servers Create Virtual Drives Select a Service Profile, click on Boot Server and launch the KVM console. Enter the Cisco FlexStorage Controller BIOS Configuration Utility (Ctrl R) to configure RAID 0 group with the two 1.2TB drives and create two virtual drives. Complete this RAID configuration on all the four B200 M4 blade servers as shown in the figure 26. Figure 26. Create RAID Group and Virtual Drives Cisco and/or its affiliates. All rights reserved.

20 Install VMware 6.0 The procedure in this section needs to be carried out on all Cisco UCS B200 M4 Blade Servers. 1. Download the Cisco Custom image for VMware 6.0 from the below URL. Also download downloads 2. In UCS Manager, select a blade server and launch the KVM console session. From the KVM console session Virtual Media tab, select Map CD/DVD and mount the Cisco custom ISO image downloaded in the previous step. 3. Select the CiscoVD Hypervisor partition as shown in the figure 27. Figure 27. Installation Storage Device Selection 4. Once the installation is complete, reboot the system and configure management IP address. Prepare for SvSAN Deployment 1. Log in to the host using the vsphere Client and navigate to Configuration >Storage and click Add Storage to add persistent storage to the host for SvSAN VSA deployment. 2. Select Disk/LUN as Storage Type and create a datastore using the 185GB LUN as shown in figure 28. Figure 28. Add Persistent Storage to Host 3. Upload the patches and device drivers to this newly created datastore. 4. Put the host into Maintenance Mode. vim-cmd hostsvc/maintenance_mode_enter 5. From the shell, install latest version from the link below: esxcli software vib update -d /vmfs/volumes/temp/<filename>.zip Cisco and/or its affiliates. All rights reserved.

21 Figure 29. Upgrade 6. Download the Cisco UCS B200 M4 device drivers image ucs-bxxx-drivers.3.0.2a.iso from the below URL and upgrade the drivers for NIC, storage and other components as shown in figure esxcli software vib install --no-sig-check -d /vmfs/volumes/temp/<filename>.zip Figure 30. Device Driver Upgrade 7. Create a SATP rule to ensure that ESX uses Round Robin path policy and reduces the path IOP count to 1 IOP for all StorMagic iscsi volumes. esxcli storage nmp satp rule add --satp VMW_SATP_ALUA --psp VMW_PSP_RR -O iops=1 --vendor= StorMagc --model= iscsi Volume Figure 31. Create SATP Rule This will ensure the even distribution of IO across mirrored volumes and is a best practice for StorMagic on the UCS Mini architecture. The following command can be used to check if the setting has taken affect. esxcli storage nmp device list Figure 32. Verify Path Policy Setting The above screenshot shows MirrorBA. The two fields that identify the changes are: The Path Select Policy: VMW_PSP_RR Path Selection Policy Device Config: (policy=rr,iops=1.) 8. Login to the vcenter and add all the hosts to the managing vcenter server where the StorMagic software is installed Cisco and/or its affiliates. All rights reserved.

22 9. Next create a Cluster under Datacenter and add all the hosts on which the StorMagic SvSAN will be deployed as shown in the figure 33. Figure 33. vcenter Hosts and Clusters 10. Select a host and navigate to Configuration > Storage Adapters and click Add to add the Software iscsi Adapter. Figure 34. Add Software iscsi Adapter Note: SAN connectivity configured for the host iscsi software adapter. iscsi port (TCP port 3260) must be enabled in the VMware firewall iscsi software initiator enabled on hosts SSH enabled on the hosts to allow VSAs to be deployed Following network ports will be used by Stormagic SvSAN Cisco and/or its affiliates. All rights reserved.

23 Table 4. Network Ports Component/Service Protocol/Port Discovery UDP 4174 XMLRPC Server TCP 8000 Inter-VSA communications TCP 4174 SMDXMLRPC TCP Web Access TCP 80, 443 Management Services TCP 8990 vsphere Client Plug-in TCP 80, 443 Misc VMware TCP 16961, 16962, Additional port numbers used by SvSAN can be found here at the below URL: Configure Host Networking 1. Create virtual standard switches in all the hosts as mentioned below. vswitch0 VMkernel Port Management Network Virtual Machine Port Group - Production Virtual Machines and SvSAN management vswitch1 VMkernel Port iscsi traffic Virtual Machine Port Group SvSAN iscsi and Mirror traffic vswitch2 VMkernel Port iscsi traffic Virtual Machine Port Group SvSAN iscsi and mirror traffic vswitch3 VMkernel Port vmotion 2. Edit all the vswitches and VMkernel ports used for iscsi, mirror and vmotion traffic and set the MTU size to 9000 as shown in figures 35 and Cisco and/or its affiliates. All rights reserved.

24 Figure 35. MTU settings for vswitch Figure 36. MTU settings for VMkernel Port Figure 37 shows all the vswitches and its VMkernel and virtual machine port groups created on all the hosts. Figure 37. Networking Configuration Cisco and/or its affiliates. All rights reserved.

25 vcenter server StorMagic s SvSAN management components are installed directly onto VMware vcenter Server. This package includes a numbers of services that are required to orchestrate VSA deployment and storage provisioning. A UCS Director module is available that includes workflows to automated VSA deployment and storage provisioning. Finally, a PowerShell module is available to allow administrators to script deployments, provisioning and any management tasks. vcenter StorMagic Software Installation Prerequisites Windows or vcsa. Table 5. vcenter and StorMagic SvSAN Version Compatibility StorMagic SvSAN vcenter Version 5.2 VMware vcenter Server 5.1a VMware vcenter Server 5.1b VMware vcenter Server 5.1 Update 1 VMware vcenter Server 5.1 Update 1a VMware vcenter Server 5.1 Update 1b VMware vcenter Server 5.1 Update 1c VMware vcenter Server 5.1 Update 2 VMware vcenter Server 5.1 Update 2a VMware vcenter Server 5.1 Update 3 VMware vcenter Server 5.1 Update 3a VMware vcenter 5.5 VMware vcenter 5.5.0a VMware vcenter 5.5.0b VMware vcenter 5.5.0c VMware vcenter Update 1 VMware vcenter Update 1a VMware vcenter Update 1b VMware vcenter Update 1c VMware vcenter Update 2 VMware vcenter Update 2d VMware vcenter Update 2e VMware vcenter Server Visual C++ Runtime Libraries and.net FX 3.5 SP1 are required. These are automatically installed (if not already) when the SvSAN setup executable (setup.exe) is run. Note: When using Windows 2012, this operation must be performed manually through Server Manager > Add Roles and Features Cisco and/or its affiliates. All rights reserved.

26 Installing the SvSAN software on a vcenter Server 1. Selecting run as administrator, run the setup.exe file provided in the downloaded ZIP file. 2. Click Next. The End-User Licence Agreement opens. Note: There are different versions for the USA and the rest of the world. Scroll down to find the appropriate agreement. 3. To accept the Agreement, check the Accept check box and then click Next. 4. Click Typical. This will install the Neutral Storage Service on vcenter Server, together with the plug-in components. 5. Once the installation has begun you must provide it with a valid vcenter username and password. If a user credentials prompt does not appear, right-click on setup.exe, then select Run as Administrator. If the installation appears to hang, check that the pop-up box has not been hidden behind another window. Finish the setup after the installation completes Cisco and/or its affiliates. All rights reserved.

27 Installing StorMagic PowerShell Toolkit There are no strict requirements. StorMagic PowerShell toolkit can be installed on the vcenter server or a scripting server. 1. To install the StorMagic PowerShell Toolkit, run the file: setupstormagicpowershelltoolkit.exe 2. The StorMagic PowerShell Toolkit Setup wizard opens. 3. Complete the wizard. There is only one component to install. 4. You can uninstall StorMagic PowerShell Toolkit from Windows Control Panel, Programs and Features, in the usual way. In the vsphere web client a StorMagic management tab will be visible when selecting vcenter > Hosts and Clusters, and a datacenter object in the inventory tree. Deploy SvSAN VSAs Repeat the following steps for each VSA to be deployed. 1. Open the plug-in: in vsphere Web Client, select your datacenter, then click Manage > StorMagic. Figure 38. vsphere Web Client with StorMagic 2. Click Deploy a VSA onto a host. The deployment wizard opens. Click Next. 3. Select a host to deploy a VSA to. Click Next. 4. Read and accept the Terms and Conditions. Click Next. 5. Enter a hostname for the VSA (unique on your network), domain name and the datastore the VSA VM is to reside on. The datastore should be on internal or direct-attached server storage. Two VMDKs are created on this datastore: a 512 MB boot disk and a 20 GB journal disk Cisco and/or its affiliates. All rights reserved.

28 Figure 39. Deploy SvSAN VSA 1. Choose the desired storage allocation technique. Raw device mappings offer the best performance; however, SSH must be enabled on the host, but only for the duration of the deployment (SSH can be disabled immediately after deployment). WARNING: If you select an RDM device that has data stored on it, that data will be permanently deleted during deployment Figure 40. Deploy VSA - Storage 2. If you wish to allocate multiple RDMs as a pool, check the Advanced options check box. If the pool you want is listed select it. Otherwise, click Add. The Create Storage Pool window opens. 3. Click Next and click the Skip cache allocation radio button in the Caching configuration window. 4. The Networking page can be left to acquire IP addresses from DHCP, or they can be set statically. Select the Network interface and click Configure to select the interface traffic types. In our setup, VM Network interface is configured for Management and iscsi1, iscsi2 interface is configured for iscsi and Mirroring traffic as shown in the figures 41 and Cisco and/or its affiliates. All rights reserved.

29 Figure 41. Deploy SvSAN VSA Networking Figure 42. Deploy VSA Networking Configuration Note: Multiple networks are shown if there are multiple vswitches configured on the host. The VSA creates an interface on all vswitches by default. If you do not wish the VSA to create an interface on specific vswitches, clear the box associated with the virtual machine port group. For each interface you can choose its type Cisco and/or its affiliates. All rights reserved.

30 5. Enter the VSA license information and click Next. Note: During deployment, the VSA attempts to connect to StorMagic s license server to validate the license. If it needs to go through a proxy server to do this, supply the proxy server details. If the VSA does not have internet connectivity, license it later using an offline activation mechanism. For further information on licensing click here. 6. Enter the VSA management password then click Next. Summary information about the VSA is displayed. 7. Click Finish to deploy the VSA. 8. The deployment process can be monitored in the VMware Recent Tasks window. StorMagic deploy OVF task provides an overall percentage of the deployment. When this task has completed, the VSA will have booted and be operational. 9. Once the VSA has been deployed it is available for creating datastores. Note: The VSA requires at least one network interface be routable to vcenter else the VSA deployment will fail. This is so once the VS VM has booted, it can communicate back to the vcenter server where the SvSAN deployment taskman resides Cisco and/or its affiliates. All rights reserved.

31 Recommended SvSAN Mirror Configurations Overview To allow the VSA storage to dynamically scale with odd numbers of UCS Blades it is recommended that each VSA divides its pooled storage into two iscsi mirrors. This configuration will also adhere to VMware s default storage heartbeat policy which requires a minimum of two datastores when deploying a two blade configuration. Mirrored Targets A mirrored target is one that is mirrored between a pair of VSAs, which each VSA hosting one side of the mirror (called a mirror plex). This means that there are two identical copies of the target data when running normally, one on each VSA. Any data sent to one plex is automatically copied to the other. This is done synchronously, meaning that if an initiator sends data to one plex, the VSA does not acknowledge the request until the data is present on both plexes. A synchronous mirror can be used to provide highly available storage, in that an initiator can access any plex at any time, and if one plex fails it can continue without interruption by using the other plex. As previously mentioned, it is recommended that the pool of storage on each VSA is divided into two iscsi mirrors. This will allow the dynamic storage migration when adding additional blades and VSAs to the cluster. This configuration will further improve on the solution reliability should there be multiple blade failures and reduce the possibility of any storage contention; spreading multiple VMs are across mirrored targets. Figure 43. Four Blade Example Mirror Configuration VMware vsphere Cluster 2 x 1.2TB RAID0 Boot Drive Boot Drive 2 x 1.2TB RAID0 VD 2 2TB VD 1 185GB VSA Shared VMFS VSA VD 1 185GB VD 2 2TB Shared VMFS 2 x Synchronous Mirrors VSA VSA Shared VMFS VD 2 2TB VD 1 185GB Shared VMFS VD 1 185GB VD 2 2TB 2 x 1.2TB RAID0 Boot Drive 2 x 32GB SD card Hypervisor (HV) Partition Mirror Boot Drive 2 x 1.2TB RAID Cisco and/or its affiliates. All rights reserved.

32 Cisco UCS Blade Server Expansion and SvSAN Target Migration Implementing two mirrors across blade servers allows for dynamic target migration when adding additional blades, for example by adding a fifth blade to the four blade configuration discussed previously. SvSAN management tools can be used to automate this process, migrating plexes from two blades to another. This ensures that storage IO is spread evenly across all the Cisco UCS blade servers. Figure 44. Blade Server Expansion and SvSAN Target Migration Shared VMFS 1 Mirror plex on VSA2 will be migrated to the new blade (VSA3) VSA2 2 x Synchronous Mirrors Shared VMFS 2 VSA2 Shared VMFS VSA 2 x Synchronous Mirrors Shared VMFS VSA The mirror plex on VSA2 housing VMFS1 will be migrated to the newly introduced blades VSA. The storage will remain online throughout this operation but additional IO will be observed as the plex data is written to the new VSA Cisco and/or its affiliates. All rights reserved.

33 Figure 45. Five Blade Servers Example Mirror Configuration after Migration VSA3 1. Mirror plex migrated from VSA2 to VSA3 Shared VMFS1 Shared VMFS3 2. New mirror created between VSA3 to VSA2 VSA1 3 x Synchronous Mirrors Shared VMFS 2 VSA2 Shared VMFS VSA 2 x Synchronous Mirrors Shared VMFS VSA Once the plex migration has completed, a new mirror will be created between the new VSA and the VSA the plex was migrated from loading the storage evenly across the three blades. The other two blades in the environment remain unchanged Cisco and/or its affiliates. All rights reserved.

34 Figure 46. Six Blade Servers Example Mirror Configuration Shared VMFS 4 3. New mirror plex created between VSA3 to VSA4 VSA3 2 x Synchronous Mirrors Shared VMFS 3 1. Mirror plex on VSA2 will be migrated to VSA4 VSA4 Shared VMFS 1 2. Mirror plex on VSA3 will be migrated to VSA2 VSA1 2 x Synchronous Mirrors Shared VMFS 2 VSA2 Shared VMFS VSA 2 x Synchronous Mirrors Shared VMFS VSA When adding a sixth blade to the solution the same procedure can be applied without taking the storage offline. This process can be applied to N number of blades introduced to the solution Cisco and/or its affiliates. All rights reserved.

35 Datastore Spanning Using VMware Extents It is possible to aggregate multiple iscsi mirrored volumes into a single datastore using VMware extents. The StorMagic UCSD module, vsphere plugin and PowerShell scripting module allows this process to be automated. This will allow the administrator to see all mirrored iscsi volumes in the X number of blades as a single datastore. This may be required if VMs are larger than the capacity of a single mirror but it should be noted that this configuration may introduce storage contention. As a performance best practice it is recommended that datastores are provisioned on single mirrors not using extents Cisco and/or its affiliates. All rights reserved.

36 Creating a shared datastore with mirrored Storage 1. In vsphere Web Client, select your datacenter and click Manage > StorMagic. The plug-in opens. 2. Click Create a shared datastore. The Create Datastore wizard opens. Click Next. 3. Name the datastore. 4. Set the provisioned size. To set the size so that all the available space is used, check Use all. To divide this storage equally across the two mirrors, set the Size to GB. This will divide the available 1.99TB equally across two mirrors. 5. To create mirrored storage, select two VSAs. 6. By default the pool to be used on each VSA is selected automatically. To specify the pool yourself check the Advanced options check box. Figure 47. Create Shared Datastore with Mirrored Storage 7. Click Next. 8. Select the neutral storage host for mirroring. Select one of the other VSAs running on another blade. The Advanced option lets you select the Up isolation policy, which does not require an NSH but is not best practice. Click here for more information on isolation policies. Click Next. 9. Optionally, enable caching on the datastore. This is only available if your license includes caching and a cache device was enabled when the VSA was deployed. If you enable caching, default cache settings are used; these can be modified at a later time using the web GUI. Click Next. 10. Select the hosts that are to have access to the SvSAN datastore. Click Next. Figure 48. Hosts Selection to Mount Datastore Cisco and/or its affiliates. All rights reserved.

37 11. The first time you create a datastore, the VSAs must authenticate with their hosts if they are not already saved from deployment. For each host, provide the host administrative password (the password that was used when was installed). Click Next. 12. Click Finish to start the datastore creation. The datastore creation process can be monitored in the VMware Recent Tasks window. Once this task has completed, the datastore is available on the hosts. 13. When a mirror is created, a full synchronization is performed to ensure both sides of the mirror are synchronized. 14. Repeat the above steps in this section and create a second datastore on the VSAs or another datastore using the other two SvSAN nodes as shown in figure 49. Figure 49. Create Mirrored Storage Cisco and/or its affiliates. All rights reserved.

38 Manage VSAs To check the status of your VSAs, in the plug-in click Manage VSAs. Figure 50. Manage VSAs This lists VSAs with their IP addresses and system status. Select a VSA to see more information about it, including the amount of pool storage used the amount available, the system serial number and firmware version. You can monitor the mirror sync status by managing the VSAs directly. Through the StorMagic plugin, click Manage VSAs, select the VSA you wish to manage, click the Management URL. This will launch a new tab directly to the VSAs management interface. The default username is admin and the password is the password supplied at VSA deployment. In addition the VSA dashboard can be used to monitor VSAs in real time. The VSA state reflects the current system status of the VSA. For example, if a mirror is unsynchronized the VSA system status will be warning as the mirror is currently degraded. When the mirror synchronizes the VSA system status will change back to healthy as the VSA is in an optimal running state. Figure 51. Manage VSAs Cisco and/or its affiliates. All rights reserved.

39 The dashboard can be accessed from the vsphere web client home page. Figure 52. StorMagic Dashboard in vsphere Web Client Configure Jumbo Frames on SvSAN Network Interfaces Repeat the following steps on each SvSAN VSA to configure the jumbo frames on VSA Logical NIC. 1. Login to the vsphere Web Client and select Hosts and Clusters. 2. Select the datacenter where the VSA(s) are deployed. Click Manage > StorMagic > Manage VSAs. 3. Highlight the VSA and a management URL will appear. Click this URL. 4. A new tab will open connecting directly to the VSA. The default username is admin, supply the password entered at VSA deployment. 5. Click Network and then select the network device you want to edit. 6. From Actions click Edit. 7. Edit the MTU size to the desired size to match the rest of the environment network configuration, then click Apply. Figure 53. Setting MTU Cisco and/or its affiliates. All rights reserved.

40 Manage shared datastores To manage shared datastores, in the plug-in click Manage Shared Datastores. Figure 54. Manage Shared Datastores The Manage Shared Datastores page displays all the VSA-hosted datastores, including the datastore name, the hosts using the datastore, the number of paths to the datastore and the datastore status. Resources chapter_010.html Americas Headquarters Cisco Systems, Inc. San Jose, CA Asia Pacific Headquarters Cisco Systems (USA) Pte. Ltd. Singapore Europe Headquarters Cisco Systems International BV Amsterdam, The Netherlands Cisco has more than 200 offices worldwide. Addresses, phone numbers, and fax numbers are listed on the Cisco Website at Cisco and the Cisco logo are trademarks or registered trademarks of Cisco and/or its affiliates in the U.S. and other countries. To view a list of Cisco trademarks, go to this URL: Third party trademarks mentioned are the property of their respective owners. The use of the word partner does not imply a partnership relationship between Cisco and any other company. (1110R) C /15

StorMagic SvSAN on VMware vsphere 6.0 with Cisco UCS Mini Deployment Guide

StorMagic SvSAN on VMware vsphere 6.0 with Cisco UCS Mini Deployment Guide StorMagic SvSAN on VMware vsphere 6.0 with Cisco UCS Mini Deployment Guide April 2016 Introduction Virtualization in today s computer infrastructure requires storage that is accessible to multiple servers

More information

Setup for Failover Clustering and Microsoft Cluster Service

Setup for Failover Clustering and Microsoft Cluster Service Setup for Failover Clustering and Microsoft Cluster Service Update 1 ESX 4.0 ESXi 4.0 vcenter Server 4.0 This document supports the version of each product listed and supports all subsequent versions until

More information

Setup for Failover Clustering and Microsoft Cluster Service

Setup for Failover Clustering and Microsoft Cluster Service Setup for Failover Clustering and Microsoft Cluster Service Update 1 ESXi 5.1 vcenter Server 5.1 This document supports the version of each product listed and supports all subsequent versions until the

More information

Microsoft SQL Server 2012 on Cisco UCS with iscsi-based Storage Access in VMware ESX Virtualization Environment: Performance Study

Microsoft SQL Server 2012 on Cisco UCS with iscsi-based Storage Access in VMware ESX Virtualization Environment: Performance Study White Paper Microsoft SQL Server 2012 on Cisco UCS with iscsi-based Storage Access in VMware ESX Virtualization Environment: Performance Study 2012 Cisco and/or its affiliates. All rights reserved. This

More information

How To Build A Cisco Ukcsob420 M3 Blade Server

How To Build A Cisco Ukcsob420 M3 Blade Server Data Sheet Cisco UCS B420 M3 Blade Server Product Overview The Cisco Unified Computing System (Cisco UCS ) combines Cisco UCS B-Series Blade Servers and C-Series Rack Servers with networking and storage

More information

Unified Computing Systems

Unified Computing Systems Unified Computing Systems Cisco Unified Computing Systems simplify your data center architecture; reduce the number of devices to purchase, deploy, and maintain; and improve speed and agility. Cisco Unified

More information

The Future of Computing Cisco Unified Computing System. Markus Kunstmann Channels Systems Engineer

The Future of Computing Cisco Unified Computing System. Markus Kunstmann Channels Systems Engineer The Future of Computing Cisco Unified Computing System Markus Kunstmann Channels Systems Engineer 2009 Cisco Systems, Inc. All rights reserved. Data Centers Are under Increasing Pressure Collaboration

More information

Setup for Failover Clustering and Microsoft Cluster Service

Setup for Failover Clustering and Microsoft Cluster Service Setup for Failover Clustering and Microsoft Cluster Service ESX 4.1 ESXi 4.1 vcenter Server 4.1 This document supports the version of each product listed and supports all subsequent versions until the

More information

Cisco Nexus 1000V Virtual Ethernet Module Software Installation Guide, Release 4.0(4)SV1(1)

Cisco Nexus 1000V Virtual Ethernet Module Software Installation Guide, Release 4.0(4)SV1(1) Cisco Nexus 1000V Virtual Ethernet Module Software Installation Guide, Release 4.0(4)SV1(1) September 17, 2010 Part Number: This document describes how to install software for the Cisco Nexus 1000V Virtual

More information

Springpath Data Platform with Cisco UCS Servers

Springpath Data Platform with Cisco UCS Servers Springpath Data Platform with Cisco UCS Servers Reference Architecture March 2015 SPRINGPATH DATA PLATFORM WITH CISCO UCS SERVERS Reference Architecture 1.0 Introduction to Springpath Data Platform 1 2.0

More information

Cisco FlexFlash: Use and Manage Cisco Flexible Flash Internal SD Card for Cisco UCS C-Series Standalone Rack Servers

Cisco FlexFlash: Use and Manage Cisco Flexible Flash Internal SD Card for Cisco UCS C-Series Standalone Rack Servers Cisco FlexFlash: Use and Manage Cisco Flexible Flash Internal SD Card for Cisco UCS C-Series Standalone Rack Servers White Paper February 2014 What You Will Learn The Cisco UCS C220 M3, C240 M3, C420 M3,

More information

Setup for Failover Clustering and Microsoft Cluster Service

Setup for Failover Clustering and Microsoft Cluster Service Setup for Failover Clustering and Microsoft Cluster Service ESX 4.0 ESXi 4.0 vcenter Server 4.0 This document supports the version of each product listed and supports all subsequent versions until the

More information

Implementing and Troubleshooting the Cisco Cloud Infrastructure **Part of CCNP Cloud Certification Track**

Implementing and Troubleshooting the Cisco Cloud Infrastructure **Part of CCNP Cloud Certification Track** Course: Duration: Price: $ 4,295.00 Learning Credits: 43 Certification: Implementing and Troubleshooting the Cisco Cloud Infrastructure Implementing and Troubleshooting the Cisco Cloud Infrastructure**Part

More information

Setup for Failover Clustering and Microsoft Cluster Service

Setup for Failover Clustering and Microsoft Cluster Service Setup for Failover Clustering and Microsoft Cluster Service ESXi 5.0 vcenter Server 5.0 This document supports the version of each product listed and supports all subsequent versions until the document

More information

Direct Attached Storage

Direct Attached Storage , page 1 Fibre Channel Switching Mode, page 1 Configuring Fibre Channel Switching Mode, page 2 Creating a Storage VSAN, page 3 Creating a VSAN for Fibre Channel Zoning, page 4 Configuring a Fibre Channel

More information

UCS M-Series Modular Servers

UCS M-Series Modular Servers UCS M-Series Modular Servers The Next Wave of UCS Innovation Marian Klas Cisco Systems June 2015 Cisco UCS - Powering Applications at Every Scale Edge-Scale Computing Cloud-Scale Computing Seamlessly Extend

More information

Course. Contact us at: Information 1/8. Introducing Cisco Data Center Networking No. Days: 4. Course Code

Course. Contact us at: Information 1/8. Introducing Cisco Data Center Networking No. Days: 4. Course Code Information Price Course Code Free Course Introducing Cisco Data Center Networking No. Days: 4 No. Courses: 2 Introducing Cisco Data Center Technologies No. Days: 5 Contact us at: Telephone: 888-305-1251

More information

A Platform Built for Server Virtualization: Cisco Unified Computing System

A Platform Built for Server Virtualization: Cisco Unified Computing System A Platform Built for Server Virtualization: Cisco Unified Computing System What You Will Learn This document discusses how the core features of the Cisco Unified Computing System contribute to the ease

More information

Achieve Automated, End-to-End Firmware Management with Cisco UCS Manager

Achieve Automated, End-to-End Firmware Management with Cisco UCS Manager Achieve Automated, End-to-End Firmware Management with Cisco UCS Manager What You Will Learn This document describes the operational benefits and advantages of firmware provisioning with Cisco UCS Manager

More information

A virtual SAN for distributed multi-site environments

A virtual SAN for distributed multi-site environments Data sheet A virtual SAN for distributed multi-site environments What is StorMagic SvSAN? StorMagic SvSAN is a software storage solution that enables enterprises to eliminate downtime of business critical

More information

Best Practices Guide: Network Convergence with Emulex LP21000 CNA & VMware ESX Server

Best Practices Guide: Network Convergence with Emulex LP21000 CNA & VMware ESX Server Best Practices Guide: Network Convergence with Emulex LP21000 CNA & VMware ESX Server How to deploy Converged Networking with VMware ESX Server 3.5 Using Emulex FCoE Technology Table of Contents Introduction...

More information

Active Fabric Manager (AFM) Plug-in for VMware vcenter Virtual Distributed Switch (VDS) CLI Guide

Active Fabric Manager (AFM) Plug-in for VMware vcenter Virtual Distributed Switch (VDS) CLI Guide Active Fabric Manager (AFM) Plug-in for VMware vcenter Virtual Distributed Switch (VDS) CLI Guide Notes, Cautions, and Warnings NOTE: A NOTE indicates important information that helps you make better use

More information

Installing and Using the vnios Trial

Installing and Using the vnios Trial Installing and Using the vnios Trial The vnios Trial is a software package designed for efficient evaluation of the Infoblox vnios appliance platform. Providing the complete suite of DNS, DHCP and IPAM

More information

MANAGE INFRASTRUCTURE AND DEPLOY SERVICES WITH EASE USING DELL ACTIVE SYSTEM MANAGER

MANAGE INFRASTRUCTURE AND DEPLOY SERVICES WITH EASE USING DELL ACTIVE SYSTEM MANAGER MANAGE INFRASTRUCTURE AND DEPLOY SERVICES WITH EASE USING DELL ACTIVE SYSTEM MANAGER A systems administrator has plenty to worry about when keeping an organization s infrastructure running efficiently.

More information

DCICT: Introducing Cisco Data Center Technologies

DCICT: Introducing Cisco Data Center Technologies DCICT: Introducing Cisco Data Center Technologies Description DCICN and DCICT will introduce the students to the Cisco technologies that are deployed in the Data Center: unified computing, unified fabric,

More information

Drobo How-To Guide. Deploy Drobo iscsi Storage with VMware vsphere Virtualization

Drobo How-To Guide. Deploy Drobo iscsi Storage with VMware vsphere Virtualization The Drobo family of iscsi storage arrays allows organizations to effectively leverage the capabilities of a VMware infrastructure, including vmotion, Storage vmotion, Distributed Resource Scheduling (DRS),

More information

Using EonStor FC-host Storage Systems in VMware Infrastructure 3 and vsphere 4

Using EonStor FC-host Storage Systems in VMware Infrastructure 3 and vsphere 4 Using EonStor FC-host Storage Systems in VMware Infrastructure 3 and vsphere 4 Application Note Abstract This application note explains the configure details of using Infortrend FC-host storage systems

More information

VMware vsphere Examples and Scenarios

VMware vsphere Examples and Scenarios VMware vsphere Examples and Scenarios ESXi 5.1 vcenter Server 5.1 vsphere 5.1 This document supports the version of each product listed and supports all subsequent versions until the document is replaced

More information

Virtual SAN Design and Deployment Guide

Virtual SAN Design and Deployment Guide Virtual SAN Design and Deployment Guide TECHNICAL MARKETING DOCUMENTATION VERSION 1.3 - November 2014 Copyright 2014 DataCore Software All Rights Reserved Table of Contents INTRODUCTION... 3 1.1 DataCore

More information

Cisco for SAP HANA Scale-Out Solution on Cisco UCS with NetApp Storage

Cisco for SAP HANA Scale-Out Solution on Cisco UCS with NetApp Storage Cisco for SAP HANA Scale-Out Solution Solution Brief December 2014 With Intelligent Intel Xeon Processors Highlights Scale SAP HANA on Demand Scale-out capabilities, combined with high-performance NetApp

More information

Setup for Failover Clustering and Microsoft Cluster Service

Setup for Failover Clustering and Microsoft Cluster Service Setup for Failover Clustering and Microsoft Cluster Service ESXi 5.5 vcenter Server 5.5 This document supports the version of each product listed and supports all subsequent versions until the document

More information

Unified Computing System When Delivering IT as a Service. Tomi Jalonen DC CSE 2015

Unified Computing System When Delivering IT as a Service. Tomi Jalonen DC CSE 2015 Unified Computing System When Delivering IT as a Service Tomi Jalonen DC CSE 2015 Key Concepts 2 Unified Computing Product Innovation Manager Comprehensive Role-Based Management and Automation for Ease

More information

High-Availability Fault Tolerant Computing for Remote and Branch Offices HA/FT solutions for Cisco UCS E-Series servers and VMware vsphere

High-Availability Fault Tolerant Computing for Remote and Branch Offices HA/FT solutions for Cisco UCS E-Series servers and VMware vsphere Table of Contents UCS E-Series Availability and Fault Tolerance... 3 Solid hardware... 3 Consistent management... 3 VMware vsphere HA and FT... 3 Storage High Availability and Fault Tolerance... 4 Quick-start

More information

Optimally Manage the Data Center Using Systems Management Tools from Cisco and Microsoft

Optimally Manage the Data Center Using Systems Management Tools from Cisco and Microsoft White Paper Optimally Manage the Data Center Using Systems Management Tools from Cisco and Microsoft What You Will Learn Cisco is continuously innovating to help businesses reinvent the enterprise data

More information

VMware vsphere-6.0 Administration Training

VMware vsphere-6.0 Administration Training VMware vsphere-6.0 Administration Training Course Course Duration : 20 Days Class Duration : 3 hours per day (Including LAB Practical) Classroom Fee = 20,000 INR Online / Fast-Track Fee = 25,000 INR Fast

More information

ANZA Formación en Tecnologías Avanzadas

ANZA Formación en Tecnologías Avanzadas Temario INTRODUCING CISCO DATA CENTER TECHNOLOGIES (DCICT) DCICT is the 2nd of the introductory courses required for students looking to achieve the Cisco Certified Network Associate certification. This

More information

Frequently Asked Questions: EMC UnityVSA

Frequently Asked Questions: EMC UnityVSA Frequently Asked Questions: EMC UnityVSA 302-002-570 REV 01 Version 4.0 Overview... 3 What is UnityVSA?... 3 What are the specifications for UnityVSA?... 3 How do UnityVSA specifications compare to the

More information

VMware vsphere 5.0 Evaluation Guide

VMware vsphere 5.0 Evaluation Guide VMware vsphere 5.0 Evaluation Guide Auto Deploy TECHNICAL WHITE PAPER Table of Contents About This Guide.... 4 System Requirements... 4 Hardware Requirements.... 4 Servers.... 4 Storage.... 4 Networking....

More information

Installing the Operating System or Hypervisor

Installing the Operating System or Hypervisor Installing the Operating System or Hypervisor If you purchased E-Series Server Option 1 (E-Series Server without preinstalled operating system or hypervisor), you must install an operating system or hypervisor.

More information

Support a New Class of Applications with Cisco UCS M-Series Modular Servers

Support a New Class of Applications with Cisco UCS M-Series Modular Servers Solution Brief December 2014 Highlights Support a New Class of Applications Cisco UCS M-Series Modular Servers are designed to support cloud-scale workloads In which a distributed application must run

More information

Installing and Configuring vcloud Connector

Installing and Configuring vcloud Connector Installing and Configuring vcloud Connector vcloud Connector 2.7.0 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by a new

More information

Dell High Availability Solutions Guide for Microsoft Hyper-V

Dell High Availability Solutions Guide for Microsoft Hyper-V Dell High Availability Solutions Guide for Microsoft Hyper-V www.dell.com support.dell.com Notes and Cautions NOTE: A NOTE indicates important information that helps you make better use of your computer.

More information

Install Guide for JunosV Wireless LAN Controller

Install Guide for JunosV Wireless LAN Controller The next-generation Juniper Networks JunosV Wireless LAN Controller is a virtual controller using a cloud-based architecture with physical access points. The current functionality of a physical controller

More information

IT Agility Delivered: Cisco Unified Computing System

IT Agility Delivered: Cisco Unified Computing System Solution Brief IT Agility Delivered: Cisco Unified Computing System In Collaboration With 20 203 Cisco and/or its affiliates. All rights reserved. This document is Cisco Public information. Page IT Agility

More information

Preparation Guide. How to prepare your environment for an OnApp Cloud v3.0 (beta) deployment.

Preparation Guide. How to prepare your environment for an OnApp Cloud v3.0 (beta) deployment. Preparation Guide v3.0 BETA How to prepare your environment for an OnApp Cloud v3.0 (beta) deployment. Document version 1.0 Document release date 25 th September 2012 document revisions 1 Contents 1. Overview...

More information

Quick Start Guide for VMware and Windows 7

Quick Start Guide for VMware and Windows 7 PROPALMS VDI Version 2.1 Quick Start Guide for VMware and Windows 7 Rev. 1.1 Published: JULY-2011 1999-2011 Propalms Ltd. All rights reserved. The information contained in this document represents the

More information

TGL VMware Presentation. Guangzhou Macau Hong Kong Shanghai Beijing

TGL VMware Presentation. Guangzhou Macau Hong Kong Shanghai Beijing TGL VMware Presentation Guangzhou Macau Hong Kong Shanghai Beijing The Path To IT As A Service Existing Apps Future Apps Private Cloud Lots of Hardware and Plumbing Today IT TODAY Internal Cloud Federation

More information

Getting Started with ESXi Embedded

Getting Started with ESXi Embedded ESXi 4.1 Embedded vcenter Server 4.1 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by a new edition. To check for more recent

More information

Set Up a VM-Series Firewall on an ESXi Server

Set Up a VM-Series Firewall on an ESXi Server Set Up a VM-Series Firewall on an ESXi Server Palo Alto Networks VM-Series Deployment Guide PAN-OS 6.1 Contact Information Corporate Headquarters: Palo Alto Networks 4401 Great America Parkway Santa Clara,

More information

Configuration Maximums

Configuration Maximums Topic Configuration s VMware vsphere 5.0 When you select and configure your virtual and physical equipment, you must stay at or below the maximums supported by vsphere 5.0. The limits presented in the

More information

Hitachi Unified Compute Platform (UCP) Pro for VMware vsphere

Hitachi Unified Compute Platform (UCP) Pro for VMware vsphere Test Validation Hitachi Unified Compute Platform (UCP) Pro for VMware vsphere Author:, Sr. Partner, Evaluator Group April 2013 Enabling you to make the best technology decisions 2013 Evaluator Group, Inc.

More information

VMware vsphere 4.1 with ESXi and vcenter

VMware vsphere 4.1 with ESXi and vcenter VMware vsphere 4.1 with ESXi and vcenter This powerful 5-day class is an intense introduction to virtualization using VMware s vsphere 4.1 including VMware ESX 4.1 and vcenter. Assuming no prior virtualization

More information

StarWind Virtual SAN Installation and Configuration of Hyper-Converged 2 Nodes with Hyper-V Cluster

StarWind Virtual SAN Installation and Configuration of Hyper-Converged 2 Nodes with Hyper-V Cluster #1 HyperConverged Appliance for SMB and ROBO StarWind Virtual SAN Installation and Configuration of Hyper-Converged 2 Nodes with MARCH 2015 TECHNICAL PAPER Trademarks StarWind, StarWind Software and the

More information

Configuration Maximums VMware vsphere 4.1

Configuration Maximums VMware vsphere 4.1 Topic Configuration s VMware vsphere 4.1 When you select and configure your virtual and physical equipment, you must stay at or below the maximums supported by vsphere 4.1. The limits presented in the

More information

Unitrends Virtual Backup Installation Guide Version 8.0

Unitrends Virtual Backup Installation Guide Version 8.0 Unitrends Virtual Backup Installation Guide Version 8.0 Release June 2014 7 Technology Circle, Suite 100 Columbia, SC 29203 Phone: 803.454.0300 Contents Chapter 1 Getting Started... 1 Version 8 Architecture...

More information

Set Up a VM-Series Firewall on an ESXi Server

Set Up a VM-Series Firewall on an ESXi Server Set Up a VM-Series Firewall on an ESXi Server Palo Alto Networks VM-Series Deployment Guide PAN-OS 6.0 Contact Information Corporate Headquarters: Palo Alto Networks 4401 Great America Parkway Santa Clara,

More information

Deploying and updating VMware vsphere 5.0 on HP ProLiant Servers

Deploying and updating VMware vsphere 5.0 on HP ProLiant Servers Deploying and updating VMware vsphere 5.0 on HP ProLiant Servers Integration Note Introduction... 2 Deployment... 2 ESXi 5.0 deployment location options... 2 ESXi 5.0 image options... 2 VMware ESXi Image

More information

Installing and Administering VMware vsphere Update Manager

Installing and Administering VMware vsphere Update Manager Installing and Administering VMware vsphere Update Manager Update 1 vsphere Update Manager 5.1 This document supports the version of each product listed and supports all subsequent versions until the document

More information

Cisco UCS Integrated Infrastructure for Big Data with Splunk Enterprise

Cisco UCS Integrated Infrastructure for Big Data with Splunk Enterprise Cisco UCS Integrated Infrastructure for Big Data with Splunk Enterprise With Cluster Mode for High Availability and Optional Data Archival Last Updated: June 8, 2015 Building Architectures to Solve Business

More information

Installing and Configuring vcloud Connector

Installing and Configuring vcloud Connector Installing and Configuring vcloud Connector vcloud Connector 2.0.0 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by a new

More information

POD INSTALLATION AND CONFIGURATION GUIDE. EMC CIS Series 1

POD INSTALLATION AND CONFIGURATION GUIDE. EMC CIS Series 1 POD INSTALLATION AND CONFIGURATION GUIDE EMC CIS Series 1 Document Version: 2015-01-26 Installation of EMC CIS Series 1 virtual pods as described this guide, requires that your NETLAB+ system is equipped

More information

Building a Virtual Desktop Infrastructure A recipe utilizing the Intel Modular Server and VMware View

Building a Virtual Desktop Infrastructure A recipe utilizing the Intel Modular Server and VMware View Building a Virtual Desktop Infrastructure A recipe utilizing the Intel Modular Server and VMware View December 4, 2009 Prepared by: David L. Endicott NeoTech Solutions, Inc. 2816 South Main St. Joplin,

More information

vsphere Networking vsphere 5.5 ESXi 5.5 vcenter Server 5.5 EN-001074-02

vsphere Networking vsphere 5.5 ESXi 5.5 vcenter Server 5.5 EN-001074-02 vsphere 5.5 ESXi 5.5 vcenter Server 5.5 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by a new edition. To check for more

More information

Delivering Unprecedented Innovation to Create Flexible Virtual Environments

Delivering Unprecedented Innovation to Create Flexible Virtual Environments Delivering Unprecedented Innovation to Create Flexible Virtual Environments Cisco and Vmware Virtualizing the Data Center Maximize IT Productivity while Lowering Capital and Operating Costs 2010 Cisco

More information

VMware for Bosch VMS. en Software Manual

VMware for Bosch VMS. en Software Manual VMware for Bosch VMS en Software Manual VMware for Bosch VMS Table of Contents en 3 Table of contents 1 Introduction 4 1.1 Restrictions 4 2 Overview 5 3 Installing and configuring ESXi server 6 3.1 Installing

More information

Introduction to VMware EVO: RAIL. White Paper

Introduction to VMware EVO: RAIL. White Paper Introduction to VMware EVO: RAIL White Paper Table of Contents Introducing VMware EVO: RAIL.... 3 Hardware.................................................................... 4 Appliance...............................................................

More information

Dell PowerEdge Blades Outperform Cisco UCS in East-West Network Performance

Dell PowerEdge Blades Outperform Cisco UCS in East-West Network Performance Dell PowerEdge Blades Outperform Cisco UCS in East-West Network Performance This white paper compares the performance of blade-to-blade network traffic between two enterprise blade solutions: the Dell

More information

QuickStart Guide vcenter Server Heartbeat 5.5 Update 2

QuickStart Guide vcenter Server Heartbeat 5.5 Update 2 vcenter Server Heartbeat 5.5 Update 2 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by a new edition. To check for more recent

More information

Virtual Appliance Setup Guide

Virtual Appliance Setup Guide Virtual Appliance Setup Guide 2015 Bomgar Corporation. All rights reserved worldwide. BOMGAR and the BOMGAR logo are trademarks of Bomgar Corporation; other trademarks shown are the property of their respective

More information

Exinda How to Guide: Virtual Appliance. Exinda ExOS Version 6.3 2012 Exinda, Inc

Exinda How to Guide: Virtual Appliance. Exinda ExOS Version 6.3 2012 Exinda, Inc Exinda How to Guide: Virtual Appliance Exinda ExOS Version 6.3 2 Virtual Appliance Table of Contents Part I Introduction 4 1 Using... this Guide 4 Part II Overview 6 Part III Deployment Options 8 Part

More information

Installing and Configuring vcenter Support Assistant

Installing and Configuring vcenter Support Assistant Installing and Configuring vcenter Support Assistant vcenter Support Assistant 5.5 This document supports the version of each product listed and supports all subsequent versions until the document is replaced

More information

Get More Scalability and Flexibility for Big Data

Get More Scalability and Flexibility for Big Data Solution Overview LexisNexis High-Performance Computing Cluster Systems Platform Get More Scalability and Flexibility for What You Will Learn Modern enterprises are challenged with the need to store and

More information

Cisco UCS B200 M3 Blade Server

Cisco UCS B200 M3 Blade Server Data Sheet Cisco UCS B200 M3 Blade Server Product Overview The Cisco Unified Computing System (Cisco UCS ) combines Cisco UCS B-Series Blade Servers and C-Series Rack Servers with networking and storage

More information

Configuration Maximums VMware vsphere 4.0

Configuration Maximums VMware vsphere 4.0 Topic Configuration s VMware vsphere 4.0 When you select and configure your virtual and physical equipment, you must stay at or below the maximums supported by vsphere 4.0. The limits presented in the

More information

Cisco UCS: Unified Infrastructure Management That HP OneView Still Can t Match

Cisco UCS: Unified Infrastructure Management That HP OneView Still Can t Match Cisco UCS: Unified Infrastructure Management That HP OneView Still Can t Match Solution Brief October 2015 Highlights Greater Efficiency and Simplicity Cisco Unified Computing System (Cisco UCS ) provides

More information

Thinspace deskcloud. Quick Start Guide

Thinspace deskcloud. Quick Start Guide Thinspace deskcloud Quick Start Guide Version 1.2 Published: SEP-2014 Updated: 16-SEP-2014 2014 Thinspace Technology Ltd. All rights reserved. The information contained in this document represents the

More information

Cisco UCS Business Advantage Delivered: Data Center Capacity Planning and Refresh

Cisco UCS Business Advantage Delivered: Data Center Capacity Planning and Refresh Solution Brief November 2011 Highlights Consolidate More Effectively Cisco Unified Computing System (Cisco UCS ) delivers comprehensive infrastructure density that reduces the cost per rack unit at the

More information

Storage Sync for Hyper-V. Installation Guide for Microsoft Hyper-V

Storage Sync for Hyper-V. Installation Guide for Microsoft Hyper-V Installation Guide for Microsoft Hyper-V Egnyte Inc. 1890 N. Shoreline Blvd. Mountain View, CA 94043, USA Phone: 877-7EGNYTE (877-734-6983) www.egnyte.com 2013 by Egnyte Inc. All rights reserved. Revised

More information

Improving IT Operational Efficiency with a VMware vsphere Private Cloud on Lenovo Servers and Lenovo Storage SAN S3200

Improving IT Operational Efficiency with a VMware vsphere Private Cloud on Lenovo Servers and Lenovo Storage SAN S3200 Improving IT Operational Efficiency with a VMware vsphere Private Cloud on Lenovo Servers and Lenovo Storage SAN S3200 Most organizations routinely utilize a server virtualization infrastructure to benefit

More information

COMPLEXITY AND COST COMPARISON: CISCO UCS VS. IBM FLEX SYSTEM (REVISED)

COMPLEXITY AND COST COMPARISON: CISCO UCS VS. IBM FLEX SYSTEM (REVISED) COMPLEXITY AND COST COMPARISON: CISCO UCS VS. IBM FLEX SYSTEM (REVISED) Not all IT architectures are created equal. Whether you are updating your existing infrastructure or building from the ground up,

More information

How to Configure Intel Ethernet Converged Network Adapter-Enabled Virtual Functions on VMware* ESXi* 5.1

How to Configure Intel Ethernet Converged Network Adapter-Enabled Virtual Functions on VMware* ESXi* 5.1 How to Configure Intel Ethernet Converged Network Adapter-Enabled Virtual Functions on VMware* ESXi* 5.1 Technical Brief v1.0 February 2013 Legal Lines and Disclaimers INFORMATION IN THIS DOCUMENT IS PROVIDED

More information

RSA Authentication Manager 8.1 Virtual Appliance Getting Started

RSA Authentication Manager 8.1 Virtual Appliance Getting Started RSA Authentication Manager 8.1 Virtual Appliance Getting Started Thank you for purchasing RSA Authentication Manager 8.1, the world s leading two-factor authentication solution. This document provides

More information

VMware vsphere 5.0 Evaluation Guide

VMware vsphere 5.0 Evaluation Guide VMware vsphere 5.0 Evaluation Guide Advanced Networking Features TECHNICAL WHITE PAPER Table of Contents About This Guide.... 4 System Requirements... 4 Hardware Requirements.... 4 Servers.... 4 Storage....

More information

SonicWALL SRA Virtual Appliance Getting Started Guide

SonicWALL SRA Virtual Appliance Getting Started Guide COMPREHENSIVE INTERNET SECURITY SonicWALL Secure Remote Access Appliances SonicWALL SRA Virtual Appliance Getting Started Guide SonicWALL SRA Virtual Appliance5.0 Getting Started Guide This Getting Started

More information

RSA Security Analytics Virtual Appliance Setup Guide

RSA Security Analytics Virtual Appliance Setup Guide RSA Security Analytics Virtual Appliance Setup Guide Copyright 2010-2015 RSA, the Security Division of EMC. All rights reserved. Trademarks RSA, the RSA Logo and EMC are either registered trademarks or

More information

PHD Virtual Backup for Hyper-V

PHD Virtual Backup for Hyper-V PHD Virtual Backup for Hyper-V version 7.0 Installation & Getting Started Guide Document Release Date: December 18, 2013 www.phdvirtual.com PHDVB v7 for Hyper-V Legal Notices PHD Virtual Backup for Hyper-V

More information

Citrix XenDesktop: Best Practices with Cisco UCS

Citrix XenDesktop: Best Practices with Cisco UCS Global Alliance Architects Citrix XenDesktop: Best Practices with Cisco UCS 2 Contents Overview...3 An Overview of Cisco UCS...3 Design Considerations...5 Prerequisites...6 Pool Design...7 Management IP

More information

Cisco UCS C220 M3 Server

Cisco UCS C220 M3 Server Data Sheet Cisco UCS C220 M3 Rack Server Product Overview The Cisco Unified Computing System (Cisco UCS) combines Cisco UCS C-Series Rack Servers and B-Series Blade Servers with networking and storage

More information

FASTER AND MORE EFFICIENT SYSTEM MANAGEMENT WITH LENOVO XCLARITY ADMINISTRATOR

FASTER AND MORE EFFICIENT SYSTEM MANAGEMENT WITH LENOVO XCLARITY ADMINISTRATOR sa FASTER AND MORE EFFICIENT SYSTEM MANAGEMENT WITH LENOVO XCLARITY ADMINISTRATOR Technology that can simplify management processes in the datacenter through automation, such as Lenovo XClarity Administrator,

More information

Management of VMware ESXi. on HP ProLiant Servers

Management of VMware ESXi. on HP ProLiant Servers Management of VMware ESXi on W H I T E P A P E R Table of Contents Introduction................................................................ 3 HP Systems Insight Manager.................................................

More information

Data Centre of the Future

Data Centre of the Future Data Centre of the Future Vblock Infrastructure Packages: Accelerating Deployment of the Private Cloud Andrew Smallridge DC Technology Solutions Architect asmallri@cisco.com 1 IT is undergoing a transformation

More information

Cisco, Citrix, Microsoft, and NetApp Deliver Simplified High-Performance Infrastructure for Virtual Desktops

Cisco, Citrix, Microsoft, and NetApp Deliver Simplified High-Performance Infrastructure for Virtual Desktops Cisco, Citrix, Microsoft, and NetApp Deliver Simplified High-Performance Infrastructure for Virtual Desktops Greater Efficiency and Performance from the Industry Leaders Citrix XenDesktop with Microsoft

More information

QUICK START GUIDE Cisco M380 and Cisco M680 Content Security Management Appliance

QUICK START GUIDE Cisco M380 and Cisco M680 Content Security Management Appliance QUICK START GUIDE Cisco M380 and Cisco M680 Content Security Management Appliance 1 Welcome 2 Before You Begin 3 Document Network Settings 4 Plan the Installation 5 Install the Appliance in a Rack 6 Plug

More information

Configuration Maximums

Configuration Maximums Topic Configuration s VMware vsphere 5.1 When you select and configure your virtual and physical equipment, you must stay at or below the maximums supported by vsphere 5.1. The limits presented in the

More information

RED HAT ENTERPRISE VIRTUALIZATION FOR SERVERS: COMPETITIVE FEATURES

RED HAT ENTERPRISE VIRTUALIZATION FOR SERVERS: COMPETITIVE FEATURES RED HAT ENTERPRISE VIRTUALIZATION FOR SERVERS: COMPETITIVE FEATURES RED HAT ENTERPRISE VIRTUALIZATION FOR SERVERS Server virtualization offers tremendous benefits for enterprise IT organizations server

More information

EMC Business Continuity for VMware View Enabled by EMC SRDF/S and VMware vcenter Site Recovery Manager

EMC Business Continuity for VMware View Enabled by EMC SRDF/S and VMware vcenter Site Recovery Manager EMC Business Continuity for VMware View Enabled by EMC SRDF/S and VMware vcenter Site Recovery Manager A Detailed Review Abstract This white paper demonstrates that business continuity can be enhanced

More information

VMware vsphere Data Protection Evaluation Guide REVISED APRIL 2015

VMware vsphere Data Protection Evaluation Guide REVISED APRIL 2015 VMware vsphere Data Protection REVISED APRIL 2015 Table of Contents Introduction.... 3 Features and Benefits of vsphere Data Protection... 3 Requirements.... 4 Evaluation Workflow... 5 Overview.... 5 Evaluation

More information

Acano solution. Virtualized Deployment R1.1 Installation Guide. Acano. February 2014 76-1025-03-B

Acano solution. Virtualized Deployment R1.1 Installation Guide. Acano. February 2014 76-1025-03-B Acano solution Virtualized Deployment R1.1 Installation Guide Acano February 2014 76-1025-03-B Contents Contents 1 Introduction... 3 1.1 Before You Start... 3 1.1.1 About the Acano virtualized solution...

More information

Configuring VMware vsphere 5.1 with Oracle ZFS Storage Appliance and Oracle Fabric Interconnect

Configuring VMware vsphere 5.1 with Oracle ZFS Storage Appliance and Oracle Fabric Interconnect An Oracle Technical White Paper October 2013 Configuring VMware vsphere 5.1 with Oracle ZFS Storage Appliance and Oracle Fabric Interconnect An IP over InfiniBand configuration overview for VMware vsphere

More information

Setup for Failover Clustering and Microsoft Cluster Service

Setup for Failover Clustering and Microsoft Cluster Service Setup for Failover Clustering and Microsoft Cluster Service Update 1 ESXi 6.0 vcenter Server 6.0 This document supports the version of each product listed and supports all subsequent versions until the

More information