Hadoop Virtualization
|
|
|
- Gertrude Johnston
- 10 years ago
- Views:
Transcription
1 Hadoop Virtualization Courtney Webster
2
3 Hadoop Virtualization Courtney Webster
4 Hadoop Virtualization by Courtney Webster Copyright 2015 O Reilly Media, Inc. All rights reserved. Printed in the United States of America. Published by O Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA O Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles ( For more information, contact our corporate/institutional sales department: or [email protected]. Editors: Julie Steele and Jenn Webb Illustrator: Rebecca Demarest February 2015: First Edition Revision History for the First Edition: : First release The O Reilly logo is a registered trademark of O Reilly Media, Inc. Hadoop Virtualization, the cover image, and related trade dress are trademarks of O Reilly Media, Inc. While the publisher and the author have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the author disclaim all responsibility for errors or omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights ISBN: [LSI]
5 Table of Contents The Benefits of Deploying Hadoop in a Private Cloud Abstract 1 Introduction 2 MapReduce 2 Hadoop 2 Virtualizing Hadoop 4 Another Form of Virtualization: Aggregation 5 Benefits of Hadoop in a Private Cloud 7 Agility and Operational Simplicity with Competitive Performance 8 Improved Efficiency 10 Flexibility 12 Conclusions 15 Apply the Resources and Best Practices You Already Know 15 Benefits of Virtualizing Hadoop 16 iii
6
7 The Benefits of Deploying Hadoop in a Private Cloud Abstract Hadoop is a popular framework used for nimble, cost-effective analysis of unstructured data. The global Hadoop market, valued at $1.5 billion in 2012, is estimated to reach $50 billion by Companies can now choose to deploy a Hadoop cluster in a physical server environment, a private cloud environment, or in the public cloud. We have yet to see which deployment model will predominate during this growth period; however, the security and granular control offered by private clouds may lead this model to dominate for medium to large enterprises. When compared to other deployment models, a private cloud Hadoop cluster offers unique benefits: A cluster can be set up in minutes It can flexibly use a variety of hardware (DAS, SAN, NAS) It is cost effective (lower capital expenses than physical deployment and lower operating expenses than public cloud deployment) Streamlined management tools lower the complexity of initial configuration and maintenance High availability and fault tolerance increase uptime This report reviews the benefits of running Hadoop on a virtualized or aggregated (container-based) private cloud and provides an overview of best practices to maximize performance. 1
8 Introduction Today, we are capable of collecting more data (and various forms of data) than ever before. 2 It may be the most valuable intangible asset of our time. The sheer volume ("big data") and need for flexible, lowlatency analysis can overwhelm traditional management systems like structured relational databases. As a result, new tools have emerged to store and mine large collections of unstructured data. MapReduce In 2004, Google engineers described a scalable programming model for processing large, distributed datasets. 3 This model, MapReduce, abstracts computation away from more complicated tasks like data distribution, failure handling, and parallelization. Developers specify a processing ("map") function that behaves as an independent, modular operation on blocks of local data. The resulting analyses can then be consolidated (or reduced ) to provide an aggregate result. This model of local computation is particularly useful for big data, where the transfer time required to move the data to a centralized computing module is limiting. Hadoop Doug Cutting and others at Yahoo! combined the computational power of MapReduce with a distributed filesystem prototyped by Google in This evolved into Hadoop an open source system made of MapReduce and the Hadoop Distributed File System (HDFS). HDFS makes several replica copies of the data blocks for resilience against server failure and is best used on high I/O bandwidth storage devices. In Hadoop 1.0, two master roles (the JobTracker and the Namenode) direct MapReduce and HDFS, respectively. Hadoop was originally built to use local data storage on a dedicated group of commodity hardware. In a Hadoop cluster, each server is considered a node. A master node stores either the JobTracker of MapReduce or the Namenode of HDFS (although in a small cluster as shown in Figure 1, one master node could store both). The remaining servers ("worker" nodes) store blocks of data and run local computation on that data. 2 The Benefits of Deploying Hadoop in a Private Cloud
9 Figure 1. A simplified overview of Hadoop 5 The JobTracker directs low-latency, high-bandwidth computational jobs (TaskTrackers) on local data. The Namenode, the lead storage directory of HDFS, provides rack awareness: the system s knowledge of where files (Data Nodes) are stored among the array of workers. It does this by mapping HDFS file names to their constituent data blocks, and then further maps those data blocks to Data Node processes. This knowledge is responsible for HDFS s reliability, as it ensures nonredundant locations of data replicates. Hadoop 2.0 In the newest version of Hadoop, the JobTracker is no longer solely reponsible for managing the MapReduce programming framework. The JobTracker function is distributed, among others, to a new Hadoop component called the Application Master. In order to run tasks, ApplicationMasters request resources from a central scheduler called the ResourceManager. This architectural redesign improves scalability and efficiency, bypassing some of the limitations in Hadoop 1.0. A new central scheduler, the ResourceManager, acts as its key replacement. Developers can then construct ApplicationMasters to encapsulate any knowledge of the programming framework, such as MapReduce. In The Benefits of Deploying Hadoop in a Private Cloud 3
10 order to run their tasks, ApplicationMasters request resources from the ResourceManager. This architectural redesign improves scalability and efficiency, bypassing some of the limitations in Hadoop 1.0. Virtualizing Hadoop As physically deployed Hadoop clusters grew in size, developers asked a familiar question: can we virtualize it? Like other enterprise (and Java-based) applications, development efforts moved to virtualization as Hadoop matured. A virtualized private cloud uses a group of hardware on the same hypervisor (such as vsphere [by VMware], XenServer [by Citrix], KVM [by Red Hat], or Hyper-V [by Microsoft]). Instead of individual servers, nodes are virtual machines (VMs) designated with master or worker roles. Each VM is allocated specific computing and storage resources from the physical host, and as a result, one can consolidate their Hadoop cluster onto far fewer physical servers. There is an up-front cost for virtualization licenses and supported or enterprise-level software, but this can be offset with the cluster s decreased operating expenses over time. Virtualizing Hadoop created the infrastructure required to run Hadoop in the cloud, leading major players to offer web-service Hadoop. The first, Amazon Web Services, began beta testing their Elastic Map Reduce service as early as Though public cloud deployment is not the focus of this review, it s worth noting that it can be useful for ad hoc or batch processing, especially if your data is already stored in the cloud. For a stable, live cluster, a company might find that building its own private cloud is more cost effective. Additionally, regulated industries may prefer the security of a private hosting facility. In 2012, VMware released Project Serengeti an open source management and deployment platform on vsphere for private cloud environments. Soon thereafter, they released Big Data Extensions (BDE), the advanced commercial version of Project Serengeti (run on vsphere Enterprise Edition). Other offerings, like OpenStack s Project Sahara on KVM (formerly called Project Savanna), were also released in the past two years. 4 The Benefits of Deploying Hadoop in a Private Cloud
11 Though these programs run on vendor-specific virtualization platforms, they support most (if not all) Hadoop distributions (Apache Hadoop [1.x and 2.x] and commercial distributions like Cloudera, Hortonworks, MapR, and Pivotal). They can also manage coordinating applications (like Hive and Pig) that are typically built on top of a Hadoop cluster to satisfy analytical needs. Case Study: Hadoop on a Public Versus Private Cloud A company providing enterprise business solutions initially turned to the public cloud for its analytics applications. Ad hoc use of a Hadoop cluster of 200 VMs cost about $40k a month. When their developers needed consistent access to Hadoop, the bills would spike by an additional $20-40k. For $80k, they decided to build their own 225 TB, 30-node virtualized Hadoop cluster. Flash-based SAN and serverbased flash cards were used to enhance performance for 2-3 TB of very active data. Using Project Serengeti, it took about 10 minutes to deploy their cluster. Another Form of Virtualization: Aggregation Cloud computing without virtualization Thus far, virtualization refers to using a hypervisor and VMs to isolate and allocate resources in a private cloud environment. For clarity, "virtualization" will continue to be used in this context. But building a private cloud environment isn t limited to virtualization. Aggregation (as a complement to or on top of virtualization) became a useful alternative for cloud computing (see B in Figure 2), especially as applications like Hadoop grew in size. The Benefits of Deploying Hadoop in a Private Cloud 5
12 Figure 2. Strategies for cloud computing Virtualization partitions servers into isolated virtual machines, while aggregation consolidates servers to create a common pool of resources (like CPU, RAM, and memory) that applications can share. System containers can run a full OS, like a VM, while others (application containers) contain a single process or application. This allows multiple applications to access the consolidated resources without interfering with each other. Resources can be dynamically allocated to different applications as their loads change. In an initial study by IBM, Linux containers (LXC) and control groups (cgroups) allowed for isolation and resource control in an aggregated environment with less overhead than a KVM hypervisor. 6 The potential overhead advantages should be weighed against some limitations with LXC, such as the restriction to only run on Linux and that, currently, containers offer less performance isolation than VMs. 6 The Benefits of Deploying Hadoop in a Private Cloud
13 If an industry has already invested in virtualization licenses, aggregation can be used on a virtualized environment to provide one "super" VM (see C in Figure 2). Unless otherwise specified, however, the terms "aggregation" and "containers" here imply use on a bare metal (nonvirtualized) environment. Cluster managers Cluster managers and management tools work on the application level to manage containers and schedule tasks in an aggregated environment. Many cluster managers, like Apache Mesos (backed by Mesosphere) and StackIQ, are designed to support analytics (like Hadoop) alongside other services. Hadoop on Mesos Apache Mesos provides a foundational layer for running a variety of distributed applications by pooling resources. Mesos allows a cluster to elastically provision resources to multiple applications (including more than one Hadoop cluster). Mesosphere aims to commercialize Mesos for Hadoop and other enterprise applications while building add-on frameworks (like distributed schedulers) along the way. In 2013, they released Elastic Mesos to easily provision a Mesos cluster on Amazon Web Services, allowing companies to run Hadoop 1.0 on Mesos in bare-metal, virtualized, and now public cloud environments. Benefits of Hadoop in a Private Cloud In addition to cost-effective setup and operation, private cloud deployment offers additive value by streamlining maintenance, increasing hardware utilization, and providing configurational flexibility to enhance the performance of a cluster. The Benefits of Deploying Hadoop in a Private Cloud 7
14 Agility and Operational Simplicity with Competitive Performance Deploy a Scalable, High-Performance Cluster with a Simplified Management Interface Benchmarking tools indicate that the performance of a virtual cluster is comparable to a physical cluster Built-in workflows lower initial configuration complexity and time to deployment Streamlined monitoring consoles provide quick performance read-outs and easy-to-use management tools Nodes can be easily added and removed for facile scaling Competitive performance Since a hypervisor demands some amount of computational resources, initial concerns about virtual Hadoop focused on performance. The virtualization layer requires some CPU, memory, and other resources in order to manage its hosted VMs, 7 though the impact is dependent on the characteristics of the hypervisor used. Over the past 5 to 10 years, however, the performance of VMs have significantly improved (especially for Java-based applications). Many independent reports show that when using best practices, a virtual Hadoop cluster performs competitively to a physical system. 8,9 Increasing the number of VMs per host can even lead to enhanced performance (up to 13%). Container-based clusters (like Linux VServer, OpenVZ, and LXC) can also provide near-native performance on Hadoop benchmarking tests like WordCount and TeraSuite. 10 With such results, performance concerns are generally outweighed by the numerous other benefits provided by a private-cloud deployment. Rapid deployment To deploy a cluster, Hadoop administrators must navigate a complicated setup and configuration procedure. Clusters can be composed 8 The Benefits of Deploying Hadoop in a Private Cloud
15 of tens to hundreds of nodes in a physical deployment, each node must be individually configured. With a virtualized cluster, an administrator can speed up initial configuration by cloning worker VM nodes. VMs can be easily copied to expand the size of the cluster, and problematic nodes can be removed and then restored from backup images. Some virtualized Hadoop offerings, like BDE, can entirely automate installation and network configuration. Using containers instead of VMs offers deployment advantages as well, as it takes hours to provision bare metal, minutes to provision VMs, but just seconds to provision containers. Like BDE, cluster managers can also automate installation and configuration (including networking software, OS software, and hardware parameters, among others). Improved management and monitoring A Hadoop cluster must be carefully monitored to meet the demands of 24/7 accessibility, and a variety of management tools exist to help watch the cluster. Some come with your Hadoop distribution (like Cloudera Manager and Pivotal s Command Center), while others are open source (like Apache Ambari) or commercial (like Zettaset Orchestrator). Virtualization-aware customers are already using hypervisor management interfaces (like vcenter or XenCenter) to simplify resource and lifecycle management, and a virtualized Hadoop cluster integrates as just another monitored workload. These simplified provisioning and management tools enable Hadoopas-a-service. Some platforms allow an administrator to hand off preconfigured templates, leaving users to customize the environment to suit their individual needs. More sophisticated cloud management tools automate the deployment and management of Hadoop, so companies can offer Hadoop clusters without users managing any configurational details. Scalability Modifying a physical cluster removing or adding physical nodes requires a reshuffling of the data within the entire system. Load balancing (ensuring that all worker nodes store approximately the same amount of data) is one of the most important tasks when scaling and maintaining a cluster. Some hypervisors, like vsphere Enterprise Ed The Benefits of Deploying Hadoop in a Private Cloud 9
16 ition, include distributed resource schedulers that can perform automatic load balancing. To scale an aggregated system, cluster managers just need to be installed on new nodes. When the cluster scheduler is made aware of the new node, it will automatically absorb the offered resources and begin scheduling tasks on it. Improved Efficiency Create a Robust, High-Utilization Cluster Rather than monopolizing dedicated hardware, a private cloud cluster allows for mixed workflows for higher utilization. High availability and fault tolerance increase the uptime of a cluster during unanticipated outages and failures or routine maintenance. Higher resource utilization A physical deployment model monopolizes its dedicated hardware. Physical Hadoop clusters are often over-engineered they are built to handle an estimated peak capacity, but left underutilized the rest of the time. Any complementary application (like a NoSQL or SQL database) requires its own dedicated hardware as well. In a virtual deployment, resources like CPU and RAM are partitioned for the Hadoop cluster, freeing up resting resources for other tasks. Co-locating VMs running Hadoop roles (like MapReduce jobs) with VMs running other workloads (such as Hive queries on HBase) can balance the use of a system. 5 Multiple workloads can be run concurrently on the same hardware with a minimal effect on results (less than a 10% difference when compared to utilizing separate, independent workloads on a standalone cluster). 11 An aggregated cloud also offers higher utilization. Though isolated from one another, all applications access the same pool of resources. The system can elastically scale resources for each application. Theoretically, a high-load application could use the entirety of aggregated resources (like CPU, RAM, and memory) until loads on other applications increase. 10 The Benefits of Deploying Hadoop in a Private Cloud
17 Minimizing downtime with high availability and fault tolerance High availability (HA) protects a cluster during planned and unplanned downtime. Failovers can be deliberately triggered for maintenance or are automatically triggered in the event of failures or unresponsive service. Virtualized HA solutions monitor hosts and VMs to detect hardware and guest operating system (OS) failures. If a server outage or failed network connection is detected, VMs from the failed host are restarted on new hosts without manual intervention (see Figure 3). 12 In the case of an OS failure, VMs are automatically restarted. In aggregated environments, failed workloads automatically failover to a new node with available resources. HA in a Hadoop cluster can protect against the single-point failure of a master node (the Namenode or JobTracker). If desired, the entire cluster (master nodes and worker nodes) can be uniformly managed and configured for HA. 12 Figure 3. High availability monitoring 12 In a virtualized environment, fault tolerance (FT) provides continuous availability by creating a live, up-to-date shadow (a secondary instance) of a VM. Though an FT system is not able to detect if the application fails or a guest OS hangs, 13 it triggers a failover procedure to the secondary VM if a VM stops due to a hardware outage or loss of network connectivity. This helps prevent data loss and decreases The Benefits of Deploying Hadoop in a Private Cloud 11
18 downtime. Combining HA and FT can create maximum availability for a virtualized Hadoop cluster. Flexibility Utilize Options for Flexible Configuration A cluster can be built using DAS, SAN/NAS, or a hybrid combination of storage. Configurational options create elastic scalability to address fluctuating demands. Hardware flexibility By using commodity hardware and built-in failure protection, Hadoop was designed for flexibility. Virtualization takes this a step further by abstracting away from hardware completely. A private cloud can use direct attached storage (DAS), a storage attached network (SAN), or a network attached storage (NAS). SAN/NAS storage can be more costly but offers enhanced scalability, performance, and data isolation. If a company has already invested in non-local storage, their Hadoop cluster can strategically employ both direct- and networkattached devices. The storage or VMDK files for the Namenode and JobTracker can be placed on SAN for maximum reliability (as they are memory- but not storage-intensive) while worker nodes store their data on DAS. 5 The temporary data generated during MapReduce can be stored wherever I/O bandwidth is maximized. Case Study: Hadoop on NAS A major shipping company uses a virtualized Hadoop cluster to perform web log analysis (detecting mobile devices accessing the website), ZIP code analysis (which ZIP codes are the highest source or destination for shipments), and shipment analysis (to determine patterns that may delay a package). Their cluster is hosted on EMC Isilon NAS storage. From their perspective, Isilon helps drive down the total cost of ownership by eliminating the "triple replicate penalty" with data storage. Additionally, they ve found that a fast enough network 12 The Benefits of Deploying Hadoop in a Private Cloud
19 can perform competitively to DAS (equalizing the playing field in terms of data locality). Configurational flexibility As previously described, organizing computation tasks to run on blocks of local data (data locality) is the key to Hadoop s performance. In a physical deployment, this necessitates that worker nodes host data and compute roles in a fixed 1:1 ratio (a "combined" model). For this model to be mimicked in a virtual cluster, each hypervisor server would host one or more VMs that contained data and compute processes (see A in Figure 4). These configurations are valid, but difficult to scale under practical circumstances. Since each VM stores data, the ease of adding or removing nodes (simply copying from a template or using live migrate capabilities) would be offset by the need to rebalance the cluster. If instead, compute and data roles on the same hypervisor server were in separate VMs (see B in Figure 4), compute operations could be scaled according to demand without redistributing any data. 14 Likewise in an aggregated cloud, Apache Mesos only spins up TaskTracker nodes as a job runs. When the task is complete, the TaskTrackers are killed, and their capacity is placed back in the consolidated pool of resources. This "separated" model is fairly intuitive (since the TaskTracker controls MapReduce and the Namenode controls the data storage) and the flexibility of virtualized or aggregated clusters makes it relatively simple to configure. In addition to creating this elastic system (where compute processes can be easily cloned or launched to increase throughput), the separated model also allows you to build a multi-tenant system where multiple, isolated compute clusters can operate on the same data. The same cloud could host a production Hadoop cluster as well as a development and QA environment. The Benefits of Deploying Hadoop in a Private Cloud 13
20 Figure 4. Configurational flexibility with compute and data processes (C: compute; D: data). Figure modified from VMware s "Deploying Virtualized Hadoop Systems with VMware vsphere Big Data Extensions (BDE)." 5 Virtual rack awareness Using the separated model does carry an added complication. If compute processes can scale on demand, the cluster s topology is dynamic (compared to the fixed structure of a physical deployment). In a virtualized cluster, a compute and data node on the same hypervisor server communicate over a virtual network in memory (without suffering physical network latencies). It is important that the system maintain virtual "rack awareness to preserve the performance advantage of data locality. To accommodate this need, VMware contributed a tool called Hadoop Virtualization Extensions (HVE) to Hadoop. 15 HVE accounts for the virtualization layer by grouping all VMs on the same host in a new domain. For best performance, the cluster can use this domain to direct computation to perform on the same hypervisor server hosting the data. It can also intelligently place data replicates on separate hosts to provide maximum protection in the event of a hardware failure. 14 The Benefits of Deploying Hadoop in a Private Cloud
21 Separately scaled data and compute processes presents a similar challenge in an aggregated environment. If an application can request tasks and be offered resources throughout the entire cloud, how does it know which nodes store the data required? Adding constraints allows an application to reject resource offers that don t meet its requirements. 16 A technique called "delay scheduling," in which an application can wait if it cannot launch a local task, can result in nearly optimal data locality. 17 Conclusions Apply the Resources and Best Practices You Already Know If planning your first cluster or a deployment overhaul, it is important to consider the following: Current data storage Estimated data growth The amount of temporary data that will be stored during Map Reduce processing Throughput and bandwidth needs Performance needs The resources (hardware and software) you have available to dedicate to the cluster The resources (hardware and software) you d need to purchase to dedicate to the cluster It would be difficult to specify an ideal architecture for every Hadoop cluster, as analytical demands and resource needs vary widely. But planning a private cloud cluster doesn t necessitate a large learning curve either. Many companies can utilize resources (e.g., virtualization licenses, cluster managers, DAS, or SAN/NAS storage) they already have. Additionally, many of the best practices an IT department already puts in place (like avoiding VM contention and optimizing I/O bandwidth) translate well to configuring a high-performance Hadoop cluster. The Benefits of Deploying Hadoop in a Private Cloud 15
22 Benefits of Virtualizing Hadoop Whether Hadoop is deployed in a physical system or in a private or public cloud, the goals of a well-established infrastructure are the same. In any implementation, Hadoop should provide: Cost-effective and high-performance data analysis Failure-tolerant data storage Scalable capability for future growth Minimal downtime For Hadoop administrators and users, a private cloud offers unique benefits with comparable (even improved) performance. Rapid deployment and built-in workflows ease initial configuration complexity, to the point where developers can use built-in tools to configure their own test environments without involving IT. Management tools make it easier to monitor and analyze performance, and features like high availability and fault tolerance decrease downtime. The need for low-latency data management systems will grow in coming years, and enterprise applications continue to move from physical systems into cloud computing infrastructures. Using a private cloud can create a scalable, streamlined Hadoop cluster built to accommodate data science s evolving landscape. 16 The Benefits of Deploying Hadoop in a Private Cloud
23 Notes 1. "Global Hadoop Market (Hardware, Software, Services, HaaS, End User Application and Geography) - Industry Growth, Size, Share, Insights, Analysis, Research, Report, Opportunities, Trends and Forecasts Through 2020" 2. "Big Data: The Next Big Thing." Nasscom, New Delhi, "MapReduce: Simplified Data Processing on Large Clusters" 4. Hadoop: The Definitive Guide 5. VMware, Inc. Deploying Virtualized Hadoop Systems with VMware vsphere Big Data Extensions Felter, W., A. Ferreira, R. Rajamony, and J. Rubio. "An Updated Performance Comparison of Virtual Machines and Linux Containers." IBM Research Report, Microsoft IT Big Data Program. Performance of Hadoop in Hyper- V. MSIT SES Enterprise Data Architect Team, Buell, J. A Benchmarking Case Study of Virtualized Hadoop Performance on VMware vsphere Buell, J. Virtualized Hadoop Performance with VMware vsphere 5.1. VMware, Inc., Xavier, M.G., M.V. Neves, and A.F. De Rose. Performance Comparison of Container-based Virtualization MapReduce Clusters. IEEE, Intel, VMware, and Dell. Scaling the Deployment of Multiple Hadoop Workloads on a Virtualized Infrastructure Hortonworks and VMware. Apache Hadoop 1.0 High Availability Solution on VMware vsphere Buell, J. Protecting Hadoop with VMware vsphere 5 Fault Tolerance. VMware, Inc., Magdon-Ismail, T., et al. Toward an Elastic Elephant Enabling Hadoop for the Cloud. VMware Technical Journal 2, no. 2 (December 2013): The Benefits of Deploying Hadoop in a Private Cloud 17
24 15. VMware, Inc. Hadoop Virtualization Extensions on VMware vsphere B. Hindman et. al. Mesos: A Platform for Fine-Grained Resource Sharing in the Data Center. UC Berkeley, M. Zaharia et al. Delay Scheduling: A Simple Technique for Achieving Locality and Fairness in Cluster Scheduling. Proceedings of the 5th European Conference on Computer Systems, The Benefits of Deploying Hadoop in a Private Cloud
25 About the Author Courtney Webster is a freelance writer with professional experience in laboratory automation, automated data analysis, and the application of mobile technology to clinical research. You can follow her on and find her blog at
Virtualizing Apache Hadoop. June, 2012
June, 2012 Table of Contents EXECUTIVE SUMMARY... 3 INTRODUCTION... 3 VIRTUALIZING APACHE HADOOP... 4 INTRODUCTION TO VSPHERE TM... 4 USE CASES AND ADVANTAGES OF VIRTUALIZING HADOOP... 4 MYTHS ABOUT RUNNING
Adobe Deploys Hadoop as a Service on VMware vsphere
Adobe Deploys Hadoop as a Service A TECHNICAL CASE STUDY APRIL 2015 Table of Contents A Technical Case Study.... 3 Background... 3 Why Virtualize Hadoop on vsphere?.... 3 The Adobe Marketing Cloud and
Virtualized Hadoop. A Dell Hadoop Whitepaper. By Joey Jablonski. A Dell Hadoop Whitepaper
Virtualized Hadoop A Dell Hadoop Whitepaper By Joey Jablonski A Dell Hadoop Whitepaper Introduction to Virtualized Hadoop Hadoop has become a standard within many organizations and data centers for its
Deploying Virtualized Hadoop Systems with VMware vsphere Big Data Extensions A DEPLOYMENT GUIDE
Deploying Virtualized Hadoop Systems with VMware vsphere Big Data Extensions A DEPLOYMENT GUIDE Table of Contents Introduction.... 4 Overview of Hadoop, vsphere, and Project Serengeti.... 4 An Overview
Reference Architecture and Best Practices for Virtualizing Hadoop Workloads Justin Murray VMware
Reference Architecture and Best Practices for Virtualizing Hadoop Workloads Justin Murray ware 2 Agenda The Hadoop Journey Why Virtualize Hadoop? Elasticity and Scalability Performance Tests Storage Reference
How To Run Apa Hadoop 1.0 On Vsphere Tmt On A Hyperconverged Network On A Virtualized Cluster On A Vspplace Tmter (Vmware) Vspheon Tm (
Apache Hadoop 1.0 High Availability Solution on VMware vsphere TM Reference Architecture TECHNICAL WHITE PAPER v 1.0 June 2012 Table of Contents Executive Summary... 3 Introduction... 3 Terminology...
Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments
Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments Important Notice 2010-2015 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, Impala, and
Top 5 Reasons to choose Microsoft Windows Server 2008 R2 SP1 Hyper-V over VMware vsphere 5
Top 5 Reasons to choose Microsoft Windows Server 2008 R2 SP1 Hyper-V over VMware Published: April 2012 2012 Microsoft Corporation. All rights reserved. This document is provided "as-is." Information and
Protecting your Data in a New Generation Virtual and Physical Environment
Protecting your Data in a New Generation Virtual and Physical Environment Read this white paper to learn how you can easily and safely protect your data in a new generation virtual and physical IT environment,
Consolidate and Virtualize Your Windows Environment with NetApp and VMware
White Paper Consolidate and Virtualize Your Windows Environment with NetApp and VMware Sachin Chheda, NetApp and Gaetan Castelein, VMware October 2009 WP-7086-1009 TABLE OF CONTENTS 1 EXECUTIVE SUMMARY...
Large scale processing using Hadoop. Ján Vaňo
Large scale processing using Hadoop Ján Vaňo What is Hadoop? Software platform that lets one easily write and run applications that process vast amounts of data Includes: MapReduce offline computing engine
COMPARISON OF VMware VSHPERE HA/FT vs stratus
COMPARISON OF VMware VSHPERE HA/FT vs stratus ftserver SYSTEMS White Paper 2 Ensuring Availability of Virtualized Business-Critical Applications in an Always-On World Introduction Virtualization has become
REDEFINE SIMPLICITY TOP REASONS: EMC VSPEX BLUE FOR VIRTUALIZED ENVIRONMENTS
REDEFINE SIMPLICITY AGILE. SCALABLE. TRUSTED. TOP REASONS: EMC VSPEX BLUE FOR VIRTUALIZED ENVIRONMENTS Redefine Simplicity: Agile, Scalable and Trusted. Mid-market and Enterprise customers as well as Managed
EMC s Enterprise Hadoop Solution. By Julie Lockner, Senior Analyst, and Terri McClure, Senior Analyst
White Paper EMC s Enterprise Hadoop Solution Isilon Scale-out NAS and Greenplum HD By Julie Lockner, Senior Analyst, and Terri McClure, Senior Analyst February 2012 This ESG White Paper was commissioned
CA ARCserve Replication and High Availability Deployment Options for Hyper-V
Solution Brief: CA ARCserve R16.5 Complexity ate my budget CA ARCserve Replication and High Availability Deployment Options for Hyper-V Adding value to your Hyper-V environment Overview Server virtualization
EMC ISILON OneFS OPERATING SYSTEM Powering scale-out storage for the new world of Big Data in the enterprise
EMC ISILON OneFS OPERATING SYSTEM Powering scale-out storage for the new world of Big Data in the enterprise ESSENTIALS Easy-to-use, single volume, single file system architecture Highly scalable with
OPTIMIZING SERVER VIRTUALIZATION
OPTIMIZING SERVER VIRTUALIZATION HP MULTI-PORT SERVER ADAPTERS BASED ON INTEL ETHERNET TECHNOLOGY As enterprise-class server infrastructures adopt virtualization to improve total cost of ownership (TCO)
Deployment Options for Microsoft Hyper-V Server
CA ARCserve Replication and CA ARCserve High Availability r16 CA ARCserve Replication and CA ARCserve High Availability Deployment Options for Microsoft Hyper-V Server TYPICALLY, IT COST REDUCTION INITIATIVES
Introduction to Cloud Computing
Introduction to Cloud Computing Cloud Computing I (intro) 15 319, spring 2010 2 nd Lecture, Jan 14 th Majd F. Sakr Lecture Motivation General overview on cloud computing What is cloud computing Services
Virtualizing Exchange
Virtualizing Exchange Simplifying and Optimizing Management of Microsoft Exchange Server Using Virtualization Technologies By Anil Desai Microsoft MVP September, 2008 An Alternative to Hosted Exchange
The Incremental Advantage:
The Incremental Advantage: MIGRATE TRADITIONAL APPLICATIONS FROM YOUR ON-PREMISES VMWARE ENVIRONMENT TO THE HYBRID CLOUD IN FIVE STEPS CONTENTS Introduction..................... 2 Five Steps to the Hybrid
Answering the Requirements of Flash-Based SSDs in the Virtualized Data Center
White Paper Answering the Requirements of Flash-Based SSDs in the Virtualized Data Center Provide accelerated data access and an immediate performance boost of businesscritical applications with caching
Windows Server 2008 R2 Hyper-V Live Migration
Windows Server 2008 R2 Hyper-V Live Migration White Paper Published: August 09 This is a preliminary document and may be changed substantially prior to final commercial release of the software described
Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments
Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments Important Notice 2010-2016 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, Impala, and
Introduction to Hadoop. New York Oracle User Group Vikas Sawhney
Introduction to Hadoop New York Oracle User Group Vikas Sawhney GENERAL AGENDA Driving Factors behind BIG-DATA NOSQL Database 2014 Database Landscape Hadoop Architecture Map/Reduce Hadoop Eco-system Hadoop
CA Technologies Big Data Infrastructure Management Unified Management and Visibility of Big Data
Research Report CA Technologies Big Data Infrastructure Management Executive Summary CA Technologies recently exhibited new technology innovations, marking its entry into the Big Data marketplace with
SQL Server Virtualization
The Essential Guide to SQL Server Virtualization S p o n s o r e d b y Virtualization in the Enterprise Today most organizations understand the importance of implementing virtualization. Virtualization
RED HAT ENTERPRISE VIRTUALIZATION FOR SERVERS: COMPETITIVE FEATURES
RED HAT ENTERPRISE VIRTUALIZATION FOR SERVERS: COMPETITIVE FEATURES RED HAT ENTERPRISE VIRTUALIZATION FOR SERVERS Server virtualization offers tremendous benefits for enterprise IT organizations server
The Benefits of Virtualizing
T E C H N I C A L B R I E F The Benefits of Virtualizing Aciduisismodo Microsoft SQL Dolore Server Eolore in Dionseq Hitachi Storage Uatummy Environments Odolorem Vel Leveraging Microsoft Hyper-V By Heidi
Veritas Cluster Server from Symantec
Delivers high availability and disaster recovery for your critical applications Data Sheet: High Availability Overview protects your most important applications from planned and unplanned downtime. Cluster
Simplifying Server Workload Migrations
Simplifying Server Workload Migrations This document describes the migration challenges that organizations face and how the Acronis AnyData Engine supports physical-to-physical (P2P), physical-to-virtual
Hadoop as a Service. VMware vcloud Automation Center & Big Data Extension
Hadoop as a Service VMware vcloud Automation Center & Big Data Extension Table of Contents 1. Introduction... 2 1.1 How it works... 2 2. System Pre-requisites... 2 3. Set up... 2 3.1 Request the Service
Deploying Hadoop with Manager
Deploying Hadoop with Manager SUSE Big Data Made Easier Peter Linnell / Sales Engineer [email protected] Alejandro Bonilla / Sales Engineer [email protected] 2 Hadoop Core Components 3 Typical Hadoop Distribution
Red Hat Satellite Management and automation of your Red Hat Enterprise Linux environment
Red Hat Satellite Management and automation of your Red Hat Enterprise Linux environment WHAT IS IT? Red Hat Satellite server is an easy-to-use, advanced systems management platform for your Linux infrastructure.
Red Hat Network Satellite Management and automation of your Red Hat Enterprise Linux environment
Red Hat Network Satellite Management and automation of your Red Hat Enterprise Linux environment WHAT IS IT? Red Hat Network (RHN) Satellite server is an easy-to-use, advanced systems management platform
Why is the V3 appliance so effective as a physical desktop replacement?
V3 Appliance FAQ Why is the V3 appliance so effective as a physical desktop replacement? The V3 appliance leverages local solid-state storage in the appliance. This design allows V3 to dramatically reduce
Cloud Optimize Your IT
Cloud Optimize Your IT Windows Server 2012 The information contained in this presentation relates to a pre-release product which may be substantially modified before it is commercially released. This pre-release
Building the Virtual Information Infrastructure
Technology Concepts and Business Considerations Abstract A virtual information infrastructure allows organizations to make the most of their data center environment by sharing computing, network, and storage
Apache Hadoop. Alexandru Costan
1 Apache Hadoop Alexandru Costan Big Data Landscape No one-size-fits-all solution: SQL, NoSQL, MapReduce, No standard, except Hadoop 2 Outline What is Hadoop? Who uses it? Architecture HDFS MapReduce Open
How Customers Are Cutting Costs and Building Value with Microsoft Virtualization
How Customers Are Cutting Costs and Building Value with Microsoft Virtualization Introduction The majority of organizations are incorporating virtualization into their IT infrastructures because of the
HADOOP SOLUTION USING EMC ISILON AND CLOUDERA ENTERPRISE Efficient, Flexible In-Place Hadoop Analytics
HADOOP SOLUTION USING EMC ISILON AND CLOUDERA ENTERPRISE Efficient, Flexible In-Place Hadoop Analytics ESSENTIALS EMC ISILON Use the industry's first and only scale-out NAS solution with native Hadoop
Understanding Data Locality in VMware Virtual SAN
Understanding Data Locality in VMware Virtual SAN July 2014 Edition T E C H N I C A L M A R K E T I N G D O C U M E N T A T I O N Table of Contents Introduction... 2 Virtual SAN Design Goals... 3 Data
How To Connect Virtual Fibre Channel To A Virtual Box On A Hyperv Virtual Machine
Virtual Fibre Channel for Hyper-V Virtual Fibre Channel for Hyper-V, a new technology available in Microsoft Windows Server 2012, allows direct access to Fibre Channel (FC) shared storage by multiple guest
MaxDeploy Hyper- Converged Reference Architecture Solution Brief
MaxDeploy Hyper- Converged Reference Architecture Solution Brief MaxDeploy Reference Architecture solutions are configured and tested for support with Maxta software- defined storage and with industry
Power Management for Server Virtualization
Power Management for Server Virtualization Strategies for preserving business continuity and data integrity in virtualized data centers By Hervé Tardy Vice President and General Manager Distributed Power
EMC Virtual Infrastructure for Microsoft Applications Data Center Solution
EMC Virtual Infrastructure for Microsoft Applications Data Center Solution Enabled by EMC Symmetrix V-Max and Reference Architecture EMC Global Solutions Copyright and Trademark Information Copyright 2009
Proact whitepaper on Big Data
Proact whitepaper on Big Data Summary Big Data is not a definite term. Even if it sounds like just another buzz word, it manifests some interesting opportunities for organisations with the skill, resources
Citrix XenServer Industry-leading open source platform for cost-effective cloud, server and desktop virtualization. citrix.com
Citrix XenServer Industry-leading open source platform for cost-effective cloud, server and desktop virtualization. 2 While the core server virtualization market has matured, virtualization itself is seeing
Windows Server 2008 R2 Hyper-V Live Migration
Windows Server 2008 R2 Hyper-V Live Migration Table of Contents Overview of Windows Server 2008 R2 Hyper-V Features... 3 Dynamic VM storage... 3 Enhanced Processor Support... 3 Enhanced Networking Support...
Managing Application Performance and Availability in a Virtual Environment
The recognized leader in proven and affordable load balancing and application delivery solutions White Paper Managing Application Performance and Availability in a Virtual Environment by James Puchbauer
Hadoop implementation of MapReduce computational model. Ján Vaňo
Hadoop implementation of MapReduce computational model Ján Vaňo What is MapReduce? A computational model published in a paper by Google in 2004 Based on distributed computation Complements Google s distributed
EMC PERFORMANCE OPTIMIZATION FOR MICROSOFT FAST SEARCH SERVER 2010 FOR SHAREPOINT
Reference Architecture EMC PERFORMANCE OPTIMIZATION FOR MICROSOFT FAST SEARCH SERVER 2010 FOR SHAREPOINT Optimize scalability and performance of FAST Search Server 2010 for SharePoint Validate virtualization
Hadoop: Embracing future hardware
Hadoop: Embracing future hardware Suresh Srinivas @suresh_m_s Page 1 About Me Architect & Founder at Hortonworks Long time Apache Hadoop committer and PMC member Designed and developed many key Hadoop
The Virtualization Practice
The Virtualization Practice White Paper: Managing Applications in Docker Containers Bernd Harzog Analyst Virtualization and Cloud Performance Management October 2014 Abstract Docker has captured the attention
Dell and Nutanix deliver a Cost Effective Web-scale Hyper-converged Appliance with Microsoft Private Cloud
Dell and Nutanix deliver a Cost Effective Web-scale Hyper-converged Appliance with Microsoft Private Cloud Published by: Value Prism Consulting Sponsored by: Microsoft Corporation Publish date: May 2015
Whitepaper. NexentaConnect for VMware Virtual SAN. Full Featured File services for Virtual SAN
Whitepaper NexentaConnect for VMware Virtual SAN Full Featured File services for Virtual SAN Table of Contents Introduction... 1 Next Generation Storage and Compute... 1 VMware Virtual SAN... 2 Highlights
Cloud Server. Parallels. An Introduction to Operating System Virtualization and Parallels Cloud Server. White Paper. www.parallels.
Parallels Cloud Server White Paper An Introduction to Operating System Virtualization and Parallels Cloud Server www.parallels.com Table of Contents Introduction... 3 Hardware Virtualization... 3 Operating
Microsoft SharePoint 2010 on VMware Availability and Recovery Options. Microsoft SharePoint 2010 on VMware Availability and Recovery Options
This product is protected by U.S. and international copyright and intellectual property laws. This product is covered by one or more patents listed at http://www.vmware.com/download/patents.html. VMware
Why Choose VMware vsphere for Desktop Virtualization? WHITE PAPER
Why Choose VMware vsphere for Desktop Virtualization? WHITE PAPER Table of Contents Thin, Legacy-Free, Purpose-Built Hypervisor.... 3 More Secure with Smaller Footprint.... 4 Less Downtime Caused by Patches...
Ubuntu and Hadoop: the perfect match
WHITE PAPER Ubuntu and Hadoop: the perfect match February 2012 Copyright Canonical 2012 www.canonical.com Executive introduction In many fields of IT, there are always stand-out technologies. This is definitely
EMC BACKUP-AS-A-SERVICE
Reference Architecture EMC BACKUP-AS-A-SERVICE EMC AVAMAR, EMC DATA PROTECTION ADVISOR, AND EMC HOMEBASE Deliver backup services for cloud and traditional hosted environments Reduce storage space and increase
Big data management with IBM General Parallel File System
Big data management with IBM General Parallel File System Optimize storage management and boost your return on investment Highlights Handles the explosive growth of structured and unstructured data Offers
Cisco Application-Centric Infrastructure (ACI) and Linux Containers
White Paper Cisco Application-Centric Infrastructure (ACI) and Linux Containers What You Will Learn Linux containers are quickly gaining traction as a new way of building, deploying, and managing applications
Increasing Storage Performance, Reducing Cost and Simplifying Management for VDI Deployments
Increasing Storage Performance, Reducing Cost and Simplifying Management for VDI Deployments Table of Contents Introduction.......................................3 Benefits of VDI.....................................4
Hadoop on OpenStack Cloud. Dmitry Mescheryakov Software Engineer, @MirantisIT
Hadoop on OpenStack Cloud Dmitry Mescheryakov Software Engineer, @MirantisIT Agenda OpenStack Sahara Demo Hadoop Performance on Cloud Conclusion OpenStack Open source cloud computing platform 17,209 commits
The Production Cloud
The Production Cloud The cloud is not just for backup storage, development projects and other low-risk applications. In this document, we look at the characteristics of a public cloud environment that
Benchmarking Hadoop & HBase on Violin
Technical White Paper Report Technical Report Benchmarking Hadoop & HBase on Violin Harnessing Big Data Analytics at the Speed of Memory Version 1.0 Abstract The purpose of benchmarking is to show advantages
CA Cloud Overview Benefits of the Hyper-V Cloud
Benefits of the Hyper-V Cloud For more information, please contact: Email: [email protected] Ph: 888-821-7888 Canadian Web Hosting (www.canadianwebhosting.com) is an independent company, hereinafter
Elasticsearch on Cisco Unified Computing System: Optimizing your UCS infrastructure for Elasticsearch s analytics software stack
Elasticsearch on Cisco Unified Computing System: Optimizing your UCS infrastructure for Elasticsearch s analytics software stack HIGHLIGHTS Real-Time Results Elasticsearch on Cisco UCS enables a deeper
Big Data and Apache Hadoop Adoption:
Expert Reference Series of White Papers Big Data and Apache Hadoop Adoption: Key Challenges and Rewards 1-800-COURSES www.globalknowledge.com Big Data and Apache Hadoop Adoption: Key Challenges and Rewards
Nutanix Solution Note
Nutanix Solution Note Version 1.0 April 2015 2 Copyright 2015 Nutanix, Inc. All rights reserved. This product is protected by U.S. and international copyright and intellectual property laws. Nutanix is
With Red Hat Enterprise Virtualization, you can: Take advantage of existing people skills and investments
RED HAT ENTERPRISE VIRTUALIZATION DATASHEET RED HAT ENTERPRISE VIRTUALIZATION AT A GLANCE Provides a complete end-toend enterprise virtualization solution for servers and desktop Provides an on-ramp to
Intro to Virtualization
Cloud@Ceid Seminars Intro to Virtualization Christos Alexakos Computer Engineer, MSc, PhD C. Sysadmin at Pattern Recognition Lab 1 st Seminar 19/3/2014 Contents What is virtualization How it works Hypervisor
An Oracle White Paper August 2011. Oracle VM 3: Server Pool Deployment Planning Considerations for Scalability and Availability
An Oracle White Paper August 2011 Oracle VM 3: Server Pool Deployment Planning Considerations for Scalability and Availability Note This whitepaper discusses a number of considerations to be made when
High-Availability Fault Tolerant Computing for Remote and Branch Offices HA/FT solutions for Cisco UCS E-Series servers and VMware vsphere
Table of Contents UCS E-Series Availability and Fault Tolerance... 3 Solid hardware... 3 Consistent management... 3 VMware vsphere HA and FT... 3 Storage High Availability and Fault Tolerance... 4 Quick-start
Enabling Database-as-a-Service (DBaaS) within Enterprises or Cloud Offerings
Solution Brief Enabling Database-as-a-Service (DBaaS) within Enterprises or Cloud Offerings Introduction Accelerating time to market, increasing IT agility to enable business strategies, and improving
Cisco Unified Data Center
Solution Overview Cisco Unified Data Center Simplified, Efficient, and Agile Infrastructure for the Data Center What You Will Learn The data center is critical to the way that IT generates and delivers
Business Process Desktop: Acronis backup & Recovery 11.5 Deployment Guide
WHITE Deployment PAPERGuide Business Process Desktop: Acronis backup & Recovery 11.5 Deployment Guide An Acronis White Paper Copyright Acronis, Inc., 2000 2011 Deployment Guide Table of contents About
... ... PEPPERDATA OVERVIEW AND DIFFERENTIATORS ... ... ... ... ...
..................................... WHITEPAPER PEPPERDATA OVERVIEW AND DIFFERENTIATORS INTRODUCTION Prospective customers will often pose the question, How is Pepperdata different from tools like Ganglia,
HRG Assessment: Stratus everrun Enterprise
HRG Assessment: Stratus everrun Enterprise Today IT executive decision makers and their technology recommenders are faced with escalating demands for more effective technology based solutions while at
Symantec NetBackup 7.1 What s New and Version Comparison Matrix
Symantec 7.1 What s New and Version Comparison Matrix Symantec 7 allows customers to standardize backup and recovery operations across physical and virtual environments with fewer resources and less risk
Symantec Cluster Server powered by Veritas
Delivers high availability and disaster recovery for your critical applications Data Sheet: High Availability Overview protects your most important applications from planned and unplanned downtime. Cluster
Protecting Big Data Data Protection Solutions for the Business Data Lake
White Paper Protecting Big Data Data Protection Solutions for the Business Data Lake Abstract Big Data use cases are maturing and customers are using Big Data to improve top and bottom line revenues. With
Enabling High performance Big Data platform with RDMA
Enabling High performance Big Data platform with RDMA Tong Liu HPC Advisory Council Oct 7 th, 2014 Shortcomings of Hadoop Administration tooling Performance Reliability SQL support Backup and recovery
VMware vsphere 5.0 Boot Camp
VMware vsphere 5.0 Boot Camp This powerful 5-day 10hr/day class is an intensive introduction to VMware vsphere 5.0 including VMware ESX 5.0 and vcenter. Assuming no prior virtualization experience, this
Virtualization Essentials
Virtualization Essentials Table of Contents Introduction What is Virtualization?.... 3 How Does Virtualization Work?... 4 Chapter 1 Delivering Real Business Benefits.... 5 Reduced Complexity....5 Dramatically
Welcome to the unit of Hadoop Fundamentals on Hadoop architecture. I will begin with a terminology review and then cover the major components
Welcome to the unit of Hadoop Fundamentals on Hadoop architecture. I will begin with a terminology review and then cover the major components of Hadoop. We will see what types of nodes can exist in a Hadoop
Real-time Protection for Hyper-V
1-888-674-9495 www.doubletake.com Real-time Protection for Hyper-V Real-Time Protection for Hyper-V Computer virtualization has come a long way in a very short time, triggered primarily by the rapid rate
WHITE PAPER: Egenera Cloud Suite for EMC VSPEX. The Proven Solution For Building Cloud Services
WHITE PAPER: Egenera Cloud Suite for EMC VSPEX The Proven Solution For Building Cloud Services Build, Manage and Protect Your Cloud with the VSPEX Certified Egenera Cloud Suite Today, organizations are
Benefits of Consolidating and Virtualizing Microsoft Exchange and SharePoint in a Private Cloud Environment
. The Radicati Group, Inc. 1900 Embarcadero Road, Suite 206 Palo Alto, CA 94303 Phone 650-322-8059 Fax 650-322-8061 http://www.radicati.com THE RADICATI GROUP, INC. Benefits of Consolidating and Virtualizing
Solving I/O Bottlenecks to Enable Superior Cloud Efficiency
WHITE PAPER Solving I/O Bottlenecks to Enable Superior Cloud Efficiency Overview...1 Mellanox I/O Virtualization Features and Benefits...2 Summary...6 Overview We already have 8 or even 16 cores on one
WHITE PAPER. www.fusionstorm.com. The Double-Edged Sword of Virtualization:
WHiTE PaPEr: Easing the Way to the cloud: 1 WHITE PAPER The Double-Edged Sword of Virtualization: Solutions and Strategies for minimizing the challenges and reaping the rewards of Disaster recovery in
Dell Reference Configuration for Hortonworks Data Platform
Dell Reference Configuration for Hortonworks Data Platform A Quick Reference Configuration Guide Armando Acosta Hadoop Product Manager Dell Revolutionary Cloud and Big Data Group Kris Applegate Solution
Veritas InfoScale Availability
Veritas InfoScale Availability Delivers high availability and disaster recovery for your critical applications Overview protects your most important applications from planned and unplanned downtime. InfoScale
Mesos: A Platform for Fine- Grained Resource Sharing in Data Centers (II)
UC BERKELEY Mesos: A Platform for Fine- Grained Resource Sharing in Data Centers (II) Anthony D. Joseph LASER Summer School September 2013 My Talks at LASER 2013 1. AMP Lab introduction 2. The Datacenter
New Features in PSP2 for SANsymphony -V10 Software-defined Storage Platform and DataCore Virtual SAN
New Features in PSP2 for SANsymphony -V10 Software-defined Storage Platform and DataCore Virtual SAN Updated: May 19, 2015 Contents Introduction... 1 Cloud Integration... 1 OpenStack Support... 1 Expanded
PARALLELS CLOUD SERVER
PARALLELS CLOUD SERVER An Introduction to Operating System Virtualization and Parallels Cloud Server 1 Table of Contents Introduction... 3 Hardware Virtualization... 3 Operating System Virtualization...
Top 10 Reasons to Virtualize VMware Zimbra Collaboration Server with VMware vsphere. white PAPER
Top 10 Reasons to Virtualize VMware Zimbra Collaboration Server with VMware vsphere white PAPER Email outages disrupt a company s ability to conduct business. Issues as diverse as scheduled downtime, human
