WHITE PAPER. Dedupe-Centric Storage. Hugo Patterson, Chief Architect, Data Domain. Storage. Deduplication. September 2007
|
|
|
- Oliver Gilmore
- 10 years ago
- Views:
Transcription
1 WHITE PAPER Dedupe-Centric Storage Hugo Patterson, Chief Architect, Data Domain Deduplication Storage September 2007 w w w. d a t a d o m a i n. c o m
2 DATA DOMAIN I Contents INTRODUCTION DEDUPLICATON: THE POST-SNAPSHOT REVOLUTION.. 3 BUILDING A BETTER SNAPSHOT Deduplication as a point solution Deduplication as a storage fundamental ABOUT Data Domain
3 People, by their nature, tend to build on what is less concern for storage space and cost. Because it already there to improve it, repurpose it, or just directly addresses one of the key engines of data to keep it up to date. They also tend to share the growth, deduplication should be at the heart of product of their work with other people. Such any data management strategy; it should be baked activity, when applied to the digital world, generates into the fundamental design of the system. When multiple successive versions of files, multiple copies deduplication is done well, people can create, of those file versions and, in general, proliferates share, access, protect and manage their data in new essentially similar data and so fills corporate data and easier ways, and IT administrators will not be centers. And people like to keep all those versions struggling to keep up. and copies around because they serve different purposes, or protect against human or computer Deduplication: The Post-Snapshot errors. Proliferating and preserving all these versions and copies drives much of the tremendous Revolution in Storage data growth most companies are experiencing. IT The revolution to come with deduplication will administrators are left to deal with the consequences. eclipse the snapshot revolution of the early 1990s. Since at least the 1980s, systems had been saving Deduplication is the process of versions of individual files or recognizing common elements snapshots of whole disk volumes or in the many versions and file systems. Technologies existed copies of data and eliminating When deduplication is done well, people can create, share, access, for creating snapshots reasonably the redundant copies of protect, and manage their data in efficiently. But, in most production those common elements. new and easier ways and IT systems, snapshots were expensive, Deduplication reduces total data administrators aren t struggling slow, or limited in number and so size and so simplifies the data to keep up. had limited use and deployment. management problem. There is less data to store, less to protect, About 15 years ago, Network less to replicate, less to index Appliance (NetApp) leveraged the and search, and less to preserve for compliance. For existing technologies and added some of their own IT administrators, this means there are fewer and/ innovations to build a file system that could create or smaller storage systems to manage, smaller data and present multiple snapshots of the entire system centers to run, fewer tapes to handle, fewer tape with little performance impact and without making a pickups, smaller network pipes, and cheaper WAN complete new copy of all the data for every snapshot. links. People tend to proliferate similar data, and It did not use deduplication to find and eliminate deduplication makes it easier to manage the data redundant data, but at least it didn t duplicate when they do. unchanged data just to create a new logical view of the same data in a new snapshot. Even this was With effective deduplication, people can make and a significant advance. By not requiring a complete preserve all the versions and copies they d like with 3
4 copy for each snapshot, it became much cheaper to store snapshots. And, by not moving the old version of data preserved in a snapshot just to store a new version of data, there was no performance impact to creating snapshots. The efficiency of their approach meant there was no reason not to create snapshots and keep a bunch around. Users responded by routinely creating several snapshots a day and often preserving some snapshots for several days. The relative abundance of such snapshots revolutionized data protection. Users could browse snapshots and restore most widely adopted approach is to read the old version of data and copy it to a safe place before writing new data. This is known as copy-on-write and works especially well for block storage systems. Unfortunately, the copy operation imposes a severe performance penalty because of the extra read and write operations it requires. But, doing something more sophisticated and writing new data to a new location without the extra read and write requires all new kinds of data structures and completely changes the model for how a storage system works. Further, block-level earlier versions of files to recover snapshots by themselves do not Creating and managing the from mistakes immediately all by provide access to the file version multiple virtual views of a file themselves. Instead of being shut system captured in snapshots captured in the snapshots; you down to run backups, databases could be quiesced briefly to create a consistent snapshot and then is challenging. need a file system that integrates those versions into the name space. freed to run again while backups proceeded in the background. Such consistent snapshots could also serve as starting points for database recovery in the event of a database corruption, thereby avoiding the need to go to tape for recovery. Faced with the choice of completely rebuilding their storage system, or simply bolting on a copy-on-write snapshot, most vendors choose the easy path that will get a check-box snapshot feature to market soonest. Very few are willing to invest the years and dollars to start over for what they view as just a single feature. Building a Better Snapshot Despite the many benefits and commercial success of space and performance efficient snapshots, it was over a decade before the first competitors built By not investing, their snapshots were doomed to be second rate. They don t cross the hurdle to that makes snapshots easy, cheap and most useful. They don t deliver on the snapshot revolution. comparable technology. Most competitors snapshots still do not compare. Many added a snapshot feature that delivers space efficiency but few matched the simplicity, performance, scalability and ultimately the utility of NetApp s design. Why? Deduplication as a Point Solution On its surface, deduplication is a simple concept: find and eliminate redundant copies of data. But computing systems are complex, supporting many different kinds of applications. And data The answer is that creating and managing the multiple virtual views of a file system captured in snapshots is challenging. By far the easiest and sets are vast, putting a premium on scalability. Ideally, a deduplication technology would be able to find and eliminate redundant data no matter 4
5 which application writes it, even if different applications have written the same data. Deduplication should be effective and scalable so that it can find a small segment of matching data no matter where it is stored Vast requirements make it very challenging to build a storage system that actually delivers the full potential of deduplication. many copies to store, manage and protect. At the end of the day, an system that eliminates duplicate attachments is a point solution. It helps that one in a large system. Deduplication also needs to be efficient or the memory and computing overhead of finding duplicates in a large system could negate any application, but it doesn t keep many additional duplicates, even of the very same attachments from proliferating through the environment. benefit. Finally, deduplication needs to be simple and automatic or the management burden of scheduling, tuning, or generally managing the deduplication process could again negate any benefits of the data reduction. All of these requirements make it very challenging to build a storage system that actually delivers the full potential of deduplication. The power of an efficient snapshot mechanism in a storage system is its generality. Databases could all build their own snapshot mechanism, but when the storage provides them they don t have to. They and any other application can benefit from snapshots for efficient, and independent, data protection. NetApp was dedicated to building efficient snapshots so all Early efforts at deduplication did not attempt a the applications didn t have to. comprehensive solution. One example is systems that store only one copy of an attachment or a message even when it is sent to many recipients. Such Dedupe as a Storage blasts are a common way for people to share Fundamental their work and early implementations created a separate copy of the and attachment for every recipient. The first innovation is to store just a single copy on the server and maintain multiple references to that. More sophisticated systems might detect when the Data Domain is dedicated to building deduplication storage systems with a deduplication engine powerful enough to become a platform that general applications can leverage. That engine can find duplicates in data written by same attachment is forwarded many different applications at Such an engine is only possible to further recipients and create with an unconventional storage every stage of the lifecycle of additional references to the system architecture that breaks the file, can scale to store many same data instead of storing a with traditional thinking. hundreds of terabytes of data new copy. Nevertheless, when Can vendors who try to bolt on and find duplicates wherever deduplication as merely a feature users download attachments to they exist, can reduce data by a of their existing system, their desktops, the corporate factor of 20 or more, can do so at ever deliver? environment still ends up with high speed without huge memory or disk resources, and does so 5
6 automatically while taking snapshots and so requires a minimum of administrator attention. Such an engine is only possible with an unconventional system architecture that breaks with traditional thinking. Here are some important features needed to build such an engine. Variable Length Duplicates Local Compression Deduplication across all the data stored in a system is necessary, but it should be complemented with local compression which typically reduces data size by another factor of 2. This may not be as significant a factor as deduplication itself, but no system that takes data reduction seriously can afford to ignore it. Format Agnostic Not all data changes are exactly 4KB in size and 4KB aligned. Sometimes people just replace one word with a longer word. Such a simple small replacement shifts all the rest of the data by some small amount. To a deduplication engine built on fixed size blocks, the entire rest of the file would seem to be new unique data even though it really contains lots of duplicates. Further, that file may exist elsewhere in the system, say Data comes in many formats generated by many different applications. But, embedded in those different formats is often the same duplicate data. A document may appear as an individual file generated by a word processor, or in an saved in a folder by an reader, or in the database of the server, or embedded in a backup image, or squirreled away by Data Domain is dedicated to in an folder, at a random an archive application. A building deduplication storage offset in a larger file. Again, to an deduplication system that relies systems with a deduplication engine organized around fixed engine powerful enough to on parsing the formats generated blocks, the data would all seem to be new. Conventional storage systems, whether NAS or SAN, store fixed sized blocks. Such a become a platform that general applications can leverage. by all these different applications can never be a general purpose storage platform. There are too many such formats and they system which attempts to bolt on deduplication as an afterthought will only be able to look for identical fixed size blocks. Clearly, such an approach will never be as effective, comprehensive, and general purpose as a system that can handle small replacements and recognize and eliminate duplicate data no matter where in a file it appears. change too quickly for a storage vendor to support them all. Even if they could, such an approach would end up handcuffing application writers trying to innovate on top of the platform. Until their new format is supported, they d gain no benefit from the platform. They would be better off sticking with the same old formats and not creating anything new. Storage platforms should unleash creativity, not squelch it. Thus, the deduplication engine must be data agnostic and find and eliminate duplicates in data no matter how it is packaged and stored to the system. 6
7 Multi-Protocol fingerprint. Other systems can t find duplicate data There are many standard protocols in use in storage systems today from NFS and CIFS to blocks and VTL. For maximum flexibility, storage systems should support all these protocols since different protocols are needed for different applications at the same time. User home directories may be in NAS. The exchange server may need to run on blocks. And backups may prefer VTL. Over the except by reading the old data to compare it to the new. In either case, the rate at which duplicate data can be found and eliminated is bounded by the speed of these disk accesses. To go faster, such systems need to add more disks. But, the whole idea of deduplication is to reduce storage, not grow it. The Data Domain SISL (Stream-Informed Segment Layout) technology does not have course of its lifecycle, the same to rely on reading lots of data Deduplication has leverage across data may be stored with all of from disk to find duplicates and the storage infrastructure for these protocols. A presentation that starts in a home directory may be ed to a colleague and stored in blocks by the server and then archived in NAS reducing data, improving data protection and, in general, simplifying data management. organizes the segment fingerprints on disk in such a way that only a small number of accesses are needed to find thousands of duplicates. With SISL, backend by an archive application and backed up from the home directory, the server, and the archive application to VTL. Deduplication should be able to disks only need to deliver a few megabytes of data to deduplicate hundreds of megabytes of incoming data. find and eliminate redundant data no matter how it is stored. Deduplicated Replication Data is protected from disaster only when a copy CPU-Centric vs. Disk-Intensive Algorithm Over the last two decades, CPU performance has increased 2,000,000x*. In that time, disk performance has only increased 11x* (*Seagate Technology Paper, Economies of Capacity and Speed, May 2004). Today, CPU performance is taking another leap with every doubling of the number of cores in a chip. Clearly, algorithms developed today for deduplication should leverage the growth in CPU performance instead of being tied to disk performance. Some systems rely on a disk access to find every piece of duplicate data. In some systems, the disk access is to lookup a segment of it is safely at a remote location. Replication has long been used for high-value data, but without deduplication, replication is too expensive for the other 90% of the data. Deduplication should happen immediately inline and its benefits applied to replication in real time, so the lag till data is safely off site is as small as possible. Only systems designed for deduplication can run fast enough to deduplicate and replicate data right away. Systems which have bolted on deduplication as merely a feature at best impose unnecessary delays in replication and at worst don t deliver the benefits of deduplicated replication at all. 7
8 Deduplicated Snapshots Snapshots are very helpful for capturing different versions of data and deduplication can store all those versions much more compactly. Both are fundamental to simplifying data management. Yet many systems that are cobbled together as a set of features either can t create snapshots efficiently, or they can t deduplicate snapshots, so users need to be careful when they create snapshots so as not to lock duplicates in. Such restrictions and limitations complicate data management not simplify it. Conclusion Deduplication has leverage across the storage infrastructure for reducing data, improving data protection and, in general, simplifying data management. But, deduplication is hard to implement in a way that runs fast with low overhead across a full range of protocols and application environments. Storage system vendors who treat deduplication merely as a feature will check off a box on a feature list, but are likely to fall short of delivering the benefits deduplication promises. About Data Domain Data Domain is the leading provider of Deduplication Storage systems for disk backup and network-based disaster recovery. Over 1,000 companies worldwide have deployed Data Domain s market-leading protection storage systems to significantly reduce backup data volume, lower their backup costs and simplify data recovery. Data Domain delivers the performance, reliability and scalability to address the data protection needs of enterprises of all sizes. Data Domain s products integrate into existing customer infrastructures and are compatible with leading enterprise backup software products. To find out more about Data Domain, visit Data Domain is headquartered at 2300 Central Expressway, Santa Clara, CA and can be contacted by phone at or by at [email protected]. Copyright 2007 Data Domain, Inc. All Rights Reserved. Data Domain, Inc. believes information in this publication is accurate as of its publication date. This publication could include technical inaccurancies or typographical errors. The information is subject to change without notice. Changes are periodically added to the information herein; these changes will be incorporated in new additions of the publication. Data Domain, Inc. may make improvements and/or changes in the product(s) and/or the program(s) described in this publication at any time. Reproduction of this publication without prior written permission is forbidden. The information in this publication is provided as is. Data Domain, Inc. makes no representations or warranties of any kind, with respect to the information in this publication, and specifically disclaims implied warranties of merchantability or fitness for a particular purpose. Data Domain and Global Compression are trademarks of Data Domain, Inc. All other brands, products, service names, trademarks, or registered service marks are used to identify the products or services of their respective owners. WP-DCS-0907 Data Domain - Data Invulnerability Architecture - November
9 Data Domain Dedupe-Centric Storage Data Domain 2300 Central Expressway Santa Clara, CA WE-DDUPE
Actifio Big Data Director. Virtual Data Pipeline for Unstructured Data
Actifio Big Data Director Virtual Data Pipeline for Unstructured Data Contact Actifio Support As an Actifio customer, you can get support for all Actifio products through the Support Portal at http://support.actifio.com/.
Get Success in Passing Your Certification Exam at first attempt!
Get Success in Passing Your Certification Exam at first attempt! Exam : E22-290 Title : EMC Data Domain Deduplication, Backup and Recovery Exam Version : DEMO 1.A customer has a Data Domain system with
EMC DATA DOMAIN OPERATING SYSTEM
ESSENTIALS HIGH-SPEED, SCALABLE DEDUPLICATION Up to 58.7 TB/hr performance Reduces protection storage requirements by 10 to 30x CPU-centric scalability DATA INVULNERABILITY ARCHITECTURE Inline write/read
Trends in Enterprise Backup Deduplication
Trends in Enterprise Backup Deduplication Shankar Balasubramanian Architect, EMC 1 Outline Protection Storage Deduplication Basics CPU-centric Deduplication: SISL (Stream-Informed Segment Layout) Data
EMC DATA DOMAIN OPERATING SYSTEM
EMC DATA DOMAIN OPERATING SYSTEM Powering EMC Protection Storage ESSENTIALS High-Speed, Scalable Deduplication Up to 58.7 TB/hr performance Reduces requirements for backup storage by 10 to 30x and archive
Data Deduplication: An Essential Component of your Data Protection Strategy
WHITE PAPER: THE EVOLUTION OF DATA DEDUPLICATION Data Deduplication: An Essential Component of your Data Protection Strategy JULY 2010 Andy Brewerton CA TECHNOLOGIES RECOVERY MANAGEMENT AND DATA MODELLING
EMC DATA DOMAIN PRODUCT OvERvIEW
EMC DATA DOMAIN PRODUCT OvERvIEW Deduplication storage for next-generation backup and archive Essentials Scalable Deduplication Fast, inline deduplication Provides up to 65 PBs of logical storage for long-term
IBM TSM DISASTER RECOVERY BEST PRACTICES WITH EMC DATA DOMAIN DEDUPLICATION STORAGE
White Paper IBM TSM DISASTER RECOVERY BEST PRACTICES WITH EMC DATA DOMAIN DEDUPLICATION STORAGE Abstract This white paper focuses on recovery of an IBM Tivoli Storage Manager (TSM) server and explores
Efficient Backup with Data Deduplication Which Strategy is Right for You?
Efficient Backup with Data Deduplication Which Strategy is Right for You? Rob Emsley Senior Director, Product Marketing CPU Utilization CPU Utilization Exabytes Why So Much Interest in Data Deduplication?
Backup to the Future. Hugo Patterson, Ph.D. Backup Recovery Systems, EMC
Backup to the Future Hugo Patterson, Ph.D. Chief Technology Officer Backup Recovery Systems, EMC SNW Spring Orlando SNW Spring, Orlando April 2010 Backup Redesign is Hot What are your top initiatives?
How To Protect Data On Network Attached Storage (Nas) From Disaster
White Paper EMC FOR NETWORK ATTACHED STORAGE (NAS) BACKUP AND RECOVERY Abstract This white paper provides an overview of EMC s industry leading backup and recovery solutions for NAS systems. It also explains
ADVANCED DEDUPLICATION CONCEPTS. Larry Freeman, NetApp Inc Tom Pearce, Four-Colour IT Solutions
ADVANCED DEDUPLICATION CONCEPTS Larry Freeman, NetApp Inc Tom Pearce, Four-Colour IT Solutions SNIA Legal Notice The material contained in this tutorial is copyrighted by the SNIA. Member companies and
Using Data Domain Storage with Symantec Enterprise Vault 8. White Paper. Michael McLaughlin Data Domain Technical Marketing
Using Data Domain Storage with Symantec Enterprise Vault 8 White Paper Michael McLaughlin Data Domain Technical Marketing Charles Arconi Cornerstone Technologies - Principal Consultant Data Domain, Inc.
Demystifying Deduplication for Backup with the Dell DR4000
Demystifying Deduplication for Backup with the Dell DR4000 This Dell Technical White Paper explains how deduplication with the DR4000 can help your organization save time, space, and money. John Bassett
Quantum DXi6500 Family of Network-Attached Disk Backup Appliances with Deduplication
PRODUCT BRIEF Quantum DXi6500 Family of Network-Attached Disk Backup Appliances with Deduplication NOTICE This Product Brief contains proprietary information protected by copyright. Information in this
Protect Microsoft Exchange databases, achieve long-term data retention
Technical white paper Protect Microsoft Exchange databases, achieve long-term data retention HP StoreOnce Backup systems, HP StoreOnce Catalyst, and Symantec NetBackup OpenStorage Table of contents Introduction...
GIVE YOUR ORACLE DBAs THE BACKUPS THEY REALLY WANT
Why Data Domain Series GIVE YOUR ORACLE DBAs THE BACKUPS THEY REALLY WANT Why you should take the time to read this paper Speed up backups (Up to 58.7 TB/hr, Data Domain systems are about 1.5 times faster
Understanding EMC Avamar with EMC Data Protection Advisor
Understanding EMC Avamar with EMC Data Protection Advisor Applied Technology Abstract EMC Data Protection Advisor provides a comprehensive set of features to reduce the complexity of managing data protection
Redefining Microsoft SQL Server Data Management
Redefining Microsoft SQL Server Data Management Contact Actifio Support As an Actifio customer, you can get support for all Actifio products through the Support Portal at http://support.actifio.com/. Copyright,
WHITE PAPER: customize. Best Practice for NDMP Backup Veritas NetBackup. Paul Cummings. January 2009. Confidence in a connected world.
WHITE PAPER: customize DATA PROTECTION Confidence in a connected world. Best Practice for NDMP Backup Veritas NetBackup Paul Cummings January 2009 Best Practice for NDMP Backup Veritas NetBackup Contents
Accelerating Data Compression with Intel Multi-Core Processors
Case Study Predictive Enterprise Intel Xeon processors Intel Server Board Embedded technology Accelerating Data Compression with Intel Multi-Core Processors Data Domain incorporates Multi-Core Intel Xeon
Introduction. Silverton Consulting, Inc. StorInt Briefing
Introduction Silverton Consulting, Inc. StorInt Briefing All too often in today s SMB data centers the overall backup and recovery process, including both its software and hardware components, is given
Symantec NetBackup 7.1 What s New and Version Comparison Matrix
Symantec 7.1 What s New and Version Comparison Matrix Symantec 7 allows customers to standardize backup and recovery operations across physical and virtual environments with fewer resources and less risk
EMC Data Domain Boost for Oracle Recovery Manager (RMAN)
White Paper EMC Data Domain Boost for Oracle Recovery Manager (RMAN) Abstract EMC delivers Database Administrators (DBAs) complete control of Oracle backup, recovery, and offsite disaster recovery with
June 2009. Blade.org 2009 ALL RIGHTS RESERVED
Contributions for this vendor neutral technology paper have been provided by Blade.org members including NetApp, BLADE Network Technologies, and Double-Take Software. June 2009 Blade.org 2009 ALL RIGHTS
Mayur Dewaikar Sr. Product Manager Information Management Group Symantec Corporation
Next Generation Data Protection with Symantec NetBackup 7 Mayur Dewaikar Sr. Product Manager Information Management Group Symantec Corporation White Paper: Next Generation Data Protection with NetBackup
HP StoreOnce D2D. Understanding the challenges associated with NetApp s deduplication. Business white paper
HP StoreOnce D2D Understanding the challenges associated with NetApp s deduplication Business white paper Table of contents Challenge #1: Primary deduplication: Understanding the tradeoffs...4 Not all
EMC Integrated Infrastructure for VMware
EMC Integrated Infrastructure for VMware Enabled by EMC Celerra NS-120 Reference Architecture EMC Global Solutions Centers EMC Corporation Corporate Headquarters Hopkinton MA 01748-9103 1.508.435.1000
DEDUPLICATION BASICS
DEDUPLICATION BASICS 4 DEDUPE BASICS 12 HOW DO DISASTER RECOVERY & ARCHIVING FIT IN? 6 WHAT IS DEDUPLICATION 14 DEDUPLICATION FOR EVERY BUDGET QUANTUM DXi4000 and vmpro 4000 8 METHODS OF DEDUPLICATION
UNDERSTANDING DATA DEDUPLICATION. Thomas Rivera SEPATON
UNDERSTANDING DATA DEDUPLICATION Thomas Rivera SEPATON SNIA Legal Notice The material contained in this tutorial is copyrighted by the SNIA. Member companies and individual members may use this material
STORAGE. Buying Guide: TARGET DATA DEDUPLICATION BACKUP SYSTEMS. inside
Managing the information that drives the enterprise STORAGE Buying Guide: DEDUPLICATION inside What you need to know about target data deduplication Special factors to consider One key difference among
How To Use An Npm On A Network Device
WHITE PAPER: CA ARCserve Backup Network Data Management Protocol (NDMP) Network Attached Storage (NAS) Option: Integrated Protection for Heterogeneous NAS Environments CA ARCserve Backup: Protecting heterogeneous
Backup and Recovery Redesign with Deduplication
Backup and Recovery Redesign with Deduplication Why the move is on September 9, 2010 1 Major trends driving the transformation of backup environments UNABATED DATA GROWTH Backup = 4 to 30 times production
EMC DATA DOMAIN ENCRYPTION A Detailed Review
White Paper EMC DATA DOMAIN ENCRYPTION A Detailed Review Abstract The proliferation of publicized data loss, coupled with new governance and compliance regulations, is driving the need for customers to
Understanding EMC Avamar with EMC Data Protection Advisor
Understanding EMC Avamar with EMC Data Protection Advisor Applied Technology Abstract EMC Data Protection Advisor provides a comprehensive set of features that reduce the complexity of managing data protection
UniFS A True Global File System
UniFS A True Global File System Introduction The traditional means to protect file data by making copies, combined with the need to provide access to shared data from multiple locations, has created an
CommVault Simpana Archive 8.0 Integration Guide
CommVault Simpana Archive 8.0 Integration Guide Data Domain, Inc. 2421 Mission College Boulevard, Santa Clara, CA 95054 866-WE-DDUPE; 408-980-4800 Version 1.0, Revision B September 2, 2009 Copyright 2009
UNDERSTANDING DATA DEDUPLICATION. Tom Sas Hewlett-Packard
UNDERSTANDING DATA DEDUPLICATION Tom Sas Hewlett-Packard SNIA Legal Notice The material contained in this tutorial is copyrighted by the SNIA. Member companies and individual members may use this material
Enterprise Data Protection
PGP White Paper June 2007 Enterprise Data Protection Version 1.0 PGP White Paper Enterprise Data Protection 2 Table of Contents EXECUTIVE SUMMARY...3 PROTECTING DATA EVERYWHERE IT GOES...4 THE EVOLUTION
Optimizing Data Protection Operations in VMware Environments
Optimizing Data Protection Operations in VMware Environments March 2009 Data protection is critical for small and medium business (SMB) customers. Evolving business and regulatory mandates are driving
WHITE PAPER Improving Storage Efficiencies with Data Deduplication and Compression
WHITE PAPER Improving Storage Efficiencies with Data Deduplication and Compression Sponsored by: Oracle Steven Scully May 2010 Benjamin Woo IDC OPINION Global Headquarters: 5 Speen Street Framingham, MA
An Oracle White Paper November 2010. Backup and Recovery with Oracle s Sun ZFS Storage Appliances and Oracle Recovery Manager
An Oracle White Paper November 2010 Backup and Recovery with Oracle s Sun ZFS Storage Appliances and Oracle Recovery Manager Introduction...2 Oracle Backup and Recovery Solution Overview...3 Oracle Recovery
UNDERSTANDING DATA DEDUPLICATION. Jiří Král, ředitel pro technický rozvoj STORYFLEX a.s.
UNDERSTANDING DATA DEDUPLICATION Jiří Král, ředitel pro technický rozvoj STORYFLEX a.s. SNIA Legal Notice The material contained in this tutorial is copyrighted by the SNIA. Member companies and individual
Field Audit Report. Asigra. Hybrid Cloud Backup and Recovery Solutions. May, 2009. By Brian Garrett with Tony Palmer
Field Audit Report Asigra Hybrid Cloud Backup and Recovery Solutions By Brian Garrett with Tony Palmer May, 2009 Field Audit: Asigra Hybrid Cloud Backup and Recovery Solutions 2 Contents Introduction...
COMPARING STORAGE AREA NETWORKS AND NETWORK ATTACHED STORAGE
COMPARING STORAGE AREA NETWORKS AND NETWORK ATTACHED STORAGE Complementary technologies provide unique advantages over traditional storage architectures Often seen as competing technologies, Storage Area
Turnkey Deduplication Solution for the Enterprise
Symantec NetBackup 5000 Appliance Turnkey Deduplication Solution for the Enterprise Mayur Dewaikar Sr. Product Manager, Information Management Group White Paper: A Deduplication Appliance Solution for
EMC BACKUP MEETS BIG DATA
EMC BACKUP MEETS BIG DATA Strategies To Protect Greenplum, Isilon And Teradata Systems 1 Agenda Big Data: Overview, Backup and Recovery EMC Big Data Backup Strategy EMC Backup and Recovery Solutions for
NetApp Syncsort Integrated Backup
WHITE PAPER NetApp Syncsort Integrated Backup Protect your Microsoft and VMware Environment with NetApp Syncsort Integrated Backup Protecting Microsoft and VMware Executive Summary 3 Microsoft and VMware
How To Store Data On Disk On Data Domain
WHITE PAPER Deduplication Storage Deduplication Storage for Nearline Applications September 2007 w w w. d a t a d o m a i n. c o m - 2 0 0 7 DATA DOMAIN I Contents Consolidated Support for Backup, Archiving,
NetApp and Microsoft Virtualization: Making Integrated Server and Storage Virtualization a Reality
NETAPP TECHNICAL REPORT NetApp and Microsoft Virtualization: Making Integrated Server and Storage Virtualization a Reality Abhinav Joshi, NetApp Chaffie McKenna, NetApp August 2008 TR-3701 Version 1.0
Backup Software? Article on things to consider when looking for a backup solution. 11/09/2015 Backup Appliance or
11/09/2015 Backup Appliance or Backup Software? Article on things to consider when looking for a backup solution. Ray Quattromini FORTUNA POWER SYSTEMS LTD T: 01256 782030 E: [email protected] W: WWW.FORTUNADATA.COM
Solution Overview. Business Continuity with ReadyNAS
Business Continuity with ReadyNAS What is ReadyNAS? ReadyNAS is a family of network storage solutions for small and medium businesses, workgroups, and remote/home offices. ReadyNAS delivers unified storage
EMC DATA DOMAIN EXTENDED RETENTION SOFTWARE: MEETING NEEDS FOR LONG-TERM RETENTION OF BACKUP DATA ON EMC DATA DOMAIN SYSTEMS
SOLUTION PROFILE EMC DATA DOMAIN EXTENDED RETENTION SOFTWARE: MEETING NEEDS FOR LONG-TERM RETENTION OF BACKUP DATA ON EMC DATA DOMAIN SYSTEMS MAY 2012 Backups are essential for short-term data recovery
an introduction to networked storage
an introduction to networked storage How networked storage can simplify your data management The key differences between SAN, DAS, and NAS The business benefits of networked storage Introduction Historical
WHITE PAPER: DATA PROTECTION. Veritas NetBackup for Microsoft Exchange Server Solution Guide. Bill Roth January 2008
WHITE PAPER: DATA PROTECTION Veritas NetBackup for Microsoft Exchange Server Solution Guide Bill Roth January 2008 White Paper: Veritas NetBackup for Microsoft Exchange Server Solution Guide Content 1.
Using HP StoreOnce Backup systems for Oracle database backups
Technical white paper Using HP StoreOnce Backup systems for Oracle database backups Table of contents Introduction 2 Technology overview 2 HP StoreOnce Backup systems key features and benefits 2 HP StoreOnce
Protect Data... in the Cloud
QUASICOM Private Cloud Backups with ExaGrid Deduplication Disk Arrays Martin Lui Senior Solution Consultant Quasicom Systems Limited Protect Data...... in the Cloud 1 Mobile Computing Users work with their
EMC Disk Library with EMC Data Domain Deployment Scenario
EMC Disk Library with EMC Data Domain Deployment Scenario Best Practices Planning Abstract This white paper is an overview of the EMC Disk Library with EMC Data Domain deduplication storage system deployment
OmniCube. SimpliVity OmniCube and Multi Federation ROBO Reference Architecture. White Paper. Authors: Bob Gropman
OmniCube SimpliVity OmniCube and Multi Federation ROBO Reference Architecture White Paper Authors: Bob Gropman Date: April 13, 2015 SimpliVity and OmniCube are trademarks of SimpliVity Corporation. All
VMware Data Backup and Recovery Data Domain Deduplication Storage Best Practices Guide
White Paper VMware Data Backup and Recovery Data Domain Deduplication Storage Best Practices Guide Abstract VMware offers extraordinary benefits, but it can come at the cost of extra storage, backup resources
EMC DATA DOMAIN OVERVIEW. Copyright 2011 EMC Corporation. All rights reserved.
EMC DATA DOMAIN OVERVIEW 1 2 With Data Domain Deduplication Storage Systems, You Can WAN Retain longer Keep backups onsite longer with less disk for fast, reliable restores, and eliminate the use of tape
Using HP StoreOnce Backup Systems for NDMP backups with Symantec NetBackup
Technical white paper Using HP StoreOnce Backup Systems for NDMP backups with Symantec NetBackup Table of contents Executive summary... 2 Introduction... 2 What is NDMP?... 2 Technology overview... 3 HP
WHY SECURE MULTI-TENANCY WITH DATA DOMAIN SYSTEMS?
Why Data Domain Series WHY SECURE MULTI-TENANCY WITH DATA DOMAIN SYSTEMS? Why you should take the time to read this paper Provide data isolation by tenant (Secure logical data isolation for each tenant
Consolidate and Virtualize Your Windows Environment with NetApp and VMware
White Paper Consolidate and Virtualize Your Windows Environment with NetApp and VMware Sachin Chheda, NetApp and Gaetan Castelein, VMware October 2009 WP-7086-1009 TABLE OF CONTENTS 1 EXECUTIVE SUMMARY...
EMC Data Domain Boost for Oracle Recovery Manager (RMAN)
White Paper EMC Data Domain Boost for Oracle Recovery Manager (RMAN) Abstract EMC delivers Database Administrators (DBAs) complete control of Oracle backup, recovery, and offsite disaster recovery with
Universal Backup Device with
Universal Backup Device with Fibre Channel Disk to Disk Backup with Affordable Deduplication and Replication for IBM Power Systems Executive Overview Copyright (c)2015 Electronic Storage Corporation Universal
INCREASING EFFICIENCY WITH EASY AND COMPREHENSIVE STORAGE MANAGEMENT
INCREASING EFFICIENCY WITH EASY AND COMPREHENSIVE STORAGE MANAGEMENT UNPRECEDENTED OBSERVABILITY, COST-SAVING PERFORMANCE ACCELERATION, AND SUPERIOR DATA PROTECTION KEY FEATURES Unprecedented observability
Backup and Recovery: The Benefits of Multiple Deduplication Policies
Backup and Recovery: The Benefits of Multiple Deduplication Policies NOTICE This White Paper may contain proprietary information protected by copyright. Information in this White Paper is subject to change
ABOUT DISK BACKUP WITH DEDUPLICATION
Disk Backup with Data Deduplication ABOUT DISK BACKUP WITH DEDUPLICATION www.exagrid.com What appears to be simple & straightforward Built for Backup is often more complex & risky than you think. 2 Agenda
EMC Data de-duplication not ONLY for IBM i
EMC Data de-duplication not ONLY for IBM i Maciej Mianowski EMC BRS Advisory TC May 2011 1 EMC is a TECHNOLOGY company EMC s focus is IT Infrastructure 2 EMC Portfolio Information Security Authentica Network
Business Benefits of Data Footprint Reduction
Business Benefits of Data Footprint Reduction Why and how reducing your data footprint provides a positive benefit to your business and application service objectives By Greg Schulz Founder and Senior
Cloud, Appliance, or Software? How to Decide Which Backup Solution Is Best for Your Small or Midsize Organization.
WHITE PAPER: CLOUD, APPLIANCE, OR SOFTWARE?........................................ Cloud, Appliance, or Software? How to Decide Which Backup Solution Is Best for Your Small or Midsize Who should read
BEST PRACTICES FOR PROTECTING MICROSOFT EXCHANGE DATA
BEST PRACTICES FOR PROTECTING MICROSOFT EXCHANGE DATA Bill Webster September 25, 2003 VERITAS ARCHITECT NETWORK TABLE OF CONTENTS Introduction... 3 Exchange Data Protection Best Practices... 3 Application
EMC Integrated Infrastructure for VMware
EMC Integrated Infrastructure for VMware Enabled by Celerra Reference Architecture EMC Global Solutions Centers EMC Corporation Corporate Headquarters Hopkinton MA 01748-9103 1.508.435.1000 www.emc.com
Cost Effective Backup with Deduplication. Copyright 2009 EMC Corporation. All rights reserved.
Cost Effective Backup with Deduplication Agenda Today s Backup Challenges Benefits of Deduplication Source and Target Deduplication Introduction to EMC Backup Solutions Avamar, Disk Library, and NetWorker
EMC NETWORKER SNAPSHOT MANAGEMENT
White Paper Abstract This white paper describes the benefits of NetWorker Snapshot Management for EMC Arrays. It also explains the value of using EMC NetWorker for snapshots and backup. June 2013 Copyright
Long term retention and archiving the challenges and the solution
Long term retention and archiving the challenges and the solution NAME: Yoel Ben-Ari TITLE: VP Business Development, GH Israel 1 Archive Before Backup EMC recommended practice 2 1 Backup/recovery process
Business-Centric Storage FUJITSU Storage ETERNUS CS800 Data Protection Appliance
Intel Xeon processor Business-Centric Storage FUJITSU Storage ETERNUS CS800 Data Protection Appliance The easy solution for backup to disk with deduplication Intel Inside. Powerful Solution Outside. If
SLOW BACKUPS GOT YOU DOWN?
Why Data Domain Series SLOW BACKUPS GOT YOU DOWN? Why you should take the time to read this paper Speed up backups by 50% (Finish backups within backup windows with breathing room for data growth. With
SYMANTEC NETBACKUP APPLIANCE FAMILY OVERVIEW BROCHURE. When you can do it simply, you can do it all.
SYMANTEC NETBACKUP APPLIANCE FAMILY OVERVIEW BROCHURE When you can do it simply, you can do it all. SYMANTEC NETBACKUP APPLIANCES Symantec understands the shifting needs of the data center and offers NetBackup
EMC VNXe File Deduplication and Compression
White Paper EMC VNXe File Deduplication and Compression Overview Abstract This white paper describes EMC VNXe File Deduplication and Compression, a VNXe system feature that increases the efficiency with
Archiving, Backup, and Recovery for Complete the Promise of Virtualization
Archiving, Backup, and Recovery for Complete the Promise of Virtualization Unified information management for enterprise Windows environments The explosion of unstructured information It is estimated that
Symantec NetBackup OpenStorage Solutions Guide for Disk
Symantec NetBackup OpenStorage Solutions Guide for Disk UNIX, Windows, Linux Release 7.6 Symantec NetBackup OpenStorage Solutions Guide for Disk The software described in this book is furnished under a
Future-Proofed Backup For A Virtualized World!
! Future-Proofed Backup For A Virtualized World! Prepared by: Colm Keegan, Senior Analyst! Prepared: January 2014 Future-Proofed Backup For A Virtualized World Like death and taxes, growing backup windows
NETAPP WHITE PAPER USING A NETWORK APPLIANCE SAN WITH VMWARE INFRASTRUCTURE 3 TO FACILITATE SERVER AND STORAGE CONSOLIDATION
NETAPP WHITE PAPER USING A NETWORK APPLIANCE SAN WITH VMWARE INFRASTRUCTURE 3 TO FACILITATE SERVER AND STORAGE CONSOLIDATION Network Appliance, Inc. March 2007 TABLE OF CONTENTS 1 INTRODUCTION... 3 2 BACKGROUND...
Deduplication and Beyond: Optimizing Performance for Backup and Recovery
Beyond: Optimizing Gartner clients using deduplication for backups typically report seven times to 25 times the reductions (7:1 to 25:1) in the size of their data, and sometimes higher than 100:1 for file
EMC PERSPECTIVE. An EMC Perspective on Data De-Duplication for Backup
EMC PERSPECTIVE An EMC Perspective on Data De-Duplication for Backup Abstract This paper explores the factors that are driving the need for de-duplication and the benefits of data de-duplication as a feature
Maximize Your Virtual Environment Investment with EMC Avamar. Rob Emsley Senior Director, Product Marketing
1 Maximize Your Virtual Environment Investment with EMC Avamar Rob Emsley Senior Director, Product Marketing 2 Private Cloud is the Vision Virtualized Data Center Internal Cloud Trusted Flexible Control
Using Microsoft Active Directory (AD) with HA3969U in Windows Server
Using Microsoft Active Directory (AD) with HA3969U in Windows Server Application Note Abstract This application note describes how to use Microsoft Active Directory (AD) service with HA3969U systems in
Top Ten Questions. to Ask Your Primary Storage Provider About Their Data Efficiency. May 2014. Copyright 2014 Permabit Technology Corporation
Top Ten Questions to Ask Your Primary Storage Provider About Their Data Efficiency May 2014 Copyright 2014 Permabit Technology Corporation Introduction The value of data efficiency technologies, namely
Data Protection with IBM TotalStorage NAS and NSI Double- Take Data Replication Software
Data Protection with IBM TotalStorage NAS and NSI Double- Take Data Replication September 2002 IBM Storage Products Division Raleigh, NC http://www.storage.ibm.com Table of contents Introduction... 3 Key
WHITE PAPER THE BENEFITS OF CONTINUOUS DATA PROTECTION. SYMANTEC Backup Exec 10d Continuous Protection Server
WHITE PAPER THE BENEFITS OF CONTINUOUS DATA PROTECTION SYMANTEC Backup Exec 10d Continuous Protection Server 1 TABLE OF CONTENTS EXECUTIVE SUMMARY...3 Current Situation...3 The New Opportunity...3 The
EMC AVAMAR INTEGRATION WITH EMC DATA DOMAIN SYSTEMS
EMC AVAMAR INTEGRATION WITH EMC DATA DOMAIN SYSTEMS A Detailed Review ABSTRACT This white paper highlights integration features implemented in EMC Avamar with EMC Data Domain deduplication storage systems
IBM Tivoli Storage Manager Suite for Unified Recovery
IBM Tivoli Storage Manager Suite for Unified Recovery Comprehensive data protection software with a broad choice of licensing plans Highlights Optimize data protection for virtual servers, core applications
