Reference Guide WindSpring Data Management Technology (DMT) Solving Today s Storage Optimization Challenges
|
|
- Betty Ward
- 8 years ago
- Views:
Transcription
1 Reference Guide WindSpring Data Management Technology (DMT) Solving Today s Storage Optimization Challenges September 2011
2 Table of Contents The Enterprise and Mobile Storage Landscapes... 3 Increased Storage Capacity with Optimized, Data-Specific Access... 4 Data Traffic Drives Next-Generation Data Management Systems... 4 Compression... 5 Error Detection and Data Deduplication... 5 Forward Error Correction and Erasure Codes... 5 The Intelligent Compression Management Solution... 6 Metadata... 6 Block... 6 CODEC... 6 Error Detection... 7 Error Correction... 7 Security Fingerprinting... 7 Compression Optimization... 7 Dedupe Enhancement... 8 Erasure Codes... 8 Cloud Data... 8 DMT Optimizes Compression... 9 CODECs... 9 Dedupe...10 Erasure Codes...11 Additional Information...11 WindSpring DMT Reference Guide September 2011 Page 2
3 The Enterprise and Mobile Storage Landscapes The volume of digital information being created is skyrocketing as rich multimedia becomes ubiquitous, regulatory requirements force long-term retention of data and the move to cloud computing brings more content to the edges of the network. IDC predicts that by 2020, nearly 40 million petabytes of data will be created annually, and available storage is predicted to grow to just over 20 million petabytes. IDC predicts that in the same time enterprise storage will grow into a $50 billion industry. That digital storage gap is forcing enterprises and mobile operators to become more effective at managing the complexities of storing and retrieving data. Even with the rapidly decreasing cost per megabyte for storage, online storage is one of the biggest expense elements in IT budgets today. This Reference Guide describes the challenges that are driving the need for storage optimization for enterprise and mobile applications, and how WindSpring Data Management Technology (DMT ) overcomes these challenges. WindSpring DMT Reference Guide September 2011 Page 3
4 Increased Storage Capacity with Optimized, Data-Specific Access WindSpring DMT is an advanced, intelligent software compression management system that optimizes both data compression and compressed data management for enterprise and mobile applications. DMT is built on a flexible architecture that simplifies compression management and includes multiple, selectable, lossless CODECs for storage, backup and dedupe, delivering increased storage capacity and optimized data-specific access. The DMT integrated data management suite also includes enhanced error detection, recovery and protection. WindSpring DMT is application sensitive, employing real-time, storage optimized and network optimized CODECs based on policies or parametric automation. DMT s metadata provides block-level error detection and error correction using multiple algorithms. Built-in monitoring systems ensure optimal integration into mobile and enterprise deployments. Data Traffic Drives Next-Generation Data Management Systems The explosion of data communications traffic is challenging network providers in both response times and capacity. While managing this data is essential to modern data systems, not all data is the same. Data storage requirements vary widely, depending on their use. The application using the compression should be the key driver when implementing an effective compression management system: Real-time data residing on primary storage devices, including user files in Word, Excel, Mail and IM, require real-time access. This makes the speed in which a file is compressed and decompressed the most critical issue. Online systems require optimized communications that balance compression and speed, while addressing the limited resources on mobile devices. Backup and archival systems stored company records and data required for regulatory compliance, for example require maximum compression to reduce both CAPEX and OPEX, making file size a higher priority than speed. The method used to access compressed data is the driver for embedded systems. These requirements make compression and data deduplication technologies essential to efficiently managing today s advanced storage systems. WindSpring DMT Reference Guide September 2011 Page 4
5 Compression Compression reduces the overall size of stored or transmitted data, typically using industry standard CODECs such as LZO and GZIP. These compression algorithms ignore the type of data and the nature of the application being compressed. The method used to implement a CODEC, plus the type of CODEC used and the type of data being compressed binary code or text, for example affects compression and decompression rates. Compression system optimization is achieved by employing deduplication and a balance of compression and speed, combined with the optimal CODEC for the method used to access a particular application. Error Detection and Data Deduplication Error or change detection is used for data deduplication and to determine whether changes have occurred in the original stored data. Error detection is based on algorithms that create a unique code or fingerprint that identifies the contents of a particular block of data. These algorithms provide checksum, CRC (cyclic redundancy checksums) or hash(sha) values that are based on the contents of the data. For error checking, fingerprints are used to determine if a change (presumably an error) has occurred in the dataset, whether it is in primary, backup or archive storage. Data deduplication uses the same fingerprints to identify identical blocks of data, replacing new data with a pointer to the location of the original data. The effectiveness of this deduplication scheme is dependent on the uniqueness of each fingerprint. If two different data blocks have the same fingerprint, then a collision will occur. The probability that a data collision will occur is dependent on the algorithm used, with SHA384 providing the lowest probability, requiring the longest time to calculate and using the largest amount of memory. CRC, on the other hand, has a high probability of collision but is fast and uses little memory. Deduplication requires that hash values are stored per data block so that new block hash values can be compared against existing block hash values. Smaller block sizes require more hash values per file, increasing memory usage, but resulting in better deduplication. Variable block deduplication achieves the best deduplication compression, but does so at the expense of memory and speed. Deduplication can occur as data is written to the primary storage system or as a post-process task running on the storage subsystem. This is dependent on the application, with the requirements for a real-time database focused on speed, while an archival storage system focuses on maximum deduplication. Ultimately, optimizing deduplication systems demands a balance between speed and memory usage, driven by the type of applications and data usage involved. Forward Error Correction and Erasure Codes Compression and deduplication critically impacts the reliability of data storage systems, and the introduction of errors in a compressed backup file may result in substantial and unrecoverable loss of backup data. The loss of a primary deduplicated block could cause all dependent files to be lost or corrupted. Error recovery is essential in compressed data systems, whether they are based on CODECs, deduplication or both. Erasure codes increase the reliability of data storage systems that use compression and data deduplication. While erasure codes make it possible for erased data to be recovered by storing additional metadata with the original data, they also require increased storage. The benefits of this combination of compression and deduplication are realized only if the storage required for the erasure codes is less than that required for the original data. With erasure codes, the location of the error is known, unlike error correcting codes. When parameters are adjusted, erasure codes can provide varying levels of reliability and redundancy. Erasure codes are generated using a number of different algorithms that affect the speed and effectiveness of the recovery algorithms. The type of data dictates the priorities. The value placed on recovery requirements for a stored Web page is typically set at a lower threshold than on a Sarbanes-Oxley document set. Reed-Solomon, Cauchy, Tornado, Raptor and Typhoon erasure code algorithms are all based on the way the encoding and decoding matrices are generated. WindSpring DMT Reference Guide September 2011 Page 5
6 The Intelligent Compression Management Solution DMT was designed specifically for storage management systems and architected to address the challenges that dominate the management of data in compressed data systems. DMT s standard C libraries enable storage management software to compress data from multiple sources using multiple CODECs, driven automatically or by policy to multiple destinations. By providing direct data access and configurable block sizes, DMT gives storage software complete control over compressed data, whether it is located on primary or secondary storage. DMT also makes it possible for compression to be configured at the file or block level and, as part of the direct data access, includes metadata that enables the use of multiple industry standard CODECs. DMT includes WindSpring s own QC0 CODEC, that enables byte-level access to compressed data without rehydration and direct edit and search of compressed data. DMT also includes metadata that allows the selection of multiple block or file-level hashing algorithms such as SHA256 or CRC. Data deduplication can also be easily handled using multiple levels of hash code matching. The reliability of compressed data is maintained with erasure codes that employ industry-standard libraries and a choice of erasure code algorithms. DMT is cross-platform compatible with standard C/POSIX library interfaces for systems based on Windows, Linux and most embedded operating systems. Metadata DMT manages another critical aspect of compression management the application s interaction with the compressed data using metadata that is included in every compressed file, regardless of CODEC. By managing this metadata, DMT enables applications to directly access the data at the block, sub-block or byte level, as determined by the selected CODEC, without decompressing the file. The metadata is completely configurable to address all critical file data. Block The file compression block size can be set from 4 KB to 1 MB. Smaller block sizes may result in faster access speeds, but may not optimize compression. Larger block sizes increase compression, but depending on the access pattern, may result in lower performance. Because they are critical in determining the correct block size, the performance analysis tools within DMT execute real-time analysis of file access patterns, making it possible to optimize the selected parameters. WindSpring DMT Reference Guide September 2011 Page 6
7 CODEC Because the optimal CODECs for one region of a file with mixed data types may be completely wrong for another region of the same file, DMT also makes it possible for the application to select the CODEC type on a block-by-block basis. For example, a database file may contain textual data for indexes and embedded pictures and videos as objects. On a file basis, the CODEC can be selected by a policy contained in a configuration file and on a block level. CODEC selection can be automated by setting API parameters. Security Fingerprinting The block-level metadata provides a fingerprint of the data in the file. Combinations of the CRC, CRC+metadata and source or compressed hash values allow security systems to calculate a unique identity for each file. Error Detection Compression can affect the reliability of compressed data in backup systems, with the error rate multiplied by the compression ratio, at a minimum. Because error detection needs to be relevant to the data type, DMT enables the error detection code methods to be selected for both the blocks and the overall file data. The codes can be recorded for both uncompressed data and compressed data. For data deduplication systems, hash calculations are determined by the final deduplication architecture and can be included at the file or block level. Deduplication systems can use CRC, CRC+metadata or hash values to implement compression. CRC and CRC+ can be included by default and DMT can include either source (uncompressed) or compressed hash values. Error Correction For high reliability data systems, erasure codes ensure that files with errors can be recovered. Erasure codes are generated based on different algorithms, each having different characteristics. With DMT, the file-level erasure code algorithm is selected using the file metadata. Compression Optimization DMT s metadata allows direct access to compressed data at the block and sub-block level. Working at the block level, DMT accelerates search and retrieval, while its multiple CODECs manage storage by dynamically selecting the best compression system for the data type in use. Data compression can be optimized for access speed, compression rate or a balance of the two. CODEC selection can be based on policy with compression selected by file type, or it can be automatic, with blocklevel API control of data CODEC and decision metrics. DMT provides simple interfaces including file-by-file and directory compression, extensive APIs for applicationlevel control and standard POSIX file I/O of compressed and rehydrated data. WindSpring DMT Reference Guide September 2011 Page 7
8 Dedupe Enhancement WindSpring DMT enhances deduplication systems by storing the configurable metadata with the compressed data, optimized for speed, reliability or a balance of the two. DMT computes metadata related to either the original data or the encoded data for every block of data that it encodes, providing both block information and error detection. DMT has been integrated into both Opendedup and the Solaris ZFS system for compression and deduplication. Erasure Codes Files that are compressed with DMT include erasure coding at the file level, using the Jerasure library. Erasure codes can be selected from the options available with the Jerasure library, including Reed-Solomon, Cauchy, Liberation and Blaum-Roth. Other algorithms, such as Tornado, Raptor and Typhoon can also be integrated, with the appropriate licensing from the relevant patent holders. DMT maintains the reliability of compressed data with erasure codes that offer industry standard libraries and a choice of erasure code algorithms. Error detection algorithms are used extensively in deduplication systems to search for identical files, blocks or regions of data. These algorithms are based on checksum algorithms such as CRC16 (cyclic redundancy checks). While DMT defaults to 16-bit CRC algorithms to check the encoded data, CRC32 and Adler32 are available options that provide higher security. Message digest algorithms are based on message digest codes such as MDn and SHAn. DMT s implementation of erasure codes is extensible. At the file level, DMT maintains maximum access speed while providing erasure code reliability. Errors that are detected in the base compression data can be corrected using the embedded erasure codes. At the cloud level, erasure codes can be included with compressed chunk data packets, increasing the reliability of the overall system. The operating system provides overall erasure code protection for a distributed file system. In compression only systems, both error detection and the speed of the algorithm are important, with CRC16 and Adler32 being faster than CRC32, while also delivering effective levels of error detection. The probability of a collision is the most important consideration in data deduplication. CRC16 and Adler32 have very high probabilities of collision, while CRC32 has lower probabilities, but is slower. In general, hash codes are required for final verification, but simpler algorithms can be used to eliminate candidates that will not match. As an intermediate step, DMT uses a combination of its CRC codes and other metadata to reduce the probability of a collision for CRC-based deduplication. DMT also allows the selection of multiple block or file-level hashing algorithms from SHA1 to SHA384. When using these multiple levels of hash code matching, data deduplication is handled with ease. WindSpring DMT Reference Guide September 2011 Page 8
9 Cloud Data DMT is written using standard C/POSIX-style APIs and can be integrated at the file, system or application level. That integration point drives the implementation of DMT applications in the cloud. DMT Optimizes Compression CODECs DMT was tested in a standard test environment, using an i7/8 GB Nexenta appliance with an internal SATA drive and the modern, data-specific Silesia Corpus. This corpus is a mixture of six textural files (texts, XML, HTML and log data) and six binary files (executable, binary databases, images), totalling 250 MB with sizes ranging from 2 MB to 50 MB. The chart at right illustrates the results of testing DMT s CODECs on the Silesia Corpus, highlighting the trade-off between encode speed, decode speed and effective compression. The chart s right axis shows the estimated size of a standard 1 TB drive after compression. So, while QC2 is clearly the fastest CODEC, the effective size is just over 2 TB, while QC1 results in an effective size of more than 3.5 TB, making it poorly suited for real-time access. When compared with standard CODECs in a straight decode operation, DMT excels again, driven by the block architecture. DMT is 20% faster than LZO, 50% faster than GZIP and 80% WindSpring DMT Reference Guide September 2011 Page 9
10 faster than LZMA. These figures do not take into account random access performance, where DMT s direct access provides further improvements in speed. The actual results vary depending on data type. Dedupe WindSpring DMT s dedupe capabilities were also tested in a standard network test environment, using the Silesia Corpus, on a Nexenta i7/8 GB appliance with an internal SATA drive configured using Solaris OS, ZFS and a napp-it console. As illustrated in the charts below, source compression has strong downstream multipliers, so that the time it takes for transfer and deduplication of DMT data is much less than native or ZFS compression. The results are a combination of the effect of compression at the source, deduplication on a smaller (compressed) dataset and ZFS compression performance. Deduplication is very effective on DMT compressed data: Time to copy/deduplicate DMT data is about 2x the time it takes to copy one dataset. WindSpring DMT Reference Guide September 2011 Page 10
11 Time to copy/deduplicate native data is nearly 3x the time it takes to copy one dataset. Time to copy/deduplicate ZFS compressed data is nearly 2.5x the time it takes to copy one dataset. Erasure Codes The chart below shows the effect of two different checksum algorithms on the CODEC speed. Two factors influence the overall impact: Real-time systems demand fast CODECs, requiring both small block size and high speed. DMT allows the application to optimize the checksum algorithm on a file or block level, enabling speed and compression to be balanced for the desired system performance. Additional Information WindSpring DMT is a proven solution that can be easily integrated into enterprise and mobile storage deployments, providing optimized compression and a highly evolved compression management system. By making it possible to select the optimal lossless CODEC for each data type and application, DMT delivers increased storage capacity and optimized data-specific access, while DMT s integrated data management suite provides enhanced error detection, recovery and protection. To learn more about WindSpring DMT please visit If you would like to discuss how DMT can make a difference in your business, please contact WindSpring at info@windspring.com. As the block size is reduced, the effect of the checksum algorithm increases. As the speed of the CODEC accelerates, the effect of the checksum algorithm increases. WindSpring DMT Reference Guide September 2011 Page 11
12 Tel
WHITE PAPER Improving Storage Efficiencies with Data Deduplication and Compression
WHITE PAPER Improving Storage Efficiencies with Data Deduplication and Compression Sponsored by: Oracle Steven Scully May 2010 Benjamin Woo IDC OPINION Global Headquarters: 5 Speen Street Framingham, MA
More informationData De-duplication Methodologies: Comparing ExaGrid s Byte-level Data De-duplication To Block Level Data De-duplication
Data De-duplication Methodologies: Comparing ExaGrid s Byte-level Data De-duplication To Block Level Data De-duplication Table of Contents Introduction... 3 Shortest Possible Backup Window... 3 Instant
More informationTop Ten Questions. to Ask Your Primary Storage Provider About Their Data Efficiency. May 2014. Copyright 2014 Permabit Technology Corporation
Top Ten Questions to Ask Your Primary Storage Provider About Their Data Efficiency May 2014 Copyright 2014 Permabit Technology Corporation Introduction The value of data efficiency technologies, namely
More informationHardware Configuration Guide
Hardware Configuration Guide Contents Contents... 1 Annotation... 1 Factors to consider... 2 Machine Count... 2 Data Size... 2 Data Size Total... 2 Daily Backup Data Size... 2 Unique Data Percentage...
More informationDeltaStor Data Deduplication: A Technical Review
White Paper DeltaStor Data Deduplication: A Technical Review DeltaStor software is a next-generation data deduplication application for the SEPATON S2100 -ES2 virtual tape library that enables enterprises
More informationData Reduction Methodologies: Comparing ExaGrid s Byte-Level-Delta Data Reduction to Data De-duplication. February 2007
Data Reduction Methodologies: Comparing ExaGrid s Byte-Level-Delta Data Reduction to Data De-duplication February 2007 Though data reduction technologies have been around for years, there is a renewed
More informationReducing Backups with Data Deduplication
The Essentials Series: New Techniques for Creating Better Backups Reducing Backups with Data Deduplication sponsored by by Eric Beehler Reducing Backups with Data Deduplication... 1 Explaining Data Deduplication...
More informationTurnkey Deduplication Solution for the Enterprise
Symantec NetBackup 5000 Appliance Turnkey Deduplication Solution for the Enterprise Mayur Dewaikar Sr. Product Manager, Information Management Group White Paper: A Deduplication Appliance Solution for
More informationPresents. Attix5 Technology. An Introduction
Presents Attix5 Technology An Introduction January 2013 1. Global Block Level Deduplication. Attix5 Feature Top 10 Things That Matter When Attix5 is first installed on a target server a full backup is
More informationBarracuda Backup Deduplication. White Paper
Barracuda Backup Deduplication White Paper Abstract Data protection technologies play a critical role in organizations of all sizes, but they present a number of challenges in optimizing their operation.
More informationData Deduplication in Tivoli Storage Manager. Andrzej Bugowski 19-05-2011 Spała
Data Deduplication in Tivoli Storage Manager Andrzej Bugowski 19-05-2011 Spała Agenda Tivoli Storage, IBM Software Group Deduplication concepts Data deduplication in TSM 6.1 Planning for data deduplication
More informationUnderstanding EMC Avamar with EMC Data Protection Advisor
Understanding EMC Avamar with EMC Data Protection Advisor Applied Technology Abstract EMC Data Protection Advisor provides a comprehensive set of features to reduce the complexity of managing data protection
More informationDemystifying Deduplication for Backup with the Dell DR4000
Demystifying Deduplication for Backup with the Dell DR4000 This Dell Technical White Paper explains how deduplication with the DR4000 can help your organization save time, space, and money. John Bassett
More informationDeduplication and Beyond: Optimizing Performance for Backup and Recovery
Beyond: Optimizing Gartner clients using deduplication for backups typically report seven times to 25 times the reductions (7:1 to 25:1) in the size of their data, and sometimes higher than 100:1 for file
More informationProtect Data... in the Cloud
QUASICOM Private Cloud Backups with ExaGrid Deduplication Disk Arrays Martin Lui Senior Solution Consultant Quasicom Systems Limited Protect Data...... in the Cloud 1 Mobile Computing Users work with their
More informationReal-time Compression: Achieving storage efficiency throughout the data lifecycle
Real-time Compression: Achieving storage efficiency throughout the data lifecycle By Deni Connor, founding analyst Patrick Corrigan, senior analyst July 2011 F or many companies the growth in the volume
More informationWHITE PAPER. Storage Savings Analysis: Storage Savings with Deduplication and Acronis Backup & Recovery 10
Storage Savings Analysis: Storage Savings with Deduplication and Acronis Backup & Recovery 10 Copyright Acronis, Inc., 2000 2009 Table of contents Executive Summary... 3 The Importance of Deduplication...
More informationE-Guide. Sponsored By:
E-Guide An in-depth look at data deduplication methods This E-Guide will discuss the various approaches to data deduplication. You ll learn the pros and cons of each, and will benefit from independent
More informationDeduplication has been around for several
Demystifying Deduplication By Joe Colucci Kay Benaroch Deduplication holds the promise of efficient storage and bandwidth utilization, accelerated backup and recovery, reduced costs, and more. Understanding
More informationRedefining Microsoft SQL Server Data Management. PAS Specification
Redefining Microsoft SQL Server Data Management APRIL Actifio 11, 2013 PAS Specification Table of Contents Introduction.... 3 Background.... 3 Virtualizing Microsoft SQL Server Data Management.... 4 Virtualizing
More informationData Backup and Restore (DBR) Overview... 2. Detailed Description... 3. Pricing... 5 SLAs... 5 Service Matrix... 6. Service Description
Service Description Data Backup and Restore (DBR) Terremark s Data Backup & Restore provides secure, streamlined online-offsite data storage and retrieval that is highly scalable and easily customizable.
More informationNETAPP WHITE PAPER Looking Beyond the Hype: Evaluating Data Deduplication Solutions
NETAPP WHITE PAPER Looking Beyond the Hype: Evaluating Data Deduplication Solutions Larry Freeman, Network Appliance, Inc. September 2007 WP-7028-0907 Table of Contents The Deduplication Hype 3 What Is
More informationA Novel Way of Deduplication Approach for Cloud Backup Services Using Block Index Caching Technique
A Novel Way of Deduplication Approach for Cloud Backup Services Using Block Index Caching Technique Jyoti Malhotra 1,Priya Ghyare 2 Associate Professor, Dept. of Information Technology, MIT College of
More informationDon t Get Duped By. Dedupe. 7 Technology Circle Suite 100 Columbia, SC 29203. Phone: 866.359.5411 E-Mail: sales@unitrends.com URL: www.unitrends.
Don t Get Duped By 7 Technology Circle Suite 100 Columbia, SC 29203 Dedupe Phone: 866.359.5411 E-Mail: sales@unitrends.com URL: www.unitrends.com 1 The purpose of deduplication is to provide more storage,
More informationEMC Data Domain Boost for Oracle Recovery Manager (RMAN)
White Paper EMC Data Domain Boost for Oracle Recovery Manager (RMAN) Abstract EMC delivers Database Administrators (DBAs) complete control of Oracle backup, recovery, and offsite disaster recovery with
More informationWHITE PAPER. Permabit Albireo Data Optimization Software. Benefits of Albireo for Virtual Servers. January 2012. Permabit Technology Corporation
WHITE PAPER Permabit Albireo Data Optimization Software Benefits of Albireo for Virtual Servers January 2012 Permabit Technology Corporation Ten Canal Park Cambridge, MA 02141 USA Phone: 617.252.9600 FAX:
More informationData Reduction: Deduplication and Compression. Danny Harnik IBM Haifa Research Labs
Data Reduction: Deduplication and Compression Danny Harnik IBM Haifa Research Labs Motivation Reducing the amount of data is a desirable goal Data reduction: an attempt to compress the huge amounts of
More informationVMware vsphere Data Protection 5.8 TECHNICAL OVERVIEW REVISED AUGUST 2014
VMware vsphere Data Protection 5.8 TECHNICAL OVERVIEW REVISED AUGUST 2014 Table of Contents Introduction.... 3 Features and Benefits of vsphere Data Protection... 3 Additional Features and Benefits of
More informationDeduplication, Incremental Forever, and the. Olsen Twins. 7 Technology Circle Suite 100 Columbia, SC 29203
Deduplication, Incremental Forever, and the 7 Technology Circle Suite 100 Columbia, SC 29203 Olsen Twins Phone: 866.359.5411 E-Mail: sales@unitrends.com URL: www.unitrends.com 1 Introduction What do deduplication,
More informationRedefining Microsoft Exchange Data Management
Redefining Microsoft Exchange Data Management FEBBRUARY, 2013 Actifio PAS Specification Table of Contents Introduction.... 3 Background.... 3 Virtualizing Microsoft Exchange Data Management.... 3 Virtualizing
More informationThe Curious Case of Database Deduplication. PRESENTATION TITLE GOES HERE Gurmeet Goindi Oracle
The Curious Case of Database Deduplication PRESENTATION TITLE GOES HERE Gurmeet Goindi Oracle Agenda Introduction Deduplication Databases and Deduplication All Flash Arrays and Deduplication 2 Quick Show
More informationVMware vsphere Data Protection 6.0
VMware vsphere Data Protection 6.0 TECHNICAL OVERVIEW REVISED FEBRUARY 2015 Table of Contents Introduction.... 3 Architectural Overview... 4 Deployment and Configuration.... 5 Backup.... 6 Application
More informationCost Effective Backup with Deduplication. Copyright 2009 EMC Corporation. All rights reserved.
Cost Effective Backup with Deduplication Agenda Today s Backup Challenges Benefits of Deduplication Source and Target Deduplication Introduction to EMC Backup Solutions Avamar, Disk Library, and NetWorker
More informationAcronis Backup Deduplication. Technical Whitepaper
Acronis Backup Deduplication Technical Whitepaper Table of Contents Table of Contents Table of Contents... 1 Introduction... 3 Storage Challenges... 4 How Deduplication Helps... 5 How It Works... 6 Deduplication
More informationCrashPlan PRO Enterprise Backup
CrashPlan PRO Enterprise Backup People Friendly, Enterprise Tough CrashPlan PRO is a high performance, cross-platform backup solution that provides continuous protection onsite, offsite, and online for
More informationACHIEVING STORAGE EFFICIENCY WITH DATA DEDUPLICATION
ACHIEVING STORAGE EFFICIENCY WITH DATA DEDUPLICATION Dell NX4 Dell Inc. Visit dell.com/nx4 for more information and additional resources Copyright 2008 Dell Inc. THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES
More informationMaximizing Hadoop Performance and Storage Capacity with AltraHD TM
Maximizing Hadoop Performance and Storage Capacity with AltraHD TM Executive Summary The explosion of internet data, driven in large part by the growth of more and more powerful mobile devices, has created
More informationTECHNICAL BRIEF. Primary Storage Compression with Storage Foundation 6.0
TECHNICAL BRIEF Primary Storage Compression with Storage Foundation 6.0 Technical Brief Primary Storage Compression with Storage Foundation 6.0 Contents Introduction... 4 What is Compression?... 4 Differentiators...
More informationSTORAGE. Buying Guide: TARGET DATA DEDUPLICATION BACKUP SYSTEMS. inside
Managing the information that drives the enterprise STORAGE Buying Guide: DEDUPLICATION inside What you need to know about target data deduplication Special factors to consider One key difference among
More informationAnalyzing Big Data with Splunk A Cost Effective Storage Architecture and Solution
Analyzing Big Data with Splunk A Cost Effective Storage Architecture and Solution Jonathan Halstuch, COO, RackTop Systems JHalstuch@racktopsystems.com Big Data Invasion We hear so much on Big Data and
More informationMulti-level Metadata Management Scheme for Cloud Storage System
, pp.231-240 http://dx.doi.org/10.14257/ijmue.2014.9.1.22 Multi-level Metadata Management Scheme for Cloud Storage System Jin San Kong 1, Min Ja Kim 2, Wan Yeon Lee 3, Chuck Yoo 2 and Young Woong Ko 1
More informationCloud Storage Backup for Storage as a Service with AT&T
WHITE PAPER: CLOUD STORAGE BACKUP FOR STORAGE AS A SERVICE........ WITH..... AT&T........................... Cloud Storage Backup for Storage as a Service with AT&T Who should read this paper Customers,
More informationFile System & Device Drive. Overview of Mass Storage Structure. Moving head Disk Mechanism. HDD Pictures 11/13/2014. CS341: Operating System
CS341: Operating System Lect 36: 1 st Nov 2014 Dr. A. Sahu Dept of Comp. Sc. & Engg. Indian Institute of Technology Guwahati File System & Device Drive Mass Storage Disk Structure Disk Arm Scheduling RAID
More informationUnderstanding EMC Avamar with EMC Data Protection Advisor
Understanding EMC Avamar with EMC Data Protection Advisor Applied Technology Abstract EMC Data Protection Advisor provides a comprehensive set of features that reduce the complexity of managing data protection
More informationNetApp Data Compression and Deduplication Deployment and Implementation Guide
Technical Report NetApp Data Compression and Deduplication Deployment and Implementation Guide Clustered Data ONTAP Sandra Moulton, NetApp April 2013 TR-3966 Abstract This technical report focuses on clustered
More informationData Deduplication and Tivoli Storage Manager
Data Deduplication and Tivoli Storage Manager Dave Cannon Tivoli Storage Manager rchitect Oxford University TSM Symposium September 2007 Disclaimer This presentation describes potential future enhancements
More informationBest Practices for Deploying Citrix XenDesktop on NexentaStor Open Storage
Best Practices for Deploying Citrix XenDesktop on NexentaStor Open Storage White Paper July, 2011 Deploying Citrix XenDesktop on NexentaStor Open Storage Table of Contents The Challenges of VDI Storage
More informationData Compression and Deduplication. LOC 2010 2010 Cisco Systems, Inc. All rights reserved.
Data Compression and Deduplication LOC 2010 2010 Systems, Inc. All rights reserved. 1 Data Redundancy Elimination Landscape VMWARE DeDE IBM DDE for Tank Solaris ZFS Hosts (Inline and Offline) MDS + Network
More informationReclaiming Primary Storage with Managed Server HSM
White Paper Reclaiming Primary Storage with Managed Server HSM November, 2013 RECLAIMING PRIMARY STORAGE According to Forrester Research Inc., the total amount of data warehoused by enterprises is doubling
More informationData Deduplication: An Essential Component of your Data Protection Strategy
WHITE PAPER: THE EVOLUTION OF DATA DEDUPLICATION Data Deduplication: An Essential Component of your Data Protection Strategy JULY 2010 Andy Brewerton CA TECHNOLOGIES RECOVERY MANAGEMENT AND DATA MODELLING
More informationSpeeding Up Cloud/Server Applications Using Flash Memory
Speeding Up Cloud/Server Applications Using Flash Memory Sudipta Sengupta Microsoft Research, Redmond, WA, USA Contains work that is joint with B. Debnath (Univ. of Minnesota) and J. Li (Microsoft Research,
More informationDisaster Recovery Strategies: Business Continuity through Remote Backup Replication
W H I T E P A P E R S O L U T I O N : D I S A S T E R R E C O V E R Y T E C H N O L O G Y : R E M O T E R E P L I C A T I O N Disaster Recovery Strategies: Business Continuity through Remote Backup Replication
More informationExaGrid Product Description. Cost-Effective Disk-Based Backup with Data Deduplication
ExaGrid Product Description Cost-Effective Disk-Based Backup with Data Deduplication 1 Contents Introduction... 3 Considerations When Examining Disk-Based Backup Approaches... 3 ExaGrid A Disk-Based Backup
More informationSawmill Log Analyzer Best Practices!! Page 1 of 6. Sawmill Log Analyzer Best Practices
Sawmill Log Analyzer Best Practices!! Page 1 of 6 Sawmill Log Analyzer Best Practices! Sawmill Log Analyzer Best Practices!! Page 2 of 6 This document describes best practices for the Sawmill universal
More informationIntroduction to VMware vsphere Data Protection TECHNICAL WHITE PAPER
Introduction to VMware vsphere Data Protection TECHNICAL WHITE PAPER Table of Contents Introduction.... 3 Architectural Overview... 3 Deployment and Configuration.... 5 Administration.... 5 Backup....
More information<Insert Picture Here> Refreshing Your Data Protection Environment with Next-Generation Architectures
1 Refreshing Your Data Protection Environment with Next-Generation Architectures Dale Rhine, Principal Sales Consultant Kelly Boeckman, Product Marketing Analyst Program Agenda Storage
More informationEvery organization has critical data that it can t live without. When a disaster strikes, how long can your business survive without access to its
DISASTER RECOVERY STRATEGIES: BUSINESS CONTINUITY THROUGH REMOTE BACKUP REPLICATION Every organization has critical data that it can t live without. When a disaster strikes, how long can your business
More informationSymantec Backup Appliances
Symantec Backup Appliances End-to-end Protection for your backup environment Stefan Redtzer Sales Manager Backup Appliances, Nordics 1 Today s IT Challenges: Why Better Backup is needed? Accelerated Data
More informationVodacom Managed Hosted Backups
Vodacom Managed Hosted Backups Robust Data Protection for your Business Critical Data Enterprise class Backup and Recovery and Data Management on Diverse Platforms Vodacom s Managed Hosted Backup offers
More informationData Deduplication HTBackup
Data Deduplication HTBackup HTBackup and it s Deduplication technology is touted as one of the best ways to manage today's explosive data growth. If you're new to the technology, these key facts will help
More informationDon t Get Duped By Dedupe or Dedupe Vendors
Don t Get Duped By Dedupe or Dedupe Vendors Whitepaper www.unitrends.com Don t Get Duped By Dedupe or Dedupe Vendors: Introducing Adaptive Deduplication The purpose of deduplication is to provide more
More information09'Linux Plumbers Conference
09'Linux Plumbers Conference Data de duplication Mingming Cao IBM Linux Technology Center cmm@us.ibm.com 2009 09 25 Current storage challenges Our world is facing data explosion. Data is growing in a amazing
More informationCisco WAAS Express. Product Overview. Cisco WAAS Express Benefits. The Cisco WAAS Express Advantage
Data Sheet Cisco WAAS Express Product Overview Organizations today face several unique WAN challenges: the need to provide employees with constant access to centrally located information at the corporate
More informationEMC Data Domain Boost for Oracle Recovery Manager (RMAN)
White Paper EMC Data Domain Boost for Oracle Recovery Manager (RMAN) Abstract EMC delivers Database Administrators (DBAs) complete control of Oracle backup, recovery, and offsite disaster recovery with
More informationRethinking Backup in a Virtualized World
Rethinking Backup in a Virtualized World 800-283-6387 Greg Church, Systems Consultant Server, Storage and Backup gchurch@datanetworks.com The Public Sector Experts o Enterprise solutions exclusively for
More informationEMC VNXe File Deduplication and Compression
White Paper EMC VNXe File Deduplication and Compression Overview Abstract This white paper describes EMC VNXe File Deduplication and Compression, a VNXe system feature that increases the efficiency with
More information(Formerly Double-Take Backup)
(Formerly Double-Take Backup) An up-to-the-minute copy of branch office data and applications can keep a bad day from getting worse. Double-Take RecoverNow for Windows (formerly known as Double-Take Backup)
More informationEight Considerations for Evaluating Disk-Based Backup Solutions
Eight Considerations for Evaluating Disk-Based Backup Solutions 1 Introduction The movement from tape-based to disk-based backup is well underway. Disk eliminates all the problems of tape backup. Backing
More informationHP StoreOnce D2D. Understanding the challenges associated with NetApp s deduplication. Business white paper
HP StoreOnce D2D Understanding the challenges associated with NetApp s deduplication Business white paper Table of contents Challenge #1: Primary deduplication: Understanding the tradeoffs...4 Not all
More informationSYMANTEC NETBACKUP APPLIANCE FAMILY OVERVIEW BROCHURE. When you can do it simply, you can do it all.
SYMANTEC NETBACKUP APPLIANCE FAMILY OVERVIEW BROCHURE When you can do it simply, you can do it all. SYMANTEC NETBACKUP APPLIANCES Symantec understands the shifting needs of the data center and offers NetBackup
More informationAvailability Digest. www.availabilitydigest.com. Data Deduplication February 2011
the Availability Digest Data Deduplication February 2011 What is Data Deduplication? Data deduplication is a technology that can reduce disk storage-capacity requirements and replication bandwidth requirements
More informationThe Modern Virtualized Data Center
WHITEPAPER The Modern Virtualized Data Center Data center resources have traditionally been underutilized while drawing enormous amounts of power and taking up valuable floorspace. Virtualization has been
More informationABOUT DISK BACKUP WITH DEDUPLICATION
Disk Backup with Data Deduplication ABOUT DISK BACKUP WITH DEDUPLICATION www.exagrid.com What appears to be simple & straightforward Built for Backup is often more complex & risky than you think. 2 Agenda
More informationEffective Planning and Use of TSM V6 Deduplication
Effective Planning and Use of IBM Tivoli Storage Manager V6 Deduplication 08/17/12 1.0 Authors: Jason Basler Dan Wolfe Page 1 of 42 Document Location This is a snapshot of an on-line document. Paper copies
More informationMaximize Your Virtual Environment Investment with EMC Avamar. Rob Emsley Senior Director, Product Marketing
1 Maximize Your Virtual Environment Investment with EMC Avamar Rob Emsley Senior Director, Product Marketing 2 Private Cloud is the Vision Virtualized Data Center Internal Cloud Trusted Flexible Control
More informationWHITE PAPER. How Deduplication Benefits Companies of All Sizes An Acronis White Paper
How Deduplication Benefits Companies of All Sizes An Acronis White Paper Copyright Acronis, Inc., 2000 2009 Table of contents Executive Summary... 3 What is deduplication?... 4 File-level deduplication
More informationwww.basho.com Technical Overview Simple, Scalable, Object Storage Software
www.basho.com Technical Overview Simple, Scalable, Object Storage Software Table of Contents Table of Contents... 1 Introduction & Overview... 1 Architecture... 2 How it Works... 2 APIs and Interfaces...
More informationWOS Cloud. ddn.com. Personal Storage for the Enterprise. DDN Solution Brief
DDN Solution Brief Personal Storage for the Enterprise WOS Cloud Secure, Shared Drop-in File Access for Enterprise Users, Anytime and Anywhere 2011 DataDirect Networks. All Rights Reserved DDN WOS Cloud
More informationBackup and Recovery: The Benefits of Multiple Deduplication Policies
Backup and Recovery: The Benefits of Multiple Deduplication Policies NOTICE This White Paper may contain proprietary information protected by copyright. Information in this White Paper is subject to change
More informationCloud-integrated Storage What & Why
Cloud-integrated Storage What & Why Table of Contents Overview...3 CiS architecture...3 Enterprise-class storage platform...4 Enterprise tier 2 SAN storage...4 Activity-based storage tiering and data ranking...5
More informationEMC Backup and Recovery for Microsoft SQL Server 2008 Enabled by EMC Celerra Unified Storage
EMC Backup and Recovery for Microsoft SQL Server 2008 Enabled by EMC Celerra Unified Storage Applied Technology Abstract This white paper describes various backup and recovery solutions available for SQL
More informationCONFIGURATION GUIDELINES: EMC STORAGE FOR PHYSICAL SECURITY
White Paper CONFIGURATION GUIDELINES: EMC STORAGE FOR PHYSICAL SECURITY DVTel Latitude NVMS performance using EMC Isilon storage arrays Correct sizing for storage in a DVTel Latitude physical security
More informationTurbo Charge Your Data Protection Strategy
Turbo Charge Your Data Protection Strategy Data protection for the hybrid cloud 1 WAVES OF CHANGE! Data GROWTH User EXPECTATIONS Do It YOURSELF Can t Keep Up Reliability and Visibility New Choices and
More informationCreating a Cloud Backup Service. Deon George
Creating a Cloud Backup Service Deon George Agenda TSM Cloud Service features Cloud Service Customer, providing a internal backup service Internal Backup Cloud Service Service Provider, providing a backup
More informationOptimizing Backup and Data Protection in Virtualized Environments. January 2009
Optimizing Backup and Data Protection in Virtualized Environments January 2009 Introduction The promise of maximizing IT investments while minimizing complexity has resulted in widespread adoption of server
More informationUpdated November 30, 2010. Version 4.1
Updated November 30, 2010 Version 4.1 Table of Contents Introduction... 3 Replicator Performance and Scalability Features... 5 Replicator Multi-Engine Deployment... 7 Multi-Threaded Replication Queue Architecture...
More informationRose Business Technologies
Primary Storage Data Reduction Data reduction on primary storage is a reality today and with the unchecked growth of data, it will undoubtedly become a key part of storage efficiency. Standard in many
More informationDeduplication Best Practices With Microsoft Windows Server 2012 and Veeam Backup & Replication 6.5
Deduplication Best Practices With Microsoft Windows Server 2012 and Veeam Backup & Replication 6.5 Joep Piscaer, VMware vexpert, VCDX #101 j.piscaer@virtuallifestyle.nl @jpiscaer Agenda Introduction Use
More informationWhitepaper: Back Up SAP HANA and SUSE Linux Enterprise Server with SEP sesam. info@sepusa.com www.sepusa.com Copyright 2014 SEP
Whitepaper: Back Up SAP HANA and SUSE Linux Enterprise Server with SEP sesam info@sepusa.com www.sepusa.com Table of Contents INTRODUCTION AND OVERVIEW... 3 SOLUTION COMPONENTS... 4-5 SAP HANA... 6 SEP
More informationZFS Backup Platform. ZFS Backup Platform. Senior Systems Analyst TalkTalk Group. http://milek.blogspot.com. Robert Milkowski.
ZFS Backup Platform Senior Systems Analyst TalkTalk Group http://milek.blogspot.com The Problem Needed to add 100's new clients to backup But already run out of client licenses No spare capacity left (tapes,
More informationProtect Microsoft Exchange databases, achieve long-term data retention
Technical white paper Protect Microsoft Exchange databases, achieve long-term data retention HP StoreOnce Backup systems, HP StoreOnce Catalyst, and Symantec NetBackup OpenStorage Table of contents Introduction...
More informationIBM TSM DISASTER RECOVERY BEST PRACTICES WITH EMC DATA DOMAIN DEDUPLICATION STORAGE
White Paper IBM TSM DISASTER RECOVERY BEST PRACTICES WITH EMC DATA DOMAIN DEDUPLICATION STORAGE Abstract This white paper focuses on recovery of an IBM Tivoli Storage Manager (TSM) server and explores
More informationTandberg Data AccuVault RDX
Tandberg Data AccuVault RDX Binary Testing conducts an independent evaluation and performance test of Tandberg Data s latest small business backup appliance. Data backup is essential to their survival
More informationCloud-integrated Enterprise Storage. Cloud-integrated Storage What & Why. Marc Farley
Cloud-integrated Enterprise Storage Cloud-integrated Storage What & Why Marc Farley Table of Contents Overview... 3 CiS architecture... 3 Enterprise-class storage platform... 4 Enterprise tier 2 SAN storage...
More informationVMware vsphere Data Protection 6.1
VMware vsphere Data Protection 6.1 Technical Overview Revised August 10, 2015 Contents Introduction... 3 Architecture... 3 Deployment and Configuration... 5 Backup... 6 Application Backup... 6 Backup Data
More informationManaged File Transfer
Managed File Transfer How do most organizations move files today? FTP Typically File Transfer Protocol (FTP) is combined with writing and maintaining homegrown code to address its limitations Limited Reliability
More informationProtecting enterprise servers with StoreOnce and CommVault Simpana
Technical white paper Protecting enterprise servers with StoreOnce and CommVault Simpana HP StoreOnce Backup systems Table of contents Introduction 2 Technology overview 2 HP StoreOnce Backup systems key
More informationVeritas Backup Exec 15: Deduplication Option
Veritas Backup Exec 15: Deduplication Option Who should read this paper Technical White Papers are designed to introduce IT professionals to key technologies and technical concepts that are associated
More informationVeeam Backup & Replication for VMware
Veeam Backup & Replication for VMware Version 6.x Best Practices for Deployment & Configuration March, 2013 Tom Sightler Solutions Architect, Core Products Veeam Software 2013 Veeam Software. All rights
More informationConfiguring Backup Settings. Copyright 2009, Oracle. All rights reserved.
Configuring Backup Settings Objectives After completing this lesson, you should be able to: Use Enterprise Manager to configure backup settings Enable control file autobackup Configure backup destinations
More information