Storage Switzerland White Paper Storage Infrastructures for Big Data Workflows
|
|
|
- Austen Owens
- 10 years ago
- Views:
Transcription
1 Storage Switzerland White Paper Storage Infrastructures for Big Data Workflows Sponsored by: Prepared by: Eric Slack, Sr. Analyst May 2012
2 Storage Infrastructures for Big Data Workflows Introduction Big Data is a term used to describe data sets which have grown so large that traditional storage infrastructures are ineffective at capturing, managing, accessing and retaining them in an acceptable time frame. The thing that separates Big Data from simply a large archive is the need to process these data sets or to provide file access to multiple users quickly. Some Big Data use cases involve analytics, the computer-based analysis of large amounts of relatively small data objects, for the purpose of pulling business value from that information. Many of these involve files supporting transaction analysis or automated event processing, such as database or web analytics, which won t be addressed in this white paper. Instead, this paper will deal with another form of Big Data that supports file processing workflows, often sequential in nature, where large files are shared by knowledge workers to create digital products, support research and perform analysis to increase productivity. Also considered will be Big Data supporting large file analytics in which files are shared by large, high performance compute clusters to support complex analysis and drive business decisions. Big Data File Processing Workflows Some of the industries using large file, Big Data sets in these two use cases include Media and Entertainment, Life Sciences, Healthcare, Defense/Intelligence and Oil and Gas. One of the things that makes this topic compelling for infrastructure suppliers is that extracting value from the collected data usually involves a time constraint, meaning that Big Data is often being ingested so quickly, it can t just be put into a large, traditional backup repository or archive. In addition to data protection, these infrastructures must have the performance to provide fast access and throughput to satisfy users just in time needs for files a collaborative workflow. Another compelling aspect comes from the word big. Very large data sets require some foresight into the design of their infrastructures since petabytes of data can t be easily manipulated or moved to facilitate a major change in storage or data handling systems. Like a construction project, once the foundation has been poured, a building is very difficult to move around, if not impossible. Unknowns created by future requirements can mean risk, since structural modifications are very difficult, so Big Data systems must include flexibility in order to address those risks. 2
3 Storage Switzerland, LLC Infrastructure Requirements The very nature of Big Data, the realities of its size and the analysis and workflows it must support, puts many demands on the storage infrastructure as well. Capacity and performance efficiency must be maintained in order to keep the costs of storing and handling such large amounts of data under control. Also, Big Data can include a record of events, such as surveillance video or involve costly data acquisition, such as oil & gas seismic exploration, and need be kept for long periods of time, bringing a need for longevity of the storage system and long term data integrity. These and other requirements of a Big Data storage infrastructure will be examined in the rest of this report. In addition, this white paper will look at Quantum s StorNext File System and Storage Management software and how it can form the core of a Big Data storage infrastructure that addresses these requirements. Flexibility One of the challenges that Big Data brings is the requirement to support many different data types, or at least have that ability. Tasked with finding ways to pull business value out a given data set, users are creating more ways to cross reference those data. Mergers and acquisitions, as well as pure financial motive, may drive an organization to purchase storage or applications from one vendor one day and a different vendor in the future. This can mean combining data sets stored on systems from different vendors and sharing the resulting file pool with clients using a variety of applications running on different operating systems. Similarly, applications used by the industries that generate Big Data are evolving, bringing new file types with them, even new platforms - and usually a need for more capacity. Advancements such as the use of 3D video in the Media & Entertainment industry and 3D images to be analyzed by geophysicists are examples. When a new application is implemented it should be able to access the current file system so that existing files can be processed while the current applications are being supported. Whether it s new applications, different file types, multiple storage platforms or something else entirely, the task of maintaining a Big Data storage infrastructure into the future, where the requirements are largely unknown, carries some significant risks. In order to reduce those risks, this infrastructure must be extremely flexible. Heterogeneous Environments The StorNext file system supports truly heterogeneous environments by connecting Linux, Windows, Mac, and even UNIX hosts to the same files via a SAN, LAN. This enables the widest possible compatibility between applications or users and the files 3
4 Storage Infrastructures for Big Data Workflows that they re required to process today. This flexibility also reduces the risk that a future application, data type or compute platform won t be supported. Big Data s volume may quickly outgrow existing storage, causing purchasing organizations to look for affordable capacity wherever they can. This can lead to the acquisition of storage systems from different manufacturers and a need to combine the capacity on diverse platforms. Ideally the organization may want to explore many different storage options in order to keep up with Big Data s appetite for capacity. Unfortunately, its scope and scale doesn t lend itself to nimbleness and data migration is usually out of the question for data sets this large. Storage Virtualization StorNext has a virtualization layer that abstracts the physical location of storage keeping those details hidden from users and applications on the front end and enabling capacity to be added on the back end transparently. This could be SAN arrays of highperformance tier one disk, economical, high-capacity arrays or even storage systems or NAS filers that were decommissioned after the last refresh cycle. This kind of flexibility is ideal in order to stay abreast of the inevitable changes in applications, storage platforms and workflows that Big Data infrastructures will see. It also supports the continual need for affordable capacity and non-disruptive upgrades that a storage system will experience as it s kept active for decades. Big Data environments may also evolve. For example, a relatively simple file sharing workflow between clients using a single platform may grow into one that includes long term archiving and data protection with multiple OSs. In these situations the file system infrastructure must support the additional data services needed. StorNext provides multiple technology choices for protecting data such as tiering, archiving (to NAS and tape as well as bulk block storage), deduplication and replication for offsite data protection. File Sharing and Collaboration In addition to storage virtualization, a Big Data infrastructure must be able to support file sharing across operating systems on the front end and across storage systems on the back end. This can include workflows in which multiple users, doing diverse tasks on different platforms, with different software need to share the same files, often concurrently. A Big Data file system that excludes one application or platform can cause manual workarounds to the data flow process. These sneaker net types of solutions can result in reduced productivity and an increased potential for error or data loss. To prevent this StorNext enables clients, regardless of OS, to access the same files. Currently, these include multiple variants of Windows and Linux, plus UNIX and Mac OS X. 4
5 Storage Switzerland, LLC In sequential file processing use cases, like those used for video editing, productivity can be directly related to the storage system s ability to access and transfer a set of shared files quickly. Time is money and as soon as one person is finished, another may need to start work on the same file or set of files. In environments, where users are running applications like 3D editing on high definition video the storage system may not have the horsepower to stream these very large files fast enough without dropping frames or support multiple workflows. Because StorNext enables performance of up to 90-95% of the underlying SAN or LAN infrastructure, it has built strong proof points of success in the Media & Entertainment industry where these high throughput workflows are common. Performance Traditional NAS devices which run CIFS and NFS protocols over an IP network can be sufficient for regular types of files but may be inadequate in these kinds of high performance environments. As a SAN file system, StorNext provides fibre channel performance (often hundreds of MB/s per single stream) to server clients. Also, a StorNext LAN client protocol that was built for large block transfers through StorNext LAN gateways, allows access at near Gb Ethernet speeds to these same files over the LAN, significantly faster than NAS storage systems running on IP-based protocols. StorNext helps maintain performance as the infrastructure grows and storage capacities expand by separating the file system metadata controller from the data movement function that s providing access to storage archive tiers. By dividing this process across multiple dedicated servers StorNext allows the system to maintain performance and better match changing workloads. StorNext s architecture enables another option, called the Distributed Data Mover (DDM), which can improve performance as well. DDM offloads the data movement operation from the metadata controller to an alternate dedicated compute engine. This allows for faster file retrievals during periods of heavy system activity and frees up cycles on the metadata controller which can be applied to other operations. Scalability Big Data environments can mean constant data growth and a requirement for the storage infrastructure to expand to what may seem like an unlimited capacity - and do this easily. In industries that deal with imagery or other visual data, resolution is continually increasing, driving file sizes up in the process. As an example, a typical 3D movie can involve multi-tb sized files. But the storage requirement for these projects doesn t just include the finished product, it may also include the copies made during the interim production steps. These must be made available to operators at multiple processing stations along the way and then archived after the project is complete. 5
6 Storage Infrastructures for Big Data Workflows In addition to providing the storage capacity to support these Big Data applications, the infrastructure must also support extremely large numbers of files. This requires an expandable file space which can be laid across multiple physical storage devices and extended seamlessly as data grows. StorNext s global namespace can support file systems in the multiple petabyte range and expand dynamically to file counts in the hundreds of millions. Cost control Cost is always an issue when storing large data sets, especially ones that can grow almost without limit, and this is certainly true with Big Data. One large genomic sequencing vendor with PBs of data studied use patterns and found that 40% of its data files had not been accessed in over eighteen months and 60% in eight months. One way to minimize costs is to leverage policy-based tiering features which only move files needed for the current projects onto the fastest (and most expensive) storage areas so they re accessible to support sequential workflow processes. Other methods include data reduction technologies, like deduplication, or using a high density, low cost recording medium like tape. With extremely large file sizes, as is common in satellite imaging or genomics applications, storage costs can be reduced by truncating files and storing only a portion on a high performance tier, with the remainder resident on an archive disk tier, or even tape. In this way, the user or application requesting the file can get started with this initial segment and have the rest of the file streamed up to the performance tier concurrently. Since StorNext controls the file system and the data management function, it can accomplish this process transparently. Tiered Storage A tiered storage architecture is an effective strategy for creating affordable capacity and increasing the scalability of storage systems. It enables higher capacity systems to be added, like arrays with multi-tb SATA drives, and can include tape as well. In order to integrate these different storage platforms into the common data pool, the Big Data storage infrastructure needs a mechanism that can move files between storage tiers based on predefined policies, while maintaining the single namespace for its users. This enables the system to keep the most active data sets on the highest performing storage assets, and move the rest off to lower cost capacity or an archive. In Big Data environments that support file sharing workflows, a tiering mechanism can move the files associated with a project off to an archive tier when their access levels indicate the project s complete and save that premium space. Files movement policies are set at the directory level to accommodate scheduled workloads. StorNext s 6
7 Storage Switzerland, LLC distributed architecture helps maintain file system performance throughout these data movement operations. Tape These capacity tiers can also include tape, the most cost effective storage medium available. The current generation, LTO5, can store over 1.5 TB per cartridge, uncompressed, giving Big Data infrastructures the density to archive enormous numbers of very large files in a relatively small data center footprint. Compared to even the most cost effective disk arrays, the use of tape archives can translate into lower operational costs since idle tapes draw no power and don t require cooling. In addition, tape makes an excellent deep archive tier since it remains viable for longer periods of time than does magnetic disk technology. Long term viability Big Data archives will often have to be stored for long periods of time, maybe indefinitely. Given the amount of money and other resources that can be put into applications like genome sequencing or feature films, Big Data storage infrastructures can represent a significant investment which will need to remain viable. Data Protection A Big Data infrastructure should provide data protection assurance so that this investment is appropriately cared for. Obviously, a traditional backup process can be impractical, since making weekly, or even daily backup copies of large numbers of very large files could take too long. StorNext provides this protection with a process that makes file copies off of primary storage continuously. When file accesses have stopped, a data protection copy is made (actually, up to 4 copies) and moved to a secondary storage tier, like tape. Then, after a certain number of days, a second policy marks that primary copy as a candidate for truncation. The actual truncation of the file from the primary disk tier is carried out when capacity thresholds on primary storage are reached. The result is at least one full-time, full-size copy of each file stored on long term archive media, as soon as that file has become inactive. StorNext maintains that copy on primary storage as long as possible and can restore it from the archive copy automatically, when accessed again. Data Integrity StorNext provides a data integrity feature that embeds checksums into archived data to help maintain viability. Quantum s AEL Archive technology also includes tape integrity 7
8 Storage Infrastructures for Big Data Workflows checking that regularly tests each cartridge for wear to confirm that its data can be reproduced reliably. When a piece of media is found to contain an excessive number of data errors it can be automatically copied to a new cartridge, so it won t further degrade and risk data loss or fail during use. Highly scalable storage products are coming out from new vendors on a regular basis, so the storage infrastructure must be able to support platform replacement if it becomes necessary. StorNext s comprehensive archive and data management functions can support the use of multiple vendors platforms and the data handling this requires. From an industry perspective, StorNext has become a standard SAN file system solution in the Big Data space, with over 60,000 file system clients deployments and 500PB of data under license. Summary Big Data creates challenges for the infrastructure that it s stored on and the people who manage it. Obviously, to support growth, capacity must be available, but also performance, in order to meet the access requirements of users and applications. In use cases like those supporting file processing workflows, for example, throughput is essential to deliver large files to users in an acceptable time frame. The file system and archive infrastructure must provide this performance and capacity expansion without creating a burden on the IT staff. These infrastructures must also be able to maintain very large data sets for a very long time, and do so cost effectively. This means providing assurance that data integrity is maintained while the physical infrastructure is upgraded, updated and expanded. It also means design and operational efficiency so that costs, especially future costs, are controlled. Big Data storage infrastructures must also be flexible, so that they can support multiple file types and client-side operating systems on the front end and multiple storage platforms on the back end. In reality these large infrastructures can eventually become a consolidation of many disparate storage devices as they accumulate more diverse data sets. StorNext SAN and LAN File System and Storage Management suite is designed to meet these considerable requirements of Big Data today, evolve to meet the requirements of tomorrow and do so cost effectively. This white paper is sponsored by Quantum 8
A Best Practice Guide to Archiving Persistent Data: How archiving is a vital tool as part of a data center cost savings exercise
WHITE PAPER A Best Practice Guide to Archiving Persistent Data: How archiving is a vital tool as part of a data center cost savings exercise NOTICE This White Paper may contain proprietary information
STORNEXT PRO SOLUTIONS. StorNext Pro Solutions
STORNEXT PRO SOLUTIONS StorNext Pro Solutions StorNext PRO SOLUTIONS StorNext Pro Solutions offer post-production and broadcast professionals the fastest, easiest, and most complete high-performance shared
Save Time and Money with Quantum s Integrated Archiving Solution
Case Study Forum WHITEPAPER Save Time and Money with Quantum s Integrated Archiving Solution TABLE OF CONTENTS Summary of Findings...3 The Challenge: How to Cost Effectively Archive Data...4 The Solution:
Quantum StorNext. Product Brief: Distributed LAN Client
Quantum StorNext Product Brief: Distributed LAN Client NOTICE This product brief may contain proprietary information protected by copyright. Information in this product brief is subject to change without
STORNEXT PRO SOLUTIONS. StorNext Pro Solutions
STORNEXT PRO SOLUTIONS StorNext Pro Solutions StorNext PRO SOLUTIONS StorNext Pro Solutions offer Post-Production and Broadcast Professionals the fastest, easiest, and most complete high-performance shared
Archive Data Retention & Compliance. Solutions Integrated Storage Appliances. Management Optimized Storage & Migration
Solutions Integrated Storage Appliances Management Optimized Storage & Migration Archive Data Retention & Compliance Services Global Installation & Support SECURING THE FUTURE OF YOUR DATA w w w.q sta
Hitachi NAS Platform and Hitachi Content Platform with ESRI Image
W H I T E P A P E R Hitachi NAS Platform and Hitachi Content Platform with ESRI Image Aciduisismodo Extension to ArcGIS Dolore Server Eolore for Dionseq Geographic Uatummy Information Odolorem Systems
Energy Efficient Storage - Multi- Tier Strategies For Retaining Data
Energy and Space Efficient Storage: Multi-tier Strategies for Protecting and Retaining Data NOTICE This White Paper may contain proprietary information protected by copyright. Information in this White
Cost Effective Backup with Deduplication. Copyright 2009 EMC Corporation. All rights reserved.
Cost Effective Backup with Deduplication Agenda Today s Backup Challenges Benefits of Deduplication Source and Target Deduplication Introduction to EMC Backup Solutions Avamar, Disk Library, and NetWorker
<Insert Picture Here> Refreshing Your Data Protection Environment with Next-Generation Architectures
1 Refreshing Your Data Protection Environment with Next-Generation Architectures Dale Rhine, Principal Sales Consultant Kelly Boeckman, Product Marketing Analyst Program Agenda Storage
A Best Practice Guide to Archiving Persistent Data: How archiving is a vital tool as part of a data centre cost savings exercise
A Best Practice Guide to Archiving Persistent Data: How archiving is a vital tool as part of a data centre cost savings exercise NOTICE This White Paper may contain proprietary information protected by
Taming Big Data Storage with Crossroads Systems StrongBox
BRAD JOHNS CONSULTING L.L.C Taming Big Data Storage with Crossroads Systems StrongBox Sponsored by Crossroads Systems 2013 Brad Johns Consulting L.L.C Table of Contents Taming Big Data Storage with Crossroads
Introduction to NetApp Infinite Volume
Technical Report Introduction to NetApp Infinite Volume Sandra Moulton, Reena Gupta, NetApp April 2013 TR-4037 Summary This document provides an overview of NetApp Infinite Volume, a new innovation in
IBM Global Technology Services September 2007. NAS systems scale out to meet growing storage demand.
IBM Global Technology Services September 2007 NAS systems scale out to meet Page 2 Contents 2 Introduction 2 Understanding the traditional NAS role 3 Gaining NAS benefits 4 NAS shortcomings in enterprise
THE EMC ISILON STORY. Big Data In The Enterprise. Copyright 2012 EMC Corporation. All rights reserved.
THE EMC ISILON STORY Big Data In The Enterprise 2012 1 Big Data In The Enterprise Isilon Overview Isilon Technology Summary 2 What is Big Data? 3 The Big Data Challenge File Shares 90 and Archives 80 Bioinformatics
BlueArc unified network storage systems 7th TF-Storage Meeting. Scale Bigger, Store Smarter, Accelerate Everything
BlueArc unified network storage systems 7th TF-Storage Meeting Scale Bigger, Store Smarter, Accelerate Everything BlueArc s Heritage Private Company, founded in 1998 Headquarters in San Jose, CA Highest
Scala Storage Scale-Out Clustered Storage White Paper
White Paper Scala Storage Scale-Out Clustered Storage White Paper Chapter 1 Introduction... 3 Capacity - Explosive Growth of Unstructured Data... 3 Performance - Cluster Computing... 3 Chapter 2 Current
Versity 2013. All rights reserved.
From the only independent developer of large scale archival storage systems, the Versity Storage Manager brings enterpriseclass storage virtualization to the Linux platform. Based on Open Source technology,
HyperQ Storage Tiering White Paper
HyperQ Storage Tiering White Paper An Easy Way to Deal with Data Growth Parsec Labs, LLC. 7101 Northland Circle North, Suite 105 Brooklyn Park, MN 55428 USA 1-763-219-8811 www.parseclabs.com [email protected]
Quantum DXi6500 Family of Network-Attached Disk Backup Appliances with Deduplication
PRODUCT BRIEF Quantum DXi6500 Family of Network-Attached Disk Backup Appliances with Deduplication NOTICE This Product Brief contains proprietary information protected by copyright. Information in this
Netapp @ 10th TF-Storage Meeting
Netapp @ 10th TF-Storage Meeting Wojciech Janusz, Netapp Poland Bogusz Błaszkiewicz, Netapp Poland Ljubljana, 2012.02.20 Agenda Data Ontap Cluster-Mode pnfs E-Series NetApp Confidential - Internal Use
Keys to Successfully Architecting your DSI9000 Virtual Tape Library. By Chris Johnson Dynamic Solutions International
Keys to Successfully Architecting your DSI9000 Virtual Tape Library By Chris Johnson Dynamic Solutions International July 2009 Section 1 Executive Summary Over the last twenty years the problem of data
Private Cloud Storage for Media Applications. Bang Chang Vice President, Broadcast Servers and Storage [email protected]
Private Cloud Storage for Media Bang Chang Vice President, Broadcast Servers and Storage [email protected] Table of Contents Introduction Cloud Storage Requirements Application transparency Universal
In the Age of Unstructured Data, Enterprise-Class Unified Storage Gives IT a Business Edge
In the Age of Unstructured Data, Enterprise-Class Unified Storage Gives IT a Business Edge 7 Key Elements to Look for in a Multipetabyte-Scale Unified Storage System By Hitachi Data Systems April 2014
XenData Video Edition. Product Brief:
XenData Video Edition Product Brief: The Video Edition of XenData Archive Series software manages one or more automated data tape libraries on a single Windows 2003 server to create a cost effective digital
WHITE PAPER. Reinventing Large-Scale Digital Libraries With Object Storage Technology
WHITE PAPER Reinventing Large-Scale Digital Libraries With Object Storage Technology CONTENTS Introduction..........................................................................3 Hitting The Limits
Protecting Information in a Smarter Data Center with the Performance of Flash
89 Fifth Avenue, 7th Floor New York, NY 10003 www.theedison.com 212.367.7400 Protecting Information in a Smarter Data Center with the Performance of Flash IBM FlashSystem and IBM ProtecTIER Printed in
ANY SURVEILLANCE, ANYWHERE, ANYTIME
ANY SURVEILLANCE, ANYWHERE, ANYTIME WHITEPAPER DDN Storage Powers Next Generation Video Surveillance Infrastructure INTRODUCTION Over the past decade, the world has seen tremendous growth in the use of
Long-term data storage in the media and entertainment industry: StrongBox LTFS NAS archive delivers 84% reduction in TCO
Long-term data storage in the media and entertainment industry: StrongBox LTFS NAS archive delivers 84% reduction in TCO Lowering Long-term Archive Storage Costs with Crossroads Systems StrongBox, Brad
(Scale Out NAS System)
For Unlimited Capacity & Performance Clustered NAS System (Scale Out NAS System) Copyright 2010 by Netclips, Ltd. All rights reserved -0- 1 2 3 4 5 NAS Storage Trend Scale-Out NAS Solution Scaleway Advantages
XenData Archive Series Software Technical Overview
XenData White Paper XenData Archive Series Software Technical Overview Advanced and Video Editions, Version 4.0 December 2006 XenData Archive Series software manages digital assets on data tape and magnetic
Using HP StoreOnce Backup Systems for NDMP backups with Symantec NetBackup
Technical white paper Using HP StoreOnce Backup Systems for NDMP backups with Symantec NetBackup Table of contents Executive summary... 2 Introduction... 2 What is NDMP?... 2 Technology overview... 3 HP
EMC DATA DOMAIN EXTENDED RETENTION SOFTWARE: MEETING NEEDS FOR LONG-TERM RETENTION OF BACKUP DATA ON EMC DATA DOMAIN SYSTEMS
SOLUTION PROFILE EMC DATA DOMAIN EXTENDED RETENTION SOFTWARE: MEETING NEEDS FOR LONG-TERM RETENTION OF BACKUP DATA ON EMC DATA DOMAIN SYSTEMS MAY 2012 Backups are essential for short-term data recovery
Managing the Unmanageable: A Better Way to Manage Storage
Managing the Unmanageable: A Better Way to Manage Storage Storage growth is unending, but there is a way to meet the challenge, without worries about scalability or availability. October 2010 ISILON SYSTEMS
Backup and Recovery: The Benefits of Multiple Deduplication Policies
Backup and Recovery: The Benefits of Multiple Deduplication Policies NOTICE This White Paper may contain proprietary information protected by copyright. Information in this White Paper is subject to change
Uncompromised business agility with Oracle, NetApp and VMware
Tag line, tag line Uncompromised business agility with Oracle, NetApp and VMware HroUG Conference, Rovinj Pavel Korcán Sr. Manager Alliances South & North-East EMEA Using NetApp Simplicity to Deliver Value
Protect Data... in the Cloud
QUASICOM Private Cloud Backups with ExaGrid Deduplication Disk Arrays Martin Lui Senior Solution Consultant Quasicom Systems Limited Protect Data...... in the Cloud 1 Mobile Computing Users work with their
DATA PROGRESSION. The Industry s Only SAN with Automated. Tiered Storage STORAGE CENTER DATA PROGRESSION
DATA PROGRESSION STORAGE CENTER DATASHEET STORAGE CENTER DATA PROGRESSION Delivering On the Promise of Tiered Storage Tiered storage solutions promise to help companies lower storage costs by storing less
Get Success in Passing Your Certification Exam at first attempt!
Get Success in Passing Your Certification Exam at first attempt! Exam : E22-290 Title : EMC Data Domain Deduplication, Backup and Recovery Exam Version : DEMO 1.A customer has a Data Domain system with
White Paper. What is IP SAN?
White Paper What is IP SAN? Introduction Internet Protocol, or IP, has grown to become the most widely used telecommunications standard worldwide. The technology is well understood, easy to implement and
The safer, easier way to help you pass any IT exams. Exam : 000-115. Storage Sales V2. Title : Version : Demo 1 / 5
Exam : 000-115 Title : Storage Sales V2 Version : Demo 1 / 5 1.The IBM TS7680 ProtecTIER Deduplication Gateway for System z solution is designed to provide all of the following EXCEPT: A. ESCON attach
WHITE PAPER. QUANTUM LATTUS: Next-Generation Object Storage for Big Data Archives
WHITE PAPER QUANTUM LATTUS: Next-Generation Object Storage for Big Data Archives CONTENTS Executive Summary....................................................................3 The Limits of Traditional
NetApp Big Content Solutions: Agile Infrastructure for Big Data
White Paper NetApp Big Content Solutions: Agile Infrastructure for Big Data Ingo Fuchs, NetApp April 2012 WP-7161 Executive Summary Enterprises are entering a new era of scale, in which the amount of data
PRIVATE CLOUD-BASED MEDIA WORKFLOW. StorNext 5 in the Cloud
PRIVATE CLOUD-BASED MEDIA WORKFLOW StorNext 5 in the Cloud CONTENTS Where the Cloud Promise Fails for Workflows... 5 A Cloud Designed for Media Workflows... 7 An Ecosystem of Integrated Partners... 8
The Modern Virtualized Data Center
WHITEPAPER The Modern Virtualized Data Center Data center resources have traditionally been underutilized while drawing enormous amounts of power and taking up valuable floorspace. Virtualization has been
Quantum BACKUP. RECOVERY. ARCHIVE. IT S WHAT WE DO.
Quantum BACKUP. RECOVERY. ARCHIVE. IT S WHAT WE DO. TM Next Generation Archival--StorNext 9 2010 Quantum Corporation. Company Confidential. Forward-looking information is based upon multiple assumptions
Tier 2 Nearline. As archives grow, Echo grows. Dynamically, cost-effectively and massively. What is nearline? Transfer to Tape
Tier 2 Nearline As archives grow, Echo grows. Dynamically, cost-effectively and massively. Large Scale Storage Built for Media GB Labs Echo nearline systems have the scale and performance to allow users
OPTIMIZING PRIMARY STORAGE WHITE PAPER FILE ARCHIVING SOLUTIONS FROM QSTAR AND CLOUDIAN
OPTIMIZING PRIMARY STORAGE WHITE PAPER FILE ARCHIVING SOLUTIONS FROM QSTAR AND CLOUDIAN CONTENTS EXECUTIVE SUMMARY The Challenges of Data Growth SOLUTION OVERVIEW 3 SOLUTION COMPONENTS 4 Cloudian HyperStore
Reducing Storage TCO With Private Cloud Storage
Prepared by: Colm Keegan, Senior Analyst Prepared: October 2014 With the burgeoning growth of data, many legacy storage systems simply struggle to keep the total cost of ownership (TCO) in check. This
Implementing a Digital Video Archive Based on XenData Software
Based on XenData Software The Video Edition of XenData Archive Series software manages a digital tape library on a Windows Server 2003 platform to create a digital video archive that is ideal for the demanding
Universal Backup Device The Essential Facts of UBD
Information Technology Solution Brief Universal Backup Device The Essential Facts of UBD Fibre Channel Disk to Disk Backup for IBM Power Systems Copyright (c)2014 Electronic Storage Corporation Information
Cloud Gateway. Agenda. Cloud concepts Gateway concepts My work. Monica Stebbins
Approved for Public Release; Distribution Unlimited. Case Number 15 0196 Cloud Gateway Monica Stebbins Agenda 2 Cloud concepts Gateway concepts My work 3 Cloud concepts What is Cloud 4 Similar to hosted
Implementing a Digital Video Archive Using XenData Software and a Spectra Logic Archive
Using XenData Software and a Spectra Logic Archive With the Video Edition of XenData Archive Series software on a Windows server and a Spectra Logic T-Series digital archive, broadcast organizations have
June 2009. Blade.org 2009 ALL RIGHTS RESERVED
Contributions for this vendor neutral technology paper have been provided by Blade.org members including NetApp, BLADE Network Technologies, and Double-Take Software. June 2009 Blade.org 2009 ALL RIGHTS
W H I T E P A P E R T h e C r i t i c a l N e e d t o P r o t e c t M a i n f r a m e B u s i n e s s - C r i t i c a l A p p l i c a t i o n s
Global Headquarters: 5 Speen Street Framingham, MA 01701 USA P.508.872.8200 F.508.935.4015 www.idc.com W H I T E P A P E R T h e C r i t i c a l N e e d t o P r o t e c t M a i n f r a m e B u s i n e
Extended Data Life Management:
TECHNOLOGY BRIEF Extended Data Life Management: Protecting Data Over Long Periods of Time NOTICE This Technology Brief may contain proprietary information protected by copyright. Information in this Technology
Selling Compellent NAS: File & Block Level in the Same System Chad Thibodeau
Selling Compellent NAS: File & Block Level in the Same System Chad Thibodeau Agenda Session Objectives Feature Overview Technology Overview Compellent Differentiators Competition Available Resources Questions
With DDN Big Data Storage
DDN Solution Brief Accelerate > ISR With DDN Big Data Storage The Way to Capture and Analyze the Growing Amount of Data Created by New Technologies 2012 DataDirect Networks. All Rights Reserved. The Big
Disaster Recovery Strategies: Business Continuity through Remote Backup Replication
W H I T E P A P E R S O L U T I O N : D I S A S T E R R E C O V E R Y T E C H N O L O G Y : R E M O T E R E P L I C A T I O N Disaster Recovery Strategies: Business Continuity through Remote Backup Replication
Building a Successful Strategy To Manage Data Growth
Building a Successful Strategy To Manage Data Growth Abstract In organizations that have requirements for a minimum of 30 terabytes to multiple petabytes of storage the go to technology for a successful
EMC ISILON OneFS OPERATING SYSTEM Powering scale-out storage for the new world of Big Data in the enterprise
EMC ISILON OneFS OPERATING SYSTEM Powering scale-out storage for the new world of Big Data in the enterprise ESSENTIALS Easy-to-use, single volume, single file system architecture Highly scalable with
EMC BACKUP MEETS BIG DATA
EMC BACKUP MEETS BIG DATA Strategies To Protect Greenplum, Isilon And Teradata Systems 1 Agenda Big Data: Overview, Backup and Recovery EMC Big Data Backup Strategy EMC Backup and Recovery Solutions for
Long term retention and archiving the challenges and the solution
Long term retention and archiving the challenges and the solution NAME: Yoel Ben-Ari TITLE: VP Business Development, GH Israel 1 Archive Before Backup EMC recommended practice 2 1 Backup/recovery process
Solution Brief: Creating Avid Project Archives
Solution Brief: Creating Avid Project Archives Marquis Project Parking running on a XenData Archive Server provides Fast and Reliable Archiving to LTO or Sony Optical Disc Archive Cartridges Summary Avid
Every organization has critical data that it can t live without. When a disaster strikes, how long can your business survive without access to its
DISASTER RECOVERY STRATEGIES: BUSINESS CONTINUITY THROUGH REMOTE BACKUP REPLICATION Every organization has critical data that it can t live without. When a disaster strikes, how long can your business
Protecting Big Data Data Protection Solutions for the Business Data Lake
White Paper Protecting Big Data Data Protection Solutions for the Business Data Lake Abstract Big Data use cases are maturing and customers are using Big Data to improve top and bottom line revenues. With
巨 量 資 料 分 層 儲 存 解 決 方 案
巨 量 資 料 分 層 儲 存 解 決 方 案 Lower Costs and Improve Efficiencies Cano Lei Senior Sales Consulting Manager Oracle Systems Oracle Confidential Internal/Restricted/Highly Restricted Agenda 1 2 3 Why Tiered Storage?
Got Files? Get Cloud!
I D C V E N D O R S P O T L I G H T Got Files? Get Cloud! November 2010 Adapted from State of File-Based Storage Use in Organizations by Richard Villars, IDC #221138 Sponsored by F5 Networks The explosion
DAS, NAS or SAN: Choosing the Right Storage Technology for Your Organization
DAS, NAS or SAN: Choosing the Right Storage Technology for Your Organization New Drivers in Information Storage Data is unquestionably the lifeblood of today s digital organization. Storage solutions remain
IBM TSM DISASTER RECOVERY BEST PRACTICES WITH EMC DATA DOMAIN DEDUPLICATION STORAGE
White Paper IBM TSM DISASTER RECOVERY BEST PRACTICES WITH EMC DATA DOMAIN DEDUPLICATION STORAGE Abstract This white paper focuses on recovery of an IBM Tivoli Storage Manager (TSM) server and explores
WHY DO I NEED FALCONSTOR OPTIMIZED BACKUP & DEDUPLICATION?
WHAT IS FALCONSTOR? FalconStor Optimized Backup and Deduplication is the industry s market-leading virtual tape and LAN-based deduplication solution, unmatched in performance and scalability. With virtual
WHITE PAPER. BIG DATA: Managing Explosive Growth. The Importance of Tiered Storage
WHITE PAPER BIG DATA: Managing Explosive Growth The Importance of Tiered Storage CONTENTS Introduction............................................................... 3 An Eye Toward Archiving.....................................................
Implementing an Automated Digital Video Archive Based on the Video Edition of XenData Software
Implementing an Automated Digital Video Archive Based on the Video Edition of XenData Software The Video Edition of XenData Archive Series software manages one or more automated data tape libraries on
Online Storage Replacement Strategy/Solution
I. Current Storage Environment Online Storage Replacement Strategy/Solution ISS currently maintains a substantial online storage infrastructure that provides centralized network-accessible storage for
How To Use An Npm On A Network Device
WHITE PAPER: CA ARCserve Backup Network Data Management Protocol (NDMP) Network Attached Storage (NAS) Option: Integrated Protection for Heterogeneous NAS Environments CA ARCserve Backup: Protecting heterogeneous
Workflow. Connectivity. Expansion. Workflow. Connectivity. Performance. Project and Bin Sharing. New! ShareBrowser Desktop Client
Workflow Connectivity Performance Expansion Project sharing, bin sharing, file sharing, SAN & NAS for professional media applications. Enough Ethernet and Fibre Channel ports to directly connect every
ntier Verde Simply Affordable File Storage
ntier Verde Simply Affordable File Storage Current Market Problems Data Growth Continues Data Retention Increases By 2020 the Digital Universe will hold 40 Zettabytes The Market is Missing: An easy to
Application Brief: Using Titan for MS SQL
Application Brief: Using Titan for MS Abstract Businesses rely heavily on databases for day-today transactions and for business decision systems. In today s information age, databases form the critical
Diagram 1: Islands of storage across a digital broadcast workflow
XOR MEDIA CLOUD AQUA Big Data and Traditional Storage The era of big data imposes new challenges on the storage technology industry. As companies accumulate massive amounts of data from video, sound, database,
Object Oriented Storage and the End of File-Level Restores
Object Oriented Storage and the End of File-Level Restores Stacy Schwarz-Gardner Spectra Logic Agenda Data Management Challenges Data Protection Data Recovery Data Archive Why Object Based Storage? The
Simplify Data Management and Reduce Storage Costs with File Virtualization
What s Inside: 2 Freedom from File Storage Constraints 2 Simplifying File Access with File Virtualization 3 Simplifying Data Management with Automated Management Policies 4 True Heterogeneity 5 Data Protection
WHITE PAPER. Effectiveness of Variable-block vs Fixedblock Deduplication on Data Reduction: A Technical Analysis
WHITE PAPER Effectiveness of Variable-block vs Fixedblock Deduplication on Data Reduction: A Technical Analysis CONTENTS Executive Summary... 3 Fixed vs. Variable-block Deduplication... 3 Test Configuration...
Protect Microsoft Exchange databases, achieve long-term data retention
Technical white paper Protect Microsoft Exchange databases, achieve long-term data retention HP StoreOnce Backup systems, HP StoreOnce Catalyst, and Symantec NetBackup OpenStorage Table of contents Introduction...
EMC arhiviranje. Lilijana Pelko Primož Golob. Sarajevo, 16.10.2008. Copyright 2008 EMC Corporation. All rights reserved.
EMC arhiviranje Lilijana Pelko Primož Golob Sarajevo, 16.10.2008 1 Agenda EMC Today Reasons to archive EMC Centera EMC EmailXtender EMC DiskXtender Use cases 2 EMC Strategic Acquisitions: Strengthen and
Implementing Offline Digital Video Storage using XenData Software
using XenData Software XenData software manages data tape drives, optionally combined with a tape library, on a Windows Server 2003 platform to create an attractive offline storage solution for professional
StorageX 7.5 Case Study
StorageX 7.5 Case Study This document will cover how StorageX 7.5 helps to transform a legacy Microsoft DFS environment into a modern, domain-based DFS environment The Challenge Microsoft has officially
Amazon Cloud Storage Options
Amazon Cloud Storage Options Table of Contents 1. Overview of AWS Storage Options 02 2. Why you should use the AWS Storage 02 3. How to get Data into the AWS.03 4. Types of AWS Storage Options.03 5. Object
Tiered Data Protection Strategy Data Deduplication. Thomas Störr Sales Director Central Europe November 8, 2007
Tiered Data Protection Strategy Data Deduplication Thomas Störr Sales Director Central Europe November 8, 2007 Overland Storage Tiered Data Protection = Good = Better = Best! NEO / ARCvault REO w/ expansion
