Technical Case Study CERN the European Organization for Nuclear Research
|
|
- Lindsay Vivien White
- 8 years ago
- Views:
Transcription
1 Technical Case Study CERN the European Organization for Nuclear Research How CERN helps physicists unlock the secrets of the universe running critical operations on a foundation of Oracle databases on NetApp storage. Following the Data to Knowledge and Discovery The physicists at CERN strive to expand humankind s general understanding of our world, pushing beyond the boundaries of knowledge to fathom the secrets of the universe. Driven by curiosity and a quest for pure knowledge, CERN s scientific community conducts fundamental research that follows the data wherever it leads in the search for clues and discoveries about how the universe works. But that does not mean that CERN research is without practical and often revolutionary application in everyday life. In 1989, for example, Tim Berners Lee, a scientist at CERN, invented the World Wide Web, conceived and developed to meet the demand for automatic information sharing among the global high energy physics community. CERN also served as the incubator for capacitive touch screens, invented in 1973 by Bent Stumpe and colleagues and originally put to use in the control room of the CERN SPS accelerator. Those inno vations applied research spin-offs, if you will have transformed modern communications. The work at CERN In addition to seeking answers to questions about the universe, the CERN community works to: Encourage global collaboration, bringing nations together through science. Educate, providing advanced worker training and building enthusiasm for physics among the next generation of scientists. Advance the frontiers of technology, cooperating with industry to bring forward new technologies.
2 Researching the building blocks of the universe CERN provides some of the world s most technologically advanced facilities for researching the basic building blocks of the universe. Facilities include particle accelerators and specialized machines to help prove the existence of exotic forms of matter. Research at CERN facilities falls into three major areas of study: The origin of mass. Research in this area includes searching for the Higgs particle, a hypothetical and elementary particle predicted by the Standard Model (SM) of particle physics. The Higgs particle belongs to a class of particles known as bosons and is considered the key to explaining why particles have mass. Dark matter. Galaxies behave as if they have more mass than can be observed. Theories suggest that there is a partner to every existing particle in the SM. Called supersymmetric particles, these particles could be the unseen dark matter. The Big Bang. What happened just after the beginning of the universe? Theorizing that the universe contained a hot, dense mixture of quarks and gluons (called quark-gluon plasma), scientists want to recreate similar conditions to analyze the properties of that mixture. About the Large Hadron Collider The CERN complex hosts a succession of particle accelerators, each able to reach increasingly higher energies. The latest addition to the complex is the Large Hadron Collider (LHC), the world s largest and most powerful particle accelerator. The CERN Control Centre near Geneva, Switzerland, houses all of the controls for the accelerator, its services, and technical infrastructure. Our biggest challenge is handling the volume and rate of data growth. Frédéric Hemmer Head, IT Department CERN The LHC, launched in 2008 and installed about 100 meters underground, forms a 27-kilometer circle that spans the border between France and Switzerland. The ring consists of superconducting magnets with a number of accelerating structures that boost the energy of particles. Traveling in opposite directions in separate pipes, beams inside the LHC are guided around the accelerator by a magnetic field achieved with superconducting magnets pre-cooled with liquid nitrogen and then filled with liquid helium to drop the temperature to a colderthan-outer-space temperature of about -271 C. Beams are directed to collide around the ring at points coinciding with the location of LHC particle detectors. International collaborations currently run four distinct big experiments each characterized by its unique particle detector to study LHC collisions and the properties of matter produced in those collisions. The LHC creates 600 million collisions per second, producing raw data at the rate of 1 million gigabytes per second. Software converts that raw data to readable data objects for later event analysis. Current experiments produce more than 20PB of new data annually, helping CERN scientists push knowledge forward and answer questions about the fundamental laws of nature. Information Technology department role CERN s Information Technology department manages the IT support infrastructure for a staff of about 2,500 and a global research community of more than 10,000 scientists and students representing 608 universities and 113 nationalities. Responsibilities of the CERN scientific and technical staff include designing, building, and ensuring the smooth operation of particle accelerators as well as preparing, running, analyzing, and interpreting data gathered from scientific experiments. 2
3 The department provides access to a broad array of IT services and data to a demanding scientific community that comprises nearly half of the world s particle physicists. They will turn the knob until it breaks, remarks Frédéric Hemmer, head of CERN s IT department. But addressing the challenges our users present is part of what makes life here at CERN so enjoyable. We re constantly adapting IT, even on a weekly basis, to facilitate collaboration and communication and to handle the increasing rate and scale of incoming experimental data. Balancing Demands for Performance, Scalability, and Reliability with Cost Constraints The big science being done at CERN introduces equivalently big data management challenges. IT has to anticipate the needs of inventive users conducting experiments with often-unpredictable requirements. To keep pace, Hemmer and his team must be innovators themselves, rapidly and efficiently delivering IT solutions that empower the CERN research community. CERN IT delivers this functionality while facing the universal challenge of providing more services with limited funding and the same or decreasing data center and administrative resources. In choosing foundational elements of the IT infrastructure technology stack, CERN continually balances technical demands for performance, reliability, and scalability with the constancy of financial constraints. Within the IT team, Database Services owns the responsibility for both the foundational database and the associated storage technologies. CERN first began utilizing Oracle databases and tools in Today, Oracle technology is used throughout the organization and plays a critical role in accelerator control systems, engineering and administrative applications, and LHC experiments. Oracle technology delivers requisite functionality, including high availability, scalability, and performance, with comprehensive tools for data distribution, protection, and manageability. On the data storage side, essential requirements include manageability, availability, and scalability to respond to fast-changing or unexpected requirements. For example, heavy lead ions cause especially complicated collisions that can make estimating data rates an inexact science. In one case, incoming data rates were five times higher than predicted. Hemmer further quantifies: Data can come into our computer center at rates up to 6GB per second that s equivalent to the contents of two DVDs every three seconds. Our job is to ensure that that data is readable and permanently available to our community of physicists. Data is our existence. Our biggest challenge is handling the volume and rate of data growth. Delivering an agile data infrastructure that is intelligent, immortal, and infinite Hemmer s team must build an agile data infrastructure that can: 1) deliver rapid impact through intelligent data management; 2) deliver immortal data availability, including nondisruptive upgrades to leverage technology advances without introducing downtime to the CERN instruments and scientific activities that run 24/7/365; and 3) provide nearly infinite scaling that will enable storage performance and capacity to grow in lock-step with CERN s research requirements and databases. In 2007, after a public tender process, CERN selected NetApp technology for the LHC logging database built on an Oracle database with Real Application Clusters (RAC) technology. Since that time, CERN has unified its entire Oracle infrastructure on NetApp and today stores 99% of all Oracle data on NetApp solutions. NetApp s affordable cost of entry with linearly scalable performance and capacity has enabled CERN to grow its storage footprint at the pace of research demand. 3
4 Eric Grancher, database services architect within CERN IT, says that NetApp delivers enabling functionality to the Oracle environment: NetApp s certification with Oracle RAC over NFS is an asset. NetApp also offers distinct functionality, including 10-Gigabit Ethernet [10GbE] support, low-impact snapshot and cloning, the ability to deliver required performance and capacity at an affordable price point [utilizing NetApp Flash Cache intelligent caching with high-capacity SATA disk drives], support for large files [up to 16TB], and, most recently, Data ONTAP operating in Cluster-Mode for more efficient data mobility. We welcome Data ONTAP Cluster-Mode that lets us move data for load-balancing, for moving less-used or inactive data to lower-cost drives, or for technology updates without having to stop the application. Oracle on NetApp across the organization Today, nearly 100 Oracle databases run on NetApp storage. The CERN IT department provides Oracle services for: LHC control and logging operations Online experiments Offline experiments Administration, including payroll services Engineering services Grancher emphasizes the critical nature of CERN s Oracle databases running on NetApp: Our Oracle on NetApp infrastructure underpins both physics and business operations at CERN. CERN relies on Oracle databases to keep the LHC online and to maintain availability of our administrative databases if those systems go down, it impacts the work of hundreds of people. One of the key decisions we made in building a highly reliable infrastructure was to deploy storage that we trust, that is simple to manage, and then layer on top of that. We take care of our storage and count on it to provide a stable service on which to build database and application services. Streams Streams IT/DB Group Experiment Online Databases Experiment Offline Databases Tier-1 Centers CASTOR (CERN Advanced STORage Manager) Data Raw Data LHC Experiments Middleware LHC Operations Accelerators ACC Administrative, IT, and Engineering Databases Figure 1) CERN S LHC and experiment operations. 4
5 Meeting the Technical Challenges of a Superscale Environment CERN IT infrastructure services must be continuously available and must be superscalable to keep pace with prodigious data growth. CERN science never sleeps keeping the LHC online Any problem receiving or managing data can bring the system down, stopping the particles beam in the LHC. The powerful tools that monitor and control the LHC are built on Oracle databases running on a NetApp data infrastructure. Controlling database (ACCCON). This database stores accelerator settings and controls. CERN operators monitor the accelerator 24/7, inputting required configuration changes into the database via control-room screens. Should this database become unavailable for even a few minutes, operators would be unable to control the accelerator and would have to dump the beam that is, extract the beam into huge graphite blocks to diffuse the beam s energy to protect the multi-billion-dollar LHC. Out-of-range temperatures, for example, could damage magnets that cost upwards of $1,000,000 each, and complicated repairs could take operations offline for weeks or even months. Logging database (ACCLOG). This database records input from thousands of sensors in the LHC, maintaining long-term logs of the status of thousands of magnets and all moving parts, including collimators that protect the beams by scraping off-track particles. This largest and fastest growing of the CERN Oracle databases currently contains 4.1 trillion rows of data (126TB) and, because it contains calibration data, is also essential to keeping the LHC online. Finding a needle in 20 million haystacks Another key challenge in providing access to CERN s massive stores of experimental data is delivering sufficient performance to Oracle index databases. Oracle databases running on NetApp manage the metadata that tracks and enables access to raw research data stored in flat files on the CERN Advanced STORage manager (CASTOR) hierarchical storage management system. CASTOR commodity disk farms and tape silos today provide 40PB of capacity. Over each year of the LHC s operation, the 4 giant detectors observing trillions of elementary particle collisions will accumulate more than 10 million gigabytes of data, equivalent to the contents of about 20 million CD-ROMs. At current recording rates, the CERN physics experiments will generate more than 20PB of new data annually that must be managed by the Oracle databases. CERN s advances in big data analytics help researchers derive maximum and rapid value from these enormous datasets and will ultimately find application in industry, helping to enhance business outcomes through predictive analyses. 5
6 Keeping up CERN s IT department also must enable database and storage systems to keep up with staggering data growth. Across CERN today, NetApp provides 901TB of capacity to Oracle databases, and CERN database staff expects capacity requirements to grow rapidly. Accelerator databases are expected to grow by 50TB each year. Such rapid growth demands unprecedented scalability and efficiencies in the CERN database and storage technology stack. Key enabling technologies to achieve balance for Oracle environments Grancher says deploying Oracle databases on NetApp enables the Database Services team to balance requirements for efficiency with necessary stability, performance, and scalability. He cites vital functionality: 10GbE offers a known growth path to greater bandwidth plus the cost efficiencies of a widely adopted mainstream technology. Leveraging 10GbE also allows CERN to use the same switches and networking that serve the rest of the lab. That means CERN IT can reduce costs by handing off administration to the networking team that is already staffed to provide 24/7 management and support. Oracle Direct NFS (dnfs) enables multiple paths to storage. This technology contributes to scalability and, because it bypasses the server operating system, typically doubles the performance of traditional NFS. But just as importantly, dnfs takes Oracle over NFS from simple to extremely simple the CERN IT staff does not have to worry about how to configure NFS because Oracle generates NFS requests directly from the database. SATA plus NetApp Flash Cache software makes it possible to achieve performance comparable to FC drives at a much lower price point. An FC solution would have been price-prohibitive at CERN s performance requirements and growth rates. NetApp FlexClone software enables efficient creation of temporary, writable copies. CERN required space-efficient Snapshot copies and writable copies of large databases, but also needed to make sure that replication processes did not impact performance. The CERN tender actually specified the maximum impact that creating a specific number of Snapshot copies could have on given workloads. 6
7 NetApp Data ONTAP 8 operating in Cluster-Mode makes it possible to maintain peak application performance and storage efficiency by adding storage and moving data without disrupting ongoing operations. In CERN s environment, no application can be stopped, so the infrastructure must deliver continuous availability with nondisruptive upgrade and other administrative operations. Grancher says that Cluster-Mode works particularly well with Oracle over NFS to give CERN needed agility. How NetApp Participated in Furthering CERN s Mission for Research Hemmer suggests that the most successful technology deployments occur in the presence of a strong partnership. We count on our providers to be innovative and proactive, helping to increase our cost effectiveness and use of resources. Grancher offers an example: With the rapid growth of the LHC logging database it expands at 50TB annually we needed an alternative to our costly FC solution. Moving to SATA would have solved our capacity and cost issues, but we expected performance problems. NetApp s recommendation to put Flash Cache in front let us keep performance at parity. Oracle RAC Databases Storage Interconnect NetApp FAS Storage Systems NetApp Disk Shelves Figure 2) CERN s NAS-based storage infrastructure Making Oracle Database 11g better A member since 2005 of the Oak Table network for Oracle scientists, Grancher understands and emphasizes the importance of implementing a storage foundation that enhances database environments. NetApp delivers a single, integrated platform for an agile data infrastructure that is: Intelligent. Management simplicity helps the CERN IT team more quickly deliver infrastructure to facilitate research. For example, CERN utilizes NetApp FlexVol virtual volumes to simplify provisioning and achieve efficiencies with thin-provisioned volumes. NetApp OnCommand management software also enables automation that reduces human errors. Says Grancher: The Oracle over NFS to NetApp storage has simplified how we access and manage data. With the time our database team saves we re able to offer more services to more users. NetApp has smart tools, and we make good use of them. 7
8 Grancher says that Oracle VM server virtualization on NFS is simple, extensible, and stable. In collaboration with Oracle, NetApp developed a Storage Connect plug-in for Oracle VM 3.0. The plug-in simplifies and centralizes management of Oracle Database and application environments by integrating advanced NetApp storage functionality like deduplication and thin-provisioning capabilities, for example with Oracle VM 3.0. CERN has never had a downtime outage of SATA drives. Moving from FC SAN to SATA NAS, we ve maintained exactly the same level of reliability. Eric Grancher Database Services Architect CERN NetApp technology also enables more efficient data protection and recoverability. Specifically, NetApp lets CERN protect data while avoiding data duplication, provide multiuse datasets without copying, and eliminate duplicate copies of data. Without NetApp SnapRestore technology, Grancher states, we d need weeks to recover just one multiterabyte Oracle Database. NetApp also makes the size of the database irrelevant we can copy a 1- or 10-terabyte database in seconds and restore it in minutes or hours. It used to take 28 days to restore a 100TB Oracle Database now it takes 15 minutes. Used in conjunction with Oracle Real Application Testing, SnapRestore technology also lets us quickly replay a workload for testing. Immortal. Stability of storage is a big asset to the stability of CERN database workloads on top. NetApp RAID-DP technology, redundant components and high-availability-pair controller configurations, and the latest Data ONTAP Cluster-Mode functionality contribute to CERN s ability to build a no-downtime, no-data-loss storage foundation. Grancher points out that NetApp technology has let CERN evolve its Oracle Database solutions with zero downtime: CERN has never had a downtime outage of SATA drives. Moving from FC SAN to SATA NAS, we ve maintained exactly the same level of reliability. Since first deploying NetApp storage in 2007, CERN has not lost a single data block on NetApp. We can t overemphasize the importance of this if CERN databases don t run, the accelerator doesn t run, and physics doesn t function. Infinite. NetApp has also allowed CERN IT to deliver affordable performance. When the capacity requirements of large-scale Oracle databases made FC-based storage no longer a viable option financially, CERN was able to combine more affordable SATA drives with NetApp Flash Cache to deliver needed capacity without performance sacrifice. Grancher says, Using Flash Cache with SATA we re achieving 35,000 IOPS over Ethernet that s the equivalent performance of 250 disks. If a big part of your workload fits into the cache, response time can drop into the millisecond range versus the milliseconds that would be the standard for SATA alone. We also have flexibility to specify what to cache for example, we don t cache archive redo logs and the cache automatically adapts to workloads. That saves time and minimizes errors. With the pace and scope of data growth at CERN, scalable storage capacity and performance are fundamental. States Grancher, CERN is much like any other organization managing an OLTP or big data environment. Our IT infrastructure has to be adaptable, reliable, scalable, and efficient, and our staff has to be proactive in integrating technologies and making effective use of limited resources even as we deal with massive data growth. From affordable cost of entry to just-in-time storage expansion, NetApp has allowed us to grow our storage infrastructure in step with our ever-expanding data and research requirements. Storage-efficiency technologies also help CERN achieve its keep forever data strategy. Hemmer says, When data comes in our computer center, it must be stored permanently. Researchers may want to access data years after it was collected, so we never, never throw away data. 8
9 A Reliable and Extensible Foundation for Research Grancher comments on the larger impact of the Oracle on NetApp infrastructure: Most rewarding for our Database Services team is being able to build something stable, an architecture that s satisfying in terms of results and that s not a one-time solution, but rather a flexible foundation for growth. Our customers CERN s global community of physicists, students, and staff can rely on this infrastructure to deliver dependable data access, enable seamless collaboration, and ensure responsive services. IT footprint: 2X less space, power, cooling (SATA vs. SAS) Hemmer adds, We ve received a number of spontaneous plaudits from scientists for the way in which our computing infrastructure has contributed to the delivery of physics results. By giving them the tools and data access they need for research, we re helping physicists find those breakthrough clues and make the big discoveries that will have an impact far beyond the bounds of our organization. 9
10 About CERN CERN, the European Organization for Nuclear Research, is one of the world s largest and most respected centers for scientific research. Its business is fundamental physics, finding out what the universe is made of and how it works. At CERN, the world s largest and most complex scientific instruments are used to study the basic constituents of matter the fundamental particles. By studying what happens when these particles collide, physicists learn about the laws of nature. Founded in 1954, the CERN Laboratory sits astride the Franco-Swiss border near Geneva, Switzerland. It was one of Europe s first joint ventures and now has 20 Member States. About NetApp NetApp creates innovative storage and data management solutions that deliver outstanding cost efficiency and accelerate business breakthroughs. Discover our passion for helping companies around the world go further, faster at Go further, faster Key Products and Technologies NetApp FAS storage systems DS4243 disk shelves 3TB SATA, 2TB SATA 512GB Flash Cache Data ONTAP 8 FlexVol FlexClone Snapshot technology SnapRestore OnCommand software Thin provisioning Large aggregates NVRAM NFS/CIFS Oracle Oracle Database 11g Enterprise Edition with Real Application Clusters Technology and partitioning options Oracle Direct NFS Oracle Streams Oracle VM Other HP ProCurve 10Gb/s Ethernet switches IBM Tivoli TSM tape system and TDPO library Servers from multiple vendors, all equipped with 10Gb Ethernet 2012 NetApp, Inc. All rights reserved. No portions of this document may be reproduced without prior written consent of NetApp, Inc. Specifications are subject to change without notice. NetApp, the NetApp logo, Go further, faster, Data ONTAP, FlexClone, FlexVol, OnCommand, RAID-DP, SnapRestore, and Snapshot are trademarks or registered trademarks of NetApp, Inc. in the United States and/or other countries. All other brands or products are trademarks or registered trademarks of their respective holders and should be treated as such. NA
Introduction to NetApp Infinite Volume
Technical Report Introduction to NetApp Infinite Volume Sandra Moulton, Reena Gupta, NetApp April 2013 TR-4037 Summary This document provides an overview of NetApp Infinite Volume, a new innovation in
More informationTechnical Case Study State of California Sets a New Standard for IT with Shared Services
Technical Case Study State of California Sets a New Standard for IT with Shared Services How the California Natural Resources Agency harnessed the power of cloud computing to reduce IT capital costs by
More informationDatasheet NetApp FAS8000 Series
Datasheet NetApp FAS8000 Series Respond more quickly to changing IT needs with unified scale-out storage and industry-leading data management KEY BENEFITS Support More Workloads Run SAN and NAS workloads
More informationDatasheet The New NetApp FAS3200 Series Enables Flash, Clustering to Improve IT Agility and Performance
Datasheet The New NetApp FAS3200 Series Enables Flash, Clustering to Improve IT Agility and Performance DATA CENTER SOLUTIONS For More Information: (866) 787-3271 Sales@PTSdcs.com KEY BENEFITS Designed
More informationA Best Practice Guide to Archiving Persistent Data: How archiving is a vital tool as part of a data center cost savings exercise
WHITE PAPER A Best Practice Guide to Archiving Persistent Data: How archiving is a vital tool as part of a data center cost savings exercise NOTICE This White Paper may contain proprietary information
More informationNetApp FAS2000 Series
Systems NetApp FAS2000 Series Take control of your fast-growing data and maximize your shrinking budgets with an affordable, and easy-to-use storage system from NetApp KEY BENEFITS Experience value Acquire
More informationReducing the cost of Protecting and. Securing Data. Assets. Big data, small data, critical data, more data. NetApp
Reducing the cost of Protecting and Securing Data Assets Big data, small data, critical data, more data NetApp Economic Challenge Faced by IT 1. Data growth 2. System performance and scalability 3. Operational
More informationNetapp @ 10th TF-Storage Meeting
Netapp @ 10th TF-Storage Meeting Wojciech Janusz, Netapp Poland Bogusz Błaszkiewicz, Netapp Poland Ljubljana, 2012.02.20 Agenda Data Ontap Cluster-Mode pnfs E-Series NetApp Confidential - Internal Use
More informationCERN Achieves Database Scalability and Performance with Oracle and NetApp session S319046
CERN Achieves Database Scalability and Performance with Oracle and NetApp session S319046 Eric Grancher eric.grancher@cern.ch CERN IT department Steve Daniel steve.daniel@netapp.com NetApp Image courtesy
More informationReducing Data Center Power Consumption Through Efficient Storage
NETAPP WHITE PAPER Reducing Data Center Power Consumption Through Efficient Storage Larry Freeman, NetApp, Inc. July 2009 WP-7010-0709 AN EIGHT-POINT PLAN FOR FIGHTING POWER CONSUMPTION The NetApp eight-point
More informationUncompromised business agility with Oracle, NetApp and VMware
Tag line, tag line Uncompromised business agility with Oracle, NetApp and VMware HroUG Conference, Rovinj Pavel Korcán Sr. Manager Alliances South & North-East EMEA Using NetApp Simplicity to Deliver Value
More informationNetApp Storage. Krzysztof Celmer NetApp Poland. Virtualized Dynamic Infrastructure. Applications. Virtualized Storage. Servers
NetApp Storage Applications Virtualization-Aware Unified Fabric Krzysztof Celmer NetApp Poland Servers Virtualized Storage Virtualized Dynamic Infrastructure NetApp s Unified Storage Architecture Critical
More informationFAS6200 Cluster Delivers Exceptional Block I/O Performance with Low Latency
FAS6200 Cluster Delivers Exceptional Block I/O Performance with Low Latency Dimitris Krekoukias Systems Engineer NetApp Data ONTAP 8 software operating in Cluster-Mode is the industry's only unified, scale-out
More informationNew Cluster-Ready FAS3200 Models
New Cluster-Ready FAS3200 Models Steven Miller Senior Technical Director and Platform Architect NetApp recently introduced two new models in the FAS3200 series: the FAS3220 and the FAS3250. Our design
More informationNetApp Big Content Solutions: Agile Infrastructure for Big Data
White Paper NetApp Big Content Solutions: Agile Infrastructure for Big Data Ingo Fuchs, NetApp April 2012 WP-7161 Executive Summary Enterprises are entering a new era of scale, in which the amount of data
More informationNetApp and Microsoft Virtualization: Making Integrated Server and Storage Virtualization a Reality
NETAPP TECHNICAL REPORT NetApp and Microsoft Virtualization: Making Integrated Server and Storage Virtualization a Reality Abhinav Joshi, NetApp Chaffie McKenna, NetApp August 2008 TR-3701 Version 1.0
More informationDatasheet NetApp FAS6200 Series
Datasheet NetApp FAS6200 Series Flexibility meets performance, scalability, and availability to satisfy the most demanding needs of your applications and virtualization workloads KEY BENEFITS Ready for
More informationTHE FUTURE IS FLUID.
THE FUTURE IS FLUID. INTRODUCING FLUID DATA TM FROM COMPELLENT. We ve created a revolutionary enterprise storage solution that automates the movement and management of data throughout its lifecycle, so
More informationReduce your data storage footprint and tame the information explosion
IBM Software White paper December 2010 Reduce your data storage footprint and tame the information explosion 2 Reduce your data storage footprint and tame the information explosion Contents 2 Executive
More informationAspirus Enterprise Backup Assessment and Implementation of Avamar and NetWorker
Aspirus Enterprise Backup Assessment and Implementation of Avamar and NetWorker Written by: Thomas Whalen Server and Storage Infrastructure Team Leader, Aspirus Information Technology Department Executive
More informationMaxDeploy Ready. Hyper- Converged Virtualization Solution. With SanDisk Fusion iomemory products
MaxDeploy Ready Hyper- Converged Virtualization Solution With SanDisk Fusion iomemory products MaxDeploy Ready products are configured and tested for support with Maxta software- defined storage and with
More informationNimble Storage for VMware View VDI
BEST PRACTICES GUIDE Nimble Storage for VMware View VDI N I M B L E B E S T P R A C T I C E S G U I D E : N I M B L E S T O R A G E F O R V M W A R E V I E W V D I 1 Overview Virtualization is an important
More informationBig Data Analytics. for the Exploitation of the CERN Accelerator Complex. Antonio Romero Marín
Big Data Analytics for the Exploitation of the CERN Accelerator Complex Antonio Romero Marín Milan 11/03/2015 Oracle Big Data and Analytics @ Work 1 What is CERN CERN - European Laboratory for Particle
More informationTHE SUMMARY. ARKSERIES - pg. 3. ULTRASERIES - pg. 5. EXTREMESERIES - pg. 9
PRODUCT CATALOG THE SUMMARY ARKSERIES - pg. 3 ULTRASERIES - pg. 5 EXTREMESERIES - pg. 9 ARK SERIES THE HIGH DENSITY STORAGE FOR ARCHIVE AND BACKUP Unlimited scalability Painless Disaster Recovery The ARK
More informationOPTIMIZING EXCHANGE SERVER IN A TIERED STORAGE ENVIRONMENT WHITE PAPER NOVEMBER 2006
OPTIMIZING EXCHANGE SERVER IN A TIERED STORAGE ENVIRONMENT WHITE PAPER NOVEMBER 2006 EXECUTIVE SUMMARY Microsoft Exchange Server is a disk-intensive application that requires high speed storage to deliver
More informationStorage Switzerland White Paper Storage Infrastructures for Big Data Workflows
Storage Switzerland White Paper Storage Infrastructures for Big Data Workflows Sponsored by: Prepared by: Eric Slack, Sr. Analyst May 2012 Storage Infrastructures for Big Data Workflows Introduction Big
More informationExperience in running relational databases on clustered storage
Experience in running relational databases on clustered storage Ruben.Gaspar.Aparicio_@_cern.ch CERN, IT Department CHEP 2015, Okinawa, Japan 13/04/2015 Agenda Brief introduction Our setup Caching technologies
More informationENTERPRISE STORAGE WITH THE FUTURE BUILT IN
ENTERPRISE STORAGE WITH THE FUTURE BUILT IN Breakthrough Efficiency Intelligent Storage Automation Single Platform Scalability Real-time Responsiveness Continuous Protection Storage Controllers Storage
More informationConsolidate and Virtualize Your Windows Environment with NetApp and VMware
White Paper Consolidate and Virtualize Your Windows Environment with NetApp and VMware Sachin Chheda, NetApp and Gaetan Castelein, VMware October 2009 WP-7086-1009 TABLE OF CONTENTS 1 EXECUTIVE SUMMARY...
More informationLab Validation Report
Lab Validation Report Unified Windows Storage Consolidation NetApp Windows Consolidation in Virtual Server Environments By Brian Garrett August 2010 Lab Validation: Unified Windows Storage Consolidation
More informationThe future of Storage and Storage Management Using Virtualization to Increase Productivity. Storyflex VISION 2007 Hans Lamprecht NetApp SEE Vienna
The future of Storage and Storage Management Using Virtualization to Increase Productivity Storyflex VISION 2007 Hans Lamprecht NetApp SEE Vienna The Evolution of Storage & NetApp Vision The Growing Burden
More informationLeveraging EMC Fully Automated Storage Tiering (FAST) and FAST Cache for SQL Server Enterprise Deployments
Leveraging EMC Fully Automated Storage Tiering (FAST) and FAST Cache for SQL Server Enterprise Deployments Applied Technology Abstract This white paper introduces EMC s latest groundbreaking technologies,
More informationREDUCING DATA CENTER POWER CONSUMPTION THROUGH EFFICIENT STORAGE
NETAPP WHITE PAPER REDUCING DATA CENTER POWER CONSUMPTION THROUGH EFFICIENT STORAGE Brett Battles, Cathy Belleville, Susan Grabau, Judith Maurier, Network Appliance, Inc. February 2007 WP-7010-0207 AN
More informationREDUCING DATA CENTER POWER CONSUMPTION THROUGH EFFICIENT STORAGE
NETAPP VISION SERIES REDUCING DATA CENTER POWER CONSUMPTION THROUGH EFFICIENT STORAGE Brett Battles, Cathy Belleville, Susan Grabau, Judith Maurier February 2007 WP-7010-0207 AN EIGHT-POINT PLAN FOR FIGHTING
More informationCisco, Citrix, Microsoft, and NetApp Deliver Simplified High-Performance Infrastructure for Virtual Desktops
Cisco, Citrix, Microsoft, and NetApp Deliver Simplified High-Performance Infrastructure for Virtual Desktops Greater Efficiency and Performance from the Industry Leaders Citrix XenDesktop with Microsoft
More informationExtend the Benefits of VMware vsphere with NetApp Storage
Extend the Benefits of VMware vsphere with NetApp Storage NetApp Suisse Romande Christophe Danjou Systems Engineer danjou@netapp.com $4B $3B $2B $1B Global Leadership FY10: $3.93 Billion 04 05 06 07 08
More information(Scale Out NAS System)
For Unlimited Capacity & Performance Clustered NAS System (Scale Out NAS System) Copyright 2010 by Netclips, Ltd. All rights reserved -0- 1 2 3 4 5 NAS Storage Trend Scale-Out NAS Solution Scaleway Advantages
More informationSTORAGE CENTER. The Industry s Only SAN with Automated Tiered Storage STORAGE CENTER
STORAGE CENTER DATASHEET STORAGE CENTER Go Beyond the Boundaries of Traditional Storage Systems Today s storage vendors promise to reduce the amount of time and money companies spend on storage but instead
More informationDirect NFS - Design considerations for next-gen NAS appliances optimized for database workloads Akshay Shah Gurmeet Goindi Oracle
Direct NFS - Design considerations for next-gen NAS appliances optimized for database workloads Akshay Shah Gurmeet Goindi Oracle Agenda Introduction Database Architecture Direct NFS Client NFS Server
More informationDeploying Flash in the Enterprise Choices to Optimize Performance and Cost
White Paper Deploying Flash in the Enterprise Choices to Optimize Performance and Cost Paul Feresten, Mohit Bhatnagar, Manish Agarwal, and Rip Wilson, NetApp April 2013 WP-7182 Executive Summary Flash
More informationJune 2009. Blade.org 2009 ALL RIGHTS RESERVED
Contributions for this vendor neutral technology paper have been provided by Blade.org members including NetApp, BLADE Network Technologies, and Double-Take Software. June 2009 Blade.org 2009 ALL RIGHTS
More informationLab Evaluation of NetApp Hybrid Array with Flash Pool Technology
Lab Evaluation of NetApp Hybrid Array with Flash Pool Technology Evaluation report prepared under contract with NetApp Introduction As flash storage options proliferate and become accepted in the enterprise,
More informationEntry level solutions: - FAS 22x0 series - Ontap Edge. Christophe Danjou Technical Partner Manager
Entry level solutions: - FAS 22x0 series - Ontap Edge Christophe Danjou Technical Partner Manager FAS2200 Series More powerful, affordable, and flexible systems for midsized organizations and distributed
More informationA Passion for Innovation
Technical Case Study A Passion for Innovation How Tiscali Reinvented Itself as a Cloud Service Provider and Opened Up New Market Opportunities By Andrea Stefano Sardu, Storage Infrastructure Manager, Tiscali
More informationFlash Memory Technology in Enterprise Storage
NETAPP WHITE PAPER Flash Memory Technology in Enterprise Storage Flexible Choices to Optimize Performance Mark Woods and Amit Shah, NetApp November 2008 WP-7061-1008 EXECUTIVE SUMMARY Solid state drives
More informationSilverton Consulting, Inc. StorInt Briefing Introduction Enterprise challenges
Silverton Consulting, Inc. StorInt Briefing Introduction In today s enterprise, IT staff often work under an unrelenting flood of data and a much more demanding business environment. To survive these challenges,
More informationNetApp for Oracle Database
NetApp Verified Architecture NetApp for Oracle Database Enterprise Ecosystem Team, NetApp November 2012 NVA-0002 Version 2.0 Status: Final TABLE OF CONTENTS 1 NetApp Verified Architecture... 4 2 NetApp
More informationNetApp SnapMirror. Protect Your Business at a 60% lower TCO. Title. Name
NetApp SnapMirror Protect Your Business at a 60% lower TCO Name Title Disaster Recovery Market Trends Providing disaster recovery remains critical Top 10 business initiative #2 area for storage investment
More informationPanasas High Performance Storage Powers the First Petaflop Supercomputer at Los Alamos National Laboratory
Customer Success Story Los Alamos National Laboratory Panasas High Performance Storage Powers the First Petaflop Supercomputer at Los Alamos National Laboratory June 2010 Highlights First Petaflop Supercomputer
More informationThe Revival of Direct Attached Storage for Oracle Databases
The Revival of Direct Attached Storage for Oracle Databases Revival of DAS in the IT Infrastructure Introduction Why is it that the industry needed SANs to get more than a few hundred disks attached to
More informationOptimizing Storage for Better TCO in Oracle Environments. Part 1: Management INFOSTOR. Executive Brief
Optimizing Storage for Better TCO in Oracle Environments INFOSTOR Executive Brief a QuinStreet Excutive Brief. 2012 To the casual observer, and even to business decision makers who don t work in information
More informationTHESUMMARY. ARKSERIES - pg. 3. ULTRASERIES - pg. 5. EXTREMESERIES - pg. 9
PRODUCT CATALOG THESUMMARY ARKSERIES - pg. 3 ULTRASERIES - pg. 5 EXTREMESERIES - pg. 9 ARKSERIES THE HIGH DENSITY STORAGE FOR ARCHIVE AND BACKUP Unlimited scalability Painless Disaster Recovery The ARK
More informationEMC SOLUTIONS TO OPTIMIZE EMR INFRASTRUCTURE FOR CERNER
EMC SOLUTIONS TO OPTIMIZE EMR INFRASTRUCTURE FOR CERNER ESSENTIALS Mitigate project risk with the proven leader, many of largest EHR sites run on EMC storage Reduce overall storage costs with automated
More informationSimplifying Storage Operations By David Strom (published 3.15 by VMware) Introduction
Simplifying Storage Operations By David Strom (published 3.15 by VMware) Introduction There are tectonic changes to storage technology that the IT industry hasn t seen for many years. Storage has been
More informationEMC VFCACHE ACCELERATES ORACLE
White Paper EMC VFCACHE ACCELERATES ORACLE VFCache extends Flash to the server FAST Suite automates storage placement in the array VNX protects data EMC Solutions Group Abstract This white paper describes
More informationMake the Most of Big Data to Drive Innovation Through Reseach
White Paper Make the Most of Big Data to Drive Innovation Through Reseach Bob Burwell, NetApp November 2012 WP-7172 Abstract Monumental data growth is a fact of life in research universities. The ability
More informationSymantec Enterprise Vault And NetApp Better Together
Symantec Enterprise Vault And NetApp Better Together John Martin, Consulting Systems Engineer Information Archival with Symantec and NetApp Today s Customer Headaches Data is growing exponentially Scaling
More informationEMC XTREMIO EXECUTIVE OVERVIEW
EMC XTREMIO EXECUTIVE OVERVIEW COMPANY BACKGROUND XtremIO develops enterprise data storage systems based completely on random access media such as flash solid-state drives (SSDs). By leveraging the underlying
More informationThe Future of Data Management
The Future of Data Management with Hadoop and the Enterprise Data Hub Amr Awadallah (@awadallah) Cofounder and CTO Cloudera Snapshot Founded 2008, by former employees of Employees Today ~ 800 World Class
More informationKaminario K2 All-Flash Array
Kaminario K2 All-Flash Array The Kaminario K2 all-flash storage array delivers predictable performance, cost, scale, resiliency and simplicity so organizations can handle ever-changing and unforeseen business
More informationSolidFire and NetApp All-Flash FAS Architectural Comparison
SolidFire and NetApp All-Flash FAS Architectural Comparison JUNE 2015 This document provides an overview of NetApp s All-Flash FAS architecture as it compares to SolidFire. Not intended to be exhaustive,
More informationNimble Storage VDI Solution for VMware Horizon (with View)
BEST PRACTICES GUIDE Nimble Storage VDI Solution for VMware Horizon (with View) B E S T P R A C T I C E S G U I D E : N I M B L E S T O R A G E V D I F O R V M W A R E H O R I Z O N ( w i t h V I E W )
More informationHow To Speed Up A Flash Flash Storage System With The Hyperq Memory Router
HyperQ Hybrid Flash Storage Made Easy White Paper Parsec Labs, LLC. 7101 Northland Circle North, Suite 105 Brooklyn Park, MN 55428 USA 1-763-219-8811 www.parseclabs.com info@parseclabs.com sales@parseclabs.com
More informationWHITE PAPER The Storage Holy Grail: Decoupling Performance from Capacity
WHITE PAPER The Storage Holy Grail: Decoupling Performance from Capacity Technical White Paper 1 The Role of a Flash Hypervisor in Today s Virtual Data Center Virtualization has been the biggest trend
More informationAll-Flash Arrays Weren t Built for Dynamic Environments. Here s Why... This whitepaper is based on content originally posted at www.frankdenneman.
WHITE PAPER All-Flash Arrays Weren t Built for Dynamic Environments. Here s Why... This whitepaper is based on content originally posted at www.frankdenneman.nl 1 Monolithic shared storage architectures
More informationScala Storage Scale-Out Clustered Storage White Paper
White Paper Scala Storage Scale-Out Clustered Storage White Paper Chapter 1 Introduction... 3 Capacity - Explosive Growth of Unstructured Data... 3 Performance - Cluster Computing... 3 Chapter 2 Current
More informationCASS COUNTY GOVERNMENT. Data Storage Project Request for Proposal
CASS COUNTY GOVERNMENT Data Storage Project Request for Proposal Contents I. PROJECT OVERVIEW... 3 II. INSTRUCTIONS FOR RFP S... 5 1 VENDOR... 7 2 PROPOSED SOLUTION ARCHITECTURE... 9 3 DATA PROTECTION...
More informationHigh Availability Databases based on Oracle 10g RAC on Linux
High Availability Databases based on Oracle 10g RAC on Linux WLCG Tier2 Tutorials, CERN, June 2006 Luca Canali, CERN IT Outline Goals Architecture of an HA DB Service Deployment at the CERN Physics Database
More informationA Best Practice Guide to Archiving Persistent Data: How archiving is a vital tool as part of a data centre cost savings exercise
A Best Practice Guide to Archiving Persistent Data: How archiving is a vital tool as part of a data centre cost savings exercise NOTICE This White Paper may contain proprietary information protected by
More informationSolution Brief Network Design Considerations to Enable the Benefits of Flash Storage
Solution Brief Network Design Considerations to Enable the Benefits of Flash Storage Flash memory has been used to transform consumer devices such as smartphones, tablets, and ultranotebooks, and now it
More informationPricing - overview of available configurations
Pricing - overview of available configurations Bundle No System Heads Disks Disk Type Software End User EUR* Token ID Config Name Bundle 1 FAS2040 Single 6 x 1TB SATA Base 4.185 R809196-2040 EEM FAS2040
More informationRFP-MM-1213-11067 Enterprise Storage Addendum 1
Purchasing Department August 16, 2012 RFP-MM-1213-11067 Enterprise Storage Addendum 1 A. SPECIFICATION CLARIFICATIONS / REVISIONS NONE B. REQUESTS FOR INFORMATION Oracle: 1) What version of Oracle is in
More informationTOP FIVE REASONS WHY CUSTOMERS USE EMC AND VMWARE TO VIRTUALIZE ORACLE ENVIRONMENTS
TOP FIVE REASONS WHY CUSTOMERS USE EMC AND VMWARE TO VIRTUALIZE ORACLE ENVIRONMENTS Leverage EMC and VMware To Improve The Return On Your Oracle Investment ESSENTIALS Better Performance At Lower Cost Run
More informationBest Practices for Implementing iscsi Storage in a Virtual Server Environment
white paper Best Practices for Implementing iscsi Storage in a Virtual Server Environment Server virtualization is becoming a no-brainer for any that runs more than one application on servers. Nowadays,
More informationTop 10 Biggest Storage Downgrades in 2014
SOFTWARE-DEFINED STORAGE IN ACTION Ben Treiber Director Strategic Systems Engineering ben.treiber@datacore.com Richard Drewelow Sales Director, Western Region richard.drewelow@datacore.com Copyright 2014
More informationMaxDeploy Hyper- Converged Reference Architecture Solution Brief
MaxDeploy Hyper- Converged Reference Architecture Solution Brief MaxDeploy Reference Architecture solutions are configured and tested for support with Maxta software- defined storage and with industry
More informationINCREASING EFFICIENCY WITH EASY AND COMPREHENSIVE STORAGE MANAGEMENT
INCREASING EFFICIENCY WITH EASY AND COMPREHENSIVE STORAGE MANAGEMENT UNPRECEDENTED OBSERVABILITY, COST-SAVING PERFORMANCE ACCELERATION, AND SUPERIOR DATA PROTECTION KEY FEATURES Unprecedented observability
More informationOmniCube. SimpliVity OmniCube and Multi Federation ROBO Reference Architecture. White Paper. Authors: Bob Gropman
OmniCube SimpliVity OmniCube and Multi Federation ROBO Reference Architecture White Paper Authors: Bob Gropman Date: April 13, 2015 SimpliVity and OmniCube are trademarks of SimpliVity Corporation. All
More informationMaxta Storage Platform Enterprise Storage Re-defined
Maxta Storage Platform Enterprise Storage Re-defined WHITE PAPER Software-Defined Data Center The Software-Defined Data Center (SDDC) is a unified data center platform that delivers converged computing,
More informationSolving Agencies Big Data Challenges: PED for On-the-Fly Decisions
White Paper Solving Agencies Big Data Challenges: PED for On-the-Fly Decisions Carina Veksler, NetApp March 2012 WP-7158 ABSTRACT With the growing volumes of rich sensor data and imagery used today to
More informationENABLING VIRTUALIZED GRIDS WITH ORACLE AND NETAPP
NETAPP AND ORACLE WHITE PAPER ENABLING VIRTUALIZED GRIDS WITH ORACLE AND NETAPP Generosa Litton, Network Appliance, Inc. Monica Kumar, Frank Martin, Don Nalezyty, Oracle March 2008 WP-7037-0208 EXECUTIVE
More informationServer Virtualization: Avoiding the I/O Trap
Server Virtualization: Avoiding the I/O Trap How flash memory arrays and NFS caching helps balance increasing I/O loads of virtualized servers November 2010 2 Introduction Many companies see dramatic improvements
More informationDeploying Flash- Accelerated Hadoop with InfiniFlash from SanDisk
WHITE PAPER Deploying Flash- Accelerated Hadoop with InfiniFlash from SanDisk 951 SanDisk Drive, Milpitas, CA 95035 2015 SanDisk Corporation. All rights reserved. www.sandisk.com Table of Contents Introduction
More informationTaking the Plunge: Desktop Infrastructure. Rich Clifton
Taking the Plunge: Effectively Deploying Virtualized Desktop Infrastructure Rich Clifton SVP & GM Technology Enablement & Solutions Two Worlds Virtual Desktops Desktop Virtualization Software Data Center
More informationEfficient Storage Strategies for Virtualized Data Centers
Efficient Storage Strategies for Virtualized Data Centers Contents Abstract. 1 Data Center Virtualization Status Report. 2 Dell EqualLogic Virtualized iscsi SAN Solutions. 2 Seamless, non-disruptive scalability.
More informationTop Ten Questions. to Ask Your Primary Storage Provider About Their Data Efficiency. May 2014. Copyright 2014 Permabit Technology Corporation
Top Ten Questions to Ask Your Primary Storage Provider About Their Data Efficiency May 2014 Copyright 2014 Permabit Technology Corporation Introduction The value of data efficiency technologies, namely
More informationRealizing the True Potential of Software-Defined Storage
Realizing the True Potential of Software-Defined Storage Who should read this paper Technology leaders, architects, and application owners who are looking at transforming their organization s storage infrastructure
More informationPivot3 Desktop Virtualization Appliances. vstac VDI Technology Overview
Pivot3 Desktop Virtualization Appliances vstac VDI Technology Overview February 2012 Pivot3 Desktop Virtualization Technology Overview Table of Contents Executive Summary... 3 The Pivot3 VDI Appliance...
More informationAdvanced Data Mobility To Power Your Hybrid Cloud
Advanced Data Mobility To Power Your Hybrid Cloud Jacint Juhasz Systems Engineer South Eastern Europe 1 Market Leading Portfolio of Innovation From data center to the hybrid cloud Shared Dedicated FlashRay
More informationHow To Protect Your Data With Netapp Storevault
StoreVault Advanced Protection Architecture NetApp technologies working together Advanced data protection Advanced system protection Introduction Advanced Data Protection Overview NetApp Snapshot Technology
More informationEMC Backup and Recovery for Microsoft SQL Server
EMC Backup and Recovery for Microsoft SQL Server Enabled by Quest LiteSpeed Copyright 2010 EMC Corporation. All rights reserved. Published February, 2010 EMC believes the information in this publication
More informationAutomated Data-Aware Tiering
Automated Data-Aware Tiering White Paper Drobo s revolutionary new breakthrough technology automates the provisioning, deployment, and performance acceleration for a fast tier of SSD storage in the Drobo
More informationFlexPod for VMware The Journey to Virtualization and the Cloud
FlexPod for VMware The Journey to Virtualization and the Cloud Presented Jointly by Simac Technik ČR with Cisco, NetApp, and VMware 2010 NetApp, Cisco, and VMware. All Rights Reserved. C97-633489-00 One
More informationFAQ. NetApp MAT4Shift. March 2015
i FAQ NetApp MAT4Shift March 2015 TABLE OF CONTENTS 1 General... 3 1.1 Solution Overview...3 What is NetApp MAT4Shift?... 3 What business needs does this solution address?... 3 What is the value of the
More informationIOmark- VDI. Nimbus Data Gemini Test Report: VDI- 130906- a Test Report Date: 6, September 2013. www.iomark.org
IOmark- VDI Nimbus Data Gemini Test Report: VDI- 130906- a Test Copyright 2010-2013 Evaluator Group, Inc. All rights reserved. IOmark- VDI, IOmark- VDI, VDI- IOmark, and IOmark are trademarks of Evaluator
More informationEMC SOLUTION FOR SPLUNK
EMC SOLUTION FOR SPLUNK Splunk validation using all-flash EMC XtremIO and EMC Isilon scale-out NAS ABSTRACT This white paper provides details on the validation of functionality and performance of Splunk
More informationWHITE PAPER Guide to 50% Faster VMs No Hardware Required
WHITE PAPER Guide to 50% Faster VMs No Hardware Required Think Faster. Visit us at Condusiv.com GUIDE TO 50% FASTER VMS NO HARDWARE REQUIRED 2 Executive Summary As much as everyone has bought into the
More informationEMC Backup and Recovery for Microsoft SQL Server
EMC Backup and Recovery for Microsoft SQL Server Enabled by EMC NetWorker Module for Microsoft SQL Server Copyright 2010 EMC Corporation. All rights reserved. Published February, 2010 EMC believes the
More informationEMC Virtual Infrastructure for Microsoft Applications Data Center Solution
EMC Virtual Infrastructure for Microsoft Applications Data Center Solution Enabled by EMC Symmetrix V-Max and Reference Architecture EMC Global Solutions Copyright and Trademark Information Copyright 2009
More informationHyperQ Storage Tiering White Paper
HyperQ Storage Tiering White Paper An Easy Way to Deal with Data Growth Parsec Labs, LLC. 7101 Northland Circle North, Suite 105 Brooklyn Park, MN 55428 USA 1-763-219-8811 www.parseclabs.com info@parseclabs.com
More information