Make the Most of Big Data to Drive Innovation Through Reseach
|
|
|
- Noel Matthews
- 10 years ago
- Views:
Transcription
1 White Paper Make the Most of Big Data to Drive Innovation Through Reseach Bob Burwell, NetApp November 2012 WP-7172 Abstract Monumental data growth is a fact of life in research universities. The ability to rapidly access and process large datasets has provided new breakthroughs in research projects across the sciences, but it is taxing the resources of IT organizations. A new infrastructure is needed to accommodate the increasing demands of research projects that generate big data, but it requires a solution that simplifies the IT complexity of high-performance computing in today s data-driven world.
2 TABLE OF CONTENTS 1 The Importance of University Research Funding and Budgets People Support for New Projects Technology Requirements for Today s Research Research Creates Big Data Capitalize on Research Grants to Better Utilize Existing Assets Information Sharing with Broad Access to Data Time to Think Differently, Not Historically Cost Management Staff Productivity Operational Productivity NetApp High-Performance Storage Rack NetApp and University Research Summary... 9 LIST OF FIGURES Figure 1) Research universities rely on funding from a variety of sources Figure 2) Where is your infrastructure breaking?... 4 Figure 3) Internet2 extends shared services and collaboration across research institutions Figure 4) High-performance computing benefits Figure 5) Data growth impact on IT Figure 6) Custom deployment versus NetApp HPS Rack Make the Most of Big Data to Drive Innovation through Research
3 1 The Importance of University Research When was the last time you thought about how innovation in medicine, technology, energy, and science was developed? Although not all research takes place at universities, higher education is contributing a growing percentage of all basic research that drives innovation, passing the results to the private sector to help enable long-term economic growth, creating jobs and improving living standards in the process. Universities offer a broad spectrum of basic and applied research activities across the sciences, including medical, engineering, agricultural, natural, and humanities and social sciences. Without research universities, many new innovations and breakthroughs in science might never happen. And today s technology advancements play a key role in the outcome of successful research projects. 1.1 Funding and Budgets A university s reputation is often a critical factor in attracting faculty and students, as well as securing funding for research projects. Faced with continued pressures from a fragile economy, research universities rely on a variety of sources, including government, private, and industry sponsors, to fund new research initiatives. However, in light of ongoing proposed budget cuts, research departments must continually evaluate alternative technology approaches to drive increased efficiencies from reduced IT budgets while maintaining a high quality of research. Figure 1) Research universities rely on funding from a variety of sources. Endowments Annual Giving Research Funding Sources Government Grants Business/Industry Grants A growing source of research funding now comes from private industry. Early-stage research performed at universities is key to innovation that can then be leveraged by the private sector to develop practical solutions, such as new medical procedures, drugs, technology, green initiatives, and other innovations that improve how people live and work. 1.2 People Research is often a critical factor in ranking the top educational institutions. And the best research universities are able to recruit top students worldwide to pursue an education at their institution. In addition, faculty distinction (published research, faculty awards), the number of doctorates awarded, the number of postdoctoral appointments supported, and the number of patents held all contribute to how a university s research program is regarded. However, to maintain and grow successful research programs, faculty and students require access to state-of-the art research laboratories that provide technology for efficient access, collaboration, and analysis of data. 1.3 Support for New Projects Advancement in technology now makes it possible to accelerate the pace of research. However, in order to maximize funding by new research grants, university IT organizations must be able to quickly allocate resources to support new projects while managing the explosion of data with even fewer resources. In short, IT can make or break the opportunity to capitalize on new grants. 3 Make the Most of Big Data to Drive Innovation through Research
4 2 Technology Requirements for Today s Research The right IT infrastructure provides the foundation for effective research programs. And with access to the many advancements in technology, university research has entered a new era of scale in which the amount of data collected, processed, and stored is taxing today s IT architectures. In order to manage these ever-increasing datasets and meet key research objectives, an underlying infrastructure must be in place that enables IT to store and manage the growing volumes of data while allowing researchers to quickly retrieve and easily share data. 2.1 Research Creates Big Data Not only is the volume of data increasing, but also the data objects themselves are getting bigger. In addition, the sophistication of the processing of analytic data has become a time-consuming, computerintensive exercise. Many departments are now reaching multiple terabytes of data or even billions of files, putting an enormous amount of scale pressure on existing infrastructures, especially the storage platform. Figure 2) Where is your infrastructure breaking? Big data is breaking today s storage infrastructure along three major axes, as illustrated in Figure 2. Complexity. Data is no longer just about text and numbers; it's about real-time events and shared infrastructure. The information is now linked, it is high fidelity, and it consists of multiple data types. Applying normal algorithms for search, storage, and categorization is becoming much more complex and inefficient. Speed. How fast is the data coming in? High-definition video, streaming media over the Internet to player devices, and slow-motion video for surveillance all have very high ingestion rates. Researchers have to keep up with the data flow to make the information useful. They also have to keep pace with ingestion rates to drive faster analysis. Volume. All collected data must be stored in a location that is secure and always available. With such high volumes of data, IT teams have to make decisions about what is too much data. For example, they might flush all data each week and start over the following week. But for many applications this is not an option, so more data must be stored longer without increasing operational complexity. This can cause the infrastructure to quickly break on the axis of volume. 2.2 Capitalize on Research Grants to Better Utilize Existing Assets Research grants are a fundamental part of university life, with individual grants generally running for two to three years. Legacy architectures can have as much as 10 times the storage actually needed even when factoring in the extra capacity required to support anticipated growth requirements across the university. Moving away from silo-based architectures enables universities to be more flexible, with the ability to quickly reallocate compute resources as new 4 Make the Most of Big Data to Drive Innovation through Research
5 grants are awarded. Combining both campus and research IT in the same data center enables IT consolidation and the potential for massive cost savings by running multiple workloads on the same hardware. A shared services model increases utilization by repurposing assets to new projects quickly. This allows higher-education institutions to reap benefits long term by repurposing equipment from grant funding to new projects. 2.3 Information Sharing with Broad Access to Data Ready access to big data, as well as collaboration with remote team members, requires a high-speed network for low latency and high throughput. Internet2 provides a unique set of global capabilities to member organizations, and it is specifically designed to meet the needs of researchers and educators. This includes a 100-gigabit-per-second network that not only delivers reliable production services for high-performance needs, but creates a powerful experimental platform for the development of new applications. Unconstrained bandwidth availability enables widespread applications development and delivers: A deeply programmable environment in which compute storage, visualization, and transport capabilities can all be driven by applications Solutions that overcome traditional bottlenecks, passing high-bandwidth traffic and allowing performance monitoring Figure 3) Internet2 extends shared services and collaboration across research institutions. 3 Time to Think Differently, Not Historically The remainder of this paper focuses on the advances in technology that have made high-performance computing (HPC) more affordable and extend the benefits beyond traditional scientific applications to a broad set of university and commercial HPC applications. In order to capitalize on the information that can be derived from the tremendous volume of data generated by research, more institutions are turning to high-performance storage solutions to effectively analyze this data and solve complex research problems. However, the resource demands of high-performance computing often exceed the expertise, staff availability, and process knowledge of many users, resulting in: Shortage of time, talent, and resources to create HPC storage configurations (months to architect, design, provide proof of concept, and install) 5 Make the Most of Big Data to Drive Innovation through Research
6 Lack of single system management (monitor, manage, analyze) Overhead of ongoing maintenance and support Figure 4) High-performance computing benefits. Performance Reliability Efficiencies! Get bigger jobs done in a reduced amount of time! Get more jobs done in the same amount of time! Minimize system downtime! Ensure data integrity and availability! Density - performance, capacity, and power! Scale of data management Deploying storage systems to support high-performance computing (HPC) environments can be challenging. Many in the high-performance computing world depend on parallel file systems such as Lustre to deliver necessary bandwidth and capacity; however, designing, testing, and deploying such a solution is complex and time consuming and so is ongoing management. Many more HPC users would likely employ parallel file systems if the barriers to deployment were lower. Figure 5) Data growth impact on IT. If you were storing 100TB of online data in 2010, you will store:! 1.1 PB in 2016 (11x)! 2.5 PB in 2018 (25x)! 5.8 PB in 2020 (58x) * Based on industry average 50% annual growth 3.1 Cost Management Operating Expense While the upfront cost of building a high-performance computing storage solution may seem attractive, when you add together the increase in time to results; the operating expenses due to complicated management, maintenance, and support costs; and the potential impact of lost productivity due to poor availability, operating expenses overshadow initial capital costs. 6 Make the Most of Big Data to Drive Innovation through Research
7 Scalability When it comes time to add performance and/or capacity, a custom design may not provide the results you want. Depending on the building blocks you choose, it can be difficult to scale in small increments, making scaling an expensive proposition. It may also be impossible to scale performance and capacity independently. 3.2 Staff Productivity Time to Deploy Although a custom parallel file system deployment may seem like the best option, the time and resources it takes to deploy the file system and achieve measurable results may simply be too long in many cases. The typical commercial HPC user may not have the time, talent, or resources it takes to create a balanced HPC storage configuration. It can take months to architect, design, and perform proof-ofconcept testing. Then it may take several more months to procure the necessary hardware and weeks to install, configure, test, and deploy. That s a long time to wait to start getting results. Complex Management A parallel file system deployment has a significant number of physical components, including metadata and object servers, storage systems, disks, interconnects, networks, and so on not to mention the file system software running on servers and clients. When you build your own solution, there are a lot of separate components to monitor and manage, and you will likely need different tools for each infrastructure element. This makes analyzing performance and troubleshooting problems much more difficult. As a result, you spend more time and it costs more to manage your storage. 3.3 Operational Productivity Reliability, Availability, and Data Integrity When you architect your own solution you have to be careful to eliminate possible points of failure and choose the right components; otherwise, reliability, availability, and data integrity may be affected. Poor availability slows results and can impact time to market. Maintenance and Support With a custom solution, maintenance and support mean dealing with multiple vendors. This adds to the cost and complexity of your storage system. Some vendors may not be able to provide round-the-clock support, potentially impacting availability. In addition, working with multiple vendors to resolve complex problems can prolong problem resolution. 4 NetApp High-Performance Storage Rack Since scientists conduct university research, not computer science engineers, students and faculty need access to technology that lets them get their job done without needing to learn new tools. NetApp has introduced the High-Performance Storage Rack (HPS Rack) to address the storage challenges of today s research universities. The HPS Rack is a fully integrated HPC storage solution designed to deliver performance, scalability, reliability, and ease of management. The high-performance file system is purpose-built for high-performance computing workflows. The solution leverages data collected over time to enable accurate planning, understand usage by user, and optimize overall system performance. 7 Make the Most of Big Data to Drive Innovation through Research
8 Built using proven NetApp E-Series storage, the HPS Rack integrates all the components of a successful Lustre file system deployment so you can have the HPC storage you need up and running in less than a day. Scaling is highly granular, delivering predictable increases in performance and/or capacity. A worldwide service and support network enables you to get the help you need when you need it. IT benefits from the ability to provide high-performance computing without the complexities of traditional HPC deployments. The HPS Rack provides investment protection in research applications with a storage system tuned for HPC workloads. And your existing IT staff can easily maintain the solution without requiring expertise in high-performance computing. Figure 6) Custom deployment versus NetApp HPS Rack. Build Your Own File System w/ Fast Block Array! Unpredictable drops! High overhead, manual Architect & Design POC Procure 6-9 Months Install & Config Test & Deploy Create Tools X Iterate Tools Continuous X! Tune High-Performance Storage Rack! Time to results faster! Tools to manage, optimize, automate POC Procure Deploy & Provision Optimize for Applications 1-3 Months Job Scheduler 5 NetApp and University Research As part of its innovation strategy, NetApp supports innovative research in the academic community. As part of the CTO, the NetApp Advanced Technology Group (ATG) is responsible for maintaining many academic research relationships through sponsorships, consortium memberships, and direct collaborations. NetApp Faculty Fellowship Program The ATG has established the NetApp Faculty Fellowship (NFF) Program to fund innovative research on data storage and related topics. The goals of this program are to encourage leadingedge research in storage and data management and to foster relationships among academic researchers and engineers and researchers at NetApp. NetApp Faculty Fellowships are one-time grants, typically covering a year of funding for a graduate student working with the principal investigator on the proposed research. Grants are not restricted to this format, however, and NetApp ATG will consider proposals for different situations and durations. 8 Make the Most of Big Data to Drive Innovation through Research
9 NetApp Academic Alliance Program To help prepare college graduates for the fast-changing IT landscape, NetApp has launched the NetApp Academic Alliances Program. Through this unique program, NetApp will collaborate with some of the nation s leading colleges and universities to provide a rich library of teaching materials and resources that faculty can use to help students develop highly marketable storage-related IT skills. NetApp Education Donation Program NetApp offers an Education Donation Program that provides higher education and K 12 schools with the opportunity to receive a new NetApp FAS3100 storage system. This program enables NetApp to partner with schools across the country to help them achieve the efficiency and flexibility required to maximize their IT infrastructure and deliver on their educational objectives. The donations are part of a long-term investment to improve education IT. 6 Summary NetApp helps research universities streamline high-performance computing deployments with a preconfigured, pretested storage solution. The NetApp HPS Rack solution eliminates the barriers to entry, allowing universities to benefit from the power of high-performance computing to solve complex research problems. Maximize budget. The NetApp solution leverages real-time data collection to enable applicationdriven tuning. You benefit from an infrastructure that delivers maximum storage capacity with excellent performance, enabling you to achieve faster results, improve efficiency, and reduce operational costs. Maximize people. With its data collection capabilities, HPS Rack simplifies management by combining different views that enable administrators to track metrics of the fast data storage services, including bandwidth usage, capacity consumption, client node access, and specific job resource usage. With this data, IT can better understand and optimize the storage infrastructure and perform data-driven capacity, throughput, utilization analysis, scheduling optimization, and complete system management. Fast-track support for new research projects. The HPS Rack delivers an integrated storage solution for high-performance computing that is easy to deploy and manage and makes it easy to analyze performance and capacity. The prepackaged and preconfigured storage solution scales as storage requirements grow, resulting in faster time to results with lower TCO. 9 Make the Most of Big Data to Drive Innovation through Research
10 NetApp provides no representations or warranties regarding the accuracy, reliability, or serviceability of any information or recommendations provided in this publication, or with respect to any results that may be obtained by the use of the information or observance of any recommendations provided herein. The information in this document is distributed AS IS, and the use of this information or the implementation of any recommendations or techniques herein is a customer s responsibility and depends on the customer s ability to evaluate and integrate them into the customer s operational environment. This document and the information contained herein may be used solely in connection with the NetApp products discussed in this document NetApp, Inc. All rights reserved. No portions of this document may be reproduced without prior written consent of NetApp, Inc. Specifications are subject to change without notice. NetApp, the NetApp logo, and Go further, faster are trademarks or registered trademarks of NetApp, Inc. in the United States and/or other countries. All other brands or products are trademarks or registered trademarks of their respective holders and should be treated as such. WP Make the Most of Big Data to Drive Innovation through Research
NetApp Big Content Solutions: Agile Infrastructure for Big Data
White Paper NetApp Big Content Solutions: Agile Infrastructure for Big Data Ingo Fuchs, NetApp April 2012 WP-7161 Executive Summary Enterprises are entering a new era of scale, in which the amount of data
Solving Agencies Big Data Challenges: PED for On-the-Fly Decisions
White Paper Solving Agencies Big Data Challenges: PED for On-the-Fly Decisions Carina Veksler, NetApp March 2012 WP-7158 ABSTRACT With the growing volumes of rich sensor data and imagery used today to
Easier - Faster - Better
Highest reliability, availability and serviceability ClusterStor gets you productive fast with robust professional service offerings available as part of solution delivery, including quality controlled
Driving Down the High Cost of Storage. Pillar Axiom 600
Driving Down the High Cost of Storage Pillar Axiom 600 Accelerate Initial Time to Value, and Control Costs over the Long Term Make a storage investment that will pay off in rapid time to value and low
Introduction to NetApp Infinite Volume
Technical Report Introduction to NetApp Infinite Volume Sandra Moulton, Reena Gupta, NetApp April 2013 TR-4037 Summary This document provides an overview of NetApp Infinite Volume, a new innovation in
IBM System x reference architecture solutions for big data
IBM System x reference architecture solutions for big data Easy-to-implement hardware, software and services for analyzing data at rest and data in motion Highlights Accelerates time-to-value with scalable,
HadoopTM Analytics DDN
DDN Solution Brief Accelerate> HadoopTM Analytics with the SFA Big Data Platform Organizations that need to extract value from all data can leverage the award winning SFA platform to really accelerate
With DDN Big Data Storage
DDN Solution Brief Accelerate > ISR With DDN Big Data Storage The Way to Capture and Analyze the Growing Amount of Data Created by New Technologies 2012 DataDirect Networks. All Rights Reserved. The Big
ANY SURVEILLANCE, ANYWHERE, ANYTIME
ANY SURVEILLANCE, ANYWHERE, ANYTIME WHITEPAPER DDN Storage Powers Next Generation Video Surveillance Infrastructure INTRODUCTION Over the past decade, the world has seen tremendous growth in the use of
Cisco for SAP HANA Scale-Out Solution on Cisco UCS with NetApp Storage
Cisco for SAP HANA Scale-Out Solution Solution Brief December 2014 With Intelligent Intel Xeon Processors Highlights Scale SAP HANA on Demand Scale-out capabilities, combined with high-performance NetApp
High-Performance Business Analytics: SAS and IBM Netezza Data Warehouse Appliances
High-Performance Business Analytics: SAS and IBM Netezza Data Warehouse Appliances Highlights IBM Netezza and SAS together provide appliances and analytic software solutions that help organizations improve
Protecting Information in a Smarter Data Center with the Performance of Flash
89 Fifth Avenue, 7th Floor New York, NY 10003 www.theedison.com 212.367.7400 Protecting Information in a Smarter Data Center with the Performance of Flash IBM FlashSystem and IBM ProtecTIER Printed in
Big data management with IBM General Parallel File System
Big data management with IBM General Parallel File System Optimize storage management and boost your return on investment Highlights Handles the explosive growth of structured and unstructured data Offers
Modern IT Operations Management. Why a New Approach is Required, and How Boundary Delivers
Modern IT Operations Management Why a New Approach is Required, and How Boundary Delivers TABLE OF CONTENTS EXECUTIVE SUMMARY 3 INTRODUCTION: CHANGING NATURE OF IT 3 WHY TRADITIONAL APPROACHES ARE FAILING
Deploying Flash in the Enterprise Choices to Optimize Performance and Cost
White Paper Deploying Flash in the Enterprise Choices to Optimize Performance and Cost Paul Feresten, Mohit Bhatnagar, Manish Agarwal, and Rip Wilson, NetApp April 2013 WP-7182 Executive Summary Flash
TOP 5 REASONS WHY FINANCIAL SERVICES FIRMS SHOULD CONSIDER SDN NOW
TOP 5 REASONS WHY FINANCIAL SERVICES FIRMS SHOULD CONSIDER SDN NOW Abstract Software-defined networking, or SDN, is a relatively new technology that is already having a major impact on companies in the
BUILDING A SCALABLE BIG DATA INFRASTRUCTURE FOR DYNAMIC WORKFLOWS
BUILDING A SCALABLE BIG DATA INFRASTRUCTURE FOR DYNAMIC WORKFLOWS ESSENTIALS Executive Summary Big Data is placing new demands on IT infrastructures. The challenge is how to meet growing performance demands
Solution Brief Network Design Considerations to Enable the Benefits of Flash Storage
Solution Brief Network Design Considerations to Enable the Benefits of Flash Storage Flash memory has been used to transform consumer devices such as smartphones, tablets, and ultranotebooks, and now it
The HP Neoview data warehousing platform for business intelligence Die clevere Alternative
The HP Neoview data warehousing platform for business intelligence Die clevere Alternative Ronald Wulff EMEA, BI Solution Architect HP Software - Neoview 2006 Hewlett-Packard Development Company, L.P.
IBM PureFlex System. The infrastructure system with integrated expertise
IBM PureFlex System The infrastructure system with integrated expertise 2 IBM PureFlex System IT is moving to the strategic center of business Over the last 100 years information technology has moved from
Cisco UCS and Quantum StorNext: Harnessing the Full Potential of Content
Solution Brief Cisco UCS and Quantum StorNext: Harnessing the Full Potential of Content What You Will Learn StorNext data management with Cisco Unified Computing System (Cisco UCS ) helps enable media
An Oracle White Paper November 2010. Backup and Recovery with Oracle s Sun ZFS Storage Appliances and Oracle Recovery Manager
An Oracle White Paper November 2010 Backup and Recovery with Oracle s Sun ZFS Storage Appliances and Oracle Recovery Manager Introduction...2 Oracle Backup and Recovery Solution Overview...3 Oracle Recovery
Redefining Infrastructure Management for Today s Application Economy
WHITE PAPER APRIL 2015 Redefining Infrastructure Management for Today s Application Economy Boost Operational Agility by Gaining a Holistic View of the Data Center, Cloud, Systems, Networks and Capacity
Building & Optimizing Enterprise-class Hadoop with Open Architectures Prem Jain NetApp
Building & Optimizing Enterprise-class Hadoop with Open Architectures Prem Jain NetApp Introduction to Hadoop Comes from Internet companies Emerging big data storage and analytics platform HDFS and MapReduce
Flash Memory Arrays Enabling the Virtualized Data Center. July 2010
Flash Memory Arrays Enabling the Virtualized Data Center July 2010 2 Flash Memory Arrays Enabling the Virtualized Data Center This White Paper describes a new product category, the flash Memory Array,
Enabling High performance Big Data platform with RDMA
Enabling High performance Big Data platform with RDMA Tong Liu HPC Advisory Council Oct 7 th, 2014 Shortcomings of Hadoop Administration tooling Performance Reliability SQL support Backup and recovery
Enhance visibility into and control over software projects IBM Rational change and release management software
Enhance visibility into and control over software projects IBM Rational change and release management software Accelerating the software delivery lifecycle Faster delivery of high-quality software Software
Business-centric Storage FUJITSU Hyperscale Storage System ETERNUS CD10000
Business-centric Storage FUJITSU Hyperscale Storage System ETERNUS CD10000 Clear the way for new business opportunities. Unlock the power of data. Overcoming storage limitations Unpredictable data growth
Driving IBM BigInsights Performance Over GPFS Using InfiniBand+RDMA
WHITE PAPER April 2014 Driving IBM BigInsights Performance Over GPFS Using InfiniBand+RDMA Executive Summary...1 Background...2 File Systems Architecture...2 Network Architecture...3 IBM BigInsights...5
WHITE PAPER. www.fusionstorm.com. Building Blocks of the Modern Data Center
WHITE PAPER: Easing the Way to the Cloud: 1 WHITE PAPER Building Blocks of the Modern Data Center How Integrated Infrastructure Solutions Help to Accelerate Application Deployments and Simplify Management
Optimizing Storage for Better TCO in Oracle Environments. Part 1: Management INFOSTOR. Executive Brief
Optimizing Storage for Better TCO in Oracle Environments INFOSTOR Executive Brief a QuinStreet Excutive Brief. 2012 To the casual observer, and even to business decision makers who don t work in information
EMC: Managing Data Growth with SAP HANA and the Near-Line Storage Capabilities of SAP IQ
2015 SAP SE or an SAP affiliate company. All rights reserved. EMC: Managing Data Growth with SAP HANA and the Near-Line Storage Capabilities of SAP IQ Based on years of successfully helping businesses
Dell* In-Memory Appliance for Cloudera* Enterprise
Built with Intel Dell* In-Memory Appliance for Cloudera* Enterprise Find out what faster big data analytics can do for your business The need for speed in all things related to big data is an enormous
Building a Scalable Big Data Infrastructure for Dynamic Workflows
Building a Scalable Big Data Infrastructure for Dynamic Workflows INTRODUCTION Organizations of all types and sizes are looking to big data to help them make faster, more intelligent decisions. Many efforts
Virtual Data Warehouse Appliances
infrastructure (WX 2 and blade server Kognitio provides solutions to business problems that require acquisition, rationalization and analysis of large and/or complex data The Kognitio Technology and Data
Fabrics that Fit Matching the Network to Today s Data Center Traffic Conditions
Sponsored by Fabrics that Fit Matching the Network to Today s Data Center Traffic Conditions In This Paper Traditional network infrastructures are often costly and hard to administer Today s workloads
Application Deployment Experts
Application Deployment Experts Introduction UNICOM Engineering, a UNICOM Global company, is a leading provider of purpose-built application platforms and lifecycle deployment services for software developers
Greater Continuity, Consistency, and Timeliness with Business Process Automation
SAP Brief Extensions SAP Business Process Automation by Redwood Objectives Greater Continuity, Consistency, and Timeliness with Business Process Automation Streamline critical enterprise processes Streamline
Online Transaction Processing in SQL Server 2008
Online Transaction Processing in SQL Server 2008 White Paper Published: August 2007 Updated: July 2008 Summary: Microsoft SQL Server 2008 provides a database platform that is optimized for today s applications,
Databricks. A Primer
Databricks A Primer Who is Databricks? Databricks vision is to empower anyone to easily build and deploy advanced analytics solutions. The company was founded by the team who created Apache Spark, a powerful
Colgate-Palmolive selects SAP HANA to improve the speed of business analytics with IBM and SAP
selects SAP HANA to improve the speed of business analytics with IBM and SAP Founded in 1806, is a global consumer products company which sells nearly $17 billion annually in personal care, home care,
Building your Big Data Architecture on Amazon Web Services
Building your Big Data Architecture on Amazon Web Services Abhishek Sinha @abysinha [email protected] AWS Services Deployment & Administration Application Services Compute Storage Database Networking
IBM DB2 Near-Line Storage Solution for SAP NetWeaver BW
IBM DB2 Near-Line Storage Solution for SAP NetWeaver BW A high-performance solution based on IBM DB2 with BLU Acceleration Highlights Help reduce costs by moving infrequently used to cost-effective systems
Title. Click to edit Master text styles Second level Third level
Title Click to edit Master text styles Second level Third level IBM s Vision For The New Enterprise Data Center Subram Natarajan Senior Consultant, STG Asia Pacific [email protected] Multiple
IBM PureApplication System for IBM WebSphere Application Server workloads
IBM PureApplication System for IBM WebSphere Application Server workloads Use IBM PureApplication System with its built-in IBM WebSphere Application Server to optimally deploy and run critical applications
Load DynamiX Storage Performance Validation: Fundamental to your Change Management Process
Load DynamiX Storage Performance Validation: Fundamental to your Change Management Process By Claude Bouffard Director SSG-NOW Labs, Senior Analyst Deni Connor, Founding Analyst SSG-NOW February 2015 L
I D C V E N D O R S P O T L I G H T. S t o r a g e Ar c h i t e c t u r e t o Better Manage B i g D a t a C hallenges
I D C V E N D O R S P O T L I G H T S t o r a g e Ar c h i t e c t u r e t o Better Manage B i g D a t a C hallenges September 2012 Adapted from Worldwide File-Based Storage 2011 2015 Forecast: Foundation
can you effectively plan for the migration and management of systems and applications on Vblock Platforms?
SOLUTION BRIEF CA Capacity Management and Reporting Suite for Vblock Platforms can you effectively plan for the migration and management of systems and applications on Vblock Platforms? agility made possible
Accelerating the path to SAP BW powered by SAP HANA
Ag BW on SAP HANA Unleash the power of imagination Dramatically improve your decision-making ability, reduce risk and lower your costs, Accelerating the path to SAP BW powered by SAP HANA Hardware Software
IBM Analytics. Just the facts: Four critical concepts for planning the logical data warehouse
IBM Analytics Just the facts: Four critical concepts for planning the logical data warehouse 1 2 3 4 5 6 Introduction Complexity Speed is businessfriendly Cost reduction is crucial Analytics: The key to
T a c k l i ng Big Data w i th High-Performance
Worldwide Headquarters: 211 North Union Street, Suite 105, Alexandria, VA 22314, USA P.571.296.8060 F.508.988.7881 www.idc-gi.com T a c k l i ng Big Data w i th High-Performance Computing W H I T E P A
Essential Elements of an IoT Core Platform
Essential Elements of an IoT Core Platform Judith Hurwitz President and CEO Daniel Kirsch Principal Analyst and Vice President Sponsored by Hitachi Introduction The maturation of the enterprise cloud,
Advanced Core Operating System (ACOS): Experience the Performance
WHITE PAPER Advanced Core Operating System (ACOS): Experience the Performance Table of Contents Trends Affecting Application Networking...3 The Era of Multicore...3 Multicore System Design Challenges...3
Accenture Human Capital Management Solutions. Transforming people and process to achieve high performance
Accenture Human Capital Management Solutions Transforming people and process to achieve high performance The sophistication of our products and services requires the expertise of a special and talented
Proactive Performance Management for Enterprise Databases
Proactive Performance Management for Enterprise Databases Abstract DBAs today need to do more than react to performance issues; they must be proactive in their database management activities. Proactive
SQL Server 2012 Parallel Data Warehouse. Solution Brief
SQL Server 2012 Parallel Data Warehouse Solution Brief Published February 22, 2013 Contents Introduction... 1 Microsoft Platform: Windows Server and SQL Server... 2 SQL Server 2012 Parallel Data Warehouse...
BUSINESS INTELLIGENCE ANALYTICS
SOLUTION BRIEF > > CONNECTIVITY BUSINESS SOLUTIONS FOR INTELLIGENCE FINANCIAL SERVICES ANALYTICS 1 INTRODUCTION It s no secret that the banking and financial services institutions of today are driven by
Choosing the Right Project and Portfolio Management Solution
Choosing the Right Project and Portfolio Management Solution Executive Summary In too many organizations today, innovation isn t happening fast enough. Within these businesses, skills are siloed and resources
How To Use Hp Vertica Ondemand
Data sheet HP Vertica OnDemand Enterprise-class Big Data analytics in the cloud Enterprise-class Big Data analytics for any size organization Vertica OnDemand Organizations today are experiencing a greater
Product Brief SysTrack VMP
for VMware View Product Brief SysTrack VMP Benefits Optimize VMware View desktop and server virtualization and terminal server projects Anticipate and handle problems in the planning stage instead of postimplementation
SAP HANA PLATFORM Top Ten Questions for Choosing In-Memory Databases. Start Here
PLATFORM Top Ten Questions for Choosing In-Memory Databases Start Here PLATFORM Top Ten Questions for Choosing In-Memory Databases. Are my applications accelerated without manual intervention and tuning?.
Application Visibility and Monitoring >
White Paper Application Visibility and Monitoring > An integrated approach to application delivery Application performance drives business performance Every business today depends on secure, reliable information
solution brief September 2011 Can You Effectively Plan For The Migration And Management of Systems And Applications on Vblock Platforms?
solution brief September 2011 Can You Effectively Plan For The Migration And Management of Systems And Applications on Vblock Platforms? CA Capacity Management and Reporting Suite for Vblock Platforms
Scala Storage Scale-Out Clustered Storage White Paper
White Paper Scala Storage Scale-Out Clustered Storage White Paper Chapter 1 Introduction... 3 Capacity - Explosive Growth of Unstructured Data... 3 Performance - Cluster Computing... 3 Chapter 2 Current
WHITE PAPER. www.fusionstorm.com. Easing the Way to the Cloud:
WHITE PAPER: Easing the Way to the Cloud: 1 WHITE PAPER Easing the Way to the Cloud: The Value of Using a Reference Architecture in Private Cloud Deployments for Microsoft Applications and Server Platforms
Consolidate and Virtualize Your Windows Environment with NetApp and VMware
White Paper Consolidate and Virtualize Your Windows Environment with NetApp and VMware Sachin Chheda, NetApp and Gaetan Castelein, VMware October 2009 WP-7086-1009 TABLE OF CONTENTS 1 EXECUTIVE SUMMARY...
CA Service Desk On-Demand
PRODUCT BRIEF: CA SERVICE DESK ON DEMAND -Demand Demand is a versatile, ready-to-use IT support solution delivered On Demand to help you build a superior Request, Incident, Change and Problem solving system.
The Business Case for Using Big Data in Healthcare
SAP Thought Leadership Paper Healthcare and Big Data The Business Case for Using Big Data in Healthcare Exploring How Big Data and Analytics Can Help You Achieve Quality, Value-Based Care Table of Contents
NetApp High-Performance Computing Solution for Lustre: Solution Guide
Technical Report NetApp High-Performance Computing Solution for Lustre: Solution Guide Robert Lai, NetApp August 2012 TR-3997 TABLE OF CONTENTS 1 Introduction... 5 1.1 NetApp HPC Solution for Lustre Introduction...5
EMC XtremSF: Delivering Next Generation Performance for Oracle Database
White Paper EMC XtremSF: Delivering Next Generation Performance for Oracle Database Abstract This white paper addresses the challenges currently facing business executives to store and process the growing
Benefits of an ITIL Help Desk in the Cloud
SOLUTION WHITE PAPER Benefits of an ITIL Help Desk in the Cloud A New ITIL Solution for Small-to-Medium Businesses Contents Introduction 1 Help Desk Needs in Smaller Environments 1 Power in the Cloud 3
How To Use All Flash Storage In Education
When Flash Makes All the Difference: A Look at 3 Common Use Cases in Education WHITE PAPER Table of Contents Use Case No. 1: Virtual Desktop Infrastructure...2 Use Case No. 2: Application Performance...3
Taming Big Data Storage with Crossroads Systems StrongBox
BRAD JOHNS CONSULTING L.L.C Taming Big Data Storage with Crossroads Systems StrongBox Sponsored by Crossroads Systems 2013 Brad Johns Consulting L.L.C Table of Contents Taming Big Data Storage with Crossroads
Netapp HPC Solution for Lustre. Rich Fenton ([email protected]) UK Solutions Architect
Netapp HPC Solution for Lustre Rich Fenton ([email protected]) UK Solutions Architect Agenda NetApp Introduction Introducing the E-Series Platform Why E-Series for Lustre? Modular Scale-out Capacity Density
Tap into Big Data at the Speed of Business
SAP Brief SAP Technology SAP Sybase IQ Objectives Tap into Big Data at the Speed of Business A simpler, more affordable approach to Big Data analytics A simpler, more affordable approach to Big Data analytics
Brocade Network Monitoring Service (NMS) Helps Maximize Network Uptime and Efficiency
WHITE PAPER SERVICES Brocade Network Monitoring Service (NMS) Helps Maximize Network Uptime and Efficiency Brocade monitoring service delivers business intelligence to help IT organizations meet SLAs,
RAID for the 21st Century. A White Paper Prepared for Panasas October 2007
A White Paper Prepared for Panasas October 2007 Table of Contents RAID in the 21 st Century...1 RAID 5 and RAID 6...1 Penalties Associated with RAID 5 and RAID 6...1 How the Vendors Compensate...2 EMA
BlueArc unified network storage systems 7th TF-Storage Meeting. Scale Bigger, Store Smarter, Accelerate Everything
BlueArc unified network storage systems 7th TF-Storage Meeting Scale Bigger, Store Smarter, Accelerate Everything BlueArc s Heritage Private Company, founded in 1998 Headquarters in San Jose, CA Highest
Using In-Memory Computing to Simplify Big Data Analytics
SCALEOUT SOFTWARE Using In-Memory Computing to Simplify Big Data Analytics by Dr. William Bain, ScaleOut Software, Inc. 2012 ScaleOut Software, Inc. 12/27/2012 T he big data revolution is upon us, fed
NEC s Carrier-Grade Cloud Platform
NEC s Carrier-Grade Cloud Platform Deploying Virtualized Network Functions in Cloud INDEX 1. Paving the way to Telecom Network Function Virtualization P.3 2. Open Carrier-grade Hypervisor P.3 Latency and
How To Improve Your Communication With An Informatica Ultra Messaging Streaming Edition
Messaging High Performance Peer-to-Peer Messaging Middleware brochure Can You Grow Your Business Without Growing Your Infrastructure? The speed and efficiency of your messaging middleware is often a limiting
Maximum performance, minimal risk for data warehousing
SYSTEM X SERVERS SOLUTION BRIEF Maximum performance, minimal risk for data warehousing Microsoft Data Warehouse Fast Track for SQL Server 2014 on System x3850 X6 (95TB) The rapid growth of technology has
How To Protect Data On Network Attached Storage (Nas) From Disaster
White Paper EMC FOR NETWORK ATTACHED STORAGE (NAS) BACKUP AND RECOVERY Abstract This white paper provides an overview of EMC s industry leading backup and recovery solutions for NAS systems. It also explains
Use product solutions from IBM Tivoli software to align with the best practices of the Information Technology Infrastructure Library (ITIL).
ITIL-aligned solutions White paper Use product solutions from IBM Tivoli software to align with the best practices of the Information Technology Infrastructure Library (ITIL). January 2005 2 Contents 2
