1 W I N T E R C O R P O R A T I O N Executive Report BIG DATA: BUSINESS OPPORTUNITIES, REQUIREMENTS AND ORACLE S APPROACH RICHARD WINTER December 2011 SUMMARY NEW SOURCES OF DATA and distinctive types of data analysis are enabling a new set of business opportunities. Dubbed Big Data, this new arena is characterized primarily by large and rapidly growing data volumes; varied data structures; and, new or newly intensified data analysis requirements. As enterprises harness big data, they are discovering opportunities to better understand and predict the interests and behavior of their customers, especially in connection with e-commerce and social networking. In engineering and manufacturing, companies are finding new opportunities to predict maintenance problems, enhance manufacturing quality and manage costs via big data. And in healthcare, there are new opportunities to predict and/ or more rapidly react to critical clinical events, yielding better health outcomes and more effective cost management. This WinterCorp Executive Report provides an introduction to big data. The Report describes what big data really is and why big data is enabling new business opportunities. The Report then reviews requirements and describes the approach adopted by Oracle Corporation to provide its customers with enterprise class products for big data. The development of this report by the independent data management expert WinterCorp has been sponsored by Oracle. A sidebar on page 3 describes the research methodology used. SPONSORED RESEARCH PROGRAM WHAT IS BIG DATA? The term Big Data first surfaced about five years ago in connection with the data management and analytical needs of extremely large internet businesses such as Google and Yahoo. These companies needed an economical and highly scalable architecture for the analysis of certain files, such as their logs of web activity. Their data and analytical requirements did not fit closely with the relational database products used in most commercial systems. And, because their data volumes were so extremely large and fast growing between ten and one hundred times the size of the largest relational databases they decided to create their own solutions. As these techniques spread, principally via the use of open source licenses, it turned out that similar requirements existed in many enterprises. THE LARGE SCALE DATA MANAGEMENT EXPERTS
2 2 BIG DATA: Business Opportunities,Requirements and Oracle s Approach About WinterCorp WinterCorp is an independent consulting firm that specializes in the scalability of terabyteand petabyte-scale data management systems and solutions, providing services to assist at every stage of the lifecycle. Since our founding in 1992, we have architected solutions to some of the largest scale and most demanding requirements, world wide. Our consulting services help enterprises define their requirements; architect their solutions; select their platforms; engineer their implementations; and, manage their growth to optimize business value. Our seminars and structured workshops help client teams establish a shared foundation of knowledge and move forward to meet their challenges in database scalability, performance and availability. Our expertise encompasses key products in commercial data warehousing as well as open source platforms such as Hadoop and other emerging technologies. With decades of experience in large scale data management implementations, and in-depth knowledge of database products and technology, we deliver unmatched insight into the issues that impede performance and the technologies and solutions that enable business success. 245 FIRST STREET, SUITE 1800 CAMBRIDGE MA The usual big data characteristics are: 1. Volume: there is a lot of data to be analyzed and/or the analysis is extremely intense. Either way, a lot of hardware is needed; 2. Variety: the data is not organized into simple, regular patterns as in a table; rather text, images and highly varied structures or structures unknown in advance are typical; 3. Velocity: the data comes into the data management system rapidly and often requires quick analysis or decision making. In addition, Oracle offers the idea that big data often has low value density. That is, most of the data in its originally received form may be of low value. Analytical processing may be required in order to transform the data into usable form or derive the usable portion. For example, it may be impossible to derive much business value from the logs of a website prior to sessionization. That is, the logs must first be organized into segments, each of which describes the actions of one user making one website visit. Only after sessionization can the user behavior be analyzed for patterns meaningful for many types of business decision making. Note that low value density does not mean low value. The opposite is true: you may have to analyze a lot of data to find what you want, but you do it because there is very high value to be found. Taken together, these four characteristics volume, variety, velocity and low value density are viewed by Oracle as the defining characteristics of big data. BIG DATA EXAMPLES E-Commerce and Consumer Marketing. An example of a big data problem is storing, managing and acting on the sentiment expressed by consumers concerning a brand such as Toyota automobiles or children s Tylenol. In the course of a day or a week, consumers can have millions of electronic interactions concerning a brand. While some of this is private, much of the interaction may be directly with the company via its websites, call centers, and stores; via to the company; or, via open social media, such as Twitter or Facebook. When something important is happening for example, consumers reacting to an economic, safety or environmental incident or to changes in price, product, supply or competition the company needs to quickly understand how best to react to the situation. A slow reaction can greatly increase the cost or difficulty of solving a problem or cause an important opportunity to be missed. This has long been a challenge, but because of the volume, variety and speed of electronic communication today, the challenge is now often much greater and can require even faster and more decisive action. But, how to be decisive in the face of a tidal wave of unstructured consumer communications? How to understand what people are really saying, what they are reacting to, what the problem really is, what the solutions are? The expectedly large and negative consumer reaction to new fees announced by large US banks in late 2011 is a case in point. Hundreds of thousands of accounts were lost in a few days the equivalent of a year of losses under typical market conditions by the time consumer sentiment was fully understood and corrective action could be taken.
3 BIG DATA: Business Opportunities,Requirements and Oracle s Approach 3 Engineering and Manufacturing. Electronic communication by consumers is only one of many types of data which are either new entirely or are now present in much larger volumes than just a few years ago. For example, the many sensors and data capture devices in consumer and industrial products in engineering and testing processes and, in manufacturing processes now collectively generate truly enormous volumes of data. As a result of the declining cost of automatically collecting and delivering data, the physical world and the daily activities of virtually every enterprise and its customers are being measured in ways we could scarcely have imagined a few years ago. There are hundreds of sensors in every car manufactured today, each generating data many times each second about some component or system of the vehicle when it is in use (e.g., brakes, accelerator, steering, transmission, lubrication, cooling, engine). This data is stored in the car; captured when the car is in for maintenance; and, then analyzed to diagnose immediate problems, but also for insight into engineering, manufacturing and maintenance issues. Many commercial fleets feature devices in every vehicle to record every significant action regarding safety, efficiency and other factors often generating many records per second per vehicle for wireless transmission and immediate collection and analysis. Recall that that there are commercial fleets of all kinds: cars, trucks, helicopters, construction equipment, road equipment and a vast array of specialty vehicles. The more costly or critical the vehicle, the more extensive the use of sensors and electronic devices to capture and emit data for safety, operator training, fleet maintenance and management. Whatever the collection device whether in manufacturing processes, engineering processes or end consumer or industrial products there is only one certainty: the number of such devices will only increase every year and the amount of data each can generate will do likewise. The result is a certain hyper-exponential increase in the volume of data to be analyzed in the optimization of most of the large scale manufacturing, product design, engineering and maintenance processes in existence. Healthcare. Perhaps the ultimate application for intelligent sensors and a source for both a great volume of data and great value is our own bodies and the medical devices that can increase and sustain our health, safety, and mobility. Intelligent electronic devices some used by people at home and some that travel with them as they go about their day now capture and transmit data for analysis in managing chronic diseases and conditions; for dealing with sleep disorders; for monitoring exercise; and, for a rapidly growing array of health and medical uses. Purpose and Methodology for this Report This WinterCorp Executive Report describes the phenomenon known as Big Data and the approach to it recently introduced by Oracle Corporation. In developing this report, WinterCorp drew on its own independent research and experience; interviewed Oracle employees; and, reviewed Oracle product materials. Oracle was provided an opportunity to comment on the paper with respect to facts, in its capacity as the sponsor of this research. WinterCorp has final editorial control over the content of this publication and is solely responsible for any opinions expressed. In general, more frequent data about what is actually happening with the heart, the breathing process, the blood sugar or the blood pressure as the patient goes about daily life greatly increases the ability to make good clinical decisions. This is a large improvement over depending on only information that the doctor can gather in a quarterly examination and interview of the patient. Thus, a cardiologist may get data every day about every ambulatory patient roughly one hundred times as often as he or she could get the information via quarterly office visits thereby sometimes getting an early warning of a problem and preventing a heart attack.
4 4 BIG DATA: Business Opportunities,Requirements and Oracle s Approach The data gathered by such medical devices is voluminous and growing rapidly; it calls for intensive and complex analysis both to enhance clinical decisions and to guide research on better practices. While a relatively new area for big data, healthcare promises to yield large scale and valuable opportunities in the years to come. Common Characteristics. These three big data examples have some things in common. In each case, there is a high volume to be analyzed. In all three cases, the data arrives for analysis at high velocity. In all three cases, there is a lot of data variety. The consumer sentiment comes in mostly unstructured messages and text. In the other cases, there are multiple sources some structured, some not. And, in all three cases, there is low value density When the analysis shows that a thousand customers have voiced the same sentiment; that a thousand patients have exhibited a similar pattern prior to heart attack; that a consistent pattern of engineering measurements correlate with a structural failure then there can be value that has not been found elsewhere. To some extent, all three of these examples can be attacked by storing and analyzing data in a relational data warehouse. But, a different type of data analysis environment one designed for big data one that features lower cost and higher flexibility can provide a better fit for the purpose. NEW OPPORTUNITIES In addition to similar technical requirements, these three examples have something else in common: an extraordinarily high return on investment based on a solution that was previously infeasible. In the past, it would have either cost too much or taken too long to get the analytical result, particularly in a situation where the outcome was uncertain and many different analytical techniques might have to be tried before the value is found. In addition, note that these are three examples of a broad and far reaching set of opportunities. So, sentiment analysis applies equally to consumer and industrial products. The sensors described in the second example are just one of many types of electronic devices in pervasive use to automatically and remotely capture data for analysis in engineering, manufacturing and maintenance. Also in widespread use are audio and optical devices; location sensing devices; and, many other types of measurement devices. Big data is thus linked with an enormous and immensely valuable set of new business, scientific and social opportunities in the years ahead. NEW TECHNOLOGIES FOR BIG DATA The new opportunities in front of us promise tremendous advances in commerce, engineering, manufacturing and healthcare to name just a few examples. But they require the solution to a tough problem: how do you handle these immense volumes of data? How do you deal with data that doesn t fit well structurally in a relational data warehouse? How can you afford to process and store data that arrives constantly 10- to-100 times faster than your transactional data? How can you accomplish the needed intensive, frequent and sometimes immediate analysis analysis that is difficult or impossible to express in SQL? And, how do you deal with the economics of such large volumes of data, particularly when you may not know in advance whether it contains information of value? The engineers at Google, Yahoo and other companies working with them confronting such a problem decided a new approach was needed. Their approach has principles in common with the highly parallel relational data warehouse and some principles that are different. First, as in the relational data warehouse, they envisioned large clusters of servers sharing the work to be done. And, they pictured distributing the data to be analyzed over the servers, so that each server could work on its own piece of data at the same time, shortening the time to completion. This principle, also, is in common with the relational database. Second, from the start, they knew they wanted a solution that would work for very large clusters of servers thousands, tens of thousands even hundreds of thousands of servers. These components would have to be very compact and inexpensive. Otherwise, the solution would become unaffordable as it grew. Third, they focused on how to simplify the writing of application programs that would operate on data distributed across the servers of a large cluster. Thus, they relieved the application programmer of concern with the distributed architecture of the cluster, but chose not to provide certain services we take for granted in a relational database, such as system management and
5 BIG DATA: Business Opportunities,Requirements and Oracle s Approach 5 enforcement of data definitions or system optimization of complex non-procedural queries. From these developmental efforts, two core database capabilities were created: a system called Cassandra for so-called NoSQL operational data management; and, a system called Hadoop for highly parallel intensive data analytics. These systems are now available as open source products from the Apache Foundation. The following brief descriptions of Hadoop and Cassandra are from the Apache Foundation website: Hadoop. The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using a simple programming model. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, so delivering a highlyavailable service on top of a cluster of computers, each of which may be prone to failures. Hadoop supports MapReduce, a software framework for distributed processing of large data sets on compute clusters. Cassandra. The Apache Cassandra Project develops a highly scalable second-generation distributed database, bringing together Dynamo s fully distributed design and Bigtable s Column Family-based data model. Cassandra was open sourced by Facebook in 2008, and is now developed by Apache committers and contributors from many companies. Some hundreds of companies have initiated programs to evaluate, experiment with and, in some cases, adopt these open source products for big data. But, they face new challenges. For example, using a relatively new open source product to manage and analyze large volumes of data as part of a high value business process is a difficult challenge for most companies. ENTERPRISE REQUIREMENTS FOR BIG DATA Now, what if you want these big data capabilities in your company, which is in retailing, financial services, manufacturing or some other industry? You are going to install a new, large scale infrastructure. You will use it to manage important data to support a business purpose. You will create and maintain analytic software Java programs employing MapReduce, for example to use with the data. And, you will want to have a sustainable, adaptable resource that integrates into your existing IT environment and produces actionable, replicable results. You need a way to create, use, manage adapt and sustain the system over a period of time, as individual employees come and go. You also need to be able to operate such a system reliably; avoid the loss of critical data; readily change the system as new technology and new requirements develop; and, apply the system to new problems as they arise. And, of course, the big data solution must be easy to deploy and integrate into their existing IT environment, and also be manageable, supportable and cost effective. There are other enterprise requirements. Consider a project intended to increase customer retention in a credit card business. The project begins with sentiment analysis to understand what customers like and don t like about the current card, services, pricing, etc. This is performed in the big data environment in Hadoop and identifies customers who are expressing some concern. Information on those customers and their issues is then transferred to the data warehouse environment, where it can be combined with other customer data, transaction data and credit scores. From this combined analysis, incentives offers can be developed to retain the most valuable and credit worthy customers from the group that is unhappy but still enrolled. A campaign can then be conducted, measuring the impact of these actions. The campaign can be assessed partly by the customers spending behavior and partly by the sentiment they express in , website visits, etc. Thus, the complete customer retention project involves a dozen or more steps in which data is used in and moved between three different environments: the big data environment, where there are facilities for capturing and analyzing customer sentiment; the data warehouse, where there is structured data on customers, transactions and credit score; and, the business intelligence environment, where there are tools for query, reporting and structured analysis. So, an important part of the enterprise requirement is the need to carry out complete projects that deliver business
6 6 BIG DATA: Business Opportunities,Requirements and Oracle s Approach value and that apply the data and resources spanning big data, data warehousing and business intelligence. ORACLE S ENGINEERED SYSTEMS FOR DATA MANAGEMENT Oracle addresses the enterprise requirements for big data via its overall approach to data management, including its Engineered Systems, as depicted in Figure 1. Oracle s Engineered Systems for data management include Oracle Exadata Database Machine, Oracle Exalytics and Oracle Big Data Appliance. Each of these consists of integrated hardware and software, preconfigured, pre-tested and engineered for a certain class of data management requirements. Exadata Database Machine is Oracle s solution to typical relational database requirements both transaction processing and data warehousing. Exalytics In-Memory Machine is aimed at high performance business intelligence requirements. And, Oracle Big Data Appliance is Oracle s answer to the big data requirements that are the focus of this report. Note that Oracle s data management software remains available as separate products. This is true for all the software components in the new Oracle Big Data Appliance, as well as for the software components of the other Oracle Engineered Systems for data management. ORACLE S BIG DATA APPLIANCE Oracle approaches big data as it does other data management requirements, believing that many customers reject the burden of integrating hundreds or thousands of software and hardware components to create a complete, scalable data management solution. By relieving the customer of the system integration, system maintenance and system support costs and by adopting an enterprise approach to customer requirements Oracle delivers significant value with this approach. Oracle s key goals for Big Data Appliance are to enable its customers to jump start their big data projects by delivering a comprehensive and pre-integrated solution that is well integrated with Exadata and that the customer can readily deploy and support. Oracle Big Data Appliance comes in a full rack configuration with 864GB of main memory and 432 TB of storage. The major hardware components are: 18 Nodes, each a Sun server consisting of 2 CPUs (6-core Intel processors) 48 GB memory per node (upgradable to 96 GB or 144 GB)& 12 x 2TB disks per node InfiniBand Networking 10 GbE connectivity Figure 1: Oracle s Engineered Systems for Data Management Source: Oracle Corporation
7 BIG DATA: Business Opportunities,Requirements and Oracle s Approach 7 Figure 2: Oracle Big Data Appliance Oracle Big Data Appliance includes a combination of open source components, packaged as system software with the appliance, and software developed by Oracle, packaged as Big Data Connectors. Key components include: 2SHQVRXUFHGLVWULEXWLRQRI $SDFKH+DGRRS, the framework for distributed data analysis with programs written in Java or other procedural languages and interfacing to the data in Hadoop via MapReduce; 2SHQVRXUFHGLVWULEXWLRQRI5, a programming language and software environment for statistical computing and graphics; R is part of the GNU project; 2UDFOH1R64/'DWDEDVH is a distributed, scalable, key-value database with a simple programming model; designed for ready deployment and management in an enterprise environment; the principal role for Oracle NoSQL Database here is low latency data capture/update and fast processing of simple operational queries; 2UDFOH'DWD,QWHJUDWRU$SSOLFDWLRQ$GDSWRUIRU +DGRRS simplifies the integration of data from Hadoop and data from Oracle databases; once the adaptor has been used to make Hadoop data accessible in the database, end users can access it with SQL and with business intelligence tools; 2UDFOH/RDGHUIRU+DGRRS supports the use of MapReduce to load data from Hadoop into Oracle Database 11g. Unlike other Hadoop loaders, it generates Oracle internal formats to load data faster and use fewer database system resources; -DYD+RW6SRW9LUWXDO0DFKLQH, a version of the Java virtual machine featuring just in time compilation and adaptive optimization, based in part on the automatic identification of hot spots in the executing Java program; 2UDFOH(QWHUSULVH/LQX[, the version of Red Hat Linux enhanced by Oracle for enterprise use; certified to work with Oracle middleware; and, Source: Oracle Corporation employed as the operating system on Oracle s Engineered Systems. To the extent that Oracle can deliver on its aims, Oracle Big Data Appliance will represent a sharp contrast to the customer experience of self-integrating, operating and managing a similar capability out of open source components, and independently sourced commercial hardware and software. FINDINGS AND RECOMMENDATIONS The much discussed phenomenon of Big Data, while over promoted in some quarters, is about a very real set business opportunities with outstanding upside potential. Big data implementations are already a significant business force in hundreds of enterprises. They are likely to have a much greater impact in the coming years. Oracle has outlined an approach to big data focused on its Engineered Systems aimed at making big data initiatives practical for the enterprise. The Oracle approach has three key strengths. First, the customer is relieved of the integration involved in assembling a suitable set of hardware and software components to create a big data architecture. When Oracle Big Data Appliance is delivered and connected
8 8 BIG DATA: Business Opportunities,Requirements and Oracle s Approach appropriately to the customer s existing infrastructure, the customer receives a working hardware and software infrastructure for big data pre configured, pre-tested and ready to use. Second, commercial quality support is available and the entire system is supported by a single vendor. This is vastly different from the situation in which the customer must isolate the software and hardware components involved in any given problem and interact with multiple vendors or perhaps with an open source support community in order to diagnose and correct the problem. Third, and more significantly for most companies already using Oracle Database: the new big data environment connects to the existing Oracle database environment at the data management software level. So, Oracle Big Data appliance can be used for staging and ETL processes that need to occur upstream of the data warehouse. The included Oracle Loader for Hadoop can then be used to transfer the data in parallel into Oracle Exadata. Oracle Exadata can readily use data that comes from an Oracle Big Data Appliance in conjunction with data already in the data warehouse. For Oracle customers with an investment in Oracle software and hardware, it is a major benefit to be able to initiate work in big data, knowing that the data warehouse environment will be able to exchange and integrate data with the big data environment. Note that all the software present in Oracle Big Data Appliance is also available separately for customers who prefer to custom configure their hardware and software systems. SUMMING UP Oracle Big Data Appliance in connection with Oracle Exadata Database Machine and Oracle Exalytics In-Memory Machine offers a new route for Oracle customers to leverage the business opportunities associated with Big Data. Big Data Appliance is aimed at addressing the technical issues of data variety, data velocity and data volume. At the same time, the Oracle approach to Big Data is focused on addressing the key enterprise requirements of integration, manageability and supportability. As with any new product, customers must evaluate with care and test rigorously to ensure that their requirements for performance, scale, reliability and manageability are in fact satisfied by the products under consideration. But, bearing the product maturity in mind, WinterCorp recommends that Oracle customers with an interest in big data take a look at Oracle Big Data Appliance.
Oracle Database - Engineered for Innovation Sedat Zencirci Teknoloji Satış Danışmanlığı Direktörü Türkiye ve Orta Asya Oracle Database 11g Release 2 Shipping since September 2009 184.108.40.206 Patch Set now
Big Data Are You Ready? Thomas Kyte http://asktom.oracle.com The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated
INSIGHT Oracle's All- Out Assault on the Big Data Market: Offering Hadoop, R, Cubes, and Scalable IMDB in Familiar Packages Carl W. Olofson IDC OPINION Global Headquarters: 5 Speen Street Framingham, MA
Big Data: Are You Ready? Kevin Lancaster Director, Engineered Systems Oracle Europe, Middle East & Africa 1 A Data Explosion... Traditional Data Sources Billing engines Custom developed New, Non-Traditional
Executive Summary... 2 Introduction... 3 Defining Big Data... 3 The Importance of Big Data... 4 Building a Big Data Platform... 5 Infrastructure Requirements... 5 Solution Spectrum... 6 Oracle s Big Data
Big Data Big Deal for Public Sector Organizations Hoàng Xuân Hiếu Director, FAB & Government Business Indochina & Myanmar 1 Copyright 2013, Oracle and/or its affiliates. All rights reserved. The following
Big Data Primer Andrew Mendelsohn, SVP Database April 12, 2012 Safe Harbor Statement "Safe Harbor" Statement: Statements in this presentation relating to Oracle's future plans, expectations, beliefs, intentions
News and trends in Data Warehouse Automation, Big Data and BI Johan Hendrickx & Dirk Vermeiren Extreme Agility from Source to Analysis DWH Appliances & DWH Automation Typical Architecture 3 What Business
The Future of Data Management with Hadoop and the Enterprise Data Hub Amr Awadallah (@awadallah) Cofounder and CTO Cloudera Snapshot Founded 2008, by former employees of Employees Today ~ 800 World Class
Oracle Big Data SQL Technical Update Jean-Pierre Dijcks Oracle Redwood City, CA, USA Keywords: Big Data, Hadoop, NoSQL Databases, Relational Databases, SQL, Security, Performance Introduction This technical
Architecting for Big Data Analytics and Beyond: A New Framework for Business Intelligence and Data Warehousing Wayne W. Eckerson Director of Research, TechTarget Founder, BI Leadership Forum Business Analytics
Technology Insight Paper Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities By John Webster February 2015 Enabling you to make the best technology decisions Enabling
ESS event: Big Data in Official Statistics Antonino Virgillito, Istat v erbi v is 1 About me Head of Unit Web and BI Technologies, IT Directorate of Istat Project manager and technical coordinator of Web
An Oracle White Paper June 2013 Oracle: Big Data for the Enterprise Executive Summary... 2 Introduction... 3 Defining Big Data... 3 The Importance of Big Data... 4 Building a Big Data Platform... 5 Infrastructure
FOR A FEW TERABYTES MORE THE GOOD, THE BAD and THE BIG DATA Cenk Kiral Senior Director of BI&EPM solutions ECEMEA region Big Data Buzz The promise of big data Intelligent Utility - 8/28/11 Big data, analytics
Introducing Oracle Exalytics In-Memory Machine Jon Ainsworth Director of Business Development Oracle EMEA Business Analytics 1 Copyright 2011, Oracle and/or its affiliates. All rights Agenda Topics Oracle
AGENDA What is BIG DATA? What is Hadoop? Why Microsoft? The Microsoft BIG DATA story Hadoop PDW Our BIG DATA Roadmap BIG DATA? Volume 59% growth in annual WW information 1.2M Zetabytes (10 21 bytes) this
Big Data, Big Traffic And the WAN Internet Research Group January, 2012 About The Internet Research Group www.irg-intl.com The Internet Research Group (IRG) provides market research and market strategy
WHITE PAPER Oracle NoSQL Database and SanDisk Offer Cost-Effective Extreme Performance for Big Data 951 SanDisk Drive, Milpitas, CA 95035 www.sandisk.com Table of Contents Abstract... 3 What Is Big Data?...
An Oracle White Paper October 2013 Oracle Data Integrator 12c (ODI12c) - Powering Big Data and Real-Time Business Analytics Introduction: The value of analytics is so widely recognized today that all mid
An Oracle White Paper June 2012 High Performance Connectors for Load and Access of Data from Hadoop to Oracle Database Executive Overview... 1 Introduction... 1 Oracle Loader for Hadoop... 2 Oracle Direct
Architecture & Experience Data Mining - Combination from SAP HANA, R & Hadoop Markus Severin, Solution Principal Copyright 2014 Hewlett-Packard Development Company, L.P. The information contained herein
BIG DATA TOOLS Top 10 open source technologies for Big Data We are in an ever expanding marketplace!!! With shorter product lifecycles, evolving customer behavior and an economy that travels at the speed
Cray: Enabling Real-Time Discovery in Big Data Discovery is the process of gaining valuable insights into the world around us by recognizing previously unknown relationships between occurrences, objects
SAS and Oracle: Big Data and Cloud Partnering Innovation Targets the Third Platform David Lawler, Oracle Senior Vice President, Product Management and Strategy Paul Kent, SAS Vice President, Big Data What
ANALYTICS BUILT FOR INTERNET OF THINGS Big Data Reporting is Out, Actionable Insights are In In recent years, it has become clear that data in itself has little relevance, it is the analysis of it that
Big Data, Fast Data, Spatial Data Making Sense of Location Data in a Smart City Hans Viehmann Product Manager EMEA ORACLE Corporation August 19, 2015 Copyright 2014, Oracle and/or its affiliates. All rights
Elasticsearch on Cisco Unified Computing System: Optimizing your UCS infrastructure for Elasticsearch s analytics software stack HIGHLIGHTS Real-Time Results Elasticsearch on Cisco UCS enables a deeper
BIG DATA TRENDS AND TECHNOLOGIES THE WORLD OF DATA IS CHANGING Cloud WHAT IS BIG DATA? Big data are datasets that grow so large that they become awkward to work with using onhand database management tools.
ORACLE BIG DATA APPLIANCE X3-2 BIG DATA FOR THE ENTERPRISE KEY FEATURES Massively scalable infrastructure to store and manage big data Big Data Connectors delivers load rates of up to 12TB per hour between
Are You Ready for Big Data? Jim Gallo National Director, Business Analytics February 11, 2013 Agenda What is Big Data? How do you leverage Big Data in your company? How do you prepare for a Big Data initiative?
1 Oracle Big Data Appliance Releases 2.5 and 3.0 Ralf Lange Global ISV & OEM Sales Agenda Quick Overview on BDA and its Positioning Product Details and Updates Security and Encryption New Hadoop Versions
Technology Insight Paper Big Data Maximizing the Flow By John Webster August 15, 2012 Enabling you to make the best technology decisions Big Data Maximizing the Flow 1 Big Data Maximizing the Flow 2 The
ORACLG Oracle Press Oracle Big Data Handbook Tom Plunkett Brian Macdonald Bruce Nelson Helen Sun Khader Mohiuddin Debra L. Harding David Segleau Gokula Mishra Mark F. Hornick Robert Stackowiak Keith Laker
An Integrated Analytics & Big Data Infrastructure September 21, 2012 Robert Stackowiak, Vice President Data Systems Architecture Oracle Enterprise Solutions Group The following is intended to outline our
Powered by Vertica Solution Series in conjunction with: hmetrix Revolutionizing Healthcare Analytics with Vertica & Tableau The cost of healthcare in the US continues to escalate. Consumers, employers,
Big Data Technologies Compared June 2014 Agenda What is Big Data Big Data Technology Comparison Summary Other Big Data Technologies Questions 2 What is Big Data by Example The SKA Telescope is a new development
David Chappell SELLING PROJECTS ON THE MICROSOFT BUSINESS ANALYTICS PLATFORM A PERSPECTIVE FOR SYSTEMS INTEGRATORS Sponsored by Microsoft Corporation Copyright 2014 Chappell & Associates Contents Business
A REVIEW PAPER ON THE HADOOP DISTRIBUTED FILE SYSTEM Sneha D.Borkar 1, Prof.Chaitali S.Surtakar 2 Student of B.E., Information Technology, J.D.I.E.T, firstname.lastname@example.org Assistant Professor, Information
White Paper Big Data Executive Overview WP-BD-10312014-01 By Jafar Shunnar & Dan Raver Page 1 Last Updated 11-10-2014 Table of Contents Section 01 Big Data Facts Page 3-4 Section 02 What is Big Data? Page
WHITEPAPER OPEN MODERN DATA ARCHITECTURE FOR FINANCIAL SERVICES RISK MANAGEMENT A top-tier global bank s end-of-day risk analysis jobs didn t complete in time for the next start of trading day. To solve
The 3 questions to ask yourself about BIG DATA Do you have a big data problem? Companies looking to tackle big data problems are embarking on a journey that is full of hype, buzz, confusion, and misinformation.
Big Data Explained An introduction to Big Data Science. 1 Presentation Agenda What is Big Data Why learn Big Data Who is it for How to start learning Big Data When to learn it Objective and Benefits of
IBM System x reference architecture solutions for big data Easy-to-implement hardware, software and services for analyzing data at rest and data in motion Highlights Accelerates time-to-value with scalable,
Dell s SAP HANA Appliance SAP HANA is the next generation of SAP in-memory computing technology. Dell and SAP have partnered to deliver an SAP HANA appliance that provides multipurpose, data source-agnostic,
An Integrated Big Data & Analytics Infrastructure June 14, 2012 Robert Stackowiak, VP ESG Data Systems Architecture Big Data & Analytics as a Service Components Unstructured Data / Sparse Data of Value
Business Analytics and Big Data Computerworld March 2012 Survey Results Purpose & Methodology Survey Sample Survey Method Field Work 1/24/12-2/21/12 Audience Base Computerworld Online Collection Number
E-PAPER March 2014 Big Data & the Cloud: The Sum Is Greater Than the Parts Learn how to accelerate your move to the cloud and use big data to discover new hidden value for your business and your users.
While a number of technologies fall under the Big Data label, Hadoop is the Big Data mascot. Remember it stands front and center in the discussion of how to implement a big data strategy. Early adopters
Cisco for SAP HANA Scale-Out Solution Solution Brief December 2014 With Intelligent Intel Xeon Processors Highlights Scale SAP HANA on Demand Scale-out capabilities, combined with high-performance NetApp
III Big Data Technologies Today, new technologies make it possible to realize value from Big Data. Big data technologies can replace highly customized, expensive legacy systems with a standard solution
Big Data Success Step 1: Get the Technology Right TOM MATIJEVIC Director, Business Development ANDY MCNALIS Director, Data Management & Integration MetaScale is a subsidiary of Sears Holdings Corporation
SEPTEMBER 2013 Buyer s Guide to Big Data Integration Sponsored by Contents Introduction 1 Challenges of Big Data Integration: New and Old 1 What You Need for Big Data Integration 3 Preferred Technology
Tap into Hadoop and Other No SQL Sources Presented by: Trishla Maru What is Big Data really? The Three Vs of Big Data According to Gartner Volume Volume Orders of magnitude bigger than conventional data
An Oracle White Paper November 2010 Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics 1 Introduction New applications such as web searches, recommendation engines,
NoSQL for SQL Professionals William McKnight Session Code BD03 About your Speaker, William McKnight President, McKnight Consulting Group Frequent keynote speaker and trainer internationally Consulted to
Safe Harbor Statement The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment
In-Memory Analytics for Big Data Game-changing technology for faster, better insights WHITE PAPER SAS White Paper Table of Contents Introduction: A New Breed of Analytics... 1 SAS In-Memory Overview...
The big data revolution Friso van Vollenhoven (Xebia) Enterprise NoSQL Recently, there has been a lot of buzz about the NoSQL movement, a collection of related technologies mostly concerned with storing
Big Data Integration Challenges and Opportunities in Accessing and Using Today s Information Research Report Executive Summary Sponsored by Copyright Ventana Research 2013 Do Not Redistribute Without Permission
perspective Data Virtualization A Potential Antidote for Big Data Growing Pains Atul Shrivastava Abstract Enterprises are already facing challenges around data consolidation, heterogeneity, quality, and
White Paper: SAS and Apache Hadoop For Government Unlocking Higher Value From Business Analytics to Further the Mission Inside: Using SAS and Hadoop Together Design Considerations for Your SAS and Hadoop
BIG DATA IS MESSY PARTNER WITH SCALABLE SCALABLE SYSTEMS HADOOP SOLUTION WHAT IS BIG DATA? Each day human beings create 2.5 quintillion bytes of data. In the last two years alone over 90% of the data on
What is Big Data Outline What is Big data and where they come from? How we deal with Big data? Big Data Everywhere! As a human, we generate a lot of data during our everyday activity. When you buy something,
Solution Overview LexisNexis High-Performance Computing Cluster Systems Platform Get More Scalability and Flexibility for What You Will Learn Modern enterprises are challenged with the need to store and
Big Data Are You Ready? Jorge Plascencia Solution Architect Manager Big Data: The Datafication Of Everything Thoughts Devices Processes Thoughts Things Processes Run the Business Organize data to do something
Benchmarking Large-Scale Data Management Insights from Presentation Confidentiality Statement The materials in this presentation are protected under the confidential agreement and/or are copyrighted materials
Architecting the Future of Big Data Whitepaper Apache Hadoop: The Big Data Refinery Introduction Big data has become an extremely popular term, due to the well-documented explosion in the amount of data
Create and Drive Big Data Success Don t Get Left Behind The performance boost from MapR not only means we have lower hardware requirements, but also enables us to deliver faster analytics for our users.
IDC ExpertROI SPOTLIGHT University of Kentucky Leveraging SAP HANA to Lead the Way in Use of Analytics in Higher Education Sponsored by: SAP Matthew Marden April 2014 Randy Perry Overview Founded in 1865
Bringing Big Data into the Enterprise Overview When evaluating Big Data applications in enterprise computing, one often-asked question is how does Big Data compare to the Enterprise Data Warehouse (EDW)?
White Paper Low Cost High Availability Clustering for the Enterprise Jointly published by Winchester Systems Inc. and Red Hat Inc. Linux Clustering Moves Into the Enterprise Mention clustering and Linux
Constructing a Data Lake: Hadoop and Oracle Database United! Sharon Sophia Stephen Big Data PreSales Consultant February 21, 2015 Safe Harbor The following is intended to outline our general product direction.
Cloud Integration and the Big Data Journey - Common Use-Case Patterns A White Paper August, 2014 Corporate Technologies Business Intelligence Group OVERVIEW The advent of cloud and hybrid architectures
Big data: Unlocking strategic dimensions By Teresa de Onis and Lisa Waddell Dell Inc. New technologies help decision makers gain insights from all types of data from traditional databases to high-visibility