Addressing Big Data Issues in Scientific Data Infrastructure
|
|
|
- Mae Carroll
- 10 years ago
- Views:
Transcription
1 Addressing Big Issues in Scientific Infrastructure Yuri Demchenko, SNE Group, University of Amsterdam 21 May 2013, BDDAC2013 Workshop CTS2013 San Diego
2 Outline Background to Big research at SNE/UvA Big definition 5 V s of Big : Volume, Velocity, Variety, Value, Veracity Use case: High Volume Low Value (HVLV) data for financial market feeds Will Big term and definition change/evolve? Big technologies landscape Big and Intensive e-science Scientific Lifecycle Management Scientific Infrastructure (SDI) for Big Cloud and Intercloud based platform for SDI Infrastructure research on SDI for Big Discussion 21 May 2013, BDDAC2013 Big & Sci Infrastructure Slide_2
3 Background to Research on Big System and Network Engineering Group at the University of Amsterdam Optical and high performance networks E-Science and Collaborative applications Computer Grids and Cloud Computing Security, access control, trust management Future Scientific Infrastructure Fusion of industry Big and Intensive Science Big requires infrastructure High performance computing and network Access infrastructure and collaborative environment Clouds (and Grids) is not a full answer 21 May 2013, BDDAC2013 Big & Sci Infrastructure 3
4 Big and Intensive Science & Technology - The next technology focus Scientific and Research e-science Big is/has becoming the next buzz word Not much academic research and papers Dive into blogs and tweets Based on the e-science concept and entire information and artifacts digitising Requires also new information and semantic models for information structuring and presentation Requires new research methods using large data sets and data mining Methods to evolve and results to be improved Changes the way how the modern research is done (in e-science) Secondary research, data re-focusing, linking data and publications Big require infrastructure to support both distributed data (collection, storage, processing) and metadata/discovery services Demand for trusted/trustworthy infrastructure Clouds provide just right technology for (data supporting) infrastructure virtualisation 21 May 2013, BDDAC2013 Big & Sci Infrastructure 4
5 Big Challenges and Initiatives A Vision for Global Research Infrastructure ( Final Roadmap Report published Peta and Exa scale problems: Storage, Computing, Transfer/Network International Exascale Software Project ( International Initiative Research Alliance (RDA) launched 2-3 Oct 2012, Washington To accelerate international data-driven innovation and discovery by facilitating research data sharing and exchange, use and re-use, standards harmonization, and discoverability Consolidated previous initiatives Web Forum (DWF) initiated by NSF DAITF Access and Interoperability Task Force initiated by EUDAT project First meeting RDA Official Launch, Gothenburg March 2013 Next meeting September 2013, Washington 21 May 2013, BDDAC2013 Big & Sci Infrastructure 5
6 Big Definition (1) IDC definition (conservative and strict approach) of Big : "A new generation of technologies and architectures designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture, discovery, and/or analysis" Big : a massive volume of both structured and unstructured data that is so large that it's difficult to process using traditional database and software techniques. From The Big Long Tail blog post by Jason Bloomberg (Jan 17, 2013). that exceeds the processing capacity of conventional database systems. The data is too big, moves too fast, or doesn t fit the structures of your database architectures. To gain value from this data, you must choose an alternative way to process it. Ed Dumbill, program chair for the O Reilly Strata Conference Termed as the Fourth Paradigm *) The techniques and technologies for such data-intensive science are so different that it is worth distinguishing data-intensive science from computational science as a new, fourth paradigm for scientific exploration. (Jim Gray, computer scientist) *) The Fourth Paradigm: -Intensive Scientific Discovery. Edited by Tony Hey, Stewart Tansley, and Kristin Tolle. Microsoft, May 2013, BDDAC2013 Big & Sci Infrastructure 6
7 Big Definition (2) - Motivation Why Big based decision making support is so important? Read book Thinking. Fast and Slow (2011) by Nobel Memorial Prize winner in Economics Daniel Kahneman Our thinking is systematically mistaken when it comes to (subconscious) statistical data assessment Dutch translation Ons feilbare denken ( Our mistaken thinking ) Blog article by Mike Gualtieri from Forrester: Firms increasingly realize that [big data] must use predictive and descriptive analytics to find nonobvious information to discover value in the data. Advanced analytics uses advanced statistical, data mining and machine learning algorithms to dig deeper to find patterns that you can t see using traditional BI (Business Intelligence) tools, simple queries, or rules. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 7
8 5 V s of Big Volume Terabytes Records/Arch Transactions Tables, Files Velocity Batch Real/near-time Processes Streams Variety Structured Unstructured Multi-factor Probabilistic 5 Vs of Big Trustworthiness Authenticity Origin, Reputation Availability Accountability Value Statistical Events Correlations Hypothetical Commonly accepted 3V s of Big Veracity 21 May 2013, BDDAC2013 Big & Sci Infrastructure 8
9 Volume, Velocity, Variety Examples e-science Volume Terabyte records, transactions, tables, files. LHC 5 PB a month (now is under re-construction) LOHAR 5 PB every hour, requires processing asap to discard noninformative data Other astronomy research Genome research Earth, climate and weather data Velocity batch, near-time, real-time, streams. LNC ATLAS detector collect about xtb data from X sensors during the collision time about 1? ms What other research require highspeed data Variety structures, unstructured, semi-structured, and all the above in a mix Biological and medical, facial research Human, psychology and behavior research History, archeology and artifacts 21 May 2013, BDDAC2013 Big & Sci Infrastructure 9
10 Volume, Velocity, Variety Examples Industry Volume Terabyte records, transactions, tables, files. A Boeing Jet engine produce out 10TB of operational data for every 30 minutes they run Hence a 4-engine Jumbo jet can create 640TB on one Atlantic crossing. Multiply that to 25,000 flights flown each day and we get the picture Velocity batch, near-time, real-time, streams. Today s on-line ad serving requires 40ms to respond with a decision Financial services (i.e., stock quotes feed) need near 1ms to calculate customer scoring probabilities Stream data, such as movies, need to travel at high speed for proper rendering Variety structures, unstructured, semi-structured, and all the above in a mix WalMart processes 1M customer transactions per hour and feeds information to a database estimated at 2.5PB (petabytes) There are old and new data sources like RFID, sensors, mobile payments, in-vehicle tracking, etc. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 10
11 Big Veracity Trustworthiness and Reputation -> Integrity Origin, Authenticity and Identification Identification both and Source Source: system/domain and author linkage (for complex hierarchical data, data provenance) Availability Timeliness Mobility (mobile/remote access; from other domain roaming; federation) Accountability A kind of pro-active measure to ensure data veracity 21 May 2013, BDDAC2013 Big & Sci Infrastructure 11
12 Big Challenges - Technological How to scale up and down (scale or shrink)? Primarily database issues SQL scales easy up but not easy scales down if demand decreases NoSQL (Not only SQL) can partly address this issue SQL has complex syntax, strong schema typing, performance NoSQL is more flexible to adopt to new biz processes Primarily but not just key-value or document-based structures and data models To respond to specific use cases and operations over data mining/data intelligence algorithms To handle/discover new data structures and multi-type data relations Human/behavioral/social targeted data analysis (means fuzzy/biased) Infrastructure support for storing, moving data, on-demand processing Is Cloud Computing a right technology? Any alternative? High speed network infrastructure, on-demand provisioning Big security, trustworthiness and data centric security 21 May 2013, BDDAC2013 Big & Sci Infrastructure 12
13 Big Challenges Socio-technological Extending big data outreach/perimeter Technology will boom if there is sufficient customer and user base Currently majority of Big consumers are big companies Although we are contributing with feeding our activity/usage log data Move big data from big companies to user and homes Smart homes, sensors and devices Without sensors and devices human can not create or use big data Smart visualisation can solve a problem of using/acting on big data Lowering entry level to use Big You should not be a data expert to use Big Needs for scalable configurable tools Big and Privacy issues Digital footprint and re-identification 21 May 2013, BDDAC2013 Big & Sci Infrastructure 13
14 21 May 2013, BDDAC2013 Big & Sci Infrastructure 14
15 Big Infrastructure Components Cloud base infrastructure services for data centric applications (storage, compute, infrastructure/vm management) Cluster services Hadoop related services and tools Specialised data analytics tools (logs, events, data mining, etc.) bases/servers SQL, NoSQL MPP (Massively Parallel Processing) databases Big Management Registries, Indexing/search, semantics, namespaces Security infrastructure (access control, policy enforcement, confidentiality, trust, availability, privacy) Collaborative environment (groups management) 21 May 2013, BDDAC2013 Big & Sci Infrastructure 15
16 Will Big term sustain? Other names Big Analytics, Big Analytics To avoid the term itself becomes a fetish Analytics, Intelligent Analytics Missed infrastructure component Big vs Intensive Science e-science is based on and involves wide cooperation between researchers New concepts related to Big Disposable in contrary to data supposed to be stored Non-deterministic nature of the scientific study objects From natural science to economics and social science 21 May 2013, BDDAC2013 Big & Sci Infrastructure 16
17 Big and Intensive Science 21 May 2013, BDDAC2013 Big & Sci Infrastructure 17
18 e-science Features Automation of all e-science processes including data collection, storing, classification, indexing and other components of the general data curation and provenance Transformation of all processes, events and products into digital form by means of multi-dimensional multi-faceted measurements, monitoring and control; digitising existing artifacts and other content Possibility to re-use the initial and published research data with possible data re-purposing for secondary research Global data availability and access over the network for cooperative group of researchers, including wide public access to scientific data Existence of necessary infrastructure components and management tools that allows fast infrastructures and services composition, adaptation and provisioning on demand for specific research projects and tasks Advanced security and access control technologies that ensure secure operation of the complex research infrastructures and scientific instruments and allow creating trusted secure environment for cooperating groups and individual researchers. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 18
19 Scientific Types EC Open Access Initiative Requires data linking at all levels and stages Publications and Linked Published Structured Raw Raw data collected from observation and from experiment (according to an initial research model) Structured data and datasets that went through data filtering and processing (supporting some particular formal model) Published data that supports one or another scientific hypothesis, research result or statement linked to publications to support the wide research consolidation, integration, and openness. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 19
20 DB Traditional Lifecycle Model collection processing Publishing research results Discussion and publications archiving Lack of initial and intermediate data preservation; and linking data to publications Raw Experimental Structured Scientific archiving archiving Project/ Experiment Planning collection Integration and processing Publishing research results End of project: discard or Archive Researcher Open Public Use 21 May 2013, BDDAC2013 Big & Sci Infrastructure 20
21 Scientific Lifecycle Management (SDLM) Model DB Lifecycle Model in e-science User Researcher discovery Curation (including retirement and clean up) recycling Re-purpose archiving Raw Experimental Structured Scientific linkage to papers archiving Project/ Experiment Planning collection and filtering analysis sharing/ publishing End of project Linkage Issues Persistent Identifiers (PID) ORCID (Open Researcher and Contributor ID) Lined Re-purpose Clean up and Retirement Ownership and authority Detainment Links Open Public Use Metadata & Mngnt 21 May 2013, BDDAC2013 Big & Sci Infrastructure 21
22 General requirements to SDI for emerging Big Science Support for long running experiments and large data volumes generated at high speed Multi-tier inter-linked data distribution and replication On-demand infrastructure provisioning to support data sets and scientific workflows, mobility of data-centric scientific applications Support of virtual scientists communities, addressing dynamic user groups creation and management, federated identity management Support for the whole data lifecycle including metadata and data source linkage Trusted environment for data storage and processing Research need to trust SDI to put all their data on it Support for data integrity, confidentiality, accountability Policy binding to data to protect privacy, confidentiality and IPR 21 May 2013, BDDAC2013 Big & Sci Infrastructure 22
23 Defining Architecture framework for SDI and Security Scientific Lifecycle Management (SDLM) model e-sdi multi-layer architecture model RORA model to define relationship between resources and actors RORA (Resource-Ownership-Role-Actor) model defines relationship between resources, owners, managers, users Initially defined for telecom domain New actors in SDI (and Big Infrastructure) Subject of data (e.g. patient, or scientific object/paper) Manager (doctor, seller) Security and Access Control and Accounting Infrastructure (ACAI) Trust management infrastructure Authentication, Authorisation, Accounting Supported by logging service Extended to support data access control and operations on data 21 May 2013, BDDAC2013 Big & Sci Infrastructure 23
24 Operation Support and Management Service (OSMS) Security and AAI Metadata and Lifecycle Management User portals Scientific Applic Scientific Scientific Applic Applic Scientific set SDI Architecture Model User/Scientific Applications Layer Layers Layer B6 Scientific Applications Technologies and solutions Scientific specialist applications Library resources Policy and distributed Collaborative Groups Support Shared Scientific Platform and Instruments (specific for scientific area, also Grid based) Cloud/Grid Infrastructure Virtualisation and Management Middleware Middleware security Layer B5 Access and Federation Layer Layer B4 Scientific Platform and Instruments Layer B3 Infrastructure Virtualisation Optical Network Infrastructure Federated Identity Management: edugain, REFEDS, VOMS, InCommon PRACE/DEISA Grid/Cloud Compute Resources Sensors and Devices Storage Resources Layer B2 center and Computing Facility Clouds Network infrastructure Layer B1 Network Infrastructure Autobahn, eduroam 21 May 2013, BDDAC2013 Big & Sci Infrastructure 24
25 SDI Architecture Layers Layer D1: Network infrastructure layer represented by the general purpose Internet infrastructure and dedicated network infrastructure Layer D2: centers and computing resources/facilities, including sensor network Layer D3: Infrastructure virtualisation layer that is represented by the Cloud/Grid infrastructure services and middleware supporting specialised scientific platforms deployment and operation Layer D4: (Shared) Scientific platforms and instruments specific for different research areas Layer D5: Access Infrastructure Layer: Federation infrastructure components, including policy and collaborative user groups support functionality Layer D6: Scientific applications and user portals/clients 21 May 2013, BDDAC2013 Big & Sci Infrastructure 25
26 SDI move to Clouds Cloud technologies allow for infrastructure virtualisation and its profiling for specific data structures or to support specific scientific workflows Clouds provide just right technology for infrastructure virtualisation to support data sets Complex distributed data require infrastructure Demand for inter-cloud infrastructure Cloud can provide infrastructure on-demand to support project related scientific workflows Similar to Grid but with benefits of the full infrastructure provisioning on-demand Software Defined Infrastructure Services As wider than currently emerging SDN (Software Defined Networks) Distributed Hadoop clusters for HPC and MPP 21 May 2013, BDDAC2013 Big & Sci Infrastructure 26
27 General use case for infrastructure provisioning: Workflow => Logical (Cloud) Infrastructure Enterprise/Scientific workflow Input Storage Instrum. Filtering Special Proc 1 Special Proc 2 Archive Visual Present Campus A CE Visualisation Visualisation Campus B CE User Group A User User User Resource/ Service Provider VR1 VR2 VR3 VR6 VR4 VR5 Cloud 1 IaaS Cloud 2 PaaS VR7 User User User Enterprise/Project based Intercloud Infrastructure User Group B Resource/ Service Provider 21 May 2013, BDDAC2013 Big & Sci Infrastructure 27 Cloud IaaS Provider Cloud PaaS Provider
28 General use case for infrastructure provisioning: Workflow => Logical (Cloud) Infrastructure Enterprise/Scientific workflow Input Storage Instrum. Filtering Special Proc 1 Special Proc 2 Archive Visual Present Campus A CE Visualisation Visualisation Campus B CE User Group A User User User Resource/ Service Provider VR1 VR2 VR3 VR6 VR4 VR5 Cloud 1 IaaS Cloud 2 PaaS VR7 User User User Enterprise/Project based Intercloud Infrastructure User Group B Resource/ Service Provider 21 May 2013, BDDAC2013 Big & Sci Infrastructure 28 Cloud IaaS Provider Cloud PaaS Provider
29 General use case for infrastructure provisioning: Logical Infrastructure => Network Infrastructure (1) Resource and Cloud Provider Domains Cloud 1 IaaS Cloud 2 PaaS Campus A Infrastructure VR1 VR3 VR5 VR7 Campus B Infrastructure VR2 VR4 VR6 Campus A CE Visualisation Cloud Carrier Network Infrastructure Visualisation Campus B CE User Group A User User User Resource/ Service Provider VR1 VR2 VR3 VR6 VR4 VR5 Cloud 1 IaaS Cloud 2 PaaS VR7 User User User Enterprise/Project based Intercloud Infrastructure User Group B Defined as InterCloud Architecture Framework (ICAF) Resource/ Service Provider 21 May 2013, BDDAC2013 Big & Sci Infrastructure 29 Cloud IaaS Provider Cloud PaaS Provider
30 Big and Cloud/Intercloud Research topics Mapping from scientific workflow to inter-cloud structures and supporting infrastructure Cloud infrastructure support for Big security and trustworthiness (for generically distributed scenarios) centric security models Authenticity, authorisation, delegation Trust, trustworthy systems, validity Accounting, auditing Privacy: Will it change with the new technologies? 21 May 2013, BDDAC2013 Big & Sci Infrastructure 30
31 Questions and Discussion 21 May 2013, BDDAC2013 Big & Sci Infrastructure 31
32 Additional materials Big technologies and foreseen innovations Trust and trustworthiness 21 May 2013, BDDAC2013 Big & Sci Infrastructure 32
33 Big : Existing technologies (1) Big storage and database solutions Google File System, BigTable New types of databases? Big computing MapReduce, Google s Dremel MPI and Computer Grids To be capable to run in a distributed environment closer to data stores and sources Network for Big Capacity Yes New protocols? Routing, streaming, load balancing, etc. optimised for big data, workflow and data structure 21 May 2013, BDDAC2013 Big & Sci Infrastructure 33
34 Big : Existing technologies (2) Social networks and human driven/centric collaboration and sharing environment Facebook is a heuristic implementation to human s crave to share, socialise and expand Not perfect, to evolve or be overtaken by new more open technologies to support human s socialising activity Current new developments at Facebook From Like to Open Graph as a behavioral technology Payment system Smartphones, tablets and BYOD (Bring Your Own Device) to be supported by employer IT Hub for human/body sensors and wearable devices 21 May 2013, BDDAC2013 Big & Sci Infrastructure 34
35 Foreseen Big Innovations in 2013 Cloud-Based Big Solutions Amazon s Elastic Map Reduce (EMR) is a market leader Expected new innovative Big and Cloud solutions Real-Time Hadoop Google s Dremel-like solutions that will allow real-time queries on Big and be open source Distributed Machine Learning Mahout iterative scalable distributed backpropagation machine learning and data mining algorithm New algorithms Jubatus, HogWild Big Appliances (also for home) Raspberry Pi and home-made GPU clusters Hardware vendors (Dell, HP, etc.) pack mobile ARM processors into server boxes Adepteva's Parallella will put a 16-core supercomputer into for $99 Easier Big Tools Open Source and easy to use drag-and-drop tools for Big Analytics to facilitate the BD adoption Commercial examples: Radoop = RapidMiner + Mahout, Tableau, meer, etc. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 35
36 Predictions for 2013 and as far as 2020 Main technologies and areas Smartphones and tablets E-Medicine and Healthcare Big Cloud: shift from private to public Big and Cloud Infographic Big in 2013 by Mike Guattieri, Forrester Firms realize that Big means all it's data. Big is the frontier of a firms ability to store, process, and access (SPA) all the data it needs to operate effectively, make decisions, reduce risks, and create better customer experiences. In 2013, all data is Big. The algorithm wars begin. Advanced analytics uses advanced statistical, data mining, and machine learning algorithms to dig deeper to find patterns that you can t see using traditional BI tools, simple queries, or rules. Real-time architectures will swing to prominence. Firms will seek out streaming, event processing, and in-memory data technologies to provide real-time analytics and run predictive models. Mobile is a key driver because hyper-connected consumers and employees will require architectures that can quickly process incoming data from all digital channels to make business decisions and deliver engaging customer experiences in realtime. Big may be disruptive. This is the year that Time magazine names Big as 2013 person of the year. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 36
37 Trust model Transposition to ICT/SDI domain Classic Trust definition ( humanised ) A trusts B to do something (typically Action, Operation, or Assertion) Trust definition transposed to Trustworthy ICT User A trusts service B to do an operation OP (or make assertion) with data D on a platform P are associated with the handling policy Platform is multi-tier, multi-domain and multi-provider Platform includes Computing, Storage and Network may be stored by 3 rd party and belong to domain different to user 21 May 2013, BDDAC2013 Big & Sci Infrastructure 37
38 Component of the Trusted/Trustworthy ICT Trust between User Trust domain A stored at user facilities domains A and B Provider Trust domain B stored in Provider domain Application (Operation) Platform/ System User User Client/ System User Client trusts provider s System/Platform Provider Trust(worthiness) Security User and Service Provider the two actors concerned with own /Content security and each other System/Platform trustworthiness may be stored separately but typically belong to user trust domain In clouds it needs to be extended to distributed, multi-domain and multi-provider environment Two other aspects of security and trust stored vs accessed/processed System Idle vs Active (running User session) 21 May 2013, BDDAC2013 Big & Sci Infrastructure 38
39 Trustworthiness and Trustworthy system Trustworthy infrastructure/system definition and analysis must include more factors Two or more parties involved into trust relations, typically Users (or user client/application) and Provider (or server or service/application) with their ownership, privacy, confidentiality level, and applied/attached handling policy Computing platform and storage (which however can associated with the general platform) Underlying communication infrastructure (which however can be associated with the general platform) Possible inter-domain issues related to different policies and assurance/provisions 21 May 2013, BDDAC2013 Big & Sci Infrastructure 39
40 Security in Virtualised Environment (2): Platform, OS/VM, Application and Network + Platform OS/VM Container Application Multilayer security Platform OS/VM Container Application Multilayer security Virtualisation Platform (Cloud) OS/VM App Container Appl VPN based transport network security Inter-container TLS/SSL security Inter-applications e2e security Virtualisation Platform (Cloud) OS/VM App Container Appl Metadata Information/ Registry Service Session Credentials/Context together with Policy User/System Credentials, secrets Platform trust anchor (TPM, hardware secret key) Requires multilayer security and trust Security and access control policies may be defined for each layer related components defining access control: metadata, encapsulated policies 21 May 2013, BDDAC2013 Big & Sci Infrastructure 40
Overview NIST Big Data Working Group Activities
Overview NIST Big Working Group Activities and Big Architecture Framework (BDAF) by UvA Yuri Demchenko SNE Group, University of Amsterdam Big Analytics Interest Group 17 September 2013, 2nd RDA Plenary
Cloud and Big Data Standardisation
Cloud and Big Data Standardisation EuroCloud Symposium ICS Track: Standards for Big Data in the Cloud 15 October 2013, Luxembourg Yuri Demchenko System and Network Engineering Group, University of Amsterdam
Big Data Standardisation in Industry and Research
Big Data Standardisation in Industry and Research EuroCloud Symposium ICS Track: Standards for Big Data in the Cloud 15 October 2013, Luxembourg Yuri Demchenko System and Network Engineering Group, University
Defining the Big Data Architecture Framework (BDAF)
Defining the Big Architecture Framework (BDAF) Outcome of the Brainstorming Session at the University of Amsterdam Yuri Demchenko (facilitator, reporter), SNE Group, University of Amsterdam 17 July 2013,
Addressing Big Data Issues in Scientific Data Infrastructure
Addressing Big Data Issues in Scientific Data Infrastructure Yuri Demchenko, Paola Grosso, Cees de Laat System and Network Engineering Group University of Amsterdam Amsterdam, The Netherlands e-mail: {y.demchenko,
Big Data course and Learning Model for Online education (LMO) at the Laureate Online Education (University of Liverpool)
Big Data course and Learning Model for Online education (LMO) at the Laureate Online Education (University of Liverpool) Yuri Demchenko (University of Amsterdam) Emanuel Gruengard (Laureate Online Education)
Big Security for Big Data: Addressing Security Challenges for the Big Data Infrastructure
Big Security for Big : Addressing Security Challenges for the Big Infrastructure Yuri Demchenko SNE Group, University of Amsterdam On behalf of Yuri Demchenko, Canh Ngo, Cees de Laat, Peter Membrey, Daniil
Defining Architecture Components of the Big Data Ecosystem
Defining Architecture Components of the Big Data Ecosystem Yuri Demchenko, Cees de Laat System and Network Engineering Group University of Amsterdam Amsterdam, The Netherlands e-mail: {y.demchenko, C.T.A.M.deLaat}@uva.nl
Defining Architecture Components of the Big Data Ecosystem
Defining Architecture Components of the Big Data Ecosystem Yuri Demchenko SNE Group, University of Amsterdam 2 nd BDDAC2014 Symposium, CTS2014 Conference 19-23 May 2014, Minneapolis, USA Outline Big Data
Big Data, Cloud Computing, Spatial Databases Steven Hagan Vice President Server Technologies
Big Data, Cloud Computing, Spatial Databases Steven Hagan Vice President Server Technologies Big Data: Global Digital Data Growth Growing leaps and bounds by 40+% Year over Year! 2009 =.8 Zetabytes =.08
Open Cloud exchange (OCX)
Open Cloud exchange (OCX) Draft Proposal and Progress GN3plus JRA1 Task 2 - Network Architectures for Cloud Services Yuri Demchenko SNE Group, University of Amsterdam 10 October 2013, GN3plus Symposium,
Big Data Analytics Platform @ Nokia
Big Data Analytics Platform @ Nokia 1 Selecting the Right Tool for the Right Workload Yekesa Kosuru Nokia Location & Commerce Strata + Hadoop World NY - Oct 25, 2012 Agenda Big Data Analytics Platform
End to End Solution to Accelerate Data Warehouse Optimization. Franco Flore Alliance Sales Director - APJ
End to End Solution to Accelerate Data Warehouse Optimization Franco Flore Alliance Sales Director - APJ Big Data Is Driving Key Business Initiatives Increase profitability, innovation, customer satisfaction,
Big Data a threat or a chance?
Big Data a threat or a chance? Helwig Hauser University of Bergen, Dept. of Informatics Big Data What is Big Data? well, lots of data, right? we come back to this in a moment. certainly, a buzz-word but
Luncheon Webinar Series May 13, 2013
Luncheon Webinar Series May 13, 2013 InfoSphere DataStage is Big Data Integration Sponsored By: Presented by : Tony Curcio, InfoSphere Product Management 0 InfoSphere DataStage is Big Data Integration
How To Handle Big Data With A Data Scientist
III Big Data Technologies Today, new technologies make it possible to realize value from Big Data. Big data technologies can replace highly customized, expensive legacy systems with a standard solution
NextGen Infrastructure for Big DATA Analytics.
NextGen Infrastructure for Big DATA Analytics. So What is Big Data? Data that exceeds the processing capacity of conven4onal database systems. The data is too big, moves too fast, or doesn t fit the structures
Data-intensive HPC: opportunities and challenges. Patrick Valduriez
Data-intensive HPC: opportunities and challenges Patrick Valduriez Big Data Landscape Multi-$billion market! Big data = Hadoop = MapReduce? No one-size-fits-all solution: SQL, NoSQL, MapReduce, No standard,
Real Time Big Data Processing
Real Time Big Data Processing Cloud Expo 2014 Ian Meyers Amazon Web Services Global Infrastructure Deployment & Administration App Services Analytics Compute Storage Database Networking AWS Global Infrastructure
The Future of Data Management
The Future of Data Management with Hadoop and the Enterprise Data Hub Amr Awadallah (@awadallah) Cofounder and CTO Cloudera Snapshot Founded 2008, by former employees of Employees Today ~ 800 World Class
Big Data Challenges and Success Factors. Deloitte Analytics Your data, inside out
Big Data Challenges and Success Factors Deloitte Analytics Your data, inside out Big Data refers to the set of problems and subsequent technologies developed to solve them that are hard or expensive to
AGENDA. What is BIG DATA? What is Hadoop? Why Microsoft? The Microsoft BIG DATA story. Our BIG DATA Roadmap. Hadoop PDW
AGENDA What is BIG DATA? What is Hadoop? Why Microsoft? The Microsoft BIG DATA story Hadoop PDW Our BIG DATA Roadmap BIG DATA? Volume 59% growth in annual WW information 1.2M Zetabytes (10 21 bytes) this
Surfing the Data Tsunami: A New Paradigm for Big Data Processing and Analytics
Surfing the Data Tsunami: A New Paradigm for Big Data Processing and Analytics Dr. Liangxiu Han Future Networks and Distributed Systems Group (FUNDS) School of Computing, Mathematics and Digital Technology,
BIG DATA TRENDS AND TECHNOLOGIES
BIG DATA TRENDS AND TECHNOLOGIES THE WORLD OF DATA IS CHANGING Cloud WHAT IS BIG DATA? Big data are datasets that grow so large that they become awkward to work with using onhand database management tools.
Klarna Tech Talk: Mind the Data! Jeff Pollock InfoSphere Information Integration & Governance
Klarna Tech Talk: Mind the Data! Jeff Pollock InfoSphere Information Integration & Governance IBM s statements regarding its plans, directions, and intent are subject to change or withdrawal without notice
BIG DATA IN THE CLOUD : CHALLENGES AND OPPORTUNITIES MARY- JANE SULE & PROF. MAOZHEN LI BRUNEL UNIVERSITY, LONDON
BIG DATA IN THE CLOUD : CHALLENGES AND OPPORTUNITIES MARY- JANE SULE & PROF. MAOZHEN LI BRUNEL UNIVERSITY, LONDON Overview * Introduction * Multiple faces of Big Data * Challenges of Big Data * Cloud Computing
The 4 Pillars of Technosoft s Big Data Practice
beyond possible Big Use End-user applications Big Analytics Visualisation tools Big Analytical tools Big management systems The 4 Pillars of Technosoft s Big Practice Overview Businesses have long managed
Are You Ready for Big Data?
Are You Ready for Big Data? Jim Gallo National Director, Business Analytics February 11, 2013 Agenda What is Big Data? How do you leverage Big Data in your company? How do you prepare for a Big Data initiative?
The future of Big Data A United Hitachi View
The future of Big Data A United Hitachi View Alex van Die Pre-Sales Consultant 1 Oktober 2014 1 Agenda Evolutie van Data en Analytics Internet of Things Hitachi Social Innovation Vision and Solutions 2
How In-Memory Data Grids Can Analyze Fast-Changing Data in Real Time
SCALEOUT SOFTWARE How In-Memory Data Grids Can Analyze Fast-Changing Data in Real Time by Dr. William Bain and Dr. Mikhail Sobolev, ScaleOut Software, Inc. 2012 ScaleOut Software, Inc. 12/27/2012 T wenty-first
Are You Ready for Big Data?
Are You Ready for Big Data? Jim Gallo National Director, Business Analytics April 10, 2013 Agenda What is Big Data? How do you leverage Big Data in your company? How do you prepare for a Big Data initiative?
Datenverwaltung im Wandel - Building an Enterprise Data Hub with
Datenverwaltung im Wandel - Building an Enterprise Data Hub with Cloudera Bernard Doering Regional Director, Central EMEA, Cloudera Cloudera Your Hadoop Experts Founded 2008, by former employees of Employees
Hortonworks & SAS. Analytics everywhere. Page 1. Hortonworks Inc. 2011 2014. All Rights Reserved
Hortonworks & SAS Analytics everywhere. Page 1 A change in focus. A shift in Advertising From mass branding A shift in Financial Services From Educated Investing A shift in Healthcare From mass treatment
Analytics in the Cloud. Peter Sirota, GM Elastic MapReduce
Analytics in the Cloud Peter Sirota, GM Elastic MapReduce Data-Driven Decision Making Data is the new raw material for any business on par with capital, people, and labor. What is Big Data? Terabytes of
COMP9321 Web Application Engineering
COMP9321 Web Application Engineering Semester 2, 2015 Dr. Amin Beheshti Service Oriented Computing Group, CSE, UNSW Australia Week 11 (Part II) http://webapps.cse.unsw.edu.au/webcms2/course/index.php?cid=2411
Big Data and Analytics: Challenges and Opportunities
Big Data and Analytics: Challenges and Opportunities Dr. Amin Beheshti Lecturer and Senior Research Associate University of New South Wales, Australia (Service Oriented Computing Group, CSE) Talk: Sharif
Workprogramme 2014-15
Workprogramme 2014-15 e-infrastructures DCH-RP final conference 22 September 2014 Wim Jansen einfrastructure DG CONNECT European Commission DEVELOPMENT AND DEPLOYMENT OF E-INFRASTRUCTURES AND SERVICES
Data Warehousing in the Age of Big Data
Data Warehousing in the Age of Big Data Krish Krishnan AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD * PARIS SAN DIEGO SAN FRANCISCO SINGAPORE SYDNEY TOKYO Morgan Kaufmann is an imprint of Elsevier
1 st Symposium on Colossal Data and Networking (CDAN-2016) March 18-19, 2016 Medicaps Group of Institutions, Indore, India
1 st Symposium on Colossal Data and Networking (CDAN-2016) March 18-19, 2016 Medicaps Group of Institutions, Indore, India Call for Papers Colossal Data Analysis and Networking has emerged as a de facto
3rd International Symposium on Big Data and Cloud Computing Challenges (ISBCC-2016) March 10-11, 2016 VIT University, Chennai, India
3rd International Symposium on Big Data and Cloud Computing Challenges (ISBCC-2016) March 10-11, 2016 VIT University, Chennai, India Call for Papers Cloud computing has emerged as a de facto computing
So What s the Big Deal?
So What s the Big Deal? Presentation Agenda Introduction What is Big Data? So What is the Big Deal? Big Data Technologies Identifying Big Data Opportunities Conducting a Big Data Proof of Concept Big Data
Impact of Big Data in Oil & Gas Industry. Pranaya Sangvai Reliance Industries Limited 04 Feb 15, DEJ, Mumbai, India.
Impact of Big Data in Oil & Gas Industry Pranaya Sangvai Reliance Industries Limited 04 Feb 15, DEJ, Mumbai, India. New Age Information 2.92 billions Internet Users in 2014 Twitter processes 7 terabytes
Keywords Big Data, NoSQL, Relational Databases, Decision Making using Big Data, Hadoop
Volume 4, Issue 1, January 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Transitioning
Deploying Big Data to the Cloud: Roadmap for Success
Deploying Big Data to the Cloud: Roadmap for Success James Kobielus Chair, CSCC Big Data in the Cloud Working Group IBM Big Data Evangelist. IBM Data Magazine, Editor-in- Chief. IBM Senior Program Director,
Integrating a Big Data Platform into Government:
Integrating a Big Data Platform into Government: Drive Better Decisions for Policy and Program Outcomes John Haddad, Senior Director Product Marketing, Informatica Digital Government Institute s Government
Big Data Technologies Compared June 2014
Big Data Technologies Compared June 2014 Agenda What is Big Data Big Data Technology Comparison Summary Other Big Data Technologies Questions 2 What is Big Data by Example The SKA Telescope is a new development
How to use Big Data in Industry 4.0 implementations. LAURI ILISON, PhD Head of Big Data and Machine Learning
How to use Big Data in Industry 4.0 implementations LAURI ILISON, PhD Head of Big Data and Machine Learning Big Data definition? Big Data is about structured vs unstructured data Big Data is about Volume
Trends and Research Opportunities in Spatial Big Data Analytics and Cloud Computing NCSU GeoSpatial Forum
Trends and Research Opportunities in Spatial Big Data Analytics and Cloud Computing NCSU GeoSpatial Forum Siva Ravada Senior Director of Development Oracle Spatial and MapViewer 2 Evolving Technology Platforms
Chapter 7. Using Hadoop Cluster and MapReduce
Chapter 7 Using Hadoop Cluster and MapReduce Modeling and Prototyping of RMS for QoS Oriented Grid Page 152 7. Using Hadoop Cluster and MapReduce for Big Data Problems The size of the databases used in
Chukwa, Hadoop subproject, 37, 131 Cloud enabled big data, 4 Codd s 12 rules, 1 Column-oriented databases, 18, 52 Compression pattern, 83 84
Index A Amazon Web Services (AWS), 50, 58 Analytics engine, 21 22 Apache Kafka, 38, 131 Apache S4, 38, 131 Apache Sqoop, 37, 131 Appliance pattern, 104 105 Application architecture, big data analytics
VIEWPOINT. High Performance Analytics. Industry Context and Trends
VIEWPOINT High Performance Analytics Industry Context and Trends In the digital age of social media and connected devices, enterprises have a plethora of data that they can mine, to discover hidden correlations
The 3 questions to ask yourself about BIG DATA
The 3 questions to ask yourself about BIG DATA Do you have a big data problem? Companies looking to tackle big data problems are embarking on a journey that is full of hype, buzz, confusion, and misinformation.
Cluster, Grid, Cloud Concepts
Cluster, Grid, Cloud Concepts Kalaiselvan.K Contents Section 1: Cluster Section 2: Grid Section 3: Cloud Cluster An Overview Need for a Cluster Cluster categorizations A computer cluster is a group of
Ganzheitliches Datenmanagement
Ganzheitliches Datenmanagement für Hadoop Michael Kohs, Senior Sales Consultant @mikchaos The Problem with Big Data Projects in 2016 Relational, Mainframe Documents and Emails Data Modeler Data Scientist
Big Data Executive Survey
Big Data Executive Full Questionnaire Big Date Executive Full Questionnaire Appendix B Questionnaire Welcome The survey has been designed to provide a benchmark for enterprises seeking to understand the
Microsoft Big Data. Solution Brief
Microsoft Big Data Solution Brief Contents Introduction... 2 The Microsoft Big Data Solution... 3 Key Benefits... 3 Immersive Insight, Wherever You Are... 3 Connecting with the World s Data... 3 Any Data,
5 Keys to Unlocking the Big Data Analytics Puzzle. Anurag Tandon Director, Product Marketing March 26, 2014
5 Keys to Unlocking the Big Data Analytics Puzzle Anurag Tandon Director, Product Marketing March 26, 2014 1 A Little About Us A global footprint. A proven innovator. A leader in enterprise analytics for
HDP Hadoop From concept to deployment.
HDP Hadoop From concept to deployment. Ankur Gupta Senior Solutions Engineer Rackspace: Page 41 27 th Jan 2015 Where are you in your Hadoop Journey? A. Researching our options B. Currently evaluating some
EDISON Education for Data Intensive Science to Open New science frontiers
H2020 INFRASUPP-4 CSA Project EDISON Education for Data Intensive Science to Open New science frontiers Yuri Demchenko University of Amsterdam Outline Consortium members EDISON Project Concept and Objectives
Big Data and Data Intensive Science:
Big Data and Data Intensive Science: Infrastructure and Other Challenges Yuri Demchenko SNE Group, University of Amsterdam EENET Conference 4 October 2013, Tartu, Outline Big Data and Data Intensive Science
NoSQL for SQL Professionals William McKnight
NoSQL for SQL Professionals William McKnight Session Code BD03 About your Speaker, William McKnight President, McKnight Consulting Group Frequent keynote speaker and trainer internationally Consulted to
Oracle Big Data SQL Technical Update
Oracle Big Data SQL Technical Update Jean-Pierre Dijcks Oracle Redwood City, CA, USA Keywords: Big Data, Hadoop, NoSQL Databases, Relational Databases, SQL, Security, Performance Introduction This technical
Oracle Big Data Strategy Simplified Infrastrcuture
Big Data Oracle Big Data Strategy Simplified Infrastrcuture Selim Burduroğlu Global Innovation Evangelist & Architect Education & Research Industry Business Unit Oracle Confidential Internal/Restricted/Highly
Detecting Anomalous Behavior with the Business Data Lake. Reference Architecture and Enterprise Approaches.
Detecting Anomalous Behavior with the Business Data Lake Reference Architecture and Enterprise Approaches. 2 Detecting Anomalous Behavior with the Business Data Lake Pivotal the way we see it Reference
Reference Architecture, Requirements, Gaps, Roles
Reference Architecture, Requirements, Gaps, Roles The contents of this document are an excerpt from the brainstorming document M0014. The purpose is to show how a detailed Big Data Reference Architecture
BIG DATA Alignment of Supply & Demand Nuria de Lama Representative of Atos Research &
BIG DATA Alignment of Supply & Demand Nuria de Lama Representative of Atos Research & Innovation 04-08-2011 to the EC 8 th February, Luxembourg Your Atos business Research technologists. and Innovation
An Oracle White Paper November 2010. Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics
An Oracle White Paper November 2010 Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics 1 Introduction New applications such as web searches, recommendation engines,
BIG DATA-AS-A-SERVICE
White Paper BIG DATA-AS-A-SERVICE What Big Data is about What service providers can do with Big Data What EMC can do to help EMC Solutions Group Abstract This white paper looks at what service providers
and Deployment Roadmap for Satellite Ground Systems
A Cloud-Based Reference Model and Deployment Roadmap for Satellite Ground Systems 2012 Ground System Architectures Workshop February 29, 2012 Dr. Craig A. Lee The Aerospace Corporation The Aerospace Corporation
The Impact of PaaS on Business Transformation
The Impact of PaaS on Business Transformation September 2014 Chris McCarthy Sr. Vice President Information Technology 1 Legacy Technology Silos Opportunities Business units Infrastructure Provisioning
NIH Commons Overview, Framework & Pilots - Version 1. The NIH Commons
The NIH Commons Summary The Commons is a shared virtual space where scientists can work with the digital objects of biomedical research, i.e. it is a system that will allow investigators to find, manage,
Challenges for Data Driven Systems
Challenges for Data Driven Systems Eiko Yoneki University of Cambridge Computer Laboratory Quick History of Data Management 4000 B C Manual recording From tablets to papyrus to paper A. Payberah 2014 2
Data Virtualization for Agile Business Intelligence Systems and Virtual MDM. To View This Presentation as a Video Click Here
Data Virtualization for Agile Business Intelligence Systems and Virtual MDM To View This Presentation as a Video Click Here Agenda Data Virtualization New Capabilities New Challenges in Data Integration
Transforming the Telecoms Business using Big Data and Analytics
Transforming the Telecoms Business using Big Data and Analytics Event: ICT Forum for HR Professionals Venue: Meikles Hotel, Harare, Zimbabwe Date: 19 th 21 st August 2015 AFRALTI 1 Objectives Describe
On Establishing Big Data Breakwaters
On Establishing Big Data Breakwaters with Analytics Dr. - Ing. Morris Riedel Head of Research Group High Productivity Data Processing, Juelich Supercomputing Centre, Germany Adjunct Associated Professor,
A New Era Of Analytic
Penang egovernment Seminar 2014 A New Era Of Analytic Megat Anuar Idris Head, Project Delivery, Business Analytics & Big Data Agenda Overview of Big Data Case Studies on Big Data Big Data Technology Readiness
High Performance Data Management Use of Standards in Commercial Product Development
v2 High Performance Data Management Use of Standards in Commercial Product Development Jay Hollingsworth: Director Oil & Gas Business Unit Standards Leadership Council Forum 28 June 2012 1 The following
Big Data Challenges in Bioinformatics
Big Data Challenges in Bioinformatics BARCELONA SUPERCOMPUTING CENTER COMPUTER SCIENCE DEPARTMENT Autonomic Systems and ebusiness Pla?orms Jordi Torres [email protected] Talk outline! We talk about Petabyte?
Analyzing Big Data with AWS
Analyzing Big Data with AWS Peter Sirota, General Manager, Amazon Elastic MapReduce @petersirota What is Big Data? Computer generated data Application server logs (web sites, games) Sensor data (weather,
Outline. What is Big data and where they come from? How we deal with Big data?
What is Big Data Outline What is Big data and where they come from? How we deal with Big data? Big Data Everywhere! As a human, we generate a lot of data during our everyday activity. When you buy something,
Executive Summary... 2 Introduction... 3. Defining Big Data... 3. The Importance of Big Data... 4 Building a Big Data Platform...
Executive Summary... 2 Introduction... 3 Defining Big Data... 3 The Importance of Big Data... 4 Building a Big Data Platform... 5 Infrastructure Requirements... 5 Solution Spectrum... 6 Oracle s Big Data
Information Architecture
The Bloor Group Actian and The Big Data Information Architecture WHITE PAPER The Actian Big Data Information Architecture Actian and The Big Data Information Architecture Originally founded in 2005 to
Big Data Analytics. Prof. Dr. Lars Schmidt-Thieme
Big Data Analytics Prof. Dr. Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute of Computer Science University of Hildesheim, Germany 33. Sitzung des Arbeitskreises Informationstechnologie,
Exploiting Data at Rest and Data in Motion with a Big Data Platform
Exploiting Data at Rest and Data in Motion with a Big Data Platform Sarah Brader, [email protected] What is Big Data? Where does it come from? 12+ TBs of tweet data every day 30 billion RFID tags
BIG DATA What it is and how to use?
BIG DATA What it is and how to use? Lauri Ilison, PhD Data Scientist 21.11.2014 Big Data definition? There is no clear definition for BIG DATA BIG DATA is more of a concept than precise term 1 21.11.14
TUT NoSQL Seminar (Oracle) Big Data
Timo Raitalaakso +358 40 848 0148 [email protected] TUT NoSQL Seminar (Oracle) Big Data 11.12.2012 Timo Raitalaakso MSc 2000 Work: Solita since 2001 Senior Database Specialist Oracle ACE 2012 Blog: http://rafudb.blogspot.com
Parallel Data Warehouse
MICROSOFT S ANALYTICS SOLUTIONS WITH PARALLEL DATA WAREHOUSE Parallel Data Warehouse Stefan Cronjaeger Microsoft May 2013 AGENDA PDW overview Columnstore and Big Data Business Intellignece Project Ability
Data Virtualization A Potential Antidote for Big Data Growing Pains
perspective Data Virtualization A Potential Antidote for Big Data Growing Pains Atul Shrivastava Abstract Enterprises are already facing challenges around data consolidation, heterogeneity, quality, and
Big Analytics: A Next Generation Roadmap
Big Analytics: A Next Generation Roadmap Cloud Developers Summit & Expo: October 1, 2014 Neil Fox, CTO: SoftServe, Inc. 2014 SoftServe, Inc. Remember Life Before The Web? 1994 Even Revolutions Take Time
