Addressing Big Data Issues in Scientific Data Infrastructure



Similar documents
Overview NIST Big Data Working Group Activities

Cloud and Big Data Standardisation

Big Data Standardisation in Industry and Research

Defining the Big Data Architecture Framework (BDAF)

Addressing Big Data Issues in Scientific Data Infrastructure

Big Data course and Learning Model for Online education (LMO) at the Laureate Online Education (University of Liverpool)

Big Security for Big Data: Addressing Security Challenges for the Big Data Infrastructure

Defining Architecture Components of the Big Data Ecosystem

Defining Architecture Components of the Big Data Ecosystem

Big Data, Cloud Computing, Spatial Databases Steven Hagan Vice President Server Technologies

Open Cloud exchange (OCX)

Big Data Analytics Nokia

End to End Solution to Accelerate Data Warehouse Optimization. Franco Flore Alliance Sales Director - APJ

Big Data a threat or a chance?

Luncheon Webinar Series May 13, 2013

How To Handle Big Data With A Data Scientist

NextGen Infrastructure for Big DATA Analytics.

Data-intensive HPC: opportunities and challenges. Patrick Valduriez

Real Time Big Data Processing

The Future of Data Management

Big Data Challenges and Success Factors. Deloitte Analytics Your data, inside out

AGENDA. What is BIG DATA? What is Hadoop? Why Microsoft? The Microsoft BIG DATA story. Our BIG DATA Roadmap. Hadoop PDW

Surfing the Data Tsunami: A New Paradigm for Big Data Processing and Analytics

BIG DATA TRENDS AND TECHNOLOGIES

Klarna Tech Talk: Mind the Data! Jeff Pollock InfoSphere Information Integration & Governance

BIG DATA IN THE CLOUD : CHALLENGES AND OPPORTUNITIES MARY- JANE SULE & PROF. MAOZHEN LI BRUNEL UNIVERSITY, LONDON

The 4 Pillars of Technosoft s Big Data Practice

Are You Ready for Big Data?

The future of Big Data A United Hitachi View

How In-Memory Data Grids Can Analyze Fast-Changing Data in Real Time

Are You Ready for Big Data?

Datenverwaltung im Wandel - Building an Enterprise Data Hub with

Hortonworks & SAS. Analytics everywhere. Page 1. Hortonworks Inc All Rights Reserved

Analytics in the Cloud. Peter Sirota, GM Elastic MapReduce

COMP9321 Web Application Engineering

Big Data and Analytics: Challenges and Opportunities

Workprogramme

Data Warehousing in the Age of Big Data

1 st Symposium on Colossal Data and Networking (CDAN-2016) March 18-19, 2016 Medicaps Group of Institutions, Indore, India

3rd International Symposium on Big Data and Cloud Computing Challenges (ISBCC-2016) March 10-11, 2016 VIT University, Chennai, India

So What s the Big Deal?

Impact of Big Data in Oil & Gas Industry. Pranaya Sangvai Reliance Industries Limited 04 Feb 15, DEJ, Mumbai, India.

Keywords Big Data, NoSQL, Relational Databases, Decision Making using Big Data, Hadoop

Deploying Big Data to the Cloud: Roadmap for Success

Integrating a Big Data Platform into Government:

Big Data Technologies Compared June 2014

How to use Big Data in Industry 4.0 implementations. LAURI ILISON, PhD Head of Big Data and Machine Learning

Trends and Research Opportunities in Spatial Big Data Analytics and Cloud Computing NCSU GeoSpatial Forum

Chapter 7. Using Hadoop Cluster and MapReduce

Chukwa, Hadoop subproject, 37, 131 Cloud enabled big data, 4 Codd s 12 rules, 1 Column-oriented databases, 18, 52 Compression pattern, 83 84

VIEWPOINT. High Performance Analytics. Industry Context and Trends

The 3 questions to ask yourself about BIG DATA

Cluster, Grid, Cloud Concepts

Ganzheitliches Datenmanagement

Big Data Executive Survey

Microsoft Big Data. Solution Brief

5 Keys to Unlocking the Big Data Analytics Puzzle. Anurag Tandon Director, Product Marketing March 26, 2014

HDP Hadoop From concept to deployment.

EDISON Education for Data Intensive Science to Open New science frontiers

Big Data and Data Intensive Science:

NoSQL for SQL Professionals William McKnight

Oracle Big Data SQL Technical Update

Oracle Big Data Strategy Simplified Infrastrcuture

Detecting Anomalous Behavior with the Business Data Lake. Reference Architecture and Enterprise Approaches.

Reference Architecture, Requirements, Gaps, Roles

BIG DATA Alignment of Supply & Demand Nuria de Lama Representative of Atos Research &

An Oracle White Paper November Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics

BIG DATA-AS-A-SERVICE

and Deployment Roadmap for Satellite Ground Systems

The Impact of PaaS on Business Transformation

NIH Commons Overview, Framework & Pilots - Version 1. The NIH Commons

Challenges for Data Driven Systems

Data Virtualization for Agile Business Intelligence Systems and Virtual MDM. To View This Presentation as a Video Click Here

Transforming the Telecoms Business using Big Data and Analytics

On Establishing Big Data Breakwaters

A New Era Of Analytic

High Performance Data Management Use of Standards in Commercial Product Development

Big Data Challenges in Bioinformatics

Analyzing Big Data with AWS

Outline. What is Big data and where they come from? How we deal with Big data?

Executive Summary... 2 Introduction Defining Big Data The Importance of Big Data... 4 Building a Big Data Platform...

Information Architecture

Big Data Analytics. Prof. Dr. Lars Schmidt-Thieme

Exploiting Data at Rest and Data in Motion with a Big Data Platform

BIG DATA What it is and how to use?

TUT NoSQL Seminar (Oracle) Big Data

Parallel Data Warehouse

Data Virtualization A Potential Antidote for Big Data Growing Pains

Big Analytics: A Next Generation Roadmap

Transcription:

Addressing Big Issues in Scientific Infrastructure Yuri Demchenko, SNE Group, University of Amsterdam 21 May 2013, BDDAC2013 Workshop CTS2013 San Diego

Outline Background to Big research at SNE/UvA Big definition 5 V s of Big : Volume, Velocity, Variety, Value, Veracity Use case: High Volume Low Value (HVLV) data for financial market feeds Will Big term and definition change/evolve? Big technologies landscape Big and Intensive e-science Scientific Lifecycle Management Scientific Infrastructure (SDI) for Big Cloud and Intercloud based platform for SDI Infrastructure research on SDI for Big Discussion 21 May 2013, BDDAC2013 Big & Sci Infrastructure Slide_2

Background to Research on Big System and Network Engineering Group at the University of Amsterdam Optical and high performance networks E-Science and Collaborative applications Computer Grids and Cloud Computing Security, access control, trust management Future Scientific Infrastructure Fusion of industry Big and Intensive Science Big requires infrastructure High performance computing and network Access infrastructure and collaborative environment Clouds (and Grids) is not a full answer 21 May 2013, BDDAC2013 Big & Sci Infrastructure 3

Big and Intensive Science & Technology - The next technology focus Scientific and Research e-science Big is/has becoming the next buzz word Not much academic research and papers Dive into blogs and tweets Based on the e-science concept and entire information and artifacts digitising Requires also new information and semantic models for information structuring and presentation Requires new research methods using large data sets and data mining Methods to evolve and results to be improved Changes the way how the modern research is done (in e-science) Secondary research, data re-focusing, linking data and publications Big require infrastructure to support both distributed data (collection, storage, processing) and metadata/discovery services Demand for trusted/trustworthy infrastructure Clouds provide just right technology for (data supporting) infrastructure virtualisation 21 May 2013, BDDAC2013 Big & Sci Infrastructure 4

Big Challenges and Initiatives A Vision for Global Research Infrastructure (http://www.grdi2020.eu/) Final Roadmap Report published Peta and Exa scale problems: Storage, Computing, Transfer/Network International Exascale Software Project (http://www.exascale.org/) International Initiative Research Alliance (RDA) http://www.rd-alliance.org/ launched 2-3 Oct 2012, Washington To accelerate international data-driven innovation and discovery by facilitating research data sharing and exchange, use and re-use, standards harmonization, and discoverability Consolidated previous initiatives Web Forum (DWF) initiated by NSF DAITF Access and Interoperability Task Force initiated by EUDAT project First meeting RDA Official Launch, Gothenburg 18-20 March 2013 Next meeting 15-17 September 2013, Washington 21 May 2013, BDDAC2013 Big & Sci Infrastructure 5

Big Definition (1) IDC definition (conservative and strict approach) of Big : "A new generation of technologies and architectures designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture, discovery, and/or analysis" Big : a massive volume of both structured and unstructured data that is so large that it's difficult to process using traditional database and software techniques. From The Big Long Tail blog post by Jason Bloomberg (Jan 17, 2013). http://www.devx.com/blog/the-big-data-long-tail.html that exceeds the processing capacity of conventional database systems. The data is too big, moves too fast, or doesn t fit the structures of your database architectures. To gain value from this data, you must choose an alternative way to process it. Ed Dumbill, program chair for the O Reilly Strata Conference Termed as the Fourth Paradigm *) The techniques and technologies for such data-intensive science are so different that it is worth distinguishing data-intensive science from computational science as a new, fourth paradigm for scientific exploration. (Jim Gray, computer scientist) *) The Fourth Paradigm: -Intensive Scientific Discovery. Edited by Tony Hey, Stewart Tansley, and Kristin Tolle. Microsoft, 2009. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 6

Big Definition (2) - Motivation Why Big based decision making support is so important? Read book Thinking. Fast and Slow (2011) by Nobel Memorial Prize winner in Economics Daniel Kahneman Our thinking is systematically mistaken when it comes to (subconscious) statistical data assessment Dutch translation Ons feilbare denken ( Our mistaken thinking ) Blog article by Mike Gualtieri from Forrester: Firms increasingly realize that [big data] must use predictive and descriptive analytics to find nonobvious information to discover value in the data. Advanced analytics uses advanced statistical, data mining and machine learning algorithms to dig deeper to find patterns that you can t see using traditional BI (Business Intelligence) tools, simple queries, or rules. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 7

5 V s of Big Volume Terabytes Records/Arch Transactions Tables, Files Velocity Batch Real/near-time Processes Streams Variety Structured Unstructured Multi-factor Probabilistic 5 Vs of Big Trustworthiness Authenticity Origin, Reputation Availability Accountability Value Statistical Events Correlations Hypothetical Commonly accepted 3V s of Big Veracity 21 May 2013, BDDAC2013 Big & Sci Infrastructure 8

Volume, Velocity, Variety Examples e-science Volume Terabyte records, transactions, tables, files. LHC 5 PB a month (now is under re-construction) LOHAR 5 PB every hour, requires processing asap to discard noninformative data Other astronomy research Genome research Earth, climate and weather data Velocity batch, near-time, real-time, streams. LNC ATLAS detector collect about xtb data from X sensors during the collision time about 1? ms What other research require highspeed data Variety structures, unstructured, semi-structured, and all the above in a mix Biological and medical, facial research Human, psychology and behavior research History, archeology and artifacts 21 May 2013, BDDAC2013 Big & Sci Infrastructure 9

Volume, Velocity, Variety Examples Industry Volume Terabyte records, transactions, tables, files. A Boeing Jet engine produce out 10TB of operational data for every 30 minutes they run Hence a 4-engine Jumbo jet can create 640TB on one Atlantic crossing. Multiply that to 25,000 flights flown each day and we get the picture Velocity batch, near-time, real-time, streams. Today s on-line ad serving requires 40ms to respond with a decision Financial services (i.e., stock quotes feed) need near 1ms to calculate customer scoring probabilities Stream data, such as movies, need to travel at high speed for proper rendering Variety structures, unstructured, semi-structured, and all the above in a mix WalMart processes 1M customer transactions per hour and feeds information to a database estimated at 2.5PB (petabytes) There are old and new data sources like RFID, sensors, mobile payments, in-vehicle tracking, etc. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 10

Big Veracity Trustworthiness and Reputation -> Integrity Origin, Authenticity and Identification Identification both and Source Source: system/domain and author linkage (for complex hierarchical data, data provenance) Availability Timeliness Mobility (mobile/remote access; from other domain roaming; federation) Accountability A kind of pro-active measure to ensure data veracity 21 May 2013, BDDAC2013 Big & Sci Infrastructure 11

Big Challenges - Technological How to scale up and down (scale or shrink)? Primarily database issues SQL scales easy up but not easy scales down if demand decreases NoSQL (Not only SQL) can partly address this issue SQL has complex syntax, strong schema typing, performance NoSQL is more flexible to adopt to new biz processes Primarily but not just key-value or document-based structures and data models To respond to specific use cases and operations over data mining/data intelligence algorithms To handle/discover new data structures and multi-type data relations Human/behavioral/social targeted data analysis (means fuzzy/biased) Infrastructure support for storing, moving data, on-demand processing Is Cloud Computing a right technology? Any alternative? High speed network infrastructure, on-demand provisioning Big security, trustworthiness and data centric security 21 May 2013, BDDAC2013 Big & Sci Infrastructure 12

Big Challenges Socio-technological Extending big data outreach/perimeter Technology will boom if there is sufficient customer and user base Currently majority of Big consumers are big companies Although we are contributing with feeding our activity/usage log data Move big data from big companies to user and homes Smart homes, sensors and devices Without sensors and devices human can not create or use big data Smart visualisation can solve a problem of using/acting on big data Lowering entry level to use Big You should not be a data expert to use Big Needs for scalable configurable tools Big and Privacy issues Digital footprint and re-identification 21 May 2013, BDDAC2013 Big & Sci Infrastructure 13

http://mattturck.com/2012/10/15/a-chart-of-the-big-data-ecosystem-take-2/ 21 May 2013, BDDAC2013 Big & Sci Infrastructure 14

Big Infrastructure Components Cloud base infrastructure services for data centric applications (storage, compute, infrastructure/vm management) Cluster services Hadoop related services and tools Specialised data analytics tools (logs, events, data mining, etc.) bases/servers SQL, NoSQL MPP (Massively Parallel Processing) databases Big Management Registries, Indexing/search, semantics, namespaces Security infrastructure (access control, policy enforcement, confidentiality, trust, availability, privacy) Collaborative environment (groups management) 21 May 2013, BDDAC2013 Big & Sci Infrastructure 15

Will Big term sustain? Other names Big Analytics, Big Analytics To avoid the term itself becomes a fetish Analytics, Intelligent Analytics Missed infrastructure component Big vs Intensive Science e-science is based on and involves wide cooperation between researchers New concepts related to Big Disposable in contrary to data supposed to be stored Non-deterministic nature of the scientific study objects From natural science to economics and social science 21 May 2013, BDDAC2013 Big & Sci Infrastructure 16

Big and Intensive Science 21 May 2013, BDDAC2013 Big & Sci Infrastructure 17

e-science Features Automation of all e-science processes including data collection, storing, classification, indexing and other components of the general data curation and provenance Transformation of all processes, events and products into digital form by means of multi-dimensional multi-faceted measurements, monitoring and control; digitising existing artifacts and other content Possibility to re-use the initial and published research data with possible data re-purposing for secondary research Global data availability and access over the network for cooperative group of researchers, including wide public access to scientific data Existence of necessary infrastructure components and management tools that allows fast infrastructures and services composition, adaptation and provisioning on demand for specific research projects and tasks Advanced security and access control technologies that ensure secure operation of the complex research infrastructures and scientific instruments and allow creating trusted secure environment for cooperating groups and individual researchers. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 18

Scientific Types EC Open Access Initiative Requires data linking at all levels and stages Publications and Linked Published Structured Raw Raw data collected from observation and from experiment (according to an initial research model) Structured data and datasets that went through data filtering and processing (supporting some particular formal model) Published data that supports one or another scientific hypothesis, research result or statement linked to publications to support the wide research consolidation, integration, and openness. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 19

DB Traditional Lifecycle Model collection processing Publishing research results Discussion and publications archiving Lack of initial and intermediate data preservation; and linking data to publications Raw Experimental Structured Scientific archiving archiving Project/ Experiment Planning collection Integration and processing Publishing research results End of project: discard or Archive Researcher Open Public Use 21 May 2013, BDDAC2013 Big & Sci Infrastructure 20

Scientific Lifecycle Management (SDLM) Model DB Lifecycle Model in e-science User Researcher discovery Curation (including retirement and clean up) recycling Re-purpose archiving Raw Experimental Structured Scientific linkage to papers archiving Project/ Experiment Planning collection and filtering analysis sharing/ publishing End of project Linkage Issues Persistent Identifiers (PID) ORCID (Open Researcher and Contributor ID) Lined Re-purpose Clean up and Retirement Ownership and authority Detainment Links Open Public Use Metadata & Mngnt 21 May 2013, BDDAC2013 Big & Sci Infrastructure 21

General requirements to SDI for emerging Big Science Support for long running experiments and large data volumes generated at high speed Multi-tier inter-linked data distribution and replication On-demand infrastructure provisioning to support data sets and scientific workflows, mobility of data-centric scientific applications Support of virtual scientists communities, addressing dynamic user groups creation and management, federated identity management Support for the whole data lifecycle including metadata and data source linkage Trusted environment for data storage and processing Research need to trust SDI to put all their data on it Support for data integrity, confidentiality, accountability Policy binding to data to protect privacy, confidentiality and IPR 21 May 2013, BDDAC2013 Big & Sci Infrastructure 22

Defining Architecture framework for SDI and Security Scientific Lifecycle Management (SDLM) model e-sdi multi-layer architecture model RORA model to define relationship between resources and actors RORA (Resource-Ownership-Role-Actor) model defines relationship between resources, owners, managers, users Initially defined for telecom domain New actors in SDI (and Big Infrastructure) Subject of data (e.g. patient, or scientific object/paper) Manager (doctor, seller) Security and Access Control and Accounting Infrastructure (ACAI) Trust management infrastructure Authentication, Authorisation, Accounting Supported by logging service Extended to support data access control and operations on data 21 May 2013, BDDAC2013 Big & Sci Infrastructure 23

Operation Support and Management Service (OSMS) Security and AAI Metadata and Lifecycle Management User portals Scientific Applic Scientific Scientific Applic Applic Scientific set SDI Architecture Model User/Scientific Applications Layer Layers Layer B6 Scientific Applications Technologies and solutions Scientific specialist applications Library resources Policy and distributed Collaborative Groups Support Shared Scientific Platform and Instruments (specific for scientific area, also Grid based) Cloud/Grid Infrastructure Virtualisation and Management Middleware Middleware security Layer B5 Access and Federation Layer Layer B4 Scientific Platform and Instruments Layer B3 Infrastructure Virtualisation Optical Network Infrastructure Federated Identity Management: edugain, REFEDS, VOMS, InCommon PRACE/DEISA Grid/Cloud Compute Resources Sensors and Devices Storage Resources Layer B2 center and Computing Facility Clouds Network infrastructure Layer B1 Network Infrastructure Autobahn, eduroam 21 May 2013, BDDAC2013 Big & Sci Infrastructure 24

SDI Architecture Layers Layer D1: Network infrastructure layer represented by the general purpose Internet infrastructure and dedicated network infrastructure Layer D2: centers and computing resources/facilities, including sensor network Layer D3: Infrastructure virtualisation layer that is represented by the Cloud/Grid infrastructure services and middleware supporting specialised scientific platforms deployment and operation Layer D4: (Shared) Scientific platforms and instruments specific for different research areas Layer D5: Access Infrastructure Layer: Federation infrastructure components, including policy and collaborative user groups support functionality Layer D6: Scientific applications and user portals/clients 21 May 2013, BDDAC2013 Big & Sci Infrastructure 25

SDI move to Clouds Cloud technologies allow for infrastructure virtualisation and its profiling for specific data structures or to support specific scientific workflows Clouds provide just right technology for infrastructure virtualisation to support data sets Complex distributed data require infrastructure Demand for inter-cloud infrastructure Cloud can provide infrastructure on-demand to support project related scientific workflows Similar to Grid but with benefits of the full infrastructure provisioning on-demand Software Defined Infrastructure Services As wider than currently emerging SDN (Software Defined Networks) Distributed Hadoop clusters for HPC and MPP 21 May 2013, BDDAC2013 Big & Sci Infrastructure 26

General use case for infrastructure provisioning: Workflow => Logical (Cloud) Infrastructure Enterprise/Scientific workflow Input Storage Instrum. Filtering Special Proc 1 Special Proc 2 Archive Visual Present Campus A CE Visualisation Visualisation Campus B CE User Group A User User User Resource/ Service Provider VR1 VR2 VR3 VR6 VR4 VR5 Cloud 1 IaaS Cloud 2 PaaS VR7 User User User Enterprise/Project based Intercloud Infrastructure User Group B Resource/ Service Provider 21 May 2013, BDDAC2013 Big & Sci Infrastructure 27 Cloud IaaS Provider Cloud PaaS Provider

General use case for infrastructure provisioning: Workflow => Logical (Cloud) Infrastructure Enterprise/Scientific workflow Input Storage Instrum. Filtering Special Proc 1 Special Proc 2 Archive Visual Present Campus A CE Visualisation Visualisation Campus B CE User Group A User User User Resource/ Service Provider VR1 VR2 VR3 VR6 VR4 VR5 Cloud 1 IaaS Cloud 2 PaaS VR7 User User User Enterprise/Project based Intercloud Infrastructure User Group B Resource/ Service Provider 21 May 2013, BDDAC2013 Big & Sci Infrastructure 28 Cloud IaaS Provider Cloud PaaS Provider

General use case for infrastructure provisioning: Logical Infrastructure => Network Infrastructure (1) Resource and Cloud Provider Domains Cloud 1 IaaS Cloud 2 PaaS Campus A Infrastructure VR1 VR3 VR5 VR7 Campus B Infrastructure VR2 VR4 VR6 Campus A CE Visualisation Cloud Carrier Network Infrastructure Visualisation Campus B CE User Group A User User User Resource/ Service Provider VR1 VR2 VR3 VR6 VR4 VR5 Cloud 1 IaaS Cloud 2 PaaS VR7 User User User Enterprise/Project based Intercloud Infrastructure User Group B Defined as InterCloud Architecture Framework (ICAF) Resource/ Service Provider 21 May 2013, BDDAC2013 Big & Sci Infrastructure 29 Cloud IaaS Provider Cloud PaaS Provider

Big and Cloud/Intercloud Research topics Mapping from scientific workflow to inter-cloud structures and supporting infrastructure Cloud infrastructure support for Big security and trustworthiness (for generically distributed scenarios) centric security models Authenticity, authorisation, delegation Trust, trustworthy systems, validity Accounting, auditing Privacy: Will it change with the new technologies? 21 May 2013, BDDAC2013 Big & Sci Infrastructure 30

Questions and Discussion 21 May 2013, BDDAC2013 Big & Sci Infrastructure 31

Additional materials Big technologies and foreseen innovations Trust and trustworthiness 21 May 2013, BDDAC2013 Big & Sci Infrastructure 32

Big : Existing technologies (1) Big storage and database solutions Google File System, BigTable New types of databases? Big computing MapReduce, Google s Dremel MPI and Computer Grids To be capable to run in a distributed environment closer to data stores and sources Network for Big Capacity Yes New protocols? Routing, streaming, load balancing, etc. optimised for big data, workflow and data structure 21 May 2013, BDDAC2013 Big & Sci Infrastructure 33

Big : Existing technologies (2) Social networks and human driven/centric collaboration and sharing environment Facebook is a heuristic implementation to human s crave to share, socialise and expand Not perfect, to evolve or be overtaken by new more open technologies to support human s socialising activity Current new developments at Facebook From Like to Open Graph as a behavioral technology Payment system Smartphones, tablets and BYOD (Bring Your Own Device) to be supported by employer IT Hub for human/body sensors and wearable devices 21 May 2013, BDDAC2013 Big & Sci Infrastructure 34

Foreseen Big Innovations in 2013 Cloud-Based Big Solutions Amazon s Elastic Map Reduce (EMR) is a market leader Expected new innovative Big and Cloud solutions Real-Time Hadoop Google s Dremel-like solutions that will allow real-time queries on Big and be open source Distributed Machine Learning Mahout iterative scalable distributed backpropagation machine learning and data mining algorithm New algorithms Jubatus, HogWild Big Appliances (also for home) Raspberry Pi and home-made GPU clusters Hardware vendors (Dell, HP, etc.) pack mobile ARM processors into server boxes Adepteva's Parallella will put a 16-core supercomputer into for $99 Easier Big Tools Open Source and easy to use drag-and-drop tools for Big Analytics to facilitate the BD adoption Commercial examples: Radoop = RapidMiner + Mahout, Tableau, meer, etc. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 35

Predictions for 2013 and as far as 2020 Main technologies and areas Smartphones and tablets E-Medicine and Healthcare Big Cloud: shift from private to public Big and Cloud Infographic http://www.cloudtweaks.com/2013/01/cloud-infographic-big-data-universe/ Big in 2013 by Mike Guattieri, Forrester Firms realize that Big means all it's data. Big is the frontier of a firms ability to store, process, and access (SPA) all the data it needs to operate effectively, make decisions, reduce risks, and create better customer experiences. In 2013, all data is Big. The algorithm wars begin. Advanced analytics uses advanced statistical, data mining, and machine learning algorithms to dig deeper to find patterns that you can t see using traditional BI tools, simple queries, or rules. Real-time architectures will swing to prominence. Firms will seek out streaming, event processing, and in-memory data technologies to provide real-time analytics and run predictive models. Mobile is a key driver because hyper-connected consumers and employees will require architectures that can quickly process incoming data from all digital channels to make business decisions and deliver engaging customer experiences in realtime. Big may be disruptive. This is the year that Time magazine names Big as 2013 person of the year. 21 May 2013, BDDAC2013 Big & Sci Infrastructure 36

Trust model Transposition to ICT/SDI domain Classic Trust definition ( humanised ) A trusts B to do something (typically Action, Operation, or Assertion) Trust definition transposed to Trustworthy ICT User A trusts service B to do an operation OP (or make assertion) with data D on a platform P are associated with the handling policy Platform is multi-tier, multi-domain and multi-provider Platform includes Computing, Storage and Network may be stored by 3 rd party and belong to domain different to user 21 May 2013, BDDAC2013 Big & Sci Infrastructure 37

Component of the Trusted/Trustworthy ICT Trust between User Trust domain A stored at user facilities domains A and B Provider Trust domain B stored in Provider domain Application (Operation) Platform/ System User User Client/ System User Client trusts provider s System/Platform Provider Trust(worthiness) Security User and Service Provider the two actors concerned with own /Content security and each other System/Platform trustworthiness may be stored separately but typically belong to user trust domain In clouds it needs to be extended to distributed, multi-domain and multi-provider environment Two other aspects of security and trust stored vs accessed/processed System Idle vs Active (running User session) 21 May 2013, BDDAC2013 Big & Sci Infrastructure 38

Trustworthiness and Trustworthy system Trustworthy infrastructure/system definition and analysis must include more factors Two or more parties involved into trust relations, typically Users (or user client/application) and Provider (or server or service/application) with their ownership, privacy, confidentiality level, and applied/attached handling policy Computing platform and storage (which however can associated with the general platform) Underlying communication infrastructure (which however can be associated with the general platform) Possible inter-domain issues related to different policies and assurance/provisions 21 May 2013, BDDAC2013 Big & Sci Infrastructure 39

Security in Virtualised Environment (2): Platform, OS/VM, Application and Network + Platform OS/VM Container Application Multilayer security Platform OS/VM Container Application Multilayer security Virtualisation Platform (Cloud) OS/VM App Container Appl VPN based transport network security Inter-container TLS/SSL security Inter-applications e2e security Virtualisation Platform (Cloud) OS/VM App Container Appl Metadata Information/ Registry Service Session Credentials/Context together with Policy User/System Credentials, secrets Platform trust anchor (TPM, hardware secret key) Requires multilayer security and trust Security and access control policies may be defined for each layer related components defining access control: metadata, encapsulated policies 21 May 2013, BDDAC2013 Big & Sci Infrastructure 40