Euro-BioImaging European Research Infrastructure for Imaging Technologies in Biological and Biomedical Sciences
|
|
|
- Joleen Little
- 10 years ago
- Views:
Transcription
1 Euro-BioImaging European Research Infrastructure for Imaging Technologies in Biological and Biomedical Sciences WP11 Data Storage and Analysis Task 11.1 Coordination Deliverable 11.2 Community Needs of Current Solutions for Biomedical Image Storage and Remote Access Services Task leaders UNIVDUN Additional Task Contributors: October
2 1. Report Summary WP11 Objectives To define a roadmap towards the construction of a European Biomedical Imaging Data Storage and Analysis Infrastructure. The key objectives of this infrastructure will be: to support efficient and standardized storage for and access to curated biological and medical image data. to support open- source software for biological and medical image analysis through coordination of community efforts, provision of an actively maintained repository of state- of- the- art validated algorithms for quantitative image analysis and thorough training. to interface with high performance computing facilities for high- throughput and/or computation- intensive image analysis. to provide seamless collaboration and access to other relevant computing and data resources in ESFRI and in European and national infrastructures. Digital imaging is now routinely used across both the life and biomedical sciences and has become an essential tool for all aspects of research, training and clinical practice. As image data volumes and complexity grow, it becomes increasingly difficult to access, view, share and analyse datasets using standard desktop- based solutions. In addition, the need for interdisciplinary collaboration, where datasets are shared with consortia of scientists with expertise in experimental biology, data analysis, image processing, modelling, physiology and/or medicine has also grown. In this new age, processing, analysis and sharing of image data based on conventional desktop- based solutions is simply no longer possible. Multi- dimensional images from unique clinical cohorts must be securely shared among defined collaborations to enable full analysis and query, while ensuring that identifiable data are never made publicly available. In biological imaging, technologies such as multi- dimensional fluorescence or high- content screening are becoming standard approaches to reveal fundamental biological mechanisms that explain human physiology and disease. Datasets produced by these technologies are routinely 10 s to 100 s of GBs, and in some cases, many TBs. To deliver the potential of these data, and the ambition and potential of European science, the technology that enables access to image data regardless of where it is produced has become a critical scientific need, and one which the Euro- BioImaging infrastructure must address. In this deliverable we explore the technological requirements for remote access image systems, and in particular examine image databases and central repositories that are the foundation for the next generation of biological and biomedical science
3 2. Image databases and repositories 2.1 Common use cases of image databases The volume of data now routinely collected in laboratories and the complexity of the experimental metadata and analytic results generated by imaging experiments has driven the development of databases for imaging data. There are currently a number of open source and commercial products that provide remote access systems for biological and medical images. Almost all mix some ability to access and view an image remotely, e.g. through a web browser, with capabilities for storing, analysing, processing, annotating and searching for images. Thus these systems are referred to as image databases or image management systems. They can be used for many different applications and at many different scales. For example, a laboratory with a single microscope may require an application for managing the data it produces on a microscope with a few users, while a department with an imaging facility will require a resource to manage and distribute the data collected in its multi- system facility to > 100 users. An institution may also require resources for managing the data associated with the manuscripts published by its scientists to achieve compliance with data publication requirements from national funding organisations and/or journals. Software applications are currently available to manage each of these cases and as these resources have matured they have increasingly become an important part of the software ecology for image processing. 2.2 Technical capabilities of existing research image databases To enable remote image access and processing, image databases must deliver access to the images they hold. Most commonly this is through an Application Programme Interface (API). This interface provides standardised access to the data held in the system and is a standard concept in software. Typically these interfaces allow image processing applications to read data from and write data into the image database. If constructed correctly, the APIs and databases can serve very large numbers of users and many different scientific applications simultaneously and even serve data from many different types of imaging modalities. Currently there are several open source projects providing image database and repository functionality. Two examples are The Open Microscopy Environment 1 (OME) and BISQUE. OME releases OMERO, an open source image data management platform. OMERO is used in thousands of sites worldwide and is the basis for several large public repositories 2. BISQUE 3 is an open source dataviewer.rupress.org/;
4 data management system built at UCSB. This is used by a major plant bioimaging project ( and several image processing groups in the USA. Both tools allow web- based remote access to large image datasets, subject to defined permissions specifications that control who can and can t access any particular dataset. For example a laboratory might want to privately hold certain datasets within a specific group of lab members yet publish other datasets on line in support of published manuscripts. Both OMERO and BISQUE provide this capability in the form of layered permissions control for their users. They also allow integration of analysis functions. OMERO S API supports integration of Matlab functions, Python scripts and plugins from the ImageJ2 project (see D11.6). Going forward submitting data directly from these applications to larger public repositories (see below) will be a key piece of functionality. 2.3 Comparing Image Databases and Rich File Formats Image databases are necessarily complex applications that require substantial expertise to develop, deploy, and maintain. Facilities that collect large datasets need access to staff with this expertise, which further increases the costs of building and running their data infrastructure. In some cases, User data can be stored using file formats that include the provision for storing rich metadata describing the experiment (e.g., OME- TIFF), or even analytic results (e.g., HDF5- based formats). Several examples of this latter approach have now been proposed and in some cases built e.g., cellh5, for timelapse sirna screens 4. These formats work well for single sites, but do not incorporate the complex permissions structures or transaction safety present in bona fide databases. In addition, provisions for remote access still have to be included to make these data useful for remote Users. While rich file formats may provide some utility e.g., for transferring datasets to Users for their own processing and analysis, they may not provide all the functionality necessary for management and access to data needed at Euro- BioImaging Nodes. 2.4 Comparing Image Databases and PACS Picture archive and communication systems 5 (PACS) are well- established in medical imaging. These systems provide archive and storage of multi- dimensional medical images, usually in DICOM format, and provide remote access, often through a web browser, with provision for multi- dimensional image viewing. PACS systems can be considered image databases, as they store image data and metadata, and enable simple search functionality. Open source PACS (e.g., OSIRIX) are currently available but in general, these systems do not have the flexibility or scalability that OMERO or Bisque provide. In particular current PACS don t use the federated image data standard Bio
5 Formats and thus can t support the range of image data types allowed by OMERO and Bisque (see D11.6 for more details) or the range of heterogeneous data types supported by OMERO (e.g., biobanking data 6 ). 2.5 Public Image data Repositories As noted in Deliverable 11.1, there are now several on- line image data repositories. Together, these provide many tens of TBs of image data, annotations, and links to other resources to the biological community. The JCB DataViewer 7, ASCB CELL Image Library 8, the web microscope 9 and the EMDataBank 10 are examples of such resources that each provide on- line public access to large (e.g. >200,000 images at the JCB DataViewer) annotated multi- dimensional image datasets. They are heavily used, and accessed by many 1000 s of unique users per month. Thus the technology to build remotely accessed image data repositories exists and can be deployed. The existing resources can be considered strong proofs of principle that serve as a foundation for the next steps in building the resources needed for the Euro- BioImaging community. Looking forward, a critical next step for delivering value to the community from these resources will be the integration of various resources to allow searching across different data sets used by larger user groups and communities, so called reference data sets. Searching for tubulin should return results from as many different resources that contain standardized image data as possible. In addition, submission and deposition of new datasets that conform to the established standards should be enabled. An example of such an important next step beyond state of the art is the linking of current and future phenotypic genome- wide screens. These imaging- based high throughput datasets (often referred to as high content screens; HCS) systematically sample genome- wide perturbations using sirna gene product knockdown in cultured cells or tissues and then record effects on cell phenotypes using imaging. Currently, at least 12 HCS datasets are published on- line, with more coming, and linking, cross- indexing and systematical and standardized annotation of these and future datasets is currently being developed in the framework of the SystemsMicroscopy NoE ( Addressing the critical need of resource integration and data deposition by delivering a community resource that delivers value similar to the well- established genomic and microarray databases is a key priority of the Euro- BioImaging data infrastructure. This resource, a Euro- BioImaging Image Data Repository (EB- IDR), represents the next step in the development of on- line dataviewer.rupress.org
6 image repositories, demonstrating the importance and value of data submission, cross- referencing and standardised annotation for indexing and search. In addition it would crosslink to the available biomolecular databases hosted by the EBI and ELIXIR, wherever images represent a molecular measurement or provide spatial or temporal context for biomolecular information, which is more and more often the case. This effort is supported by a close collaboration and common image data policy developed jointly by Euro- BioImaging and ELIXIR. Once established, this EB- IDR could serve as the public access point for these and other datasets, as the results generated from them are published. It can also serve as a data hosting service, before public release of data, to foster collaboration between consortia of data producers and image processing and data analysis experts. Eventually, the EB- IDR may emerge as the de facto point of deposition of image data associated with published biological and biomedical research in the EU, much as the Protein Data Bank 11 (PDB) has emerged as the acknowledged community repository for 3D structural data of biological macromolecules. 3. Cloud- based resources for image databases As discussed in D11.5, cloud- based resources can be used for bioimage data storage and analysis, but the current cost structures of commercial cloud providers make these especially expensive for handling processing and analysis of large datasets. However, over the next 3-5 years, it is likely that national and trans- national research cloud infrastructures will become available as are currently being prepared in the Helix- Nebula project, in which Euro- BioImaging is represented through its coordinating institution EMBL, and at least some (based on current trends, we estimate up to 35%) of Euro- BioImaging s data may be stored and processed on these resources (Note: Australia has deployed its NeCTAR platform for this purpose 12 ). Tools that allow on- demand deployment of image database resources in cloud environments would be very useful for imaging facilities to serve data to remote users for analysis and computation. Image data sets used by individual or small user groups could go online in Euro- BioImaging s remote access resources when needed and then potentially go offline once analysis or mining is completed and data is archived locally, reducing the need for Euro- BioImaging to invest in permanent, large- scale storage and computational resources. Reference image data sets used by larger user groups over long times could be made available through the EB- IDR, which would offer remote compute services on this data at the repository. If larger compute resources were required,
7 these reference datasets could be deployed on permanent storage linked to cloud- based HPC resources. An example of this kind of technology is the deployment of Embassy at the European Bioinformatics Institute (EBI- EMBL). Based on ESX technology from VMware, users are able to request a web- accessible portal into a desktop that includes a defined set of data analysis applications, linked to a large data repository. This technology is currently used by EBI to support defined, limited collaborations with external groups and is not yet scalable to an open public resource. However, Embassy can be considered as a current state- of- the- art concept that might be expanded to a community resource, with the appropriate resourcing and technical development. Perhaps the most challenging problem in this type of architecture will be access to the very large datasets stored in the EB- IDR or elsewhere. In general, moving TB- scale datasets between arbitrary locations on demand is not practical. Euro- BioImaging will need to leverage investments and developments in several other transnational projects, and in particular those under development by ELIXIR, to enable general access to these datasets, and in particular to enable users to link cloud- based high performance computing resources to these data resources. 4. Conclusion Open source and commercial products that provide remote access systems for biological and medical images are an important part of the software ecology for image access and processing. Application Programme Interfaces (APIs) and databases can serve very large numbers of users and many different scientific applications simultaneously and even serve data from many different types of imaging applications. However, image databases are necessarily complex applications that require substantial expertise to develop, deploy, and maintain and as such facilities that collect large datasets need to access staff with this expertise as an integral part of running their data infrastructure. Image database technology will be a critical resource for Euro- BioImaging Nodes needing to share data and analysis tools with Users in remote locations. In the future Euro- BioImaging infrastructure tools that allow on- demand deployment of image data sets in database resources of its cloud environment may reduce the need for Euro- BioImaging Nodes to invest in permanent, large- scale storage and computational resources. Development of a Euro- BioImaging Image Data Repository (EB- IDR) that integrates standardized image data sets for cross referencing, searching and new data deposition is a critical data infrastructure task. Ultimately, the EB- IDR should become the de facto point of deposition of image data associated with published biological and biomedical research in the EU. EB- IDR will serve reference image data sets to large user communities over long times. EB- IDR would offer remote compute services for user analysis needs - 7 -
8 of the reference data and may deploy some of its data sets on demand at collaborating high performance compute centers, when specialized HPC analysis is needed by its users
Euro-BioImaging European Research Infrastructure for Imaging Technologies in Biological and Biomedical Sciences
Euro-BioImaging European Research Infrastructure for Imaging Technologies in Biological and Biomedical Sciences WP11 Data Storage and Analysis Task 11.1 Coordination Deliverable 11.3 Selected Standards
Three data delivery cases for EMBL- EBI s Embassy. Guy Cochrane www.ebi.ac.uk
Three data delivery cases for EMBL- EBI s Embassy Guy Cochrane www.ebi.ac.uk EMBL European Bioinformatics Institute Genes, genomes & variation European Nucleotide Archive 1000 Genomes Ensembl Ensembl Genomes
EMBL Identity & Access Management
EMBL Identity & Access Management Rupert Lück EMBL Heidelberg e IRG Workshop Zürich Apr 24th 2008 Outline EMBL Overview Identity & Access Management for EMBL IT Requirements & Strategy Project Goal and
DMBI: Data Management for Bio-Imaging.
DMBI: Data Management for Bio-Imaging. Questionnaire Data Report 1.0 Overview As part of the DMBI project an international meeting was organized around the project to bring together the bio-imaging community
Big Data in BioMedical Sciences. Steven Newhouse, Head of Technical Services, EMBL-EBI
Big Data in BioMedical Sciences Steven Newhouse, Head of Technical Services, EMBL-EBI Big Data for BioMedical Sciences EMBL-EBI: What we do and why? Challenges & Opportunities Infrastructure Requirements
#jenkinsconf. Jenkins as a Scientific Data and Image Processing Platform. Jenkins User Conference Boston #jenkinsconf
Jenkins as a Scientific Data and Image Processing Platform Ioannis K. Moutsatsos, Ph.D., M.SE. Novartis Institutes for Biomedical Research www.novartis.com June 18, 2014 #jenkinsconf Life Sciences are
Using the Grid for the interactive workflow management in biomedicine. Andrea Schenone BIOLAB DIST University of Genova
Using the Grid for the interactive workflow management in biomedicine Andrea Schenone BIOLAB DIST University of Genova overview background requirements solution case study results background A multilevel
BBSRC TECHNOLOGY STRATEGY: TECHNOLOGIES NEEDED BY RESEARCH KNOWLEDGE PROVIDERS
BBSRC TECHNOLOGY STRATEGY: TECHNOLOGIES NEEDED BY RESEARCH KNOWLEDGE PROVIDERS 1. The Technology Strategy sets out six areas where technological developments are required to push the frontiers of knowledge
MediSapiens Ltd. Bio-IT solutions for improving cancer patient care. Because data is not knowledge. 19th of March 2015
19th of March 2015 MediSapiens Ltd Because data is not knowledge Bio-IT solutions for improving cancer patient care Sami Kilpinen, Ph.D Co-founder, CEO MediSapiens Ltd Copyright 2015 MediSapiens Ltd. All
European Genome-phenome Archive database of human data consented for use in biomedical research at the European Bioinformatics Institute
European Genome-phenome Archive database of human data consented for use in biomedical research at the European Bioinformatics Institute Justin Paschall Team Leader Genetic Variation / EGA ! European Genome-phenome
6 ELIXIR Domain Specific Services
6 ELIXIR Domain Specific Services Work stream leads: Alfonso Valencia (ES), Inge Jonassen (NO), Jose Leal (PT) Work stream members: Nils-Peder Willassen (NO), Finn Drablos (NO), Mark Viant (UK), Ferran
Workprogramme 2014-15
Workprogramme 2014-15 e-infrastructures DCH-RP final conference 22 September 2014 Wim Jansen einfrastructure DG CONNECT European Commission DEVELOPMENT AND DEPLOYMENT OF E-INFRASTRUCTURES AND SERVICES
Teaching Computational Thinking using Cloud Computing: By A/P Tan Tin Wee
Teaching Computational Thinking using Cloud Computing: By A/P Tan Tin Wee Technology in Pedagogy, No. 8, April 2012 Written by Kiruthika Ragupathi ([email protected]) Computational thinking is an emerging
Report of the DTL focus meeting on Life Science Data Repositories
Report of the DTL focus meeting on Life Science Data Repositories Goal The goal of the meeting was to inform and discuss research data repositories for life sciences. The big data era adds to the complexity
How To Build An Open Source Data Infrastructure
EUDAT Collaborative Data Infrastructure Towards the convergence of Compute, Data, Knowledge and Scientific Instruments Giuseppe Fiameni CINECA www.eudat.eu EUDAT receives funding from the European Union's
A Service for Data-Intensive Computations on Virtual Clusters
A Service for Data-Intensive Computations on Virtual Clusters Executing Preservation Strategies at Scale Rainer Schmidt, Christian Sadilek, and Ross King [email protected] Planets Project Permanent
Big Data and Cloud Computing for GHRSST
Big Data and Cloud Computing for GHRSST Jean-Francois Piollé ([email protected]) Frédéric Paul, Olivier Archer CERSAT / Institut Français de Recherche pour l Exploitation de la Mer Facing data deluge
Building a Collaborative Informatics Platform for Translational Research: Prof. Yike Guo Department of Computing Imperial College London
Building a Collaborative Informatics Platform for Translational Research: An IMI Project Experience Prof. Yike Guo Department of Computing Imperial College London Living in the Era of BIG Big Data : Massive
CYBERINFRASTRUCTURE FRAMEWORK FOR 21 st CENTURY SCIENCE AND ENGINEERING (CIF21)
CYBERINFRASTRUCTURE FRAMEWORK FOR 21 st CENTURY SCIENCE AND ENGINEERING (CIF21) Goal Develop and deploy comprehensive, integrated, sustainable, and secure cyberinfrastructure (CI) to accelerate research
European Molecular Biology Laboratory Case Example
European Molecular Biology Laboratory Case Example Dr. Silke Schumacher Director International Relations EMBL Member States Austria 1974 Denmark 1974 France 1974 Germany 1974 Israel 1974 Italy 1974 Netherlands
High Performance Computing Initiatives
High Performance Computing Initiatives Eric Stahlberg September 1, 2015 DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health National Cancer Institute Frederick National Laboratory is
Concept and Project Objectives
3.1 Publishable summary Concept and Project Objectives Proactive and dynamic QoS management, network intrusion detection and early detection of network congestion problems among other applications in the
Delivering the power of the world s most successful genomics platform
Delivering the power of the world s most successful genomics platform NextCODE Health is bringing the full power of the world s largest and most successful genomics platform to everyday clinical care NextCODE
Keystones for supporting collaborative research using multiple data sets in the medical and bio-sciences
Keystones for supporting collaborative research using multiple data sets in the medical and bio-sciences David Fergusson Head of Scientific Computing The Francis Crick Institute The Francis Crick Institute
NIH Commons Overview, Framework & Pilots - Version 1. The NIH Commons
The NIH Commons Summary The Commons is a shared virtual space where scientists can work with the digital objects of biomedical research, i.e. it is a system that will allow investigators to find, manage,
Exploitation of ISS scientific data
Cooperative ISS Research data Conservation and Exploitation Exploitation of ISS scientific data Luigi Carotenuto Telespazio s.p.a. Copernicus Big Data Workshop March 13-14 2014 European Commission Brussels
Alison Yao, Ph.D. July 2014
* Alison Yao, Ph.D. Program Officer, Office of Genomics and Advanced Technologies Division of Microbiology and Infectious Diseases National Institute of Allergy and Infectious Diseases National Institutes
Make the Most of Big Data to Drive Innovation Through Reseach
White Paper Make the Most of Big Data to Drive Innovation Through Reseach Bob Burwell, NetApp November 2012 WP-7172 Abstract Monumental data growth is a fact of life in research universities. The ability
Leading Genomics. Diagnostic. Discove. Collab. harma. Shanghai Cambridge, MA Reykjavik
Leading Genomics Diagnostic harma Discove Collab Shanghai Cambridge, MA Reykjavik Global leadership for using the genome to create better medicine WuXi NextCODE provides a uniquely proven and integrated
M110.726 The Nucleus M110.727 The Cytoskeleton M340.703 Cell Structure and Dynamics
of Biochemistry and Molecular Biology 1. Master the knowledge base of current biochemistry, molecular biology, and cellular physiology Describe current knowledge in metabolic transformations conducted
The Extension of the DICOM Standard to Incorporate Omics
Imperial College London The Extension of the DICOM Standard to Incorporate Omics Data Richard I Kitney, Vincent Rouilly and Chueh-Loo Poh Department of Bioengineering We stand at the dawn of a new understanding
An Introduction to Genomics and SAS Scientific Discovery Solutions
An Introduction to Genomics and SAS Scientific Discovery Solutions Dr Karen M Miller Product Manager Bioinformatics SAS EMEA 16.06.03 Copyright 2003, SAS Institute Inc. All rights reserved. 1 Overview!
Open & Big Data for Life Imaging Technical aspects : existing solutions, main difficulties. Pierre Mouillard MD
Open & Big Data for Life Imaging Technical aspects : existing solutions, main difficulties Pierre Mouillard MD What is Big Data? lots of data more than you can process using common database software and
Connecting Basic Research and Healthcare Big Data
Elsevier Health Analytics WHS 2015 Big Data in Health Connecting Basic Research and Healthcare Big Data Olaf Lodbrok Managing Director Elsevier Health Analytics [email protected] t +49 89 5383 600
How To Write A Blog Post On Globus
Globus Software as a Service data publication and discovery Kyle Chard, University of Chicago Computation Institute, [email protected] Jim Pruyne, University of Chicago Computation Institute, [email protected]
Electronic Laboratory Notebook in the Graduate Level Laboratory Informatics Program
Electronic Laboratory Notebook in the Graduate Level Laboratory Informatics Program Mahesh Merchant, Paresh Sanghani*, Sonal Sanghani* * Department of Biochemistry and Molecular Biology Mahesh Merchant
BIOSCIENCES COURSE TITLE AWARD
COURSE TITLE AWARD BIOSCIENCES As a Biosciences undergraduate student at the University of Westminster, you will benefit from some of the best teaching and facilities available. Our courses combine lecture,
BIOINFORMATICS Supporting competencies for the pharma industry
BIOINFORMATICS Supporting competencies for the pharma industry ABOUT QFAB QFAB is a bioinformatics service provider based in Brisbane, Australia operating nationwide and internationally. QFAB was established
BIOINFORMATICS METHODS AND APPLICATIONS
FACULTY of ENGINEERING SCHOOL OF COMPUTER SCIENCE AND ENGINEERING BINF3010/9010 BIOINFORMATICS METHODS AND APPLICATIONS SESSION 1, 2015 Course staff Course Convener: Bruno Gaëta [email protected] School
Standard Big Data Architecture and Infrastructure
Standard Big Data Architecture and Infrastructure Wo Chang Digital Data Advisor Information Technology Laboratory (ITL) National Institute of Standards and Technology (NIST) [email protected] May 20, 2016
Detailed Design Report
Detailed Design Report Chapter 9 Control System MAX IV Facility CHAPTER 9.0. CONTROL SYSTEM 1(9) 9. Control System 9.1. Introduction...2 9.1.1. Requirements... 2 9.2. Design...3 9.2.1. Guidelines... 3
Horizon 2020. Research e-infrastructures Excellence in Science Work Programme 2016-17. Wim Jansen. DG CONNECT European Commission
Horizon 2020 Research e-infrastructures Excellence in Science Work Programme 2016-17 Wim Jansen DG CONNECT European Commission 1 Before we start The material here presented has been compiled with great
Continuous Integration
Continuous Integration Sébastien Besson Open Microscopy Environment Wellcome Trust Centre for Gene Regulation & Expression College of Life Sciences, University of Dundee Dundee, Scotland, UK 1 Plan 1.
Attacking the Biobank Bottleneck
Attacking the Biobank Bottleneck Professor Jan-Eric Litton BBMRI-ERIC BBMRI-ERIC Big Data meets research biobanking Big data is high-volume, high-velocity and highvariety information assets that demand
Programme Specification (2014-15): MSc in Bioinformatics and Computational Genomics
Date of Revision Date of Previous Revision Programme Specification (2014-15): MSc in Bioinformatics and Computational Genomics A programme specification is required for any programme on which a student
EUROPEAN COMMISSION Directorate-General for Research & Innovation. Guidelines on Data Management in Horizon 2020
EUROPEAN COMMISSION Directorate-General for Research & Innovation Guidelines on Data Management in Horizon 2020 Version 2.0 30 October 2015 1 Introduction In Horizon 2020 a limited and flexible pilot action
How To Change Medicine
P4 Medicine: Personalized, Predictive, Preventive, Participatory A Change of View that Changes Everything Leroy E. Hood Institute for Systems Biology David J. Galas Battelle Memorial Institute Version
Deliverable D1.1. Building data bridges between biological and medical infrastructures in Europe. Grant agreement no.: 284209
Deliverable D1.1 Project Title: Building data bridges between biological and medical infrastructures in Europe Project Acronym: BioMedBridges Grant agreement no.: 284209 Research Infrastructures, FP7 Capacities
ENABLING DATA TRANSFER MANAGEMENT AND SHARING IN THE ERA OF GENOMIC MEDICINE. October 2013
ENABLING DATA TRANSFER MANAGEMENT AND SHARING IN THE ERA OF GENOMIC MEDICINE October 2013 Introduction As sequencing technologies continue to evolve and genomic data makes its way into clinical use and
BIG DATA IN THE CLOUD : CHALLENGES AND OPPORTUNITIES MARY- JANE SULE & PROF. MAOZHEN LI BRUNEL UNIVERSITY, LONDON
BIG DATA IN THE CLOUD : CHALLENGES AND OPPORTUNITIES MARY- JANE SULE & PROF. MAOZHEN LI BRUNEL UNIVERSITY, LONDON Overview * Introduction * Multiple faces of Big Data * Challenges of Big Data * Cloud Computing
Ph.D. in Bioinformatics and Computational Biology Degree Requirements
Ph.D. in Bioinformatics and Computational Biology Degree Requirements Credits Students pursuing the doctoral degree in BCB must complete a minimum of 90 credits of relevant work beyond the bachelor s degree;
Research Collaboration in the Cloud: - the NeCTAR Research Cloud
Research Collaboration in the Cloud: - the NeCTAR Research Cloud National eresearch Collaboration Tools and Resources nectar.org.au NeCTAR is an initiative of the Australian Government being conducted
Harald Isemann. Vienna Brno Olomouc. Research Institute for Molecular Pathology
Harald Isemann Vienna Brno Olomouc Managing Director Research Institute for Molecular Pathology The Campus The Campus Max F. Perutz Laboratories www.mfpl.ac.at www.univie.ac.at www.meduniwien.ac.at Research
The growing challenges of big data in the agricultural and ecological sciences
The growing challenges of big data in the agricultural and ecological sciences [email protected] Head of Computational and Systems Biology Food Security Demand for food is projected to increase
The 5G Infrastructure Public-Private Partnership
The 5G Infrastructure Public-Private Partnership NetFutures 2015 5G PPP Vision 25/03/2015 19/06/2015 1 5G new service capabilities User experience continuity in challenging situations such as high mobility
Cloud Computing Solutions for Genomics Across Geographic, Institutional and Economic Barriers
Cloud Computing Solutions for Genomics Across Geographic, Institutional and Economic Barriers Ntinos Krampis Asst. Professor J. Craig Venter Institute [email protected] http://www.jcvi.org/cms/about/bios/kkrampis/
Cloud BioLinux: Pre-configured and On-demand Bioinformatics Computing for the Genomics Community
Cloud BioLinux: Pre-configured and On-demand Bioinformatics Computing for the Genomics Community Ntinos Krampis Asst. Professor J. Craig Venter Institute [email protected] http://www.jcvi.org/cms/about/bios/kkrampis/
Developing Scalable Smart Grid Infrastructure to Enable Secure Transmission System Control
Developing Scalable Smart Grid Infrastructure to Enable Secure Transmission System Control EP/K006487/1 UK PI: Prof Gareth Taylor (BU) China PI: Prof Yong-Hua Song (THU) Consortium UK Members: Brunel University
PODD. An Ontology Driven Architecture for Extensible Phenomics Data Management
PODD An Ontology Driven Architecture for Extensible Phenomics Data Management Gavin Kennedy Gavin Kennedy PODD Project Manager High Resolution Plant Phenomics Centre Canberra, Australia What is Plant Phenomics?
How To Useuk Data Service
Publishing and citing research data Research Data Management Support Services UK Data Service University of Essex April 2014 Overview While research data is often exchanged in informal ways with collaborators
1. Program Title Master of Science Program in Biochemistry (International Program)
1 Program Structure and Specification Master of Science Program in Biochemistry (International Program) Curriculum Last Revised in 2012 for Students Entering in Academic Year 2016 -----------------------------------------
Service Overview. KANA Express. Introduction. Good experiences. On brand. On budget.
KANA Express Service Overview Introduction KANA Express provides a complete suite of integrated multi channel contact and knowledge management capabilities, proven to enable significant improvements in
Data Publishing Workflows with Dataverse
Data Publishing Workflows with Dataverse Mercè Crosas, Ph.D. Twitter: @mercecrosas Director of Data Science Institute for Quantitative Social Science, Harvard University MIT, May 6, 2014 Intro to our Data
Data Analytics at NERSC. Joaquin Correa [email protected] NERSC Data and Analytics Services
Data Analytics at NERSC Joaquin Correa [email protected] NERSC Data and Analytics Services NERSC User Meeting August, 2015 Data analytics at NERSC Science Applications Climate, Cosmology, Kbase, Materials,
National eresearch Collaboration Tools and Resources nectar.org.au
National eresearch Collaboration Tools and Resources nectar.org.au NeCTAR is an Australian Government project conducted as part of the Super Science initiative and financed by the Education Investment
Virginia Commonwealth University Rice Rivers Center Data Management Plan
Virginia Commonwealth University Rice Rivers Center Data Management Plan Table of Contents Objectives... 2 VCU Rice Rivers Center Research Protocol... 2 VCU Rice Rivers Center Data Management Plan... 3
Global Alliance. Ewan Birney Associate Director EMBL-EBI
Global Alliance Ewan Birney Associate Director EMBL-EBI Our world is changing Research to Medical Research English as language Lightweight legal Identical/similar systems Open data Publications Grant-funding
Manjrasoft Market Oriented Cloud Computing Platform
Manjrasoft Market Oriented Cloud Computing Platform Aneka Aneka is a market oriented Cloud development and management platform with rapid application development and workload distribution capabilities.
Image Data, RDA and Practical Policies
Image Data, RDA and Practical Policies Rainer Stotzka and many others KIT University of the State of Baden-Württemberg and National Laboratory of the Helmholtz Association www.kit.edu Data Life Cycle Lab
Genevestigator Training
Genevestigator Training Gent, 6 November 2012 Philip Zimmermann, Nebion AG Goals Get to know Genevestigator What Genevestigator is for For who Genevestigator was created How to use Genevestigator for your
