The THREDDS Data Repository: for Long Term Data Storage and Access
|
|
|
- Hollie Barker
- 10 years ago
- Views:
Transcription
1 8B.7 The THREDDS Data Repository: for Long Term Data Storage and Access Anne Wilson, Thomas Baltzer, John Caron Unidata Program Center, UCAR, Boulder, CO 1 INTRODUCTION In order to better manage ever increasing volumes of scientific data, infrastructure in the form of intelligent data servers is needed. Large data volumes mean that server-side subsetting and aggregation are essential server requirements. But simply being able to serve data is no longer sufficient. Data must also be discoverable, queriable, and browsable. Thus these servers must also provide a cataloging system and tools to manage and access metadata. Data generators and providers want to be able to publish both transient and archival data to a reliable and secure location that allows access to a community of data consumers. These providers want protection from time consuming housekeeping tasks such as locating storage, moving the data, and updating metadata catalogs. They also want to be able to easily configure their data space in a meaningful manner. Similarly data consumers want to be free from issues related to locating and acquiring data, as well as format incompatibility issues. Such chores are often a time consuming distraction from the actual task to be performed. Indeed, such a level of automation is required in order for the repository to be used by an automated data generator or consumer, such as an data processing system that acquires data for inputs and generates data as outputs. The Unidata THREDDS Data Repository (TDR), currently under development, provides long term data storage with metadata management support in the form of THREDDS catalogs. Integrated with the THREDDS Data * Corresponding author address: Dr. Anne Wilson PO Box 3000 Boulder Colorado, 80307: [email protected] Server (TDS), the two together provide automated, high level data and metadata storage and access. Through use cases provided by two TDR client projects, this paper discusses TDR requirements and the functionality designed to meet them. 2 TDR CLIENT STORIES There are two projects that are steering design requirements for the TDR. 2.1 The LEAD Project Linked Environments for Atmospheric Discovery (LEAD) is a multi-institutional Large Information Technology Research (ITR) project funded by the National Science Foundation (NSF). The goal of LEAD is to create a framework based on Grid and web services to support mesoscale meteorology research and education via provision of capabilities such as running a forecast model over a spatial and temporal domain specified by the user, data mining for meteorological phenomena, and dynamic orchestrations that automatically reconfigure themselves in response to changing weather. LEAD accomplishes these tasks via orchestration software that invokes and manages component services. ( Workflow is another term for orchestration.) LEAD presents unique challenges in managing and storing large data volumes from real-time observational systems as well as data that are dynamically created during the execution of adaptive workflows. Furthermore, LEAD users want one stop shopping to visualize data and store them and their associated metadata in a location where they could be made private or be accessed by
2 others, both individuals as well as services in the orchestration. LEAD orchestrations must retrieve data required for their use. They also generate intermediate and final data products to be stored in the user s space. An orchestration can also acquire and generate metadata for these generated products, such as user information (including security tokens such as certificates or encoded passwords), data format, data size, provenance information, etc. LEAD metadata is organized around individual users, whose space is structured as projects that contain experiments. 2.2 The STORM Case Study Archive The Unidata STORM Case Study Archive is a repository for case studies. This resource will provide community access to case studies compiled and annotated by case study authors. The archive will store data, metadata, and other resources such as notes, links to other resources, IDV bundles (a collection of data references and their rendering created by the Unidata Integrated Data View or IDV), etc. Case study authors need to be able to move data into the repository, provide metadata, and ensure that the resources placed in the repository, both data and metadata, are accessible to the community they are intended to serve. They will need to be able to update the metadata. They will want to be able to organize resources in the repository, moving datasets and associated metadata around in the case study hierarchy. For STORM, the metadata is structured around case studies, such as Katrina and Rita. Multiple case study authors may be authorized to add or change information in a case study. Users of the case study repository will be able to access, visualize and acquire these resources in useable formats. 2.3 Commonalities Both of these projects require support in the areas of: Long term data and metadata storage Data movement into and out of the repository Metadata generation and management High level data access functionality, such as server side subsetting 3 The THREDDS Data Server (TDS) The Unidata THREDDS Data Server addresses several of these concerns. The TDS is designed to serve a moving window of recent scientific data and currently does so via the protocols http, OPeNDAP, and WCS for gridded data. It can provide the data in the format in which it is stored on the disk, and is working towards generating netcdf files from a variety of data formats through the Unidata Common Data Model (CDM). (The list of CDM supported data formats currently includes: NetCDF, level II and level III radar, Grib, GINI, and HDF5.) Also, the TDS supports metadata maintained in the form of a hierarchy of catalogs that contain information such as data format (e.g. NetCDF, GRIB, HDF5, etc.), type (e.g. grid, image, point, radial, etc.), data size, spatial and temporal coverages, variables (if applicable), descriptions, creators, publishers, rights, etc. Although Unidata does maintain some data archives, an important TDS goal is for users to easily implement and maintain their own data server. For this reason, the TDS is an integrated, easy to install package consisting of THREDDS cataloging software, an OPeNDAP server, a WCS server, and other supporting software. 4 The THREDDS Data Repository (TDR) While the TDS was designed around serving transitory data, the TDR is intended to fill in the long term storage aspect of the TDS. It provides data movement into a long term repository and ensures data access from the repository through integration with a coresiding TDS. In addition to providing storage for the data, it supports incorporation of preexisting metadata and generation of new metadata into a metadata hierarchy that, in later versions, will be editable through a catalog and metadata management interface.
3 Indeed, the basis of the TDR is metadata management. The handle to the actual data can be viewed as simply another metadata element. Information needed by the TDR to put data into the repository includes: information about a user, used in authentication, authorization, and elsewhere, a reference to the data to be stored, either a local file or a URL, any preexisting metadata, and, a logical name, such as Katrina/data/radar/level2, used to position the metadata in the metadata hierarchy. The successful completion of a request to put data into the repository returns a unique ID, a non-semantic sequence of characters that is a handle to the data. Either this ID, or the original user information and logical name can be used to retrieve the data. The use cases dictate that the TDR provide two types of interfaces. One is via a web form where users can enter information to update the repository. The other is via programmatic calls from one web service to another. The TDR is implemented as a Java servlet, which meets both of these goals. For the STORM project, case study authors are provided with a web form that allows users to enter a URL or browse for a local file to be a data source. Also, the user can enter metadata such as the data type and format if it is known. Users also provide a project name and logical name for the data being stored via this form. The programmatic interface provides the same information via Java networked I/O. Once the TDR receives a request either via the web interface or programmatically it performs the following sequence of actions: 1. Locate storage for the data. 2. Generate a unique ID for the data. 3. Copy the data to the storage. 4. Generate additional metadata. 5. Incorporate the metadata into the existing metadata hierarchy. We will examine each of these tasks individually. These different jobs are each performed by respective specialized task managers: the Storage Locater, the Unique ID Generator, the Data Mover, the Metadata Generator, and the Cataloger. 4.1 Locate Storage Locating storage for the data entails knowing where there is space available to put the data. Thus the Storage Locater must track space availability. It is the job of the Storage Locater to ensure that the user is authorized to use a space and that by doing so they will not violate any space management policies. Depending on authorizations, space available could include a variety of storage devices, such as space on a mass storage device. 4.2 Generate a Unique ID This ID will be used as a handle to the data. It must be guaranteed to be unique within the domain in which it exists. For the TDR that is serving both LEAD and the STORM Case Study Archive, the ID must be unique within all the catalogs served by the TDS. (Algorithms exist that generate IDs with such an extremely low likelihood for duplication that they can be considered unique.) The Unique ID Generator performs this task. 4.3 Copy the data to the storage The Data Mover is responsible for copying the data to the target destination. The Data Mover makes decisions about how to move the data based in part on the protocol portion of the URL of the source file. Http service is assumed to be available at the data source, but if the protocol indicates a better option, such as gridftp, that will be tried. 4.4 Generate Additional Metadata New metadata to be generated includes recording where the data now physically resides. (This mapping from unique ID to physical location could get complex if replication was involved. In such a case name mapping could be achieved via a separate Name Resolution object or service.) Perhaps the most important additional metadata generated is the creation of access URLs for the data. In order to create these an attempt must be made to know the data
4 format. The Metadata Generator is currently using the following logic to determine how the data can be served: 1) All files can be served via http. 2) If there is a gridftp server present, all files can be served via gridftp. 3) If the file can be successfully opened by the Common Data Model file accessor, then it can be served via OPeNDAP. 4) If 3 above is true and the file is known to be a grid, then it can be served via WCS. 4.5 Incorporate the metadata into the Metadata Hierarchy The TDR updates the appropriate catalogs with the new metadata, which involves the creation of new datasets in the existing catalogs and may also involve the creation of new catalogs. The logical name is used to determine where in the metadata hierarchy to place the new dataset. Reflecting the logical path may involve the generation of nested datasets. The TDR then signals the TDS to reread the catalogs in order to incorporate the changes. Now the data and metadata can be served via the TDS. 5 FUTURE DIRECTIONS A prototype implementation of the TDR is available on the Unidata LEAD test bed. It is currently handling input for both clients. The next important functionality to be added is to support the handling of a collection of files in the form of a compressed zip file. It is assumed that these files are of the same data type so that a single chunk of metadata can be applied to the entire set. Also, general repository editing capabilities need to be provided, such as the ability to add and edit metadata and remove data and its associated metadata. 6 ACKNOWLEDGEMENTS LEAD is a Large Information Technology Research (ITR) Grant funded by the National Science Foundation under the following Cooperative Agreements: ATM (University of Oklahoma), ATM (Colorado State University), ATM (Millersville University), ATM (Indiana University), ATM (University of Alabama in Huntsville), ATM (Howard University), ATM (University Corporation for Atmospheric Research), and ATM (University of Illinois at Urbana- Champaign, with a sub-contract to the University of North Carolina). 7 REFERENCES Droegemeier, K.K. and Co-Authors, 2005: Service-oriented environments in research and education for dynamically interacting with mesoscale weather. Computing in Science and Engineering, 7, Droegemeier, K. K. and Co-Authors, 2005: Linked environments for atmospheric discovery (LEAD): Architecture, technology roadmap and deployment strategy. Preprints, 21st Conf. on Interactive Info. Processing Systems for Meteorology, Oceanography, and Hydrology, San Diego, CA, Amer. Meteor. Soc. Droegemeier, K.K. and Co-Authors, 2005: Service-oriented environments in research and education for dynamically interacting with mesoscale weather. Computing in Science and Engineering, 7, Murray, D., J. McWhirter, S. Wier, and S. Emmerson, 2003: The Integrated Data Viewer a web-enabled application for scientific analysis and visualization. Preprints, 19th Conf. On Integrated Information and Processing, 8-13 February, Long Beach, CA, Amer. Meteor. Soc. THREDDS Catalog Specification tech/catalog/invcatalogspec.html OPeNDAP Project The Integrated Data Viewer (IDV) dv/
5 The Common Data Model (CDM) DM/index.html The THREDDS Data Server (TDS) S/tech/#TDS
13.2 THE INTEGRATED DATA VIEWER A WEB-ENABLED APPLICATION FOR SCIENTIFIC ANALYSIS AND VISUALIZATION
13.2 THE INTEGRATED DATA VIEWER A WEB-ENABLED APPLICATION FOR SCIENTIFIC ANALYSIS AND VISUALIZATION Don Murray*, Jeff McWhirter, Stuart Wier, Steve Emmerson Unidata Program Center, Boulder, Colorado 1.
The Arctic Observing Network and its Data Management Challenges Florence Fetterer (NSIDC/CIRES/CU), James A. Moore (NCAR/EOL), and the CADIS team
The Arctic Observing Network and its Data Management Challenges Florence Fetterer (NSIDC/CIRES/CU), James A. Moore (NCAR/EOL), and the CADIS team Photo courtesy Andrew Mahoney NSF Vision What is AON? a
Integrated Data Viewer (IDV) a visualization framework Yuan Ho Unidata Program Center Boulder, CO Presentation Outline Integrated Data Viewer (IDV) overview The IDV features IDV examples and customized
Real-time and Archived Antarctic. Interactive Processing Tools
Real-time and Archived Antarctic Meteorological Data via a Synergy of Interactive Processing Tools Mark W. Seefeldt Department of Atmospheric and Oceanic Sciences University it of Colorado at Boulder,
PART 1. Representations of atmospheric phenomena
PART 1 Representations of atmospheric phenomena Atmospheric data meet all of the criteria for big data : they are large (high volume), generated or captured frequently (high velocity), and represent a
Steve Ansari *, Stephen Del Greco, Brian Nelson, and Helen Frederick NOAA National Climatic Data Center, Asheville, North Carolina 2.
11.4 THE SEVERE WEATHER DATA INVENTORY (SWDI): SPATIAL QUERY TOOLS, WEB SERVICES AND DATA PORTALS AT NOAA S NATIONAL CLIMATIC DATA CENTER (NCDC) Steve Ansari *, Stephen Del Greco, Brian Nelson, and Helen
Workload Characterization and Analysis of Storage and Bandwidth Needs of LEAD Workspace
Workload Characterization and Analysis of Storage and Bandwidth Needs of LEAD Workspace Beth Plale Indiana University [email protected] LEAD TR 001, V3.0 V3.0 dated January 24, 2007 V2.0 dated August
The ORIENTGATE data platform
Seminar on Proposed and Revised set of indicators June 4-5, 2014 - Belgrade (Serbia) The ORIENTGATE data platform WP2, Action 2.4 Alessandra Nuzzo, Sandro Fiore, Giovanni Aloisio Scientific Computing and
HYCOM Meeting. Tallahassee, FL
HYCOM Data Service An overview of the current status and new developments in Data management, software and hardware Ashwanth Srinivasan & Jon Callahan COAPS FSU & PMEL HYCOM Meeting Nov 7-9, 7 2006 Tallahassee,
The Data Grid: Towards an Architecture for Distributed Management and Analysis of Large Scientific Datasets
The Data Grid: Towards an Architecture for Distributed Management and Analysis of Large Scientific Datasets!! Large data collections appear in many scientific domains like climate studies.!! Users and
J9.6 GIS TOOLS FOR VISUALIZATION AND ANALYSIS OF NEXRAD RADAR (WSR-88D) ARCHIVED DATA AT THE NATIONAL CLIMATIC DATA CENTER
J9.6 GIS TOOLS FOR VISUALIZATION AND ANALYSIS OF RADAR (WSR-88D) ARCHIVED DATA AT THE NATIONAL CLIMATIC DATA CENTER Steve Ansari * STG Incorporated, Asheville, North Carolina Stephen Del Greco NOAA National
The ORIENTGATE data platform
Research Papers Issue RP0195 December 2013 The ORIENTGATE data platform SCO Scientific Computing and Operations Division By Alessandra Nuzzo University of Salento and Scientific Computing and Operations
GOSIC NEXRAD NIDIS NOMADS
NOAA National Climatic Data Center GOSIC NEXRAD NIDIS NOMADS Christina Lief NOAA/NESDIS/NCDC GOSIC Program Manager NOAA/NESDIS/NCDC Asheville, NC 28801 GEOSS AIP Phase 2 Workshop September 25-26, 2008
THREDDS. THematic Real-time Environmental Distributed Data Services. Connecting people, documents and data
THREDDS THematic Real-time Environmental Distributed Data Services Connecting people, documents and data Ben Domenico, John Caron, Ethan Davis, Robb Kambic, Stefano Nativi Unidata Program Center and University
GridFTP: A Data Transfer Protocol for the Grid
GridFTP: A Data Transfer Protocol for the Grid Grid Forum Data Working Group on GridFTP Bill Allcock, Lee Liming, Steven Tuecke ANL Ann Chervenak USC/ISI Introduction In Grid environments,
Data-Intensive Science and Scientific Data Infrastructure
Data-Intensive Science and Scientific Data Infrastructure Russ Rew, UCAR Unidata ICTP Advanced School on High Performance and Grid Computing 13 April 2011 Overview Data-intensive science Publishing scientific
Data Grids. Lidan Wang April 5, 2007
Data Grids Lidan Wang April 5, 2007 Outline Data-intensive applications Challenges in data access, integration and management in Grid setting Grid services for these data-intensive application Architectural
An Integrated Simulation and Visualization Framework for Tracking Cyclone Aila
An Integrated Simulation and Visualization Framework for Tracking Cyclone Aila Preeti Malakar 1, Vijay Natarajan, Sathish S. Vadhiyar, Ravi S. Nanjundiah Department of Computer Science and Automation Supercomputer
Unidata s Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud
Unidata s Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud The WWOSC 2014 17 August 2014 Montreal, CA Dr. Mohan Ramamurthy Unidata Program Center University Corporation
CDI/THREDDS Interoperability: the SeaDataNet developments. P. Mazzetti 1,2, S. Nativi 1,2, 1. CNR-IMAA; 2. PIN-UNIFI
CDI/THREDDS Interoperability: the SeaDataNet developments P. Mazzetti 1,2, S. Nativi 1,2, 1. CNR-IMAA; 2. PIN-UNIFI Outline Interoperability Issues in SeaDataNet A broker solution for CDI/THREDDS interoperability
THE CCLRC DATA PORTAL
THE CCLRC DATA PORTAL Glen Drinkwater, Shoaib Sufi CCLRC Daresbury Laboratory, Daresbury, Warrington, Cheshire, WA4 4AD, UK. E-mail: [email protected], [email protected] Abstract: The project aims
Introduction to IODE Data Management. Greg Reed Past Co-Chair IODE
Introduction to IODE Data Management Greg Reed Past Co-Chair IODE Outline Background - Introduction to IOC and IODE - IODE activities Oceanographic data management - End to end data management - Data stewardship
The distribution of marine OpenData via distributed data networks and Web APIs. The example of ERDDAP, the message broker and data mediator from NOAA
The distribution of marine OpenData via distributed data networks and Web APIs. The example of ERDDAP, the message broker and data mediator from NOAA Dr. Conor Delaney 9 April 2014 GeoMaritime, London
Project Summary. Project Description
Project Summary This project will help to bridge the gap between meteorology education and local operational forecasting through collaboration with the National Weather Service (NWS). By leveraging emerging
StormSurgeViz: A Visualization and Analysis Application for Distributed ADCIRC-based Coastal Storm Surge, Inundation, and Wave Modeling
: A Visualization and Analysis Application for Distributed ADCIRC-based Coastal Storm Surge, Inundation, and Wave Modeling Brian Blanton Renaissance Computing Institute University of North Carolina at
INTEROPERABLE IMAGE DATA ACCESS THROUGH ARCGIS SERVER
INTEROPERABLE IMAGE DATA ACCESS THROUGH ARCGIS SERVER Qian Liu Environmental Systems Research Institute 380 New York Street Redlands, CA92373, U.S.A - [email protected] KEY WORDS: OGC, Standard, Interoperability,
6A.4 TESTING STRATEGIES FOR THE NEXT GENERATION AWIPS
6A.4 TESTING STRATEGIES FOR THE NEXT GENERATION AWIPS Kenneth L. Stricklett* National Weather Service, Office of Operational Systems, Silver Spring, MD Jason P. Tuell, Ronla K. Henry, Peter K. Pickard
IDL. Get the answers you need from your data. IDL
Get the answers you need from your data. IDL is the preferred computing environment for understanding complex data through interactive visualization and analysis. IDL Powerful visualization. Interactive
Technical. Overview. ~ a ~ irods version 4.x
Technical Overview ~ a ~ irods version 4.x The integrated Ru e-oriented DATA System irods is open-source, data management software that lets users: access, manage, and share data across any type or number
REACCH PNA Data Management Plan
REACCH PNA Data Management Plan Regional Approaches to Climate Change (REACCH) For Pacific Northwest Agriculture 875 Perimeter Drive MS 2339 Moscow, ID 83844-2339 http://www.reacchpna.org [email protected]
NetCDF and HDF Data in ArcGIS
2013 Esri International User Conference July 8 12, 2013 San Diego, California Technical Workshop NetCDF and HDF Data in ArcGIS Nawajish Noman Kevin Butler Esri UC2013. Technical Workshop. Outline NetCDF
GIS Initiative: Developing an atmospheric data model for GIS. Olga Wilhelmi (ESIG), Jennifer Boehnert (RAP/ESIG) and Terri Betancourt (RAP)
GIS Initiative: Developing an atmospheric data model for GIS Olga Wilhelmi (ESIG), Jennifer Boehnert (RAP/ESIG) and Terri Betancourt (RAP) Unidata seminar August 30, 2004 Presentation Outline Overview
VISUAL INSPECTION OF EO DATA AND PRODUCTS - OVERVIEW
WMS services from the EUMETSAT real-time Image Gallery Uwe Voges (1), Michael Schick (2), Udo Einspanier (1) (1) con terra GmbH Martin-Luther-King-Weg 24, 48155, Münster, Germany EMail: (U.Voges U.Einspanier)@conterra.de
Open Source Visualisation with ADAGUC Web Map Services
Open Source Visualisation with ADAGUC Web Map Services Maarten Plieger Ernst de Vreede John van de Vegte, Wim Som de Cerff, Raymond Sluiter, Ian van der Neut, Jan Willem Noteboom 1 ADAGUC project Cooperative
Environmental Data Management:
Environmental Data Management: Challenges & Opportunities Mohan Ramamurthy, Unidata University Corporation for Atmospheric Research Boulder CO 25 May 2010 Environmental Data Management Workshop Silver
HydroDesktop Overview
HydroDesktop Overview 1. Initial Objectives HydroDesktop (formerly referred to as HIS Desktop) is a new component of the HIS project intended to address the problem of how to obtain, organize and manage
Zhenping Liu *, Yao Liang * Virginia Polytechnic Institute and State University. Xu Liang ** University of California, Berkeley
P1.1 AN INTEGRATED DATA MANAGEMENT, RETRIEVAL AND VISUALIZATION SYSTEM FOR EARTH SCIENCE DATASETS Zhenping Liu *, Yao Liang * Virginia Polytechnic Institute and State University Xu Liang ** University
An Operational Local Data Integration System (LDIS) at WFO Melbourne
1. Introduction An Operational Local Data Integration System (LDIS) at WFO Melbourne Peter F. Blottman, Scott M. Spratt, David W. Sharp, and Anthony J. Cristaldi III WFO Melbourne, FL Jonathan L. Case
Data Management in an International Data Grid Project. Timur Chabuk 04/09/2007
Data Management in an International Data Grid Project Timur Chabuk 04/09/2007 Intro LHC opened in 2005 several Petabytes of data per year data created at CERN distributed to Regional Centers all over the
Server based signature service. Overview
1(11) Server based signature service Overview Based on federated identity Swedish e-identification infrastructure 2(11) Table of contents 1 INTRODUCTION... 3 2 FUNCTIONAL... 4 3 SIGN SUPPORT SERVICE...
AN APPROACH TO DEVELOPING BUSINESS PROCESSES WITH WEB SERVICES IN GRID
AN APPROACH TO DEVELOPING BUSINESS PROCESSES WITH WEB SERVICES IN GRID R. D. Goranova 1, V. T. Dimitrov 2 Faculty of Mathematics and Informatics, University of Sofia S. Kliment Ohridski, 1164, Sofia, Bulgaria
EUMETSAT EO Portal. End User Image Access using OGC WMS/WCS services. EUM/OPS/VWG/10/0095 Issue <1> <14/01/2010> Slide: 1
EUMETSAT EO Portal End User Image Access using OGC WMS/WCS services Slide: 1 Overview Introduction: status of data access and visualization EUMETSAT datasets Architecture Web Map Service implementation
Structure? Integrated Climate Data Center How to use the ICDC? Tools? Data Formats? User
Integrated Climate Data Center? Data Formats?? Tools???? visits Structure???? User Contents Which Data Formats do we offer? What is the Structure of our data center? Which Tools do we provide? Our Aims
Best Practices for Managing and Monitoring SAS Data Management Solutions. Gregory S. Nelson
Best Practices for Managing and Monitoring SAS Data Management Solutions Gregory S. Nelson President and CEO ThotWave Technologies, Chapel Hill, North Carolina ABSTRACT... 1 INTRODUCTION... 1 UNDERSTANDING
INTELLIGENT DEFECT ANALYSIS, FRAMEWORK FOR INTEGRATED DATA MANAGEMENT
INTELLIGENT DEFECT ANALYSIS, FRAMEWORK FOR INTEGRATED DATA MANAGEMENT Website: http://www.siglaz.com Abstract Spatial signature analysis (SSA) is one of the key technologies that semiconductor manufacturers
SIPAC. Signals and Data Identification, Processing, Analysis, and Classification
SIPAC Signals and Data Identification, Processing, Analysis, and Classification Framework for Mass Data Processing with Modules for Data Storage, Production and Configuration SIPAC key features SIPAC is
A Service-Oriented Integrating Practice for Data Modeling, Analysis and Visualization
A Service-Oriented Integrating Practice for Data Modeling, Analysis and Visualization Dan Cheng 1, Zhibin Mai 1, Jianjia Wu 1, Guan Qin 2, Wei Zhao 1 1 Department of Computer Science {dcheng, zbmai, jianjiaw,
Using EMC Documentum with Adobe LiveCycle ES
Technical Guide Using EMC Documentum with Adobe LiveCycle ES Table of contents 1 Deployment 3 Managing LiveCycle ES development assets in Documentum 5 Developing LiveCycle applications with contents in
How To Use The Alabama Data Portal
113 The Alabama Metadata Portal: http://portal.gsa.state.al.us By Philip T. Patterson Geological Survey of Alabama 420 Hackberry Lane P.O. Box 869999 Tuscaloosa, AL 35468-6999 Telephone: (205) 247-3611
Scientific Data Management and Dissemination
Federal GIS Conference February 9 10, 2015 Washington, DC Scientific Data Management and Dissemination John Fry Solution Engineer, Esri [email protected] Agenda Background of Scientific Data Management through
Norwegian Satellite Earth Observation Database for Marine and Polar Research http://normap.nersc.no USE CASES
Norwegian Satellite Earth Observation Database for Marine and Polar Research http://normap.nersc.no USE CASES The NORMAP Project team has prepared this document to present functionality of the NORMAP portal.
Model examples Store and provide Challenges WCS and OPeNDAP Recommendations. WCS versus OPeNDAP. Making model results available through the internet.
Making model results available through the internet. Fedor Baart, Gerben de Boer, Wim de Haas, Gennadiy Donchyts, Marc Philippart, Maarten Plieger September 14, 2011 Introduction Fedor Baart PhD thesis:
Managing Data in Motion
Managing Data in Motion Data Integration Best Practice Techniques and Technologies April Reeve ELSEVIER AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS SAN DIEGO SAN FRANCISCO SINGAPORE SYDNEY
Web Applications Access Control Single Sign On
Web Applications Access Control Single Sign On Anitha Chepuru, Assocaite Professor IT Dept, G.Narayanamma Institute of Technology and Science (for women), Shaikpet, Hyderabad - 500008, Andhra Pradesh,
J12.5 AN EXAMPLE OF NEXTGEN WEATHER INFORMATION INTEGRATION AND MANAGEMENT
J12.5 AN EXAMPLE OF NEXTGEN WEATHER INFORMATION INTEGRATION AND MANAGEMENT Russ Sinclair*, Tom Hicks, Carlos Felix, Keith Bourke Harris Corporation, Melbourne, Florida 1. INTRODUCTION In the NextGen era
Globus Striped GridFTP Framework and Server. Raj Kettimuthu, ANL and U. Chicago
Globus Striped GridFTP Framework and Server Raj Kettimuthu, ANL and U. Chicago Outline Introduction Features Motivation Architecture Globus XIO Experimental Results 3 August 2005 The Ohio State University
Long Term Preservation of Earth Observation Space Data. Preservation Workflow
Long Term Preservation of Earth Observation Space Data Preservation Workflow CEOS-WGISS Doc. Ref.: CEOS/WGISS/DSIG/PW Data Stewardship Interest Group Date: March 2015 Issue: Version 1.0 Preservation Workflow
irods and Metadata survey Version 0.1 Date March Abhijeet Kodgire [email protected] 25th
irods and Metadata survey Version 0.1 Date 25th March Purpose Survey of Status Complete Author Abhijeet Kodgire [email protected] Table of Contents 1 Abstract... 3 2 Categories and Subject Descriptors...
Big Data Volume & velocity data management with ERDAS APOLLO. Alain Kabamba Hexagon Geospatial
Big Data Volume & velocity data management with ERDAS APOLLO Alain Kabamba Hexagon Geospatial Intergraph is Part of the Hexagon Family Hexagon is dedicated to delivering actionable information through
Structured Content: the Key to Agile. Web Experience Management. Introduction
Structured Content: the Key to Agile CONTENTS Introduction....................... 1 Structured Content Defined...2 Structured Content is Intelligent...2 Structured Content and Customer Experience...3 Structured
Data Management using irods
Data Management using irods Fundamentals of Data Management September 2014 Albert Heyrovsky Applications Developer, EPCC [email protected] 2 Course outline Why talk about irods? What is irods?
BIRT Document Transform
BIRT Document Transform BIRT Document Transform is the industry leader in enterprise-class, high-volume document transformation. It transforms and repurposes high-volume documents and print streams such
Oracle Service Bus Examples and Tutorials
March 2011 Contents 1 Oracle Service Bus Examples... 2 2 Introduction to the Oracle Service Bus Tutorials... 5 3 Getting Started with the Oracle Service Bus Tutorials... 12 4 Tutorial 1. Routing a Loan
DSpace: An Institutional Repository from the MIT Libraries and Hewlett Packard Laboratories
DSpace: An Institutional Repository from the MIT Libraries and Hewlett Packard Laboratories MacKenzie Smith, Associate Director for Technology Massachusetts Institute of Technology Libraries, Cambridge,
Integrating VoltDB with Hadoop
The NewSQL database you ll never outgrow Integrating with Hadoop Hadoop is an open source framework for managing and manipulating massive volumes of data. is an database for handling high velocity data.
Integration strategy
C3-INAD and ESGF: Integration strategy C3-INAD Middleware Team: Stephan Kindermann, Carsten Ehbrecht [DKRZ] Bernadette Fritzsch [AWI] Maik Jorra, Florian Schintke, Stefan Plantikov [ZUSE Institute] Markus
Visualization and Data Analysis
Working Group Outbrief Visualization and Data Analysis James Ahrens, David Rogers, Becky Springmeyer Eric Brugger, Cyrus Harrison, Laura Monroe, Dino Pavlakos Scott Klasky, Kwan-Liu Ma, Hank Childs LLNL-PRES-481881
McIDAS-V - A powerful data analysis and visualization tool for multi and hyperspectral environmental satellite data
McIDAS-V - A powerful data analysis and visualization tool for multi and hyperspectral environmental satellite data Thomas Achtor, Thomas Rink, Thomas Whittaker, David Parker and David Santek Space Science
NOAA Big Data Project. David Michaud Acting Director, Office of Central Processing Office Monday, August 3, 2015
NOAA Big Data Project David Michaud Acting Director, Office of Central Processing Office Monday, August 3, 2015 Central Processing Portfolio Benefits and Scope Central Processing Portfolio Benefits Ensures
Web Application Hosting Cloud Architecture
Web Application Hosting Cloud Architecture Executive Overview This paper describes vendor neutral best practices for hosting web applications using cloud computing. The architectural elements described
Introduction to XML Applications
EMC White Paper Introduction to XML Applications Umair Nauman Abstract: This document provides an overview of XML Applications. This is not a comprehensive guide to XML Applications and is intended for
MIGRATING DESKTOP AND ROAMING ACCESS. Migrating Desktop and Roaming Access Whitepaper
Migrating Desktop and Roaming Access Whitepaper Poznan Supercomputing and Networking Center Noskowskiego 12/14 61-704 Poznan, POLAND 2004, April white-paper-md-ras.doc 1/11 1 Product overview In this whitepaper
MyOcean Copernicus Marine Service Architecture and data access Experience
MyOcean Copernicus Marine Service Architecture and data access Experience Sophie Besnard CLS, Toulouse, France February 2015 MyOcean Story MyOcean Challenge & Success MyOcean Service MyOcean System MyOcean
ER/Studio 8.0 New Features Guide
ER/Studio 8.0 New Features Guide Copyright 1994-2008 Embarcadero Technologies, Inc. Embarcadero Technologies, Inc. 100 California Street, 12th Floor San Francisco, CA 94111 U.S.A. All rights reserved.
Distributed Computing. Mark Govett Global Systems Division
Distributed Computing Mark Govett Global Systems Division Modeling Activities Prediction & Research Weather forecasts, climate prediction, earth system science Observing Systems Denial experiments Observing
Flattening Enterprise Knowledge
Flattening Enterprise Knowledge Do you Control Your Content or Does Your Content Control You? 1 Executive Summary: Enterprise Content Management (ECM) is a common buzz term and every IT manager knows it
Windchill Service Information Manager 10.1. Curriculum Guide
Windchill Service Information Manager 10.1 Curriculum Guide Live Classroom Curriculum Guide Building Information Structures with Windchill Service Information Manager 10.1 Building Publication Structures
David P. Ruth* Meteorological Development Laboratory Office of Science and Technology National Weather Service, NOAA Silver Spring, Maryland
9.9 TRANSLATING ADVANCES IN NUMERICAL WEATHER PREDICTION INTO OFFICIAL NWS FORECASTS David P. Ruth* Meteorological Development Laboratory Office of Science and Technology National Weather Service, NOAA
Terms and Definitions for CMS Administrators, Architects, and Developers
Sitecore CMS 6 Glossary Rev. 081028 Sitecore CMS 6 Glossary Terms and Definitions for CMS Administrators, Architects, and Developers Table of Contents Chapter 1 Introduction... 3 1.1 Glossary... 4 Page
Jozef Matula. Visualisation Team Leader IBL Software Engineering. 13 th ECMWF MetOps Workshop, 31 th Oct - 4 th Nov 2011, Reading, United Kingdom
Visual Weather web services Jozef Matula Visualisation Team Leader IBL Software Engineering Outline Visual Weather in a nutshell. Path from Visual Weather (as meteorological workstation) to Web Server
Addressing the SAP Data Migration Challenges with SAP Netweaver XI
Addressing the SAP Data Migration Challenges with SAP Netweaver XI Executive Summary: Whether it is during the final phases of a new SAP implementation, during SAP upgrades and updates, during corporate
