1 Data Management Handbook Last updated: December, 2002
2 Argo Data Management Handbook 2 TABLE OF CONTENTS 1. INTRODUCTION GLOBAL DATA FLOW RESPONSIBILITIES NATIONAL CENTRES: GLOBAL ARGO CENTRES PRINCIPAL INVESTIGATORS REGIONAL CENTRES THE ARGO INFORMATION CENTRE (AIC) LONG TERM ARCHIVE DATA MANAGEMENT TASKS FLOAT IDENTIFIERS DATA FORMATS REAL-TIME QUALITY CONTROL DATA TRANSMISSION DELAYED-MODE QUALITY CONTROL DATA VALIDATION GLOBAL DATA MANAGEMENT ACTIVITIES REFERENCED DOCUMENTS...15
3 Argo Data Management Handbook 3 HISTORY Version Date Comment December 2001 to March 2002 Initial version circulating among Argo Data Management Committee 1.0 July 2002 First version taking into account the AST remarks at the 4 th IAST meeting 1.1 September 2002 AIC et R Keeley remarks 1.2 December rd Data Management Remarks
4 Argo Data Management Handbook 4 1. INTRODUCTION Argo is an internationally coordinated activity directed at characterizing both the temperature and salinity structure of the mid- and upper-ocean and the advective field at mid-depth through deployment of autonomous profiling floats. It is envisioned that the resulting temperature and salinity profiles will be used to: (1) initialize climate forecast models, (2) detect and attribute climate change effects on the ocean, (3) calibrate/validate satellite altimetric data, and (4) increase understanding of the ocean and its role in global climate. These objectives demonstrate that Argo data are to be utilized by both operational and research communities. The international and global nature of Argo dictates that there will be diverse float manufacturers, deployers of floats and Centres for data management. The Argo Science Team has stated that, as practical, data processing procedures should be standardized to simplify the tasks of the data users. To maximize the effectiveness of the Argo data-set, a data management methodology is required to ensure that high quality information is readily accessible to a wide variety of users in a timely manner. This Handbook is one step in achieving this objective. Specifically, the objectives of the Data Management Handbook are: (1) to standardize, as practical, data management procedures among Argo Data Centres through distribution of accepted procedures and protocols, (2) to provide information for new Data Centres, and (3) to familiarize Argo data users with data management procedures and standards. Any agency, country or consortium can take part in the Argo program. They may set up their own data processing facilities or make use of existing ones. As a contributor, they must agree that any data they collect be made available immediately and without restriction. They must also agree to abide by the procedures and practices described in this handbook.
5 Argo Data Management Handbook 5 2. GLOBAL DATA FLOW The assembly of data in the Argo program is a distributed responsibility. In many cases, individual countries have established data Centres to handle the data collected by floats that their countries have contributed. In other cases, agencies within countries or groups of countries have also contributed floats to the Argo program but they make use of an existing data processing Centre. Argo data are processed and distributed through a network involving different actors PI: the scientists who deploy the floats, then carry out delayed mode QC and return data to National Centres within 5 months of observations. They can delegate the task to a National Centre, but they are still the point of contact for questions about the quality of the data. National Centres: the data Centres who collect, qualify, process and distribute the float data for which they are responsible. Data are distributed to PIs and the GTS within 24 hours of the float surfacing. They also send the data to the Global Data Centres. National Centres are called DACs in the rest of the document Global Data Centres: two distribution points of Argo data distribution on Internet. They are located in Coriolis/Ifremer/France and US GODAE/FNMOC/USA.Coordination between these Centres occurs daily. Global Data Centres are called GDACs in the rest of the document. Regional Data Centres: Data Centres responsible for quality control on float data collected in specified regions. Argo Information Centre (AIC): located in Toulouse, France, responsible for providing information on the Argo program. Argo long term archive: data Centre located in NODC/USA in charge of ensuring the long term archive of all the Argo data. Figure 1: A visual summary of the data flow from float to global archives
6 Argo Data Management Handbook 6 3. RESPONSIBILITIES 3.1. National Centres Data processing activities at National Centres are schematized in the figure below and described in the following paragraphs. Data communications Systems Decode & format Automatic quality control PI Verification & matching Quality okay? Y N Metadata repository Data Repository Manual quality control Regional Data Centres GDAC PI GTS Black line = within 24 hours Blue Line = varying times from 24 hours to months Green line = as needed Red line = within 5months Data are transmitted from floats while they are on the surface. The satellite communications system (e.g. Service ARGOS, others) relays the data to ground stations. These either send the data to the processing National Centres, or National Centres connect directly to the communications system provider to download data. This data transfer happens several times a day. Many of the countries supplying floats also have a National Centre designated to process the data. Although the exact nature of the work carried out at these Centres varies, each is generally responsible for converting the data stream from each float to profile and drift information. They also have the responsibility to maintain the master versions of metadata concerning the float specifications as well as data for their floats.
7 Argo Data Management Handbook 7 Data tracking is needed to assure that all components in the data system are receiving the appropriate data and at the appropriate schedules. The first step in data tracking is registration of new float information by PIs. The PI registers newly deployed floats by forwarding necessary information to the appropriate National Centre. This information is relayed to the Global Data Centres. Such information allows the National Centre to anticipate the arrival of profiles and to initiate inquiries when data are not received. Each National Centre holds the master list of instruments, profiles and current float status for the region, or PIs for which they are responsible. As part of this process, National Centres may act to request unique float numbers from the WMO before floats are deployed (see section 4.1). National Centres prepare the data for dissemination on the GTS using a basic set of QC tests ([QC]). They also relay data to PIs within their country or who use the services of the processing Centre. At the same time, within 24 hours, the information from the float is also sent to the Global Data Centres as an alternate dissemination channel to the GTS for the research and modeling community. In some cases, the National processing Centre may also act as a distribution Centre within their own country for all Argo data collected in a region of interest to National PIs. National Centres relay the data to the appropriate PI for delayed mode qualification. Upon receiving data from the PIs, the National Centre has the responsibility to ensure that the content of the return file is good as well as the fact that each PI made an appropriate entry (entries) in the History section of the netcdf file. If this information is missing, the Centre is responsible for working with the PI to ensure proper information is included before the data are sent on to the Global Data Centres. This transfer of delayed mode profiles should occur within 5 months of the observation date of the profiles. Additionally, if analyses at Regional Centres raise questions about the data, and if corrections are made by a PI, these pass through the National Centres back to the Global Data Centres Global ARGO Centres Even though the National Centres keep the master copies of both data and metadata for the floats for which they are responsible, the Global Data Centres are the source from which all users should obtain their data. By centralizing this function, users can be assured that they are receiving the most up-to-date versions and that the data they receive is the same as what all others would receive. In addition, a central website will provide an extensive set of tools to query, retrieve, plot and compare the profiling float data dynamically. The choice of which data server to access could be determined by its proximity to the user while attempting to alleviate the load put on either server. Data processing activities at Global Data Centre are schematized in the figure below and described in the following paragraphs. Each of the two Global servers receives data directly from every National Centres with the latest version of their float profiles and trajectory data and float metadata. Both servers should be updated simultaneously in order to ensure consistency between the two datasets. Each file is the responsibility of a single DAC (i.e. the data provider) who guarantees the quality and integrity of the data. A security system is set up on each server to ensure that only the appropriate DAC is given the necessary privileges to create/modify a netcdf file for a given profiling float..
8 Argo Data Management Handbook 8 National Centre data & metadata Verification & matching Data Repository Synchronization Other GDAC Metadata repository ftp server WWW server Products Pis, National, Regional Centres, others The GDACs are responsible for verifying the format of the files and matching received files to files already on the servers when updates are required. The two GDACs provide an FTP and a WWW access to the Argo data. The FTP server, suitable for data retrieval by a script/program, will provide users data organized: Geographically (by ocean basin) and then temporally (by year/month/day) in each basin, By data provider and platform (DAC and Argo float number). By processing date: the latest processed data organized by processing day (access to the 12 last months) On the FTP site, multi-profile files will be available. These files will be generated by GDACs from the individual profile file provided by the DACs. The WWW server will provide a wider set of subsetting facilities and dynamic manipulation tools; US GODAE and IFREMER agreed on a common set of WWW functionalities but may also provide additional functionality depending of their individual choices. Global Centres will also distribute to users all relevant products that may be designed or even processed by PIs or Regional Centres. This list will evolve with time and a preliminary list should be presented at the 5 th Argo Science team in 2003.
9 Argo Data Management Handbook Principal Investigators The PIs are the scientists who deploy the floats, who are responsible for verification of the quality of the data, and have a scientific interest in using these data. The data sent to the PIs are the same as those sent to the Global Data Centres. PIs take the data and carry out further testing and calibrations as required to ensure high quality data. Within 5 months, they send the data back to their National Centre that then uploads the information to the Global Data Centres. A PI can delegate to another center the delayed mode QC for his floats but will keep the responsibility of guarantying the quality of his float data. The PIs may also prepare products based on whatever instrumentation was used to collect profile data in a particular area and for a particular time frame. These may be sent to the Global Data Centres for wider distribution Regional Centres Regional Centres are groups that take on the responsibility, to validate all float data on a specific area (and perhaps other data as well) through more rigorous scrutiny, and to derive products. They typically retrieve data for their use from the Global Data Centres and from whatever other data servers can provide additional observations they need or want. At the 4 th Argo Science Team meeting the following activities were envisioned for regional data centers: Determining the internal consistency of the Argo dataset by comparing Argo data from different sources in the region and through comparison with ongoing hydrographic cruises. A mechanism for feedback to PIs will be essential. Comparing Argo data with model output and with assimilated fields. Understanding why specific data are rejected by assimilations (model inconsistencies, systematic data errors) Preparing and distributing Argo data products and services. Providing scientific QC as a service to National programs without such capabilities. Coordinating Argo float deployment plans for the region. Providing advice/guidance on regional deployment needs. Developing new real-time quality control tests if appropriate for the particular region. Assembling best available recent CTD/hydrographic data for real-time and delayed mode calibration purposes. Regional data centers may be contributed by a single National data center or may result from collaborations among two or more groups. Collaborative efforts might target different sub regions or contribute different areas of expertise. Argo National programs (and institutions) interested in forming or participating in regional data centers are listed below. The first one listed for each ocean is designated as the lead institution, to work with the others in developing a regional data center. Atlantic Ocean France (IFREMER/Coriolis), U.S.A. (AOML) Pacific Ocean Japan (JAMSTEC), U.S.A. (PMEL), U.S.A.(IPRC) Southern Ocean U.K.(BODC), Australia (CSIRO/BOM) Indian Ocean India (INCOIS), Australia (CSIRO/BOM), U.S.A.(PMEL), U.S.A. (IPRC)
10 Argo Data Management Handbook 10 In the examination of the data from a region and in the generation of these products, previously undetected problems in the data may be found. If problems are detected it is the responsibility of the Regional Centre to make appropriate notations in the files in which errors were found. Information such as the tests performed, when and by whom are important. Flags should be set on the data and if changes are made, values as received should be recorded in the History section of the data format. The data will then be returned to the PIs who will validate the corrections proposed before transmitting the data back to the Global Data Centres through their National Centre 3.5. The Argo Information Centre (AIC) The Argo Information Centre (AIC), established in Toulouse, France, is responsible for the international technical coordination of Argo, under the general supervision of the Chairman of the Argo Science Team ( acting in close collaboration with the Secretary of IOC and the Secretary General of WMO. The AIC acts to resolve any issues arising between float operators, manufacturers, data telecommunication providers, data assimilation Centres, quality control and archiving agencies, WMO and IOC, encourages the participation of new countries, and assists as appropriate in the implementation of a global network. The AIC website ( gives general information on the Argo project (participating countries, contacts, real-time status of the network, status of DACs developments, maps, news, etc.) and proposes a float monitoring system which particularly permit to inform the states on the status of floats entering their Exclusive Economic Zones. The application is also used for deployment strategy. The AIC, which doesn't distribute nor archive data, guides the users to the Argo GDACs and regional products Long Term Archive Argo is an internationally coordinated project that requires international participation to ensure global coverage. Individual countries have established data Centres to handle the data collected by profiling floats that their countries have contributed. The Argo program needs to establish a unique Centre that will archive long term float data in the long term. The U.S. National Oceanographic Data Centre volunteered to provide for the long-term archival of Argo data and information. In collaboration with the Global Data Centres, it will recover all the data available and provide a long-term access to Argo data. It will also act as the Global Data Centres backup. It will also be responsible for providing Argo data in hard copy from (such as CDs or DVDs) for users without good Internet access.
11 Argo Data Management Handbook DATA MANAGEMENT TASKS An effective data management methodology includes many components to ensure that the highest quality data moves from sensor to user to final archive in a timely manner. It should be noted that the definition of timely is user dependent ranging from a day or less for some operational users of Argo data to months for some research users. There is also the need to provide feedback between components to allow for reporting of problems and corrections to the system. Finally, there must be a way to measure the performance of the system to ensure it meets stated requirements. Each of the following sections discusses individual components of the data system Float Identifiers Each float has a platform telecommunications terminal (ptt) by which it identifies itself to the communications system. These ptt numbers are used and reused as floats are deployed and fail. Through an agreement with the World Meteorological Organization (WMO), a unique WMO number is assigned to each float deployment. This number is of the form A9xxxxx where A represents the WMO region in which the float is deployed, 9 is a designator of a profiling float and xxxxx is a 5 digit number. For publication of data on the GTS in TESAC messages, a Q is added in front of the Id. Each time a float is deployed, a new, never before used, WMO identifier is assigned. It is this WMO identifier that is used in the data transmitted on the GTS and used as a unique identifier for a float in all data exchanges. The link between the ptt and WMO identifier is maintained in metadata files maintained at each Centre and held on the Argo Servers. When the data are transmitted on the GTS the WMO number is prefixed by the letter Q; that is, a float with WMO identifier is identified on the GTS as Q The WMO (i.e. without Q) will the identifier used on GDAC FTP and WWW sites Data Formats As shown in the data flow diagram, there are two ways that data can reach users. Real-time data are distributed both on the GTS and through the use of the Global Data Centres. The data sent to the GTS are encoded, presently, in TESAC form. Data that have been flagged as wrong by the QC process are removed from the TESAC message. As well, since the TESAC code form reports depth, and also since floats report pressure, conversion from pressure to depth must be made. UNESCO software routines are recommended for this conversion (see [QC]- on QC procedures for a specific reference). Data encoded in the TESAC should be at the same vertical resolution as reported through the communications system from the float. Information about the TESAC code form can be obtained by contacting the WMO or visiting
12 Argo Data Management Handbook 12 Full resolution profile data, together with quality flags, metadata, and technical data are encoded into netcdf, a self-describing format and sent to the Global Data Centres. Information about netcdf can be obtained from the Unidata home page for netcdf. The format description of the data sent to the Global Data Centres is described in [Format].All data reported regardless of the data quality flag assigned by the QC process are included in the netcdf file as well as the data quality flags assigned by the QC process. In addition, the netcdf file format has a section (History section) used to record which agency carried out which actions on what date and to record what QC tests were performed and which were failed at each Centre. Finally, the netcdf format stores data measured on pressure levels. It is important to be sure that the values of the depths reported in the TESAC code form are not included directly in the netcdf format as pressures Data Transmission Within 24 hours of data collection, the profile data are inserted onto the GTS. In future, as the GTS format is developed, it will be possible to include more metadata, including data quality flags, in the real-time data transmissions. The surface drift data and additional metadata can not initially be sent on the GTS. If for some reason, the 24-hour target for GTS insertion cannot be met, the data still should be sent to the GTS. Although some clients require the data within 24 hours, there are many others who can make use of the data after 24 hours. At the same time as the profile data are sent to the GTS, all of the data and metadata will also go to the Global Data Centres. Because the format for storing data on the Argo servers is more capable than the initial format being used for the GTS, the Argo servers will have the most complete data and information for the floats. Where possible, it is advisable that the data also be sent immediately to the PIs on the same time schedule. This allows them the maximum amount of time to carry out delayed mode QC Real-time Quality Control Because of the speed at which the data are required, the quality control procedures applied to the real-time data must run in an automated mode even though experience shows that some problems will not be caught. These problems should be caught in the delayed mode QC described in [Delayed-QC]. Each Centre carries out the same base set of tests. The data formats include a series of flags that indicate the overall assessment of quality of a data point relative to the specific tests applied. Based on the experience acquired in the development of the Global Temperature and Salinity Profile Project, GTSPP, a set of automatic tests had been defined (see [QC]) These tests are applied to the profile data before they are converted to a message format suitable to the GTS. In the case of TESAC, since it is not possible to include data quality flags, and since it is undesirable to send data that have failed the QC tests (those that are deemed back by the automatic test procedures), any data value flagged as bad is removed from the TESAC report. If float positions, dates and times or identifiers are in question, none of the data from the float will go to the GTS. Certain profiles may have sufficient problems that are detected by the automated testing and result in the profile not going to the GTS. In this case, it may be possible to carry out further testing, and salvage all or part of the data. If this is possible, the data should be put onto the GTS even though they fail to meet the 24-hour target described earlier. Likewise, all of the data with all appropriate flags should be sent on to the Global Data Centres for further distribution.
13 Argo Data Management Handbook Delayed-mode Quality Control Following the real-time QC, a higher level of QC is carried out by the responsible PIs and Regional Centres. The delayed-mode QC is completed by the PIs within 5 months of data collection. In delayed-mode, the responsible parties are in possession of a more complete data set for mapping and comparison purposes. Neighboring floats from other countries, different data types and subsequent profiles from a given float can all be used in the identification of suspect data. The individual PIs role is to make a careful examination of profiles and to render her/his opinion on data integrity, salinity calibration, changes, etc. The regional Centres are to take a larger perspective (e.g., basin-wide) for the same purpose. As with the real time QC procedures, the PI s and regional Centres attach unique QC flags and also attach recalibration data as needed to the metadata files. After QC, the data are resubmitted to the appropriate PI for verification and then to the National Centre that uploads the data to the Global Data Centres. Once agreed, the delayed-mode QC procedures will be described in [delayed QC] 4.6. Data Validation Changes in the mix of data types in the data archives and even technology advances within a particular type can result in spurious climate signals when the data are examined. For example, it is known that XBT data contain systematic errors not present in floats and other higher quality data. It will be necessary to intercompare the results from various profiling instrumentation (CTD, XBT, XCTD, floats, etc.) to identify any systematic differences and to ensure that spurious signals are not introduced into the long-term record. Validation reports will be available to the Argo community through the AIC. Producing such reports is one of the deliverables expected from regional Centres.
14 Argo Data Management Handbook GLOBAL DATA MANAGEMENT ACTIVITIES To have an objective overview on Argo program progression in general, and data management activities in particular, there is the need at a global level to set up some monitoring indicators. These are not yet defined but the important elements can be highlighted now. Continuous monitoring of the locations of operating floats is needed to ensure that design requirements are being met. Specifically, data void areas will be identified and appropriate action will be taken to fill them. While this monitoring can be carried out by individual processing Centres with access to the complete data collection, maintenance of the measurement network requires close cooperation between Argo partners. Other types of monitoring that should be undertaken include testing the ability of the data system to deliver real-time data within 24 hours of the float coming to the surface, checking that delayed mode data are available to users within 5 months, checking float lifetimes, notifying countries of floats that may drift into their territorial waters, monitoring the general quality of data from the processing Centres. In addition, the robustness of the QC procedures must be verified through inter-comparison. One approach is for the National Centres to exchange profiles, qualify the raw data and compare flags. Furthermore, the use made of the Argo data by climate models must be evaluated. Data discards must be examined to determine if problems are with the sensors or assimilation techniques. The monitoring can be carried out by volunteer National Centres, Global Data Centres or by the AIC, and results may be made available either through the AIC and/or the Global Data Centres.
15 Argo Data Management Handbook REFERENCED DOCUMENTS [Format] Argo Data User Manual ; Version 1.0, July 2002 [QC] Real Time QC procedure, Version 1.0 October 2001 [GDAC] [Delayed-QC] US GODAE/IFREMER Data Servers as part of the Argo data distribution network, Version 2.2, February 2002 Delayed-mode QC procedures (To write when agreed)
A beginners guide to accessing Argo data John Gould Argo Director Argo collects salinity/temperature profiles from a sparse (average 3 x 3 spacing) array of robotic floats that populate the ice-free oceans
ICES Guidelines for Profiling Float Data (Compiled January 2001; revised April 2006) Profiling floats are neutrally buoyant devices that periodically surface to transmit data to a satellite system. The
Coriolis data-centre an in-situ data portail for operational oceanography Loïc Petit de la Villéon -Ifremer http://www.coriolis.eu.org firstname.lastname@example.org 1 How did Coriolis start to work on operational oceanography?
GLOBAL TEMPERATURE AND SALINITY PROFILE PROGRAMME (GTSPP) Dr. Charles Sun, GTSPP Chair National Oceanographic Data Center USA What s s GTSPP? GTSPP = Global Temperature Salinity Profile Programme GTSPP
Coriolis, a French project for operational oceanography S Pouliquen *1,T Carval *1,L Petit de la Villéon *1, L Gourmelen *2, Y Gouriou *3 1 Ifremer Brest France 2 Shom Brest France 3 IRD Brest France Abstract
Data Management Activities Bob Keeley OOPC- 9 Southampton, Jun, 2004 OIT Project Issues These include: effective telecommunication needs for agreed standards and protocols provision for innovative data
Coriolis data-centre an in-situ data portail for operational oceanography Ingrid Puillat on behalf of L. Petit de la Villeon & G. Maudire IFREMER http://www.coriolis.eu.org email@example.com 1 How did Coriolis
UNITED KINGDOM CONTRIBUTION TO ARGO REPORT FOR ARGO SCIENCE TEAM 6 TH MEETING, MARCH 2004 The UK's contribution to Argo is being funded by the Department of the Environment, Food and Rural Affairs (Defra),
1608 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 24 The Real-Time Data Management System for Argo Profiling Float Observations CLAUDIA SCHMID, ROBERT L. MOLINARI,
WORLD METEOROLOGICAL ORGANIZATION Joint WMO-IOC Technical Commission For Oceanography And Marine Meteorology (JCOMM) Observations Programme Area Coordination Group (OCG) Fourth Session Hobart, Australia,
North-Atlantic Regional Data Center Virginie Thierry E. Autret, F. Gaillard, Y.Gouriou, S. Pouliquen Overview Introduction: A North-Atlantic DAC Part I - Analysis of T and S at Coriolis: a contribution
CTD Oceanographic Tags The first telemetry tag that links a marine mammal s behavior with its physical environment. Features: Oceanographic quality temperature & salinity profiles Detailed individual dive
4 Th Argo Data Management Meeting Report Monterey, California, USA 5-7 th November 2003 1. Introduction Mike Clancy from Fleet Numerical Meteorological and Oceanographic Centre welcomed the participants
DELAYED MODE QUALITY CONTROL OF ARGO SALINITY DATA IN THE MEDITERRANEAN AND BLACK SEA FLOAT WMO 6900818 G. Notarstefano and P.-M. Poulain Produced by the Mediterranean Argo Regional Centre (MedArgo), OGS,
Looking beyond your microscope: contributing data and information to the global community Peter Pissierssens Head, IOC Project Office for IODE IODE Programme Manager IOC Capacity Development Coordinator
Argo Information Centre Report Argo TC, M. Belbeoch AST #8, Paris / IOC of UNESCO March 2007 Update of report included in annex of ADMT#7. 1. Background... 2 2. TC Activities... 2 3. AIC Information System...
Levels of Archival Stewardship at the NOAA National Oceanographic Data Center: A Conceptual Model 1 Levels of Archival Stewardship at the NOAA National Oceanographic Data Center: A Conceptual Model Dr.
A Project to Create Bias-Corrected Marine Climate Observations from ICOADS Shawn R. Smith 1, Mark A. Bourassa 1, Scott Woodruff 2, Steve Worley 3, Elizabeth Kent 4, Simon Josey 4, Nick Rayner 5, and Richard
TSUNAMI WARNING OPERATIONS: SEA LEVEL MONITORING TIDE TOOL: DISPLAY AND DECODE OF SEA LEVEL DATA TRANSMITTED OVER THE WMO GLOBAL TELECOMMUNICATIONS SYSTEM (vers 8.11, June 2010) Pacific Tsunami Warning
Develop a Hybrid Coordinate Ocean Model with Data Assimilation Capabilities W. Carlisle Thacker Atlantic Oceanographic and Meteorological Laboratory 4301 Rickenbacker Causeway Miami, FL, 33149 Phone: (305)
TROPICAL ATMOSPHERE-OCEAN (TAO) PROGRAM FINAL CRUISE REPORT TT-15-01 Area: Equatorial Pacific: 8 N 95 W to 8 S 95 W and 8 S 110 W to 8 N 110 W Itinerary: TT-15-01 DEP April 6, 2015, San Diego, CA ARR May
Vandalism on data buoys Table of contents: 1. Background information... 1 1.1. The DBCP... 1 1.2. Current status of data buoy programmes... 2 1.3. Vandalism on data buoys and assistance required from CPRNW...
The Arctic Observing Network and its Data Management Challenges Florence Fetterer (NSIDC/CIRES/CU), James A. Moore (NCAR/EOL), and the CADIS team Photo courtesy Andrew Mahoney NSF Vision What is AON? a
Development and Support for the USGODAE Server Mr. Mike Clancy Fleet Numerical Meteorology and Oceanography Center 7 Grace Hopper Ave, Stop 1 Monterey, CA 93943-5501 phone: (831) 656-4414 fax: (831) 656-4489
DYNAMO DATA MANAGEMENT OVERVIEW Steve Williams and Jim Moore National Center for Atmospheric Research (NCAR) Earth Observing Laboratory (EOL) Boulder, Colorado, USA CINDY2011/DYNAMO Operations Planning
Guide to the Use of the Data Collection System of the Himawari series of satellite (Himawari-DCS) Japan Meteorological Agency 07 July 2015 The Data Collection System of the Himawari series of satellite
1 st OceanSITES Data Management meeting Ref:OceanSITES-DataManagementMeetingReport-01.B Date 16th -17 th February 2006 Location Hawaii/USA Authors: S Pouliquen Affiliation : Ifremer Ref.: 1 st OceanSITES
GPS Phone Tags A major advance in marine mammal telemetry, featuring: GPS quality locations Detailed individual dive and haulout data Efficient and cheap data relay via mobile phone GSM Introduction The
Adaptive Sampling and the Autonomous Ocean Sampling Network: Bringing Data Together With Skill Lev Shulman, University of New Orleans Mentors: Paul Chandler, Jim Bellingham, Hans Thomas Summer 2003 Keywords:
GCOS/GOOS/GTOS JOINT DATA AND INFORMATION MANAGEMENT PLAN Prepared by the Joint Data and Information Management Panel (JDIMP) VERSION 1.0 May 2000 GCOS - 60 GOOS - 70 (WMO/TD No. 1004) TABLE OF CONTENTS
2nd ARC meeting October 28, 2008 reminder of goals of ARCs: essential: regional analysis & feedback to PIs optional: float deployments, new quality control tests, products PARC Partnership with 7 countries
DC3 DATA MANAGEMENT PLAN Steve Williams 1 and Gao Chen 2 1 NCAR Earth Observing Laboratory (EOL) Computing, Data, and Software Facility (CDS) 2 NASA Langley Research Center (LaRC) DC3 and SEAC 4 RS Joint
Limited Distribution IOC/IODE-XVII/37-Annexe Paris, 21 February 2003 Original: English Project Plan Last updated:31 Jan, 2003 GOSUD Project Plan 1. Introduction... 2 2. Rationale for the Project... 2 3.
Incorporated Research Institutions for Seismology Request for Proposal IRIS Data Management System Data Product Development February 9, 2011 RFP IRIS Data Management System Data Product Development Table
Argo National Data Management Report 2009 Coriolis data center Annual report August 2008 September 2009 Version 1.1 September 24th, 2009 13 725 new Argo profiles from 436 floats managed by Coriolis DAC
J7.4 AUTOMATED MARINE WEATHER OBSERVATIONS ON RESEARCH VESSELS AS PART OF AN OCEAN OBSERVING SYSTEM Shawn R. Smith Center for Ocean-Atmospheric Prediction Studies, The Florida State University Tallahassee,
NOAA Big Data Project David Michaud Acting Director, Office of Central Processing Office Monday, August 3, 2015 Central Processing Portfolio Benefits and Scope Central Processing Portfolio Benefits Ensures
NODC s Data Stewardship for Jason-2 and Jason-3 Meeting October, 2013 Dr. Deirdre A. Byrne Dr. Yongsheng Zhang NOAA/NODC 1 What is NODC? NODC was founded in 1961 as part of the Naval Oceanographic Office
SeaDataNet 2 Kick-Off Meeting WP10 Development and regular update of data products Overview (INGV ULG S. Simoncelli) Athens, October 19-20th, 2011 OUTLINE Introduction: SDN JRA 4-9 outcome Objectives Description
NERC Thematic Programme Cloud Water Vapour and Climate (CWVC) Data Management Plan BADC December 2001 Updated February 2002 Scope The purpose of the CWVC data management plan is to set up a coherent approach
HYCOM Data Service An overview of the current status and new developments in Data management, software and hardware Ashwanth Srinivasan & Jon Callahan COAPS FSU & PMEL HYCOM Meeting Nov 7-9, 7 2006 Tallahassee,
Executive Summary Belize s HydrometDB is a Climatic Database Management System (CDMS) that allows easy integration of multiple sources of automatic and manual stations, data quality control procedures,
DOE/SC-ARM-14-008 Data Management Facility Operations Plan NN Keck January 2014 DISCLAIMER This report was prepared as an account of work sponsored by the U.S. Government. Neither the United States nor
Governance Framework For The AADC System For Indicator Monitoring & Reporting (SIMR) 1. Purpose & Function of SIMR Tool The requirement for an Antarctic component to the 2001 Australian State of the Environment
Minutes of the 3 rd Argo Regional Centre Meeting Tuesday 29 th September 2009 Toulouse, France Chairs: Claudia Schmid and Ann Gronell Thresher The stated objectives of the meeting are to: Review the status
SHared Access Research Ecosystem (SHARE) June 7, 2013 DRAFT Association of American Universities (AAU) Association of Public and Land-grant Universities (APLU) Association of Research Libraries (ARL) This
World Meteorological Organization WORLD METEOROLOGICAL ORGANIZATION INTERGOVERNMENTAL OCEANOGRAPHIC COMMISSION (OF UNESCO) JCOMM DATA MANAGEMENT PLAN Prepared by the Members of the Data Management Coordination
NASA s Big Data Challenges in Climate Science Tsengdar Lee, Ph.D. High-end Computing Program Manager NASA Headquarters Presented at IEEE Big Data 2014 Workshop October 29, 2014 1 2 7-km GEOS-5 Nature Run
Establishing and operating an Ocean Data Interoperability Platform ODIP EU US Australia cooperation By Helen Glaves NERC- BGS ODIP Coordinator & Dick M.A. Schaap MARIS ODIP Technical Coordinator Supported
International Journal of Grid and Distributed Computing 23 Remote Sensing Images Data Integration Based on the Agent Service Binge Cui, Chuanmin Wang, Qiang Wang College of Information Science and Engineering,
Data Quality Assurance and Control Methods for Weather Observing Networks Cindy Luttrell University of Oklahoma Oklahoma Mesonet Quality data are Trustworthy Reliable Accurate Precise Accessible Data life
Second Coastal Environmental Assessment Workshop, 11 September 2008, Toyama, Japan Current Status of National and International Oceanographic Database Toru Suzuki Marine Information Research Center IODE
8B.7 The THREDDS Data Repository: for Long Term Data Storage and Access Anne Wilson, Thomas Baltzer, John Caron Unidata Program Center, UCAR, Boulder, CO 1 INTRODUCTION In order to better manage ever increasing
Southeast Alaska Network Abstract Basic freshwater quality observations are maintained by SEAN in a consistent database. This paper summarizes the scope of these data. Methods for accessing data are explained.
Norwegian Satellite Earth Observation Database for Marine and Polar Research http://normap.nersc.no USE CASES The NORMAP Project team has prepared this document to present functionality of the NORMAP portal.
NOAA/National Climatic Data Center In situ snow observations Matt Menne NOAA/National Climatic Data Center Asheville, North Carolina, USA ECMWF Snow Archive Workshop November 17-19, 2014 1 Outline In situ
NOAA Direct Broadcast Real-Time Network: Current Status and Plans for Delivering Sounder Data to DRARS Liam Gumley (NOAA DB Demonstration Technical Manager), Bruce Flynn, Heath Skarlupka, David Santek,
Project Name Equipment Process Line/Location Project Number Serial Number Model Number Protocol number WRITTEN BY: REVIEWED BY: Position APPROVAL TO EXECUTE: Position: PROTOCOL COMPLETION APPROVAL: Position:
On the use of Satellite Altimeter data in Argo quality control Stéphanie Guinehut CLS, Space Oceanography Division Argo-France 14-15/05/2009 Brest -1- - Objective Background idea: to check the data before
ANALYSIS OF DATA EXCHANGE PROBLEMS IN GLOBAL ATMOSPHERIC AND HYDROLOGICAL NETWORKS SUMMARY REPORT 1 June 2004 Global Climate Observing System (GCOS) Secretariat 1 This summary report is based on a study
STATEMENT OF WORK NETL Cooperative Agreement DE-FC26-02NT41476 Database and Analytical Tool for the Management of Data Derived from U. S. DOE (NETL) Funded Fine Particulate (PM 2.5 ) Research PROJECT SCOPE
13.2 THE INTEGRATED DATA VIEWER A WEB-ENABLED APPLICATION FOR SCIENTIFIC ANALYSIS AND VISUALIZATION Don Murray*, Jeff McWhirter, Stuart Wier, Steve Emmerson Unidata Program Center, Boulder, Colorado 1.
Near real-time transmission of reduced data from a moored multi-frequency sonar by low bandwidth telemetry Rene A.J. Chave, David D. Lemon, Jan Buermans ASL Environmental Sciences Inc. Victoria BC Canada
Virginia Commonwealth University Rice Rivers Center Data Management Plan Table of Contents Objectives... 2 VCU Rice Rivers Center Research Protocol... 2 VCU Rice Rivers Center Data Management Plan... 3
HYCOM Data Management -- opportunities & planning -- Steve Hankin (NOAA/PMEL) Ashwanth Srinivasan, (RSMAS) Peter Cornillon, OPeNDAP PI (URI) Mike Clancy, US GODAE Server (FNMOC) Jon Callahan, Kevin O Brien
Introduction to IODE Data Management Greg Reed Past Co-Chair IODE Outline Background - Introduction to IOC and IODE - IODE activities Oceanographic data management - End to end data management - Data stewardship
Federal GIS Conference February 9 10, 2015 Washington, DC Scientific Data Management and Dissemination John Fry Solution Engineer, Esri firstname.lastname@example.org Agenda Background of Scientific Data Management through
Outcomes of the CDS Technical Infrastructure Workshop Baudouin Raoult Baudouin.email@example.com Funded by the European Union Implemented by Evaluation & QC function C3S architecture from European commission
Truck Activity Visualizations In The Cloud Presented By Dr. Catherine Lawson NATMEC Improving Traffic Data Collection, Analysis, and Use June 4 7, 2012 Dallas, Texas What Question Are You Trying to Answer?
NATIONAL CLIMATE CHANGE & WILDLIFE SCIENCE CENTER & CLIMATE SCIENCE CENTERS DATA MANAGEMENT PLAN GUIDANCE Prepared by: NCCWSC/CSC Data Management Working Group US Geological Survey February 26, 2013 Version
SAMOS, GOSUD and the Observing System Monitoring Center Second Joint GOSUD/SAMOS Workshop Seattle, Washington, USA 10-12 Steve Hankin NOAA/PMEL (presenter) Derrick Snowden NOAA/OCO Kevin O Brien JISAO/UW
Software Design Proposal Scientific Data Management System Alex Fremier Associate Professor University of Idaho College of Natural Resources Colby Blair Computer Science Undergraduate University of Idaho
New challenges of water resources management: Title the future role of CHy by Bruce Stewart* Karl Hofius in his article in this issue of the Bulletin entitled Evolving role of WMO in hydrology and water
ATTACHMENT 8: Quality Assurance Hydrogeologic Characterization of the Eastern Turlock Subbasin Quality assurance and quality control (QA/QC) policies and procedures will ensure that the technical services
Document Title Version 1.1 Document Review Date March 2016 Document Owner Revision Timetable / Process RESEARCH DATA MANAGEMENT POLICY RESEARCH DATA MANAGEMENT POLICY Director of the Research Office Regular
CROATIAN BUREAU OF STATISTICS REPUBLIC OF CROATIA MAIN (STATISTICAL) BUSINESS PROCESSES INSTRUCTIONS FOR FILLING OUT THE TEMPLATE CONTENTS INTRODUCTION... 3 1. SPECIFY NEEDS... 4 1.1 Determine needs for
EUROPEAN COMMISSION Directorate-General for Research & Innovation Guidelines on Data Management in Horizon 2020 Version 2.0 30 October 2015 1 Introduction In Horizon 2020 a limited and flexible pilot action