J3.3 AN END-TO-END QUALITY ASSURANCE SYSTEM FOR THE MODERNIZED COOP NETWORK



Similar documents
Data Quality Assurance and Control Methods for Weather Observing Networks. Cindy Luttrell University of Oklahoma Oklahoma Mesonet

Guidelines on Quality Control Procedures for Data from Automatic Weather Stations

Heavy Rainfall from Hurricane Connie August 1955 By Michael Kozar and Richard Grumm National Weather Service, State College, PA 16803

Design and Deployment of Specialized Visualizations for Weather-Sensitive Electric Distribution Operations

6.9 A NEW APPROACH TO FIRE WEATHER FORECASTING AT THE TULSA WFO. Sarah J. Taylor* and Eric D. Howieson NOAA/National Weather Service Tulsa, Oklahoma

USING THE GOES 3.9 µm SHORTWAVE INFRARED CHANNEL TO TRACK LOW-LEVEL CLOUD-DRIFT WINDS ABSTRACT

A.1 Sensor Calibration Considerations

David P. Ruth* Meteorological Development Laboratory Office of Science and Technology National Weather Service, NOAA Silver Spring, Maryland

P POTENTIAL APPLICATIONS OF A CONUS SOUNDING CLIMATOLOGY DEVELOPED AT THE STORM PREDICTION CENTER

Basic Climatological Station Metadata Current status. Metadata compiled: 30 JAN Synoptic Network, Reference Climate Stations

USING SIMULATED WIND DATA FROM A MESOSCALE MODEL IN MCP. M. Taylor J. Freedman K. Waight M. Brower

Real-time Ocean Forecasting Needs at NCEP National Weather Service

6A.2 The testing of NSSL multi-sensor applications and data from prototype platforms in NWS forecast operations

Improved Warnings for Natural Hazards: A Prototype System for Southern California

Development of an Integrated Data Product for Hawaii Climate

Quality Assurance for Hydrometric Network Data as a Basis for Integrated River Basin Management

Scholar: Elaina R. Barta. NOAA Mission Goal: Climate Adaptation and Mitigation

Climate and Weather. This document explains where we obtain weather and climate data and how we incorporate it into metrics:

Climate Extremes Research: Recent Findings and New Direc8ons

Data Collection and Validation

SIXTH GRADE WEATHER 1 WEEK LESSON PLANS AND ACTIVITIES

REDUCING UNCERTAINTY IN SOLAR ENERGY ESTIMATES

2.8 Objective Integration of Satellite, Rain Gauge, and Radar Precipitation Estimates in the Multisensor Precipitation Estimator Algorithm

Daily High-resolution Blended Analyses for Sea Surface Temperature

GLOBAL TEMPERATURE AND SALINITY PROFILE PROGRAMME (GTSPP) Dr. Charles Sun, GTSPP Chair National Oceanographic Data Center USA

Present Status of Coastal Environmental Monitoring in Korean Waters. Using Remote Sensing Data

Head 168 HONG KONG OBSERVATORY

AIR TEMPERATURE IN THE CANADIAN ARCTIC IN THE MID NINETEENTH CENTURY BASED ON DATA FROM EXPEDITIONS

Slides prepared by the Northeast Gas Association

National Data Buoy Center Cooperative Relations

Long term cloud cover trends over the U.S. from ground based data and satellite products

163 ANALYSIS OF THE URBAN HEAT ISLAND EFFECT COMPARISON OF GROUND-BASED AND REMOTELY SENSED TEMPERATURE OBSERVATIONS

Data Integration and long-term planning of the Observing Systems as a cross-cutting process in a NMS

Weather patterns for sailing in Weymouth Bay & Portland Harbour:

21st Education Symposium AMS 92nd New Orleans, LA 24 January 2012 With thanks to: Image Acknowledgements: ç The Leviathan, from friedpost.

COASTAL WIND ANALYSIS BASED ON ACTIVE RADAR IN QINGDAO FOR OLYMPIC SAILING EVENT

Evaluating the impact of U.S. Historical Climatology Network homogenization using the U.S. Climate Reference Network

National Oceanic and Atmospheric Administration

Final Report - HydrometDB Belize s Climatic Database Management System. Executive Summary

Cloud/Hydrometeor Initialization in the 20-km RUC Using GOES Data

Quality System Training Module 4 Data Validation and Usability

8B.6 A DETAILED ANALYSIS OF SPC HIGH RISK OUTLOOKS,

Solar Input Data for PV Energy Modeling

Development of a. Solar Generation Forecast System

A beginners guide to accessing Argo data. John Gould Argo Director

Data Management Framework for the North American Carbon Program

Networking Break 3:00 pm 3:30 pm

Quality assurance for hydrometric network data as a basis for integrated river basin management

The THREDDS Data Repository: for Long Term Data Storage and Access

MIAMI-SOUTH FLORIDA National Weather Service Forecast Office

Many technological advances and scientific breakthroughs

Flash Flood Guidance Systems

Fog and low cloud ceilings in the northeastern US: climatology and dedicated field study

IMPACT OF SAINT LOUIS UNIVERSITY-AMERENUE QUANTUM WEATHER PROJECT MESONET DATA ON WRF-ARW FORECASTS

The Pennsylvania Observer

How To Find Out If The Winds From The Oak Ridge Site Are Calm

Improvement in the Assessment of SIRS Broadband Longwave Radiation Data Quality

DATA VALIDATION, PROCESSING, AND REPORTING

MI oceanographic data

Regulating Offshore Renewable Energy Leasing and Development

Training session for the. Firm Yield Estimator. Version 1.0. Prepared for the Massachusetts Department of Environmental Protection

Huai-Min Zhang & NOAAGlobalTemp Team

A SEVERE WEATHER CLIMATOLOGY FOR THE WILMINGTON, NC WFO COUNTY WARNING AREA

Eastern Caribbean Open-Source Geospatial Data Sharing and Management Workshop

Visualization and processing of weather radar data

CURRENT STATUS OF HYDROLOGICAL DATA MANAGEMENT SYSTEM IN SLOVENIA. Country report. Dr. Mira Kobold

Satellite Weather And Climate (SWAC) Satellite and cloud interpretation

Preliminary summary of the 2015 NEWS- e realtime forecast experiment

PURPOSE OF MEDICAL CERTIFICATION To ensure that only those pilots who are physically and mentally fit will be authorized to operate aircraft, thereby

FORENSIC WEATHER CONSULTANTS, LLC

EVALUATING SOLAR ENERGY PLANTS TO SUPPORT INVESTMENT DECISIONS


Project Title: Quantifying Uncertainties of High-Resolution WRF Modeling on Downslope Wind Forecasts in the Las Vegas Valley

4.12 Improving wind profiler data recovery in non-uniform precipitation using a modified consensus algorithm

THE STRATEGIC PLAN OF THE HYDROMETEOROLOGICAL PREDICTION CENTER

Catastrophe Bond Risk Modelling

Denis Botambekov 1, Andrew Heidinger 2, Andi Walther 1, and Nick Bearson 1

Developing Continuous SCM/CRM Forcing Using NWP Products Constrained by ARM Observations

[ Climate Data Collection and Forecasting Element ] An Advanced Monitoring Network In Support of the FloodER Program

The IPCC Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation

Geography affects climate.

Reply to No evidence for iris

Empirical study of the temporal variation of a tropical surface temperature on hourly time integration

Transcription:

J3.3 AN END-TO-END QUALITY ASSURANCE SYSTEM FOR THE MODERNIZED COOP NETWORK Christopher A. Fiebrich*, Renee A. McPherson, Clayton C. Fain, Jenifer R. Henslee, and Phillip D. Hurlbut Oklahoma Climatological Survey, Norman, OK 1. INTRODUCTION In late 2004, the Oklahoma Climatological Survey (OCS) began to prototype a modernized data ingest, quality assurance (QA), and monitoring system for the modernized cooperative observer (COOP) program of the National Oceanic and Atmospheric Administration (NOAA). Recently renamed NOAA s Environmental Real-Time Observation Network (NERON), the modernized COOP program has established about 100 automated sites in Maine, New Hampshire, Vermont, New York, Connecticut, Rhode Island, and Massachusetts. OCS personnel actively archive real-time observations from these NERON sites in New England, as well as perform rigorous quality assurance of the network data. The quality assurance system includes archiving site survey and site installation metadata in an online database, maintaining instrument and equipment information, performing both automated and manual quality control of the data, and providing detailed sensor problem reports to NERON maintenance personnel. The goal is to provide a research-quality data set that can be trusted by both real-time decision makers and research scientists alike. 2. SITE SURVEY AND INSTALLATION METADATA To aid site selection in New England, the OCS team created a web site with access to the site survey metadata (Fig. 1). The website displays photos acquired during the surveys, site geographic information, and the subjective site rating provided by the surveyor. In addition, for NERON personnel with administrative access, the user can select whether or not a site will be installed at the surveyed location and what measurements will be acquired. * Corresponding author address: Christopher A. Fiebrich, Oklahoma Climatological Survey, 100 E. Boyd, Suite 1210, Norman, OK 73019; e-mail: chris@mesonet.org After stations were installed, OCS quality assurance (QA) staff entered the metadata into the official NERON database (Fig. 2). The NERON database contains metadata including sensor inventories, site information, manual quality assurance flags, sensor coefficient information, and trouble tickets. The database provides a seamless connection between the numerous metadata components (e.g., the resolution of a trouble ticket can result in an update of sensor locations, sensor residence times, and associated quality assurance flags). 3. AUTOMATED QUALITY ASSURANCE The automated QA system for NERON provided by the OCS is adapted from that of the Oklahoma Mesonet (Shafer et al. 2000). Variables are tested first by a filter and second by a set of independent algorithms. The filter flags data that a) fail range tests, b) coincide with a technician visit, or c) have been flagged manually by a QA meteorologist. Data that pass all of the filter tests are then tested by a number of algorithms, each of which provides an assessment of the data. The independent algorithms currently consist of spatial tests, step tests, persistence tests, and step-tonormal tests. A decider algorithm combines the results of the various tests and provides a final flag for each observation. The flags are stratified as follows: a) 0 for a good observation, b) 1 for suspect, c) 2 for warning, and d) 3 for failure. Suspect flags are placed most commonly on observations via the independent algorithms, whereas warning flags are placed most commonly on observations via manual QA flags. Failure flags most often result from range test failures or from a combination of suspect and warning flags from the various algorithms. The data product generators prevent any data flagged as warning or failure from being displayed on the public website. 4. MANUAL QUALITY ASSURANCE Each day, QA meteorologists review the results from the automated QA system. They analyze each observation that is flagged as

erroneous by the automated QA. As a result of this assessment, trouble tickets are issued to NERON technicians so that sensors can be replaced or repaired. In addition, the QA meteorologists manually flag additional observations after they determine the true trace date/time of the problem. Fig. 1. Details of a site survey, as compiled by the site surveyor. Site details include a list of the pictures taken at the site during the survey (in this case, at an existing COOP site), the ratings provided by the site surveyor (with 5 stars representing an excellent location), general site information, geographic location (in latitude/longitude), and contact information of the site owner. These data are password protected. 4.1. Identifying Erroneous Observations The automated QA identifies potential data problems across the NERON network each day. For example, on 13 March 2005, the automated QA began to flag observations from the NERON site near Jonesboro, ME, because of a suspected cool bias. Jonesboro lies in a somewhat isolated area on the eastern coast of Maine. Because of Jonesboro s location, the QA meteorologist was unsure whether the cool anomaly was a real mesoscale feature or if it was caused by a sensor problem. Jonesboro continued to report 2-5 C cooler every day than its nearest neighbor (about 50 km up the coast at Eastport, ME). The QA

meteorologist contacted the site host and verified that the cool readings were erroneous. The sensor bias was traced to 1 February 2005, and the data were flagged manually back to that date (Fig. 3). In addition, a trouble ticket was issued so that the sensor could be replaced. 4.2. Over-riding Automated QA Flags In some situations, the automated QA may inadvertently flag good observations as erroneous (Fiebrich and Crawford 2001). When this occurs, the QA meteorologist enters a manual QA flag in the database indicating that the observation is good. One such example, depicted in Figure 4, occurred during a sea breeze across Long Island, NY. Note that the NERON sites on the eastern side of Long Island are much cooler (e.g., 14.4 C at Fire Island CS Coast Guard Station and 18.8 C at Jones Beach Coast Guard) than those on the western side (e.g., 27.7 C at Planting Fields Arboretum and 28.7 C at Vanderbilt Museum). The cold anomaly at the two eastern stations caused the automated QA algorithm to flag the observations as erroneous. When the QA meteorologist inspected the data (along with ASOS wind observations), it was determined that the cold anomaly was caused by onshore flow. The observations at Fire Island CS Coast Guard and Jones Beach Coast Guard were then manually flagged as good in the QA database. Fig. 2. Portion of the NERON metadata database that lists station information for the Andover, ME COOP site.

Fig. 3. Portion of the NERON metadata database which allows the QA meteorologists to manually flag observations. In this case, the QA meteorologist flagged the Jonesboro, ME (JONM1) air temperature observations with a warning flag back to 1 February 2005 because of a sensor cool bias. Fig. 4. Station plot of the surface temperature ( C) and wind field (m/s) across Long Island at 3:00 PM on 20 April 2005. An onshore flow created a temperature gradient of more than 10 C across Long Island.

5. NETWORK MONITORING A number of OCS network administrators and student operators monitor the NERON network seven days per week. Personnel monitor the communication status of each station in real-time via a number of websites. In addition, scripts automatically generate daily reports of battery voltages, tech visit status, and other diagnostics. These reports are emailed to the NERON managers, technicians, and QA personnel. 6. FUTURE WORK Between now and the end of FY06, OCS will continue to prototype a modernized data ingest, quality assurance, and monitoring system for NERON. The metadata database will continue to evolve and grow as routine maintenance passes begin at the NERON sites across New England. In addition, the QA techniques and tasks will expand as NERON sites begin to receive wind and soil sensors. Research currently is underway to implement a real-time storm-total precipitation tool for providing automated assessments of the New England rain gauges. As some sites begin to migrate to a VHF 2-way communications infrastructure (using the backbone of the National Law Enforcement Telecommunications System), new tools for monitoring and data collection will be developed. Likewise, as NERON sites are surveyed and installed in other parts of the nation, the scalability of the end-to-end QA system will continue to mature. As FY06 ends, the prototype system will be transferred into an operational system whose details are still to be determined. 7. REFERENCES Shafer, M. A., C. A. Fiebrich, D. S. Arndt, S. E. Fredrickson, and T. W. Hughes, 2000: Quality assurance procedures in the Oklahoma Mesonetwork. J. Atmos. Oceanic Technol., 17, 474-494. Fiebrich, C. A., and K. C. Crawford, 2001: The impact of unique meteorological phenomena detected by the Oklahoma Mesonet and ARS Micronet on automated quality control. Bull. Amer. Meteor. Soc., 82, 2173-2187.