Cloud Model Verification at the Air Force Weather Agency

Similar documents
Estimating Firn Emissivity, from 1994 to1998, at the Ski Hi Automatic Weather Station on the West Antarctic Ice Sheet Using Passive Microwave Data

The impact of window size on AMV

Cloud verification: a review of methodologies and recent developments

Development of an Integrated Data Product for Hawaii Climate

NCDC s SATELLITE DATA, PRODUCTS, and SERVICES

Huai-Min Zhang & NOAAGlobalTemp Team

RAPIDS Operational Blending of Nowcasting and NWP QPF

MOGREPS status and activities

Very High Resolution Arctic System Reanalysis for

Options for filling the LEO-GEO AMV Coverage Gap Francis Warrick Met Office, UK

Wind resources map of Spain at mesoscale. Methodology and validation


Application of Numerical Weather Prediction Models for Drought Monitoring. Gregor Gregorič Jožef Roškar Environmental Agency of Slovenia

The Canadian Lightning Detection Network Novel Approaches for Performance Measurement and Network Management

Solar Energy Forecasting Using Numerical Weather Prediction (NWP) Models. Patrick Mathiesen, Sanyo Fellow, UCSD Jan Kleissl, UCSD

Comparative Evaluation of High Resolution Numerical Weather Prediction Models COSMO-WRF

IBM Big Green Innovations Environmental R&D and Services

Project Title: Quantifying Uncertainties of High-Resolution WRF Modeling on Downslope Wind Forecasts in the Las Vegas Valley

2.8 Objective Integration of Satellite, Rain Gauge, and Radar Precipitation Estimates in the Multisensor Precipitation Estimator Algorithm

Developing Continuous SCM/CRM Forcing Using NWP Products Constrained by ARM Observations

Temporal variation in snow cover over sea ice in Antarctica using AMSR-E data product

Development of a. Solar Generation Forecast System

Monitoring Soil Moisture from Space. Dr. Heather McNairn Science and Technology Branch Agriculture and Agri-Food Canada

Evaluations of the CALIPSO Cloud Optical Depth Algorithm Through Comparisons with a GOES Derived Cloud Analysis

Including thermal effects in CFD simulations

USING SIMULATED WIND DATA FROM A MESOSCALE MODEL IN MCP. M. Taylor J. Freedman K. Waight M. Brower

Using Cloud-Resolving Model Simulations of Deep Convection to Inform Cloud Parameterizations in Large-Scale Models

Real-time Global Flood Monitoring and Forecasting using an Enhanced Land Surface Model with Satellite and NWP model based Precipitation

Mixing Heights & Smoke Dispersion. Casey Sullivan Meteorologist/Forecaster National Weather Service Chicago

Nowcasting of significant convection by application of cloud tracking algorithm to satellite and radar images

A system of direct radiation forecasting based on numerical weather predictions, satellite image and machine learning.

IMPACT OF SAINT LOUIS UNIVERSITY-AMERENUE QUANTUM WEATHER PROJECT MESONET DATA ON WRF-ARW FORECASTS

Solarstromprognosen für Übertragungsnetzbetreiber

Cloud Correction and its Impact on Air Quality Simulations

Fire Weather Index: from high resolution climatology to Climate Change impact study

Parameterization of Cumulus Convective Cloud Systems in Mesoscale Forecast Models

Sub-grid cloud parametrization issues in Met Office Unified Model

Fog and low cloud ceilings in the northeastern US: climatology and dedicated field study

APPENDIX A : 1998 Survey of Proprietary Risk Assessment Systems

WMO Climate Database Management System Evaluation Criteria

Cloud-Resolving Simulations of Convection during DYNAMO

DoD Interest in Seasonal & Sub-seasonal Climate Prediction

Validation n 2 of the Wind Data Generator (WDG) software performance. Comparison with measured mast data - Flat site in Northern France

3.5 THREE-DIMENSIONAL HIGH-RESOLUTION NATIONAL RADAR MOSAIC

McIDAS-V Tutorial Displaying Polar Satellite Imagery updated September 2015 (software version 1.5)

ROAD WEATHER AND WINTER MAINTENANCE

Ground System as an Enterprise Software Architecture Considerations

AUTOMATED DEM VALIDATION USING ICESAT GLAS DATA INTRODUCTION

Coupling micro-scale CFD simulations to meso-scale models

Climatology and Monitoring of Dust and Sand Storms in the Arabian Peninsula

Virtual Met Mast verification report:

Estimation of satellite observations bias correction for limited area model

Environmental Data Services for Delaware:

Titelmasterformat durch Klicken. bearbeiten

Precipitation Remote Sensing

Active Fire Monitoring: Product Guide

Studying cloud properties from space using sounder data: A preparatory study for INSAT-3D

SOLAR IRRADIANCE FORECASTING, BENCHMARKING of DIFFERENT TECHNIQUES and APPLICATIONS of ENERGY METEOROLOGY

Mode-S Enhanced Surveillance derived observations from multiple Air Traffic Control Radars and the impact in hourly HIRLAM

Tools for Viewing and Quality Checking ARM Data

Improving Accuracy of Solar Forecasting February 14, 2013

Launch-Ready Operations Code Chain ESDT ShortNames, LongNames, and Generating PGE or Ingest Source

The relationships between Argo Steric Height and AVISO Sea Surface Height

Sea Mammal Research Unit. GPS Phone Tags. Introduction. University of St Andrews.

HFIP Web Support and Display and Diagnostic System Development

ASSESSMENT OF THE CAPABILITY OF WRF MODEL TO ESTIMATE CLOUDS AT DIFFERENT TEMPORAL AND SPATIAL SCALES

CRMS Website Training

Use of numerical weather forecast predictions in soil moisture modelling

NC STATE UNIVERSITY Exploratory Analysis of Massive Data for Distribution Fault Diagnosis in Smart Grids

Near Real Time Blended Surface Winds

Selection Requirements for Business Activity Monitoring Tools

13.2 THE INTEGRATED DATA VIEWER A WEB-ENABLED APPLICATION FOR SCIENTIFIC ANALYSIS AND VISUALIZATION

HIGH RESOLUTION SATELLITE IMAGERY OF THE NEW ZEALAND AREA: A VIEW OF LEE WAVES*

Environmental Remote Sensing GEOG 2021

Joint Polar Satellite System (JPSS)

CGMS-36, NOAA-WP-14 Prepared by C. Cao Agenda Item: II/2 Discussed in WG II

THE STRATEGIC PLAN OF THE HYDROMETEOROLOGICAL PREDICTION CENTER

REGIONAL CLIMATE AND DOWNSCALING

How do you compare numbers? On a number line, larger numbers are to the right and smaller numbers are to the left.

Comparison of visual observations and automated ceilometer cloud reports at Blindern, Oslo. Anette Lauen Borg Remote sensing MET-Norway

An Analysis of the Rossby Wave Theory

Technical Bulletin 144. Air Volume Conversions. by:

Comparison of the Vertical Velocity used to Calculate the Cloud Droplet Number Concentration in a Cloud-Resolving and a Global Climate Model

Chapter 3: Weather Map. Weather Maps. The Station Model. Weather Map on 7/7/2005 4/29/2011

Verification of cloud simulation in HARMONIE AROME

Meteorological Forecasting of DNI, clouds and aerosols

Monsoon Variability and Extreme Weather Events

4.3. David E. Rudack*, Meteorological Development Laboratory Office of Science and Technology National Weather Service, NOAA 1.

Notable near-global DEMs include

Transcription:

2d Weather Group Cloud Model Verification at the Air Force Weather Agency Matthew Sittel UCAR Visiting Scientist Air Force Weather Agency Offutt AFB, NE Template: 28 Feb 06

Overview Cloud Models Ground Truth Verification Technique Sample Statistics MET Output 2

Cloud Models Three are currently run at AFWA Advect Cloud (ADVCLD) Quasi-Lagrangian advection using global model winds Diagnostic Cloud Forecast (DCF) Statistical relation based on recent performance of mesoscale model Stochastic Cloud Forecast Model (SCFM) Statistical relation based on long-term performance of GFS model 3

Cloud Model Comparison Model Domain Model Run Frequency Forecast Time Step Maximum Forecast Hour Grid Spacing Vertical Layers ADVCLD Hemispheric 3-hourly 1 hour 12 hours 16 th mesh 5 DCF Theater 6-hourly 3 hours 72 hours 16 th mesh 5 SCFM Hemispheric 6-hourly 3 hours 84 hours 45, 15 km 9 4

Cloud Model Outputs Total Cloud Amount Cloud Base Height Cloud Top Height Cloud Type (DCF only) 5

Cloud Model Outputs Total Cloud Amount Cloud Base Height Cloud Top Height Cloud Type (DCF only) 6

Total Cloud Amount ADVCLD and SCFM forecast cloud amount to the nearest 1%. DCF does not 7

DCF Total Cloud Cloud Amount Coded Value 0% 0% 1-20% 13% 21-40% 33% 41-60% 53% 61-80% 73% 81-100% 93% SCFM and ADVCLD total cloud forecasts are converted to this categorical scheme when comparing to DCF. 8

Ground Truth: WWMCA WWMCA = World Wide Merged Cloud Analysis Run hourly Northern and Southern Hemisphere Total cloud (resolution to nearest 1%), cloud base and top heights 16 th mesh grid (~788,000 usable points) 9

Ground Truth: WWMCA Geostationary Data Polar Orbiting Data NOGAPS Upper Atmos. Temp Surface Observations Snow Analysis Resolution: 25 nm Obs: Surface, SSM/I Freq: Daily, 12Z World-Wide Merged Cloud Analysis (WWMCA) Hourly, global, real-time, cloud analysis @12.5nm Surface Temp Analysis Resolution: 25 nm Obs: IR imagery, SSM/I Temp Freq: 3 Hourly Total Cloud and Layer Cloud data supports National Intelligence Community, cloud forecast models, and global soil temperature and moisture analysis. 10

WWMCA Components Geostationary Satellites Polar Orbiting Satellites Surface Temperature Analysis Snow Depth Analysis Upper Air Temperature Data Surface Observations Manual QC 11

A Perfect WWMCA All satellites functioning properly No problems with satellite data transmission All satellite data received at AFWA correctly/on time Satellite data conversion is problem-free Availability of specialized analyses Decision process is correct (e.g., snow vs. cloud) Error-free observational data Correct manual QC 12

WWMCA Timeliness Hemispheric analyses are not snapshots! Age limits are applied No data older than 120 minutes are used in verification 13

WWMCA Data Counts 100 90 80 Percent Data Availability 70 60 50 40 30 20 10 0 Run Date (YYYYMMDDCC) On average, 82% of WWMCA global data points are usable (~1.29 million data points per run). 14

Verification Technique Determine model-observation pairs ADVCLD and SCFM are already co-located with WWMCA ground truth data points DCF points depend on domain s map projection When ADVCLD or SCFM is compared to DCF, use nearest neighbor to map ADVCLD, SCFM and WWMCA to the DCF domain WWMCA is dumbed down to the 6 categories when compared to DCF Data counts for total cloud contingency table categories (6 for DCF, 101 for ADVCLD, SCFM) are archived for long-term statistics calculations 15

Cloud Verification Statistics Root Mean Square Error Mean Absolute Deviation Forecast Bias 20-20 Index 16

Cloud Verification Statistics Root Mean Square Error Mean Absolute Deviation Forecast Bias 20-20 Index 17

20-20 Index Percent of model-observation data points with error 20% or less For each i of n forecast pairs: Forecast and observation expressed as a percentage ranging from 0 to 100 1 is best, 0 is worst 18

Domain-Wide Statistics 50 June 2009 DCF RMSE 49 48 47 Average RMSE 46 45 44 43 42 41 40 6 12 18 24 30 36 42 48 Forecast Hour 19

Domain-Wide Statistics 50 June 2009 DCF MAE 49 48 47 Average MAE 46 45 44 43 42 41 40 6 12 18 24 30 36 42 48 Forecast Hour 20

Domain-Wide Statistics 0 June 2009 DCF Bias -0.5-1 Average Bias -1.5-2 -2.5-3 -3.5-4 Forecast Hour 6 12 18 24 30 36 42 48 21

Domain-Wide Statistics 0.9 June 2009 DCF 20-20 Index 0.89 Average 20-20 Index 0.88 0.87 0.86 0.85 6 12 18 24 30 36 42 48 Forecast Hour 22

Sample WWMCA Distribution 25000 20000 15000 June 30, 00Z (both hemispheres combined) Almost 70% of the data points are 0 or 100%. This is a typical amount. Count 10000 5000 0 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44 46 48 50 52 54 56 58 60 62 64 66 68 70 72 74 76 78 80 82 84 86 88 90 92 94 96 98 100 Total Cloud Percentage 23

Sample Contingency Tables 24-hour total cloud forecasts CONUS domain 18Z model run 30 day totals: June 1-30, 2009 6 DCF cloud categories = 6x6 table 24

June, 2009 Total Cloud WWMCA (Observation) DCF (Forecast) 0% 13% 33% 53% 73% 93% 0% 336,315 84,209 35,369 26,641 22,889 92,600 13% 46,839 31,483 17,711 14,392 13,511 57,593 33% 36,207 15,335 8,842 7,484 7,471 43,730 53% 35,750 14,314 7,806 6,500 6,047 30,874 73% 19,769 10,312 7,249 6,236 6,692 42,949 93% 75,845 49,645 36,367 37,284 43,644 476,603 25

June, 2009 Total Cloud WWMCA (Observation) DCF (Forecast) 0% 13% 33% 53% 73% 93% Total 0% 336,315 84,209 35,369 26,641 22,889 92,600 598,023 13% 46,839 31,483 17,711 14,392 13,511 57,593 181,529 33% 36,207 15,335 8,842 7,484 7,471 43,730 119,069 53% 35,750 14,314 7,806 6,500 6,047 30,874 101,291 73% 19,769 10,312 7,249 6,236 6,692 42,949 93,207 93% 75,845 49,645 36,367 37,284 43,644 476,603 719,388 Total 550,725 205,298 113,344 98,537 100,254 744,349 1,812,507 26

June, 2009 Total Cloud WWMCA (Observation) DCF (Forecast) 0% 13% 33% 53% 73% 93% Total 0% 336,315 84,209 35,369 26,641 22,889 92,600 598,023 13% 46,839 31,483 17,711 14,392 13,511 57,593 181,529 33% 36,207 15,335 8,842 7,484 7,471 43,730 119,069 53% 35,750 14,314 7,806 6,500 6,047 30,874 101,291 73% 19,769 10,312 7,249 6,236 6,692 42,949 93,207 93% 75,845 49,645 36,367 37,284 43,644 476,603 719,388 Total 550,725 205,298 113,344 98,537 100,254 744,349 1,812,507 Hit Rate = 0.478 HSS = 0.270 27

June, 2009 Total Cloud WWMCA (Observation) DCF (Forecast) 0% 13% 33% 53% 73% 93% 0% 336,315 84,209 35,369 26,641 22,889 92,600 13% 46,839 31,483 17,711 14,392 13,511 57,593 33% 36,207 15,335 8,842 7,484 7,471 43,730 53% 35,750 14,314 7,806 6,500 6,047 30,874 73% 19,769 10,312 7,249 6,236 6,692 42,949 93% 75,845 49,645 36,367 37,284 43,644 476,603 Let s simplify to a 2x2 contingency table cloud vs. no cloud 28

2x2 : Cloud vs. No Cloud WWMCA (Observation) DCF (Forecast) 0% Non-Zero 0% 336,315 261,708 Non-Zero 214,410 1,000,074 29

2x2 : Cloud vs. No Cloud WWMCA (Observation) DCF (Forecast) 0% Non-Zero 0% 336,315 261,708 Non-Zero 214,410 1,000,074 Hit Rate = 0.737 (was 0.478 for 6x6 table) HSS = 0.394 (was 0.270 for 6x6 table) POD = 0.611 FAR = 0.438 CSI = 0.414 30

Using MET MODE MET = Model Evaluation Tools MODE = Method for Object-Based Diagnostic Evaluation Tool How does MODE perform with cloud forecasts? 31

MET MODE Example Total Cloud Cover Sample Event: July 15, 2009 06Z Model Run, 6-hour forecast 15 km CONUS DCF vs. 16 th mesh WWMCA (~24 km) WWMCA is re-mapped to exactly match the DCF domain for use in MODE 32

Resolving Objects: Threshold DCF is already limited to 6 categories Non-zero cloud amounts are dominated by 100% cases All 100% cases are coded as 93% in DCF Threshold is the 93% DCF category (81-100% cloud) Used ge81.0 for both raw forecast and observation value in the configuration file 33

DCF Total Cloud Forecast 34

WWMCA Ground Truth 35

IR Satellite Image 36

WWMCA Ground Truth Satellite Pass Boundary Terrain? 37

WWMCA Objects, > 0 gs (Default) 38

MODE Defaults Area Threshold for Objects: 0 grid squares (gs) Convolution Radius: 4 grid units (gu) Is there any benefit to changing these? 39

WWMCA Objects, > 50 gs 40

WWMCA Objects, > 100 gs 41

Convolution Radius = 4 gu (Default, Objects set to 50 gs) 42

Convolution Radius = 2 gu 43

Convolution Radius = 1 gu 44

MODE Summary Plot (using Defaults) 45

MODE Summary Plot (using Defaults) 46

Diagnosing DCF Performance How is MODE best used for cloud model verification? Domain-wide summaries? dominated by large objects? Noisy WWMCA adds to the challenge Geographic subregions? Persistent objects (e.g., Coastal stratus) 47