Cloud verification: a review of methodologies and recent developments Anna Ghelli ECMWF Slide 1 Thanks to: Maike Ahlgrimm Martin Kohler, Richard Forbes Slide 1
Outline Cloud properties Data availability NWP model fields and observation: the matching game Standard scores and new ideas Example plots Active satellite profiling: new challenges Conclusions Slide 2 Slide 2
Stratocumulus Lenticularis Los Lagos -- Chile Slide 3 Bernhard Mühr, www.wolkenatlas.de Slide 3
Macrophysical properties Germany Bernhard Mühr, www.wolkenatlas.de Cloud base height Cloud fraction Total cloud cover Cloud top height Slide 4 Stratocumulus stratiformis translucidus Slide 4
Microphysical properties Yellowstone, USA Bernhard Mühr, www.wolkenatlas.de Cloud water ice content Cloud liquid water content Cloud droplet size Liquid water path Radar reflectivity Optical depth Slide 5 Stratocumulus stratiformis opacus cumulogenitus Slide 5
What does the model produce? Total cloud cover High, low and medium clouds Temperature, humidity and cloudiness --> can be transformed into brightness temperature. Cloud fraction, Liquid Water Content, Ice Water Content (on Model Levels) Slide 6 Slide 6
Observations -- what is available? Conventional observing systems: Satellite data: Geostationary SYNOPs RADAR Polar orbiting Active satellite profiling LIDAR Slide 7 Slide 7
Observations Conventional observing systems: Sparse and inhomogeneous coverage Decreasing in number Differences between manual and automated But: The data volumes are manageable Available at synoptic times Available in real time They measure the weather Satellite data: Large data volumes Need location and time matching Thinning algorithms are needed May not be available in real time But: Wide spatial coverage High spatial and temporal resolution Weather phenomena like fog can be assessed, not possible with conventional Slide 8 data Slide 8
The matching game Need to address mismatch in spatial scales in model and obs (1 km) Need to address mismatch in time scales Approaches: Obs to model --> Average obs to model representative spatial scale Model to obs --> Statistically represent model sub-gridscale variability using a Monte-Carlo multi-independent column approach. Model gridbox cloud fraction Model generated sub-columns Compare CloudSat Obs Model Cloudy Obs Cloudy Cloud-free CloudSat Obs Obs averaged onto model gridscale Slide 9 Slide 9 Compare Model gridbox cloud fraction Richard Forbes, ECMWF
The matching game VOCALS field experiment off Chile GOES12 10.8µm ECMWF 10.8µm Slide 10 Slide 10
The matching game SYNOPS Slide 11 Slide 11
Desirable properties Equitable --> random forecast scores zero Difficulty to hedge --> do not reward under or over predicting Independence of the frequency of occurrence --> can be used for rare events Dependence of forecast bias --> bias may influence the score and more? Slide 12 Slide 12
Scores and their properties Continuous scores: MAE and MAESS (Mean Absolute Error Skill Score) Bias (forecast - observation) Fractions Skill Score Contingency table-based scores: Heidke skill score Equitable Threat Score Odd Ratio Log Odd Ratio Extreme Dependence Score Symmetric Extreme Dependency Score (Hogan et al., QJ 2009) Range: - to 1, Perfect score = 1, No skill level = 0 Range: -1/3 to 1, Perfect score = 1, No skill level = 0 Range: 0 to, Perfect score =, No skill level = 1 Range: - to, Perfect score =, No skill level = 0 Range: -1 to 1, Perfect score = 1 Slide 13 Slide 13 Perfect score = 0 Range: 0 to 1, Perfect score = 1 Perfect score = 1
Symmetric EDS EDS is easy to hedge predicting the event all the time EDS is not equitable SEDS= ln[(a+b)/n] + ln[(a+c)/n] --------------------------------- - 1 ln (a/n) Slide 14 Hogan et all, QJ 2009 Slide 14
Timeseries of MAESS for total cloud clover (reference: persistence) Slide 15 Europe Slide 15
Timeseries of bias and stdv Slide 16 Slide 16
Time series of ETS Total Cloud Cover : Model vs SYNOP Slide 17 Slide 17
MODEL against SYNOPS Percentage correct 60 Canada PPM umosbin umos Finland 50 40 Heidke Skill Score 0.40 0.30 0.20 Slide 18 6 12182430364248 hours PPM umosbin umos 6 12182430364248 Slide 18
FC bias -- Winter Total Cloud Cover: 36h forecast versus SYNOP observation (high pressure days over central Europe) Slide 19 Slide 19
Observation to model trade cumulus clouds Low clouds over ocean have large radiative impact Low cloud fraction, but ubiquitous Slide 20 Maike Ahlgrimm, ECMWF Slide 20
Observation to model Can the model predict accurately this type of clouds? Probably not! Cloud characteristics: Ubiquitous Relatively small scale Verification strategy: Relax time and space constraints, i.e. I am not asking the model to forecast my cloud at the exact location and time. The new verification question is: Given an area and period of time, what is the frequency of Slide 21 occurrence of my event in the forecast and in the observation? Slide 21
TCu frequency of occurrence CALIPSO Slide 22 Model has TCu more frequently than observed (66% vs. 47%) Maike Ahlgrimm, ECMWF Slide 22
Observation to Model Cloud top height OBS ERA-I model clouds have higher cloud tops than observed Slide 23 Maike Ahlgrimm, ECMWF Slide 23
Observation to Model Ice Water Content Example cross section (Variational method: Delanoë and Hogan, 2009) 26/02/2007 15Z Model ice water content (excluding precipitating snow). Ice water content derived from a 1DVAR retrieval of CloudSat/ CALIPSO/Aqua Eq Antarctica Eq Slide 24 Greenland Eq log 10 kg m -3 Slide 24
Observation to Model Ice Water Content (cloudsat/calipso) Slide 25 Richards Forbes (ECMWF) in collaboration with Delanoë and Hogan (Reading Univ., UK) Slide 25
MODEL to OBSERVATION Radar Reflectivity: Cross-section through a mid-latitude front Slide 26 Richard Forbes, ECMWF Slide 26
MODEL to OBSERVATION STATISTICS: Frequency of occurrence (Radar Reflectivity vs. Height) Tropics over ocean 30S to 30N for February 2007 Significantly Slide higher 27 occurrence of cloud in model Richard Forbes, ECMWF Slide 27
Conclusions.. or our challenges Observing systems. Data management issues. Model and observations: the matching game. 2D verification of clouds are well established, 3D evaluation of cloud properties is now possible with active satellite profiling. users --> involve them in any verification process. Slide 28 Slide 28