An update on the WGNE/WGCM Climate Model Metrics Panel

Similar documents
Performance Metrics for Climate Models: WDAC advancements towards routine modeling benchmarks

Addendum to the CMIP5 Experiment Design Document: A compendium of relevant s sent to the modeling groups

CFMIP-2 : Cloud Feedback Model Intercomparison Project Phase 2

A Project to Create Bias-Corrected Marine Climate Observations from ICOADS

WCRP Grand Challenge on. Clouds, Circulation and Climate Sensitivity

Interactive comment on Total cloud cover from satellite observations and climate models by P. Probst et al.

Selecting members of the QUMP perturbed-physics ensemble for use with PRECIS

WGNE Conference call minutes Friday 14 June 2013, 14:00-16:00

CLIVAR WGSIP. Strategy for Development of Seasonal Prediction

Driving Earth Systems Collaboration across the Pacific

Improving Hydrological Predictions

Decadal predictions using the higher resolution HiGEM climate model Len Shaffrey, National Centre for Atmospheric Science, University of Reading

A Summary of the CMIP5 Experiment Design

Atmospheric Processes

CMIP6 Recommendations from The Cloud Feedback Model Inter-comparison Project CFMIP. Building bridges between cloud communities

Critical Assets and Extreme Weather Process & Lessons

Evaluating GCM clouds using instrument simulators

Comparison of Cloud and Radiation Variability Reported by Surface Observers, ISCCP, and ERBS

IMPACTS OF IN SITU AND ADDITIONAL SATELLITE DATA ON THE ACCURACY OF A SEA-SURFACE TEMPERATURE ANALYSIS FOR CLIMATE

Clouds, Circulation, and Climate Sensitivity

Why aren t climate models getting better? Bjorn Stevens, UCLA

Future needs of remote sensing science in Antarctica and the Southern Ocean: A report to support the Horizon Scan activity of COMNAP and SCAR

Selected Ac)vi)es Overseen by the WDAC

Supporting Online Material for

Jessica Blunden, Ph.D., Scientist, ERT Inc., Climate Monitoring Branch, NOAA s National Climatic Data Center

ENSO: Recent Evolution, Current Status and Predictions. Update prepared by: Climate Prediction Center / NCEP 29 June 2015

WCRP Observation and Assimilation Panel (WOAP)

Evaluating the present-day simulation of clouds, precipitation, and radiation in climate models

Decadal Variability: ERBS, ISCCP, Surface Cloud Observer, and Ocean Heat Storage

CAUSES. (Clouds Above the United States and Errors at the Surface)

MARK D. ZELINKA STEPHEN A. KLEIN

Report to 8 th session of OOPC. By Dr. Alan R. Thomas, Director, GCOS Secretariat

Long term cloud cover trends over the U.S. from ground based data and satellite products

Modelling the Earth System. Understanding and predicting climate changes and fluctuations

Real-time monitoring and forecast of intraseasonal variability during the 2011 African Monsoon

Huai-Min Zhang & NOAAGlobalTemp Team

ARM Data Center Experience in Linking Big Data

MSG-SEVIRI cloud physical properties for model evaluations

Daily High-resolution Blended Analyses for Sea Surface Temperature

An A-Train Water Vapor Thematic Climate Data Record Using Cloud Classification

Visualizing of Berkeley Earth, NASA GISS, and Hadley CRU averaging techniques

How To Understand Cloud Radiative Effects

UNITED KINGDOM CONTRIBUTION TO ARGO

ENSO Cycle: Recent Evolution, Current Status and Predictions. Update prepared by Climate Prediction Center / NCEP 9 May 2011

GCOS science conference, 2 Mar. 2016, Amsterdam. Japan Meteorological Agency (JMA)

A process oriented descrip-on of oceanic clouds derived from A- train observa-ons, for climate model evalua-on

Joel R. Norris * Scripps Institution of Oceanography, University of California, San Diego. ) / (1 N h. = 8 and C L

Physical properties of mesoscale high-level cloud systems in relation to their atmospheric environment deduced from Sounders

Tropospheric Adjustment Induces a Cloud Component in CO 2 Forcing

ExArch: Climate analytics on distributed exascale data archives Martin Juckes, V. Balaji, B.N. Lawrence, M. Lautenschlager, S. Denvil, G. Aloisio, P.

Rodney Martínez Güingla

HYCOM Data Management -- opportunities & planning --

THE CURIOUS CASE OF THE PLIOCENE CLIMATE. Chris Brierley, Alexey Fedorov and Zhonghui Lui

Strong coherence between cloud cover and surface temperature

Scholar: Elaina R. Barta. NOAA Mission Goal: Climate Adaptation and Mitigation

IEAGHG Information Paper ; The Earth s Getting Hotter and So Does the Scientific Debate

CMIP5 Data Management CAS2K13

Comment on "Observational and model evidence for positive low-level cloud feedback"

The impact of parametrized convection on cloud feedback.

UK Argo Data Centre and the Southern Ocean Regional Data Centre (SORDAC)

Item 4.1: Atmospheric Observation Panel for Climate Recent climatic events, observing-system changes and report from 17 th Session of AOPC

Options for filling the LEO-GEO AMV Coverage Gap Francis Warrick Met Office, UK

II. Related Activities

The NOAA National Climatic Data Center Data availability, WDC-A, and GCOS data sets

Multiple Disciplines - Faculty Positions

Queensland rainfall past, present and future

Since launch in April of 2006, CloudSat has provided

EARTH SYSTEM SCIENCE ORGANIZATION Ministry of Earth Sciences, Government of India

IGAD CLIMATE PREDICTION AND APPLICATION CENTRE

Improvements in atmospheric physical parameterizations for the Australian Community Climate and Earth-System Simulator (ACCESS)

Performance Analysis of a Numerical Weather Prediction Application in Microsoft Azure

Development of an Integrated Data Product for Hawaii Climate

Understanding and Improving CRM and GCM Simulations of Cloud Systems with ARM Observations. Final Report

CLIMATE FUTURES TOOL USER GUIDE

REDUCING UNCERTAINTY IN SOLAR ENERGY ESTIMATES

Research Objective 4: Develop improved parameterizations of boundary-layer clouds and turbulence for use in MMFs and GCRMs

Data Management Activities. Bob Keeley OOPC- 9 Southampton, Jun, 2004

Near Real Time Blended Surface Winds

Malcolm L. Spaulding Professor Emeritus, Ocean Engineering University of Rhode Island Narragansett, RI 02881

UCAR Trustee Candidate Kenneth Bowman

Artificial Neural Network and Non-Linear Regression: A Comparative Study

Titelmasterformat durch Klicken. bearbeiten

Real-time monitoring and forecast of the West African monsoon intraseasonal variability: from 2011 to 2013

NOTES AND CORRESPONDENCE

Investigating mechanisms of cloud feedback differences.

A Web-Based REPORTING SYSTEM

4.3. David E. Rudack*, Meteorological Development Laboratory Office of Science and Technology National Weather Service, NOAA 1.

CMIP5 and AR5 Data Reference Syntax (DRS)

A Comparison of the Atmospheric Response to ENSO in Coupled and Uncoupled Model Simulations

Emissions de CO 2 et objectifs climatiques

SOOS Data Management

Manual on the Implementation of Education and Training Standards in Meteorology and Hydrology

Celeste Saulo Centro de Investigaciones del Mar y la Atmósfera (CONICET/UBA) Buenos Aires, Argentina

Sustainable & Responsible Investment Policy

Empower loss prevention with strategic data analytics

SPARC Report and WGNE role

Climate modelling. Dr. Heike Huebener Hessian Agency for Environment and Geology Hessian Centre on Climate Change

Global Synthesis and Observations Panel: GSOP activities and OOPC coordination. Co-chairs Keith Haines Tong Lee

Transcription:

An update on the WGNE/WGCM Climate Model Metrics Panel Members selected by relevant and diverse experience, and potential to liaison with key WCRP activities: Beth Ebert (BMRC) JWGV/WWRP, WMO forecast metrics Veronika Eyring (DLR Germany) WGCM/SPARC, CCMI, ESMs Pierre Friedlingstein (U. Exeter) IGBP, carbon cycle, ESMs Peter Gleckler (PCMDI), chair WGNE, atmosphere, ocean Simon Marsland (CSIRO) WGOMD, ocean Robert Pincus (NOAA) GEWEX/GCSS, clouds/radiation Karl Taylor (PCMDI) WGCM, CMIP5, atmosphere Helene Hewitt (U.K. Met Office) polar ocean and sea-ice

Metrics panel terms of reference (working version) Identify a limited but diverse set of climate model performance metrics based on comparison with observations well established in literature, and preferably in widespread use easy to calculate, reproduce, interpret and be fairly robust covering a diverse suite of climate characteristics large- to global-scale mean climate and some variability atmosphere, oceans, land surface, and sea-ice Coordinate with other WCRP/CLIVAR working groups identify metrics for more focused evaluation (e.g., variability modes, process level) striving towards a community based activity by coalescing expertise Justify and promote these basic metrics in an attempt to establish routine performance benchmarks facilitate further research of increasingly targeted metrics

What has happened since the last WGCM meeting? Panel efforts hampered by four members being preoccupied with the preparation of the AR5 A good deal of early CMIP5 analysis has been accomplished with performance metrics (a few examples to follow) The panel s wiki was made public in April 2012 The research community is clearly stimulated by the subject as evidenced in early CMIP5 research

First steps towards identifying routine metrics Basic mean state and annual cycle: Large- to global- scale evaluation (global, tropical, NH/SH extra-tropics) 20 year climatologies: Annual and seasonal means Routine metrics: bias, centered RMSE, MAE, correlation, S.D. Field examples: OLR, T850, precip, SST, SSH, sea-ice extent Observations: multiple for most cases Towards an extended set of metrics, coordinating with other working groups (in progress): ENSO (CLIVAR Pacific Panel) Monsoons (CLIVAR AAMP) MJO (YOTC Task force) CFMIP committee WGOMD Carbon cycle in emission-driven ESMs (ILAMB) Chemistry-Climate (CCMVal, CCMI)...

Revisiting Gleckler et al. (2008) portrait plot with CMIP5 Relative space-time global RMSE in climatological annual cycle

Examining redundancies in mean state metrics Yokoi et al., 2011: J. Appl.Metr.Clim Similar metrics to previous studies (e.g., Murphy et al. 2004, Gleckler et al 2008, Pincus, 2008) Compare results from two cluster analysis methods Methods yield similar results : ~7 clusters, with a mix of mean bias and centered-rmse metrics

Summarizing mean climate performance Nishii et al., 2012, JAMS An index based on total ATM total energy yields similar results to other, more comprehensive measures (e.g., CPI). RANK At this stage the panel is not advocating overall skill scores, but there is now evidence that at some level results are robust to how such indices are being constructed

Cloud related metrics? Some examples: Pincus et al. (2008) CMIP3 cloud evaluation, low-order error measures, no obs proxy Williams and Webb (2009) CFMIP2, evaluation using ISCCP proxy against observed canonical subsets Jiang et al (2012) and Li et al (2012) CMIP5 LWP and IWP using A-train observations, no obs proxy Klein et al (submitted) CFMIP1 + CFMIP2, evaluation of cloud-radiative impact using ISCCP proxy Bottom line: Active area of research makes it difficult at this stage to identify metrics that meet the panel s criteria

Some scratch slides.

Priorities for the panel during the coming year Strengthen the wiki so that it becomes recognized as a useful resource Provide all modeling groups with a database/code of standard metrics results from all CMIP 3/5 models this will enable groups, if interested, to incorporate into their development process an ability to examine how there model compares to others Prepare manuscript synthesizing metrics panel results for CMIP 3 & 5 Advance the concept of a repository for metrics/analysis codes Consider a workshop dedicated to performance metrics, 6-18 months after the March 2013 WGNE systematic errors workshop?