The Beijin2008 FDP/RDP project

Similar documents
IMPACT OF SAINT LOUIS UNIVERSITY-AMERENUE QUANTUM WEATHER PROJECT MESONET DATA ON WRF-ARW FORECASTS

Triple eyewall experiment of the 2012 typhoon Bolaven using cloud resolving ensemble forecast

Application of Numerical Weather Prediction Models for Drought Monitoring. Gregor Gregorič Jožef Roškar Environmental Agency of Slovenia

Very High Resolution Arctic System Reanalysis for

MOGREPS status and activities

Status of ALADIN-LAEF, Impact of Clustering on Initial Perturbations of ALADIN-LAEF

Project Title: Quantifying Uncertainties of High-Resolution WRF Modeling on Downslope Wind Forecasts in the Las Vegas Valley

Titelmasterformat durch Klicken. bearbeiten

Nowcasting: analysis and up to 6 hours forecast

Comparative Evaluation of High Resolution Numerical Weather Prediction Models COSMO-WRF

Developing Continuous SCM/CRM Forcing Using NWP Products Constrained by ARM Observations

Can latent heat release have a negative effect on polar low intensity?

Towards an NWP-testbed

SUPERENSEMBLE FORECASTS WITH A SUITE OF MESOSCALE MODELS OVER THE CONTINENTAL UNITED STATES

GCMs with Implicit and Explicit cloudrain processes for simulation of extreme precipitation frequency

Developing sub-domain verification methods based on Geographic Information System (GIS) tools

Next generation models at MeteoSwiss: communication challenges

Tropical Cyclogenesis Monitoring at RSMC Tokyo Mikio, Ueno Forecaster, Tokyo Typhoon Center Japan Meteorological Agency (JMA)

Hybrid-DA in NWP. Experience at the Met Office and elsewhere. GODAE OceanView DA Task Team. Andrew Lorenc, Met Office, Exeter.

The Wind Integration National Dataset (WIND) toolkit

ASSESSMENT OF THE CAPABILITY OF WRF MODEL TO ESTIMATE CLOUDS AT DIFFERENT TEMPORAL AND SPATIAL SCALES

REGIONAL CLIMATE AND DOWNSCALING

RAPIDS Operational Blending of Nowcasting and NWP QPF

Estimation of satellite observations bias correction for limited area model

Stan Benjamin Eric James Haidao Lin Steve Weygandt Susan Sahm Bill Moninger NOAA Earth System Research Laboratory, Boulder, CO

Cloud Correction and its Impact on Air Quality Simulations

Validation n 2 of the Wind Data Generator (WDG) software performance. Comparison with measured mast data - Flat site in Northern France

Parameterization of Cumulus Convective Cloud Systems in Mesoscale Forecast Models

Performance Analysis and Application of Ensemble Air Quality Forecast System in Shanghai

NOWCASTING OF PRECIPITATION Isztar Zawadzki* McGill University, Montreal, Canada

Sub-grid cloud parametrization issues in Met Office Unified Model

Nowcasting of significant convection by application of cloud tracking algorithm to satellite and radar images

Proposals of Summer Placement Programme 2015

Atmospheric kinetic energy spectra from high-resolution GEM models

Status of HPC infrastructure and NWP operation in JMA

STATUS AND RESULTS OF OSEs. (Submitted by Dr Horst Böttger, ECMWF) Summary and Purpose of Document

ASR CRM Intercomparison Study on Deep Convective Clouds and Aerosol Impacts

Evalua&ng Downdra/ Parameteriza&ons with High Resolu&on CRM Data

Synoptic assessment of AMV errors

IBM Big Green Innovations Environmental R&D and Services

Use of numerical weather forecast predictions in soil moisture modelling

Evaluating the Impact of Cloud-Aerosol- Precipitation Interaction (CAPI) Schemes on Rainfall Forecast in the NGGPS

NOMADS. Jordan Alpert, Jun Wang NCEP/NWS. Jordan C. Alpert where the nation s climate and weather services begin

Meteorological Forecasting of DNI, clouds and aerosols

Convective Clouds. Convective clouds 1

Cloud Model Verification at the Air Force Weather Agency

Big Data Assimilation Revolutionizing Weather Prediction

How To Write A New Vertical Diffusion Package

Cloud/Hydrometeor Initialization in the 20-km RUC Using GOES Data

Overview of the IR channels and their applications

"Attività di modellistica numerica previsionale meteorologica al Servizio IdroMeteoClima di ARPA Emilia-Romagna"

A Description of the 2nd Generation NOAA Global Ensemble Reforecast Data Set

Forecaster comments to the ORTECH Report

A limited area model (LAM) intercomparison study of a TWP-ICE active monsoon mesoscale convective event

Quality Assurance in Atmospheric Modeling

Large Eddy Simulation (LES) & Cloud Resolving Model (CRM) Françoise Guichard and Fleur Couvreux

STRATEGY & Parametrized Convection

Simulation of low clouds from the CAM and the regional WRF with multiple nested resolutions

How To Model An Ac Cloud

A system of direct radiation forecasting based on numerical weather predictions, satellite image and machine learning.

An exploratory study of the possibilities of analog postprocessing

The Weather Intelligence for Renewable Energies Benchmarking Exercise on Short-Term Forecasting of Wind and Solar Power Generation

How To Model The Weather

The impact of window size on AMV

Evaluation of precipitation simulated over mid-latitude land by CPTEC AGCM single-column model

A Review on the Uses of Cloud-(System-)Resolving Models

Hong Kong Observatory Summer Placement Programme 2015

Cloud-Resolving Simulations of Convection during DYNAMO

Limitations of Equilibrium Or: What if τ LS τ adj?

Recent activities on Big Data Assimilation in Japan

Guy Carpenter Asia-Pacific Climate Impact Centre, School of energy and Environment, City University of Hong Kong

Mode-S Enhanced Surveillance derived observations from multiple Air Traffic Control Radars and the impact in hourly HIRLAM


Catastrophe Bond Risk Modelling

Hybrid Data Assimilation in the GSI

An Integrated Ensemble/Variational Hybrid Data Assimilation System

Partnership to Improve Solar Power Forecasting

Testing and Evaluation of GSI-Hybrid Data Assimilation and Its Applications for HWRF at the Developmental Testbed Center

CHUVA. by CHUVA Science Team. 4 th CHUVA Planning Meeting 13 December 2010 San Francisco, CA. Rachel I. Albrecht rachel.albrecht@cptec.inpe.

MM5/COSMO-DE Model Inter-Comparison and Model Validation

Aformal definition of quality assurance that is

Climatology and Monitoring of Dust and Sand Storms in the Arabian Peninsula

An Integrated Simulation and Visualization Framework for Tracking Cyclone Aila

Financing Community Wind

Virtual Met Mast verification report:

Current posture analysis is nothing without providing bilateral feedback aiming towards improving performance through appropriate remedial actions:

Investigations on COSMO 2.8Km precipitation forecast

FLOODALERT: A SIMPLIFIED RADAR-BASED EWS FOR URBAN FLOOD WARNING

Potential Climate Impact of Large-Scale Deployment of Renewable Energy Technologies. Chien Wang (MIT)

What the Heck are Low-Cloud Feedbacks? Takanobu Yamaguchi Rachel R. McCrary Anna B. Harper

Wireless Soil Moisture and Weather Monitoring System Caipos

TOPIC: CLOUD CLASSIFICATION

Advances in data assimilation techniques

BSRN Station Sonnblick

A Hybrid ETKF 3DVAR Data Assimilation Scheme for the WRF Model. Part II: Real Observation Experiments

1. a. Surface Forecast Charts (USA and Ontario and Quebec)

Development of an Integrated Data Product for Hawaii Climate

SPOOKIE: The Selected Process On/Off Klima Intercomparison Experiment

Outline. Case Study over Vale do Paraiba 11 February Comparison of different rain rate retrievals for heavy. Future Work

Development of a. Solar Generation Forecast System

Transcription:

1 st meeting WWRP WG MWF, 10 October 2007, Dubrovnik The Beijin2008 FDP/RDP project Kazuo Saito Meteorological Research Institute/Japan Meteorological Agency ksaito@mri jma.go.jp Presentation thanks to: Yihong Duan (NMC/CMA) Jiandong Gong (NMC/CMA) Yinglin Li (NMC/CMA) Li Li (NMC/CMA) Hiromu Seko (MRI/JMA) Bill Kuo (NCAR) ISSC of B08FDP/RDP

1.What is the Beijing 2008 FDP/RDP (B08FDP/RDP)? WWRP Beijing 2008 FDP/RDP is a WWRP short range weather forecasting research project which is conducted corresponding to the Beijing Olympic Games of August 2008. Approved as a WWRP research project succeeding to the Sydney 2000FDP. The project is divided into 2 components: FDP component: Forecast Demonstration for FT=0 6 hour based on nowcasting RDP component: Research and Development for FT=6 36 hour based on Mesoscale ensemble prediction.

Beijing Olympic Games 2008 Aug 8-24

Climate at Beijing (Aug 8-24, 1980 2001) Mean Temp ( ) Mean Max Temp( ) Extreme max temp ( ) Mean Min Temp( ) Extreme min temp ( ) Mean RH(%) Mean wind (m/s) Precipitation (%) Thunderstroms(%) 値 25.2 29.9 35.7 21.1 14.3 75 1.8 49.2 25.9 2004/07/10

Background Preparation of Olympic Weather Services The most important forecasts include: Hourly forecast for 0~24 hours T, RH, P, rainfall, wind and wind gust Severe weather warnings Heavy rain, lightning, strong wind and hails,heat Waves Specific services for sports/events WBGT, sand temperature

Observation network (2004) Observation network (2008)

Goals of the RDP Project To improve understanding of the high resolution and very short range probabilistic prediction processes through numerical experimentation To share experiences in the development of the realtime MEP system To study and develop adequate methods on the assessment of the capability and forecast skill of MEP systems To demonstrate how MEP system can improve quality of forecasts compare with deterministic run and/or Global EPS To train forecasters to apply ensemble forecasting products & support a better meteorological service for 2008 Olympic game To setup shareable database for future research in the community

Expected Outcomes Improved understanding on the dynamics, physics and predictability of the HIW over Beijing area, which is beneficial for Beijing HIW forecasts in 2008 New knowledge on high resolution ensemble prediction, and experience in development of the operation MEP system Intercomparison of MEP system among participants,which provides good opportunities for participant nations to learn from each other and find the clues for further improvements of their MEP systems Experience on training forecasters to use of EP products efficiently, and on making propaganda for the public to understand probability forecasting

1st B08 Workshop March 2005 at Beijing FDP: China (CMA, CAMS), US (NSSL), Canada (AES), UK (MO), Hong Kong (HKO) Australia (BMRC) RDP: China (NMC/CMA,CAMS/CMA), US (NCEP, NCAR), Canada (MSC), UK (MO), Japan (MRI/JMA) UK (MO) withdrawn after 1 st workshop Austria (ZAMG) joined since 2 nd workshop (collaborate with Meteo France)

WMO WWRP Science Steering Committee WWRP SSC Member / B08 Project administrator from CMA International Science Steering Committee (B08RDP ISSC) Technical Support Team (B08RDP ITeST) International Science Steering Committee (B08FDP ISSC) Data Management, Display & Network Team Participant #1 LSSC & LTeST Participant #2 LSSC & LTeST Participant #3 LSSC & LTeST Participant #4 LSSC & LTeST Management Organization Structure

International Science Steering Committee (ISSC) Geoff DiMego (NCEP), Bill Kuo (NCAR), Martin Charron (MSC), Kazuo Saito (JMA), Yihong DUAN (CMA), Lawrence Wilson (MSC) Yong Wang (ZAMG/Meteo France) (Change Position from Francoics Bouttier to Yong Wang) Gong Jiandong (secretary of ISSC)

International Technical Supporting Team (ITeST) Jun Du (NCEP), Wei Wang (NCAR), Hiromu Seko (JMA), Dehui Chen and Jiming Li (CMA), Amin Erfani (MSC), Yong Wang (Meteo France and ZAMG) Jiandong Gong (Team leader of B08RDP ITeST)

Time Schedule of B08RDP Year\Mont h 2004 January May June July August September October Nov. Dec. WWRP team visit to Beijing WWRP approve B08FDP/RD P proposal Beginning implementation 2005 1 st B08RDP workshop 2006 Initial setup of mesoscale ensemble system Technique & System development Relevant research Preliminary running & data transfer test; 2 nd B08RDP workshop; basic training Pre test on data transfer Products archive; verification; Continued work on setup of Meso EPS 2007 Modification of the systems Preparing for test Quasi real time running & data transfer test; 3 rd B08RDP workshop; forecaster training Products archive; Verification Report to WWRP 2008 Further modification Preparing for test Quasi operational running Full verification 2009 4 th B08RDP international workshop

General Requirements on Configuration of B08RDP MEPS Fine domain 3500km 1320km 3000km 1100km Tier 1 15 km mesoscale ensemble up to 36 hour Tier 2 2 3 km CRM experiments case study

Tier 1 MEP systems 2007 Participants Model IC IC perturbation LBC NCEP* WRF NMM WRF ARW NCEP Global 3DVAR Breeding Global EPS MRI/JMA JMA NHM JMA Regional 4DVAR Targeted Global SV JMA Regional Forecast MSC GEM MSC Global 4DVAR Targeted Global SV MSC Global EPS ZAMG & Meteo Fr. ALANDIN ECMWF Global 4DVAR ECMWF Global SV ECMWF Global EPS NMC/CMA WRF ARW WRF 3DVAR Breeding Global EPS CAMS/CMA GRAPES GRAPES 3DVAR Breeding Global EPS *EP system of NCEP is as of the 2006 experiment: NCEP submitted results by global EPS in the 2007 experiment

Start CMA Data base Obs NCEP ftp server Decompress obs. Format & Compress NCEP MEP MEP data Format & Compress MRI ftp server Decompress obs. Format & Compress MRI MEP MEP data NMC MEP CMA ftp server CAMS MEP MSC ftp server Decompress obs. Format & Compress MSC MEP MEP data MEP data Format & Compress MEP data Format & Compress ZAMG ftp server Decompress obs. Format & Compress ZAMG MEP MEP data CMA Data base MEP Products BMB Data base Display END MEP decompress Multi center super EP MEP verification Training forecast The data flow chart of B08RDP between CMA and participant countries are setup completely forecaster

Website www.b08rdp.org

Individual layout Combinational layout

Progress of participating systems NCEP Two models NCEP NMM and NCAR ARW, 10 member Model bias analysis and off line bias correction.

Progress of participating systems JMA Upgrade forecast model to the latest version of the JMA non hydrostatic model (JMA NHM), apply new physics. Tested four different initial perturbation methods (Global and mesoscale SV methods, downscaling of JMA operational one week EPS, and mesoscale BGM method). Perturb initial soil temperatures.

Progress of participating systems MSC GEM LAM model, 16 members Test perturbations of initial condition from targeted singular vector Obtain perturbed lateral boundary conditions from 16 global GEM model

Progress of participating systems ZAMG and Météo France Use hydrostatic version of limited area model ALADIN Dynamical downscaling of ECMWF EPS members (16 out of 50), 18 member ensemble Analysis 2m temperature and relative humidity bias, Test alternative initialization of surface and soil parameters and reduce the bias significantly.

Progress of participating systems CMA(GRAPES_MESO based) Use Global/Regional Assimilation and PrEdiction System (GRAPES). 9 members Initial perturbation from breeding method Lateral boundary condition from Global EPS Physical uncertainty by microphysics, cumulus and PBL scheme

Progress of participating systems CMA(WRF based) Update WRF ARW model from V1.0 to V2.2 Compare initial perturbation from Global EPS and regional breeding method Lateral boundary condition from Global EPS Physical uncertainty by cumulus convective parameterization and microphysics Bias analysis and correct for most model output

Probability Precipitation BS(BSS) at 12 UTC Average for 20070724 20070831 BS(>0.1mm) BSS(>0.1mm) BSS(>6.0mm) 0. 90 0. 80 0. 70 0. 60 0. 50 0. 40 0. 30 0. 20 0. 10 0. 00 0. 90 0. 80 0. 70 0. 60 0. 50 0. 40 0. 30 MRI/JMA 6 12 1 8 2 4 3 0 3 6 Forecast Hours MSC 0.90 0.80 0.70 0.60 0.50 0.40 0.30 0.20 0.10 0.00 0.9 0 0.8 0 0.7 0 0.6 0 0.5 0 0.4 0 0.3 0 NMC/CMA 6 12 18 24 30 36 ZAMG Forecast Hours 0. 9 0 0. 8 0 0. 7 0 0. 6 0 0. 5 0 0. 4 0 0. 3 0 0. 2 0 0. 1 0 0. 0 0 0.90 0.80 0.70 0.60 0.50 0.40 0.30 CAMS/CMA 6 1 2 1 8 2 4 3 0 3 6 NCEP Forecast Hours 0. 20 0. 10 0. 00 6 12 1 8 2 4 30 36 Forecast Hours 0.2 0 0.1 0 0.0 0 6 1 2 1 8 2 4 3 0 3 6 Forecast Hours 0.20 0.10 0.00 6 12 18 24 30 36 Forecast Hours After Y. Li (2007)

Relative Operating Characteristic (ROC) Curve Precipitation Probability (>0.1mm) 18 hr forecast Average for 200707024 20070831 H i t r a t e 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 MRI/JMA Relative Operating Characteristic Precipitation Average for 200707024-20070831 0 0.2 0.4 0.6 0.8 1 False alarm rate NMC/CMA Relative Operating Characteristic of 18 hours Precipitation Prpbability Forecast Average for 200707024-20070831 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.2 0.4 0.6 0.8 1 Hit rate False alarm rate CAMS/CMA Relative Operating Characteristic of 18 hours Precipitation Prpbability Forecast Average for 200707024-20070831 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.2 0.4 0.6 0.8 1 False alarm rate Hit rate Hi t r a t e MSC Relative Operating Characteristic of 18 hours Precipitation Prpbability Forecast Average for 200707024-20070824 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.2 0.4 0.6 0.8 1 False alarm rate H i t r a t e ZAMG Relative Operating Characteristic of 18 hours Precipitation Prpbability Forecast Average for 200707024-20070831 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.2 0.4 0.6 0.8 1 False alarm rate Hi t r a t e NCEP Relative Operating Characteristic of 18 hours Precipitation Prpbability Forecast Average for 200707024-20070831 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.2 0.4 0.6 0.8 1 False alarm rate After Y. Li (2007)

RMS error 3.0 2.5 2.0 1.5 1.0 0.5 Verification of upper air data NMC/CMA Temperature at 12UTC Average for 20070724-20070831 RM S E_ t 25 0 Sp r ea d _t 25 0 RM S E_ t 50 0 Sp r ea d _t 50 0 RM S E_ t 85 0 Sp r ea d _t 85 0 R M S e r r o r 3.0 2.5 2.0 1.5 1.0 0.5 MRI/JMA Temperature at 12UTC Average for 20070724-20070831 RMSE_t2 50 Spread_ t250 RMSE_t5 00 Spread_ t500 RMSE_t8 50 Spread_ t850 0.0 0 12 24 36 Forecast Hours 0.0 0 12 24 36 Forecast Hours Temperature of upper air : 850mb(red), 500mb(green), 250mb(blue) solid line RMSE of EPS mean, broken line is spread, they are in the same color in one level. The spread is less than RMSE in every level, and grows gradually with the forecast time increase, the RMSE grows gradually except of 850mb temperature, and RMSE in 850mb is the biggest one in the three levels. The spread and RMSE of temperature in all three levels grows gradually with the forecast time increase, and the RMSE is less than that of NMC/CMA. After Li (2007)

Bias Correction Method & Application GOAL Improve reliability while maintaining resolution Reduce systematic errors (improve reliability) while Not increasing random errors (maintaining resolution) Retain all useful information METHODOLOGY Use bias free estimators of systematic error Need methods with fast convergence using small sample Bias Correction Techniques array of methods Estimate/correct bias moment by moment Simple approach. May be less applicable for extreme cases Other methods (Bayesian approach/climate mean bias correction) 28

Bias Correction Method & Application Moment based method in B08RDP: apply adaptive (Kalman Filter type) algorithm decaying averaging mean error = (1 w) * prior t.m.e + w * (f a) 6.6% Test different decaying weights. 2%, 5% and 10%, respectively 3.3% Decide to use 10% (~ 10 days) decaying accumulation bias estimation 1.6% After L. Li (2007)

Bias Correction Method & Application variables 500mb height,850mb temperature, 2 meter temperature and 2 meter relative humidity Verification Relative errors RMS errors Correlation coefficient Talagrand Targets CAMS/CMA NCEP MSC MRI/JMA ZAMG NMC/CMA After L. Li (2007)

For NMC/CMA FT=18 hour of 2 meter temperature at 12UTC relative error 2007.08.15 2007.08.24 left raw bias right bias after correction r e l a t e e r r o r 4 3 2 1 0-1 -2-3 2 0 0 7. 8. 1 5 1 6 1 7 1 8 1 9 date 2 0 2 1 2 2 2 3 2 4 4 mem1 mem2 3 mem3 mem4 2 mem5 mem6 1 mem7 mem8 0 mem9 mem10-1 mem11 mem12 mem13-2 mem14 mem15-3 rela t e erro r 2 007.8.1 5 1 6 1 7 1 8 1 9 date 2 0 2 1 2 2 2 3 2 4 mem1 mem2 mem3 mem4 mem5 mem6 mem7 mem8 mem9 mem10 mem11 mem12 mem13 mem14 mem15 Before calibration, raw bias of 15 members scattered into two groups, and the value range from about 2 to 3 degree. After calibration, bias of 15 members scattered into one group, and the value range from about 2 to 1degree. Bias have been decreased obviously. After L. Li (2007)

Combination of Multi Source Ensemble Systems Goal Limitation of numbers of each ensemble system with respect of their relative historical or theoretical merit Assign different weights to ensemble systems Three methods in getting the weights Equal weights the best and the worst forecast computational efficiency greater weights to more skillful forecasts Depend on correlation coefficient Depend on inverse of expected errors Targets CAMS/CMA NCEP MSC MRI/JMA ZAMG NMC/CMA After L. Li (2007)

2 meter temperature at 12UTC correlation coefficient (2007.8.14.12 2007.8.24.12) left raw right after calibration CC 1 0.8 0.6 0.4 0.2 0 6 12 18 24 30 36 forecast hours NMC/CMA CAMS/CMA NCEP MRI/JMA MSC ZAMG combination CC 1 0.8 0.6 0.4 0.2 0 6 12 18 24 30 36 forecast hours NMC/CMA CAMS/CMA NCEP MRI/JMA MSC ZAMG combination2 Correlation coefficient of all EPS including Combined Ensemble have all been increased after calibration to different extent. Also, diurnal changes have been reduced after calibration. Correlation coefficients of Combined Ensemble before and after calibration are all biggest. After L. Li (2007)

2 meter temperature RMS error (2007.8.14 2007.8.24) left raw right after calibration RMS error 5 4 3 2 1 0 6 12 18 24 30 36 forecast hours 5 NMC/CMA 4.5 4 CAMS/CMA 3.5 NCEP 3 MRI/JMA 2.5 MSC 2 1.5 ZAMG 1 combination 0.5 spread 0 6 12 18 24 30 36 NMC/CMA CAMS/CMA NCEP MRI/JMA MSC ZAMG combination spread RMS errors of all EPS after calibration have been decreased obviously. Diurnal changes have been lowered after calibration. Combined Ensemble RMS error after calibration is the smallest of all, before calibration, it is the smallest in most cases. After calibration spread is more closer to RMS errors than before. After Li (2007)

left raw 2 meter humidity correlation coefficient(2007.8.14 2007.8.24) right after calibration 1 1 C C 0.8 0.6 0.4 0.2 NMC/CMA CAMS/CMA NCEP MRI/JMA MSC ZAMG conbine 0.8 0.6 CC 0.4 0.2 NMC/CMA CAMS/CMA NCEP MRI/JMA MSC ZAMG conbine 0 6 12 18 24 30 36 forecast hours 0 6 12 18 24 30 36 forecast hours Correlation coefficient of all EPS including Combined Ensemble have been increased after calibration. Diurnal changes have been weakened after calibration. Correlation coefficient of Combined Ensemble is biggest of all before and after calibration.

2 meter humidity RMS error(2007.8.14 2007.8.24) left raw right after calibration 25 25 RMS error 20 15 10 5 NMC/CMA CAMS/CMA NCEP MRI/JMA MSC ZAMG combination RMS error 20 15 10 5 NMC/CMA CAMS/CMA NCEP MRI/JMA MSC ZAMG combination spread 0 6 12 18 24 30 36 forecast hours spread 0 6 12 18 24 30 36 forecast hours RMS errors of all EPS have been decreased obviously after calibration. Diurnal changes have been lowered greatly after calibration. RMS error of Combined Ensemble is smallest before and after calibration. After calibration spread is much more closer to RMS errors than before.

BMB Tier 2 NWP System D1 D2 D3 Model domain (D1:27km, D2: 9km, D3: 3km) After Kuo (2007)

6h accumulated precipitation observations for 1 August 2006 (unit: mm) 00 06UTC 06 12UTC 53 12 18UTC 18 00UTC

Cycling run for the 1 August 2006 heavy rain event Cycle setup: cold start at 00Z,1 August 2006, warm start at 03/06/09Z (cycling mode), the previous cycle s 3h model FCST as FG WRFVar v2.1, 3h time windows, GTS/AWS/GPSPW assimilated 12h forecast, 1h output interval NCEP GFS analysis as IC for cold start, forecast as BC for cycle Model setup: WRF ARW v2.2 27 9 3km one way nesting (no feedback), 38 vertical levels WSM6 or Thompson microphysics Old Kain Fritsch cumulus (27 9km), no cumulus (3km) YSU PBL scheme RRTM longwave radiation and Dudhia shortwave radiation scheme SLAB 5 layers Soil After Kuo (2007)

Focus on: exp1 vs. exp2 3h cycle 和 1h cycle exp3 exp7 3kmBES tuning(var_scaling=1~0) exp5 vs. exp12 3kmBES and interpolated 3kmBES exp5 vs. exp11 with and without GPSPW assimilated After Kuo (2007)

Max reflectivity forecast a)exp5 and b)exp11 at initial time 03Z, c)exp5 and d)exp11 at initial time 06Z a GPS PW b No GPS PW 03 UTC I.C. 03 09 UTC rainfall c d 06 UTC I.C. 06 12 UTC rainfall

Tier 2: Thunderstorm on 31 July 2007 GMS IR image 18UTC 30 July 2007 At 18UTC, the rainfall exceeding to 70mm/3hour was observed in Beijing area. After H. Seko (2007)

Results of Tier 1 1 experiment Probability of 3 hour rainfall exceeding to RR3h>5mm (7/30 18UTC) MSC 18UTC JMA/GSV 18UTC 18UTC Weather chart 7/3018UTC CMA ZAMG 18UTC 18UTC Most of participant's models reproduced this thunderstorm well.

Tier 1 1 experiment (18UTC 7/30 2007: Init.7/29 12UTC) CNTL Ensemble mean N N N 18UTC M01m M02m M03m M04m M05m M01p M02p M03p M04p M05p Initial perturbation was produced by WEP method. 3 hour rainfall region of M05m, M03m and M04p were similar to that of observed one. 3 hour rainfall region of M02p is much smaller than observed one.

Tier 2 2 experiment (14 18UTC 18UTC 7/30 : Init.:00UTC) M05m Parameters of Tier 2 are same as those of the operational 2km NHM forecast of JMA. N N N 14UTC 15UTC 16UTC Rainfall band that extended from south to north was reproduced. N N This band moved eastward. 17UTC 18UTC These features are also seen in the observed ones. After H. Seko (2007)

3hour rainfall/grind (mm) 3hour rainfall (mm) 80 70 60 50 40 30 20 10 0 8000 6000 4000 2000 0 Sum of 3 hour rainfall (in Tier 2 domain) 15km grid 3km grid M00 M05 M03 P04 P02 member Rank of 3 hour rainfall M00_03 M00_15 M03_03 M03_15 P02_03 P02_15 1 101 201 301 401 501 601 rank Maximum 3hour rainfall Rainfall area (grid) 100 80 60 40 20 0 700 600 500 400 300 200 100 0 Maximum 3 hour rainfall among the grids 15km grid 3km grid M00 M05 M03 P04 P02 member Area of rainfall > 1.0mm (unit:grid) 15km grid 3km grid M00 M05 M03 P04 P02 member Rainfall of Tier 1 is larger than that of Tier 2, except M02p. Maximum rainfall of Tier 2 is larger than that of Tier 1. The area of the rainfall exceeding 1.0mm is larger in Tier 1 Rainfall rank distribution indicates that the rainfall is more localized in Tier 2.

Review of Workshop Presentations and Discussions For Tier 1(15KM) EPS, 6 participants reported progresses since last workshop. Most groups reported improvement of respective systems. CMA reported the preliminary study on multi resource EPS data combination. More attention is needed on bias correction and methods to combine different EPS results. For assessment and verification of the participants systems, the objective verification results and NMC forecaster subjective assessments were reported. An updated verification plan is suggested by Dr. Lawrence Wilson, for verification, bias correction and combining different EPS systems.

Review of workshop presentations and Discussions The Tier 1 data issues, such like data transmission and archiving, products, training, and website were reported and discussed. Those issues will be continuously addressed, upgraded and improved by CMA. 4 participants from MRI, NCAR BMB, NCEP and NMC report tier 2 (3km) cloud resolving model simulation of 1 Aug and 31 July 2006 cases. The possibility to combination of Mesoscale EPS (15km) with highresolution deterministic forecast (3km) was discussed. NCAR BMB reported progress of WRF RUC system development. A publication plan to summarize results of B08RDP was proposed, and the outlines and topics of paper are discussed. Some research topics, and B08RDP next step, are discussed and proposed, and will continued to be discussed off line after the workshop.

Major Tasks for 2008 RDP Exercise For tier 1 EPS, Third Exercise will use the same configuration and procedure as those of second trial in 2007, including domain and exercise period (July 24 to August 24, 2008). Because forecasters are more interested in 700 hpa wind shear for precipitation forecast, the RDP products from all participants will add 700hPha winds in the products list. It was proposed that at least 40 days of full EPS or a subset EPS or control run be made before July 24, 2008, for bias correction. Updated verification plan will be used for the third exercise. And 2006 2007 trial EPS data will be verified again to assess the EPS system improvement during B08RDP project period

Support for Forecasting for 2008 Olympics B08RDP products will be available in real time based on CMA MEPS system. Products from other MEPS systems will also be available close to real time. Forecaster assessment of B08RDP products for meteorological forecasting in support of Olympics Games: CMA/NMC and CMA/BMB forecaster will be assigned to use RDP products Provide subject assess of usefulness of the RDP products Report will be provided in the fourth workshop

Scientific Topics and Opportunities of Interest to B08RDP 1. What are the values added of meso scale EPS? In comparison with Global EPS? In comparison with high resolution deterministic forecast? 2. How to verify MEPS results, and demonstrate its usefulness? Objective verification score Subjective forecaster assessment 3. What is the efficient way to perturb initial condition? 4. What is the impact of physics variations? 5. How to increase 2m temperature perturbation spread? 6. How to combine Meso scale EPS (at 15 km) with a highresolution deterministic forecast (at 3 km) to provide further improvement in forecasting? 7. How to combine multi center EPS data? What is the value added for multiple mesoscale EPS systems (as compared with a single mesoscale EPS system)?

Possible extension into future Maintain the MEPS group participation, and focus more on scientific issues through forum and workshops Enhance collaboration among participating institutes on Mesoscale EPS, beyond 2008 Encourage active involvement with TIGGE LAM This will be further discussed in the fourth workshop