Advances in Network Performance Monitoring and Modelling Presented by: Jerry Carter IDC/OD Contributors: Mark Prior IDC/SA Pierrick Mialle IDC/SA Mika Nikkinen IDC/SA Monika Krysta IDC/OD Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization Provisional Technical Secretariat Page 1
Monitoring vs. Modelling What s happening What if Page 2
Modelling vs. Monitoring Monitoring uses the actual network configuration and noise fields at a specific time (or averaged over some relatively short time period) Modelling can use a hypothetical network and hypothetical noise characteristics for each station Or, noise characteristics may be estimated from existing stations Because assumptions about detection and attenuation are used in both modelling and monitoring, validation provides confidence in the results Page 3
= Seismic Station = Hypothetical Event Location Page 4
Signal Size Page 5
Noise Level Page 6
Magnitude Threshold Monitoring 2.8 3.0 2.9 2.7 3.1 Page 7
Magnitude Threshold Monitoring Kværna & Ringdal, (BSSA, 1999) Page 8
Set Event Magnitude Estimate Detection Probabilities 77% 83% 3.6 88% 89% 81% Page 9
Set Station Detection Probability Estimate Detectable Magnitudes 3.7 3.9 3.8 3.7 3.9 Magnitudes for 90% probability of detection at each station Page 10
3-Station Detection Monitoring Kværna & Ringdal, (BSSA, 1999) Page 11
3-Station Detection Modelling P net = P(at least 3 stations detect the event) Magnitude at which there is a 90% probability of detecting at least 3 P phases with the full IMS primary seismic network (using empirical station noise values where possible) Page 12
Network and station probability Consider: 5 stations, each with an independent probability of detection of 89% MONITORING SOFTWARE: No valid detection because no single station exceeds 90% MODEL: Valid detection because the network probability of 3 or more stations detecting is 94% If individual probability is 50%, valid detection possibility with 9 stations Page 13
Network Detection Validation REB completeness 90% Probability of detection 99% Probability of detection Prior & Brown, S&T11 Page 14
Network Detection Validation Global Average IMS Network Detection Threshold Prior & Brown, S&T11 Monitoring 3-P detection capability (99%+) REB 90% measurement Modelling 90% prediction Page 15
Network Detection Validation Page 16
Network Detection Validation by Region: Model-Meas. Mismatch Mean difference ~0.01 Mag. U. magnitude units STD ~0.25 Mag. U. Page 17
What is the predicted capability of the IMS seismic & hydro networks for the oceans? Page 18
Wind Effect Page 19
Infrasound Network Detection Capability Monitoring Le Pichon et al, S&T11 Infrasound network performance map (minimum yield in tons) with real-time station noise and atmospheric specification (ECMWF) based on LANL (Whitaker 2003) yield relations (with support from & collaboration with the French and German NDCs) Page 20
Radionuclide Network Detection Capability Modelling Becker & Wotawa 10 9 10 10 10 11 10 12 10 13 10 14 10 15 10 16 10 17 Bq 140 Ba source strengths at which there is a 90% probability of single station detection by the complete particulate network after 14 days transport (median of 14 days of estimates) Page 21
Summary Network Monitoring and Modelling serve two distinct purposes: one to monitor performance the other to predict performance Both methods require careful validation Seismic models validated to +/- 0.25 magnitude units Hydroacoustic propagation is efficient and thresholds for in-water explosions are <~100 Tons TNT Infrasound and radionuclide monitoring and modelling are affected by a dynamic atmosphere Page 22
Thank You Page 23