SAVE POD AND FAR Normalization for Event Frequency in Performance Metrics (IFR Example)



Similar documents
Industry Environment and Concepts for Forecasting 1

Natural Gas Wholesale Prices at PG&E Citygate as of November 7, 2006

1 Define Management. Explain Mintzberg Managerial Roles. 12M

Exponential Smoothing with Trend. As we move toward medium-range forecasts, trend becomes more important.

Algorithmic Trading Session 1 Introduction. Oliver Steinki, CFA, FRM

Forecasting in STATA: Tools and Tricks

PREDICTIVE ANALYTICS VS. HOTSPOTTING

NASA Explorer Schools Pre-Algebra Unit Lesson 2 Student Workbook. Solar System Math. Comparing Mass, Gravity, Composition, & Density

USING SIMULATED WIND DATA FROM A MESOSCALE MODEL IN MCP. M. Taylor J. Freedman K. Waight M. Brower

OPTIONS TRADING AS A BUSINESS UPDATE: Using ODDS Online to Find A Straddle s Exit Point

(More Practice With Trend Forecasts)

Contact Center Math:

Climate and Weather. This document explains where we obtain weather and climate data and how we incorporate it into metrics:

The Main Page of RE STATS will provide the range of data at the very top of the page.

PREDICTIVE ANALYTICS vs HOT SPOTTING

Chapter 6 The Tradeoff Between Risk and Return

How To Measure Performance

Volatility: Implications for Value and Glamour Stocks

Visualization Quick Guide

4. Simple regression. QBUS6840 Predictive Analytics.

Welcome to Math 7 Accelerated Courses (Preparation for Algebra in 8 th grade)

LESSON 4 Missing Numbers in Multiplication Missing Numbers in Division LESSON 5 Order of Operations, Part 1 LESSON 6 Fractional Parts LESSON 7 Lines,

8. Time Series and Prediction

DESCRIPTIVE STATISTICS. The purpose of statistics is to condense raw data to make it easier to answer specific questions; test hypotheses.

An innovative approach combining industrial process data analytics and operator participation to implement lean energy programs: A Case Study

Ratios and Proportional Relationships Set 1: Ratio, Proportion, and Scale... 1 Set 2: Proportional Relationships... 8

Correlation key concepts:

Pattern Recognition and Prediction in Equity Market

Huai-Min Zhang & NOAAGlobalTemp Team

OPTIMIZING THE USE OF VHA s FEE BASIS CLAIMS SYSTEM (FBCS)

PdM Overview. Predictive Maintenance Services

Demand forecasting & Aggregate planning in a Supply chain. Session Speaker Prof.P.S.Satish

PITFALLS IN TIME SERIES ANALYSIS. Cliff Hurvich Stern School, NYU

Ch.3 Demand Forecasting.

Cloud Model Verification at the Air Force Weather Agency

Glencoe. correlated to SOUTH CAROLINA MATH CURRICULUM STANDARDS GRADE 6 3-3, , , 4-9

QUANTITATIVE METHODS FOR MANAGEMENT

r a t her t han a s a f e haven

Chapter 9. The Valuation of Common Stock. 1.The Expected Return (Copied from Unit02, slide 39)

430 Statistics and Financial Mathematics for Business

Demand Forecasting to Increase Profits on Perishable Items

The Point-Slope Form

Prentice Hall: Middle School Math, Course Correlated to: New York Mathematics Learning Standards (Intermediate)

OMBU ENTERPRISES, LLC. Process Metrics. 3 Forest Ave. Swanzey, NH Phone: Fax: OmbuEnterprises@msn.

PART 3 CASH FLOW FORMULA:

How To Implement Your Own Sales Performance Dashboard An Introduction to the Fundamentals of Sales Execution Management

MGT 267 PROJECT. Forecasting the United States Retail Sales of the Pharmacies and Drug Stores. Done by: Shunwei Wang & Mohammad Zainal

Capital Market Theory: An Overview. Return Measures

TD is currently among an exclusive group of 77 stocks awarded our highest average score of 10. SAMPLE. Peers BMO 9 RY 9 BNS 9 CM 8

AN APPROACH TO INCORPORATING DEMAND UNCERTAINTY IN NAS-WIDE MODELING

LeSueur, Jeff. Marketing Automation: Practical Steps to More Effective Direct Marketing. Copyright 2007, SAS Institute Inc., Cary, North Carolina,

Qi Liu Rutgers Business School ISACA New York 2013

Paper 1. Calculator not allowed. Mathematics test. First name. Last name. School. Remember KEY STAGE 3 TIER 5 7

Forecaster comments to the ORTECH Report

Forecasting Analytics. Group members: - Arpita - Kapil - Kaushik - Ridhima - Ushhan

Solar Input Data for PV Energy Modeling

Indian School of Business Forecasting Sales for Dairy Products

Promotional Forecast Demonstration

Sample of Best Practices

An Empirical Analysis of Determinants of Commercial and Industrial Electricity Consumption

NEW MEXICO Grade 6 MATHEMATICS STANDARDS

Data Practice with FRED Measures of Inflation Objectives

Eðlisfræði 2, vor 2007

JetBlue Airways Stock Price Analysis and Prediction

Cloud/Hydrometeor Initialization in the 20-km RUC Using GOES Data

Automating IT Capacity Management

SEI s Approach to Asset Allocation

Outline: Demand Forecasting

Choosing Colors for Data Visualization Maureen Stone January 17, 2006

Algebra Academic Content Standards Grade Eight and Grade Nine Ohio. Grade Eight. Number, Number Sense and Operations Standard

Time Series Analysis: Basic Forecasting.

STT 200 LECTURE 1, SECTION 2,4 RECITATION 7 (10/16/2012)

Smoothing methods. Marzena Narodzonek-Karpowska. Prof. Dr. W. Toporowski Institut für Marketing & Handel Abteilung Handel

Volatility in the Overnight Money-Market Rate in Bangladesh: Recent Experiences PN 0707

What Causes Climate? Use Target Reading Skills

Determine If An Equation Represents a Function

Extra-Tropical Cyclones in a Warming Climate:

BETTING ON CLIMATE CHANGE

2) The three categories of forecasting models are time series, quantitative, and qualitative. 2)

Common Core Unit Summary Grades 6 to 8

Please see the Seasonal Changes module description.

Analyzing Weather Data

Project Title: Quantifying Uncertainties of High-Resolution WRF Modeling on Downslope Wind Forecasts in the Las Vegas Valley

Smalian s Formula 4. November 1,

700 Analysis and Reporting

Expert Color Choices for Presenting Data

Not Your Dad s Magic Eight Ball

Q Earnings Conference Call. July 30, 2015

Numeracy Targets. I can count at least 20 objects

Diversifying with Negatively Correlated Investments. Monterosso Investment Management Company, LLC Q1 2011

Executive Summary. Abstract. Heitman Analytics Conclusions:

Module 6: Introduction to Time Series Forecasting

Transcription:

SAVE POD AND FAR Normalization for Event Frequency in Performance Metrics (IFR Example) Matthew Lorentson August 2015 1

High Points POD and FAR must not be used individually to summarize performance Performance metrics must be normalized to account for the gross influence of event frequency Use a moving average to evaluate progress and trends 2

IFR POD GPRA FY15 AOP Milestone for Aviation Program: Methodology for improved measurement of forecast accuracy for GPRA Metrics IFR FAR GPRA 3

TPIX = POD x SR Index use solves POD or FAR overemphasis TPIX easy to calculate (advantage over CSI) TPIX exhibits strong linear correlation with IFR Frequency 4 IFR Frequency range (right axis, in red) is magnified compared to IFR Total Performance Index range (right axis, in blue) to emphasize comparable distribution shapes

Performance Indices It is essential to recall that there is no universal approach to verification, but rather that the procedure selected needs to match the specific objectives of the study - Roebber, 2009 Avg = (POD + SR)/2 CSI = 1/[1/POD+(1/SR)-1] TPIX = POD x SR 5

6 Month POD FAR SR TPIX CSI Avg. IFR Freq. Jan-12 66.80 34.00 66.00 4408.80 0.4970 66.40 9.07% Example POD = 66.80 SR = 66.00 IFR Frequency = 9.07% POD x SR = TPIX 66.80 x 66.00 = 4408.80

Month POD FAR SR TPIX CSI Avg. IFR Freq. 7 TAF IFR Performance Correlates with Observed IFR Jan-12 66.80 34.00 66.00 4408.80 0.4970 66.40 9.07%

Moving Average TPIX residuals represent performance relative to all months Month to month variability is high, standard deviation = ~200 TPIX Points Like stocks and commodities, singleday performance, or even monthly volatility, should not be used to make long-term investment decisions performance is a long-term prospect, thus we should use moving averages (and 12-month lag) 8

Single Location Example: ORD ORD exhibits a wide range of IFR Frequency, higher IFR Frequency typically in winter months Recent performance, according to the eight-year monthly sample, is very good in relative terms 9

Summary POD and FAR must not be used individually to summarize performance Performance metrics must be normalized to account for the gross influence of event frequency Use a moving average to evaluate progress and trends 10

Details of Methodology Peer-reviewed article published Dec 2013 Coordinated with Performance Branch Better represents actual forecast performance http://www.nwas.org/jom/articles/2013/2013-jom22/2013-jom22.pdf 11

Questions Thanks to Kevin Stone at Aviation & Space Weather Branch for input on this presentation 12

Backup Slides 13

Total Performance Index Total Performance Index (TPIX) POD = Probability of Detection FAR = False Alarm Ratio 1-FAR = Success Ratio (SR) TPIX = POD * (1-FAR) Comparable to Critical Success Index (CSI) but easier to understand and calculate from POD and FAR CSI = 1 [(100/POD)+(100/1-FAR)-1] 14

Geometric Relationship*: POD, SR, CSI, TPIX, and Bias TPIX can be visualized on this graph as a quadrilateral area calculated by multiplying Probability of Detection by Success Ratio. Dashed lines = Bias (POD/SR) Solid contour = CSI. Blue square = TPIX example Using whole numbers, the blue square area with POD and SR scores of 65 (Bias = 1.0) produces a TPIX of 4225 and CSI of ~4815. TPIX and CSI are maximized in the form of a square when bias = 1.0 15 Roebber, P., 2009: Visualizing Multiple Measures of Forecast Quality. Wea. Forecasting, 24, 601 608. [Available online at: http://journals.ametsoc.org/doi/full/10.1175/2008waf2222159.1] Cross and shape figures represent various forecast averages discussed by Roebber; half circles represent TAF and MOS TAF averages (MOS in gray).

New GPRA Metric: Improvement Over Predicted Score 2006-2014 2009-2014 Fiscal Year Improvement Over Predicted 12 Month Moving Avg. IFR GPRA Goals (based on 2009-2014 trend) FY (Oct-Sep) Month # Predicted Actual Performance 2010 12 41.29 99.36 58.07 2011 24 50.03 33.96-16.07 2012 36 58.78 71.55 12.77 2013 48 67.53 19.15-48.38 2014 60 76.28 92.85 16.57 FY (Oct-Sep) Month # Goal Actual Performance 2015 72 85.03 2016 84 93.78 2017 96 102.52 2018 108 111.27 2019 120 120.02 2020 132 128.77 0.7290379(month #) + 32.537062 16

New GPRA Metric: Improvement Over Predicted Score Fiscal Year Improvement Over Predicted 12 Month Moving Avg. IFR GPRA Goals (based on 2009-2014 trend) FY (Oct-Sep) Month # Predicted Actual Performance 2010 12 41.29 99.36 58.07 2011 24 50.03 33.96-16.07 2012 36 58.78 71.55 12.77 2013 48 67.53 19.15-48.38 2014 60 76.28 92.85 16.57 2009-2014 FY (Oct-Sep) Month # Goal Actual Performance 2015 72 85.03 2016 84 93.78 2017 96 102.52 2018 108 111.27 2019 120 120.02 2020 132 128.77 0.7290379(month #) + 32.537062 = Goal Performance trend over time 17

Western Region Climate Artifacts of performance regimes are evident in large-samples Western Region: significant diversity in its climate profile coastal vs. mountain During stratus season, IFR conditions appear easier to forecast thus seasons, and different climate regimes, should be measured separately 18

Wind Gust >27kt 19

Single Location Example: LAS LAS exhibits low IFR Frequency, thus an unusable sample Recent performance, according to the eightyear monthly sample, is very good in relative terms 20