Wind Power Forecasting Pilot Project Part B: The Quantitative Analysis Final Report

Size: px
Start display at page:

Download "Wind Power Forecasting Pilot Project Part B: The Quantitative Analysis Final Report"

Transcription

1 Report Wind Power Forecasting Pilot Project Part B: The Quantitative Analysis Final Report A Report to: Attention: The Alberta Electric System Operator 25, 33-5 th Avenue SW Calgary, Alberta T2P L4 Mr. Darren McCrank, P.Eng. Wind Power Forecasting Pilot Project Project Manager Fax: (43) Submitted by: Donald (Don) C. McKay, Ph.D., MBA General Manager Tel: (95) , Ext. 499 Fax: (95) Report No.: pages, 5 Appendices Date: August 18, Speakman Drive, Mississauga, ON Canada L5K 1B3 Tel Fax

2 Wind Power Forecasting Pilot Project DISCLAIMER The conclusions reached in this report were arrived at using data and other information provided to ORTECH by others. The activities of ORTECH in preparing this report were limited to analyzing the data and other information supplied to ORTECH. Except in the case of obvious anomalies or discrepancies no attempt was made to verify the data or the accuracy thereof as such activities were beyond the scope of ORTECH s engagement. While ORTECH prepared this report in accordance with professional standards and has no reason to question the data or information provided to it, ORTECH assumes no responsibility or liability for the accuracy, completeness or veracity of the data or other information supplied to it in the course of preparing this report.

3 Wind Power Forecasting Pilot Project TABLE OF CONTENTS Page No. EXECUTIVE SUMMARY INTRODUCTION DATA SETS Data Completeness RESULTS AND ANALYSES What is the General Accuracy of the Forecasts? MAE and RMSE Wind Speed Decomposition of RMSE RMSE and MAE Power Normalized RMSE Normalized MAE What is the accuracy of the Forecasts at the different forecast horizons studied (T=1 hour to T=48 hours)? What is the accuracy of the Forecasts at different hours of the day and seasons of the year? Hours of the Day Seasons of the Year What is the accuracy of the Forecasted Meteorological Data before Running through the Power Conversion models? What is the accuracy of the Power Conversion? What is the Potential co-variance from given data samples? What is the accuracy of the Forecast at different wind speeds or different points of a Wind Power Facility s power curve? What is the relative comparison between Forecasts? Contemporary Analyses Wind Power Performance Index Root Mean Square Error of Wind and Power Forecasts Bias of Wind and Power Forecasts General Assessment...53

4 Wind Power Forecasting Pilot Project TABLE OF CONTENTS Page No. 3.9 Which is the region with the least amount of error? Which forecaster forecasts best in that regions and why? What is the effect of spatial smoothing on forecast error? How well do the forecasts predict fast ramp up and ramp down times, Event analysis (CSI)? What is the Impact on data availability? Are there times (day/month/weather pattern) when there is more uncertainty in the forecasts than other times?) What is the relationship between the spread of the min/max and the forecast error? What is the correlation factor between all three forecasts? Is this related to the forecast error? Summary of Findings GENERAL OBSERVATIONS AND RECOMMENDATIONS Data Trial Period Freezing the Models Existing vs. Future Sites Focusing on the Priorities...65

5 Wind Power Forecasting Pilot Project TABLE OF CONTENTS TABLES Page No. Table 2-1 Table 2-2 Table 2-3 Table 2-4 Table 2-5 Table 2-6 Table 2-7 Table 2-8 Summary of the 1-Minute Measured Wind Speed Data Recovery Rates for each Site during the Q4 (February 1, 28 : UTC - April 3, 28 23: UTC)...12 Summary of the Hourly-Averaged Measured Wind Speed Data Recovery Rates for each Site during Q4 (February 1, 28 : UTC - April 3, 28 23: UTC)...12 Summary of the 1-Minute Measured/Calculated Power Data Recovery Rates for each Site during Q4 (February 1, 28 : UTC April 3, 28 23: UTC)...13 Summary of the Hourly-Averaged Measured/Calculated Power Data Recovery Rates for each of the Sites during Q4 (February 1, 28 : UTC April 3, 28 23: UTC)...13 Summary of Acceptable Ranges of Variables used by ORTECH to Screen-Out Measured/Calculated and Forecasted Wind and Power Data...14 Summary of the Annual Recovery Rates of Measured Hourly- Averaged Wind-Speed and Measured/Calculated Power Data at the Different Sites and Regions from May 1, 27 to April 3, Summary of the Annual Recovery Rates of Hourly-Averaged Wind-Speed Data-Sets used in the Statistical Analyses at Each of the Selected Horizons and at the Different Regions from May 1, 27 to April 3, Summary of the Annual Recovery Rates of Hourly-Averaged Power Data-Sets used in the Statistical Analyses at Each of the Selected-Horizons and at the Different Regions from May 1, 27 to April 3,

6 Wind Power Forecasting Pilot Project TABLE OF CONTENTS Page No. Table 3-1 Table 3-2 Table 3-3 Table 3-4 Table 3-5 Table 3-6 Table 3-7 General Accuracy for Power Forecasts for Each Forecaster on an Annual Base...35 Annual Relative Standard Deviations of Measured Wind Speed and Power in Different Regions...4 Annual Ration of MAE and RMSE for Power Normalized by the Measured Average Power to MAE and RMSE for Wind Speed Normalized by the Measured Mean Wind Speed...41 Sum of various metrics for nine horizons for wind and power for the South West and South Central regions for all three forecasters...53 Annual Summary of Rapid Ramp Events for specified Time Horizons captured by Each Forecaster (% of total number of events) When Event is Forecasted Not More Than 12 Hours In advance of the Actual Event...58 Annual Summary of Rapid Ramp Events for Specified Time Horizons Captured by Each Forecaster (% of total number of events) When Event is Forecasted Either 6 Hours Before or 6 Hours After the Actual Event...58 Confidence limits (%) of minimum and maximum predicted power distributions...6

7 Wind Power Forecasting Pilot Project TABLE OF CONTENTS FIGURES Page No. Figure 3-1 Figure 3-2 Figure 3-3 Figure 3-4 Figure 3-5 Figure 3-6 Comparison of the time series of measured and predicted wind speeds at first forecast horizon (T=1hr) by an anonymous forecaster at a particular wind farm (name is not disclosed). Note examples are circled...23 Annual Mean Absolute Error (MAE) of wind speed (WS) predictions (Pred) in South West (SW), South Central (SC), South East (SE), Central (CE), Existing Facilities (EF), Future Facilities (FF) and All Facilities (AF) by three forecasters A, B and C as a function of forecast horizons Annual Root Mean Square Error (RMSE) of wind speed (WS) predictions (Pred.) in South West (SW), South Central (SC), South East (SE), Central (CE), Existing Facilities (EF), Future Facilities (FF) and All facilities (AF) by three forecasters A, B and C as a function of forecast horizons Annual errors of wind speed predictions in South West (SW), South Central (SC), South East (SE), Central (CE), Existing Facilities (EF), Future Facilities (FF) and All Facilities (AF) by Forecasters A, B, and C as a function of forecast horizon. Note that RMSE and its components: dispersion (disp), bias (bias) and standard deviation of bias (sdbias) are shown with different symbols and colors...28 Annual Normalized Root Mean Square Error (RMSE %) of Power predictions in South West (SW), South Central (SC), and existing facilities (EF) by three forecasters A, B and C as a function of forecast horizons. Note that the actual errors are normalized by the rated capacity (RC) of the region of power aggregation...3 Annual Errors of power predictions in South West (SW), South Central (SC), Existing Facilities (EF), by Forecasters A, B, and C as a function of forecast horizon. Note that RMSE and its components: dispersion (disp), bias (bias) and standard deviation of bias (sdbias) are shown with different symbols and colors...32

8 Wind Power Forecasting Pilot Project TABLE OF CONTENTS Figure 3-7 Figure 3-8 Figure 3-9 Figure 3-1 Figure 3-11 Figure 3-12 Page No. Normalized Root Mean Square Error (RMSE) of Power predictions in South West (SW), South Central (SC), and existing facilities (EF) by three forecasters A, B and C as a function of forecast horizons. Note that the actual errors are normalized by the rated capacity of the region of power aggregation Accuracy of the Power Forecast for Specific 6 Hour Time Periods for Each Forecaster Using RMSE (% Rated Capacity for the South West (SW), South Central (SC), and Existing Facilities (EF)...37 Seasonal Accuracy of the Power Forecast for Each Forecaster Using RMSE (% Rated Capacity) for the South West (SW), South Central (SC), and Existing Facilities (EF)...38 Typical Power Curve (GE 1.5 MW sle)...42 Annual RMSE of wind speed predictions at the chosen forecast horizons (T = 1, T =2 and T = 48hr) in Existing Facilities (EF) by Forecasters A, B and C as a function of predicted wind speed...44 Annual Normalized RMSE of Power for Existing Facilities at Different Predicted Wind Speed Bins...45 Figure 3-13 RMSE Annual Comparison between Forecasters for Wind by Region...47 Figure 3-14 Annual Comparison between Forecasters for Power by Region (Normalized RMSE % of Rated Capacity (RC)) and a Comparison against Persistence for SW, SC Regions and EF Figure 3-15 Figure 3-16 Figure 3-17 Figure 3-18 Figure 3-19 Annual Comparison between Forecasters Against Persistence for SW, SC Regions and EF for Power using SS (RMSE)...48 Annual Normalized Performance Index for Wind using RMSE for Four Regions for the Three Forecasters...5 Annual Normalized Performance Index for Power Using RMSE in Two Regions for the Three Forecasters...5 Annual Normalized Performance Index for Wind using Bias for Four Regions for the Three Forecasters...52 Annual Normalized Performance Index for Power Using Bias in Two Regions for the Three Forecasters...52

9 Wind Power Forecasting Pilot Project TABLE OF CONTENTS Page No. Figure 3-2 Annual Ratio of Regional Standard Deviation of Power Prediction Errors to the Average Standard Deviation of the Individual Sites in the Region for Various Region Sizes and Forecast Horizons...56 APPENDICES Appendix A Appendix B Appendix C Appendix D Appendix E Alberta Forecast Regions Bibliography Data Completeness for Each Quarter Selected Wind Statistics by Quarter Selected Power Statistics by Quarter

10 Wind Power Forecasting Pilot Project Page 1 of 65 EXECUTIVE SUMMARY As part of the Alberta Energy System Operator s (AESO) wind power forecasting pilot project, ORTECH Power (ORTECH) was contracted to provide quantitative analysis services. Wind and power forecast data was provided by three independent wind forecasting firms, referred to herein as Forecaster A, Forecaster B and Forecaster C. The analysis consisted of comparing the predicted data against measured meteorological power data for seven existing Alberta wind power facilities (Existing Facilities), and measured meteorological data and derived power data for five future Alberta wind power facilities (Future Facilities). The measured, predicted, and simulated data was collected and distributed to the forecasters and ORTECH by GENIVAR (Phoenix Engineering). Analysis was carried out by examining the available data from each of the forecasters in seven categories, consisting of the entire dataset All Facilities (AF), Existing Facilities (EF), Future Facilities (FF), and four geographic regions. To define the four regions, the available data for the Existing Facilities and the Future Facilities were distributed geographically into South West (SW) Region, South Central (SC) Region, South East (SE) Region, and a Central Region (CE) (see map in Appendix A). The analysis for the individual facilities was not included in this report due to confidentially requirements. The forecast data was analyzed using statistical methods with an emphasis on deriving meaningful and precise values for averaged errors between predicted and observed data that would provide some insight into the error characteristics based on the diagram below. Observations: Hourly time-scale Temporal statistics: Forecasts: 1 to 48 hour time scales Meteorological situations: 3 to >8 hour time scales 1 to 48 hour time horizons and Seasonal Stratified Statistics: season and meteorological situation Summary statistics: confidence intervals etc. resolved 1 to 48 hour horizons Distinction between forecasters: resolved 1 to 48 hour horizons The final report documents the analysis for the one-year study period, extending from May 1, 27 to April 3, 28, inclusive.

11 Wind Power Forecasting Pilot Project Page 2 of 65 The final report attempts to draw out specific findings and/or insights based on a set of questions posed by the working group after reviewing the three progress reports. The questions posed by the working group were; 1. What is the general accuracy of the Forecasts? 2. What is the accuracy of the Forecasts at the different forecast horizons studied (T=1 hour to T=48 hours)? 3. What is the accuracy of the Forecasts at different hours of the day and seasons of the year? 4. What is the accuracy of the Forecasted Meteorological Data before running through the Power Conversion models? 5. What is the accuracy of the Power Conversion? 6. What is the Potential co-variance from given data samples? 7. What is the accuracy of the Forecast at different wind speeds or different points of a Wind Power Facility's power curve? 8. What is the relative comparison between Forecasts? 9. Which is the region with the least amount of error? Which forecaster forecasts best in that region and why? 1. What is the effect of spatial smoothing on forecast error? 11. How well do the forecasts predict fast ramp up and ramp down times, event analysis (CSI)? 12. What is the Impact on data availability? 13. Are there times (day/month/weather pattern) when there is more uncertainty in the forecasts than other times? 14. What is the relationship between the spread of the min/max and the forecast error? 15. What is the correlation factor between all three forecasts? Is this related to the forecast error? The report is presented as follows: Section 2. describes the data sets provided by GENIVAR (Phoenix Engineering) Section 3. the results and analyses (response to questions posed by working group) Section 4. general observations and recommendations Appendices C, D and E provide quarterly statistics.

12 Wind Power Forecasting Pilot Project Page 3 of 65 From the analyses undertaken in Section 3., addressing questions raised by the working group, a summary of the findings was produced which is outlined below. After each finding the relevant section were the analysis is discussed is indicated. The summary of findings is; 1. Forecasting wind energy in the regions examined in Alberta over a 48 hour time horizon is possible (all sections) 2. All three forecasters can provide general forecasts in the regions examined over the 48 hour time horizon.(sections 3.1 and 3.8) 3. The largest error in the forecasts is due to phases errors. The different regions show a consistent behaviour for phase error suggesting a similar property affects all regions in a similar manner. (sections 3.1 and 3.2) 4. The accuracy of the forecasts in general decreases as the forecast horizon increases. While there is a variation between forecasters in accuracy at each forecast horizon, the trend of decreasing accuracy as the forecast horizon increases is preserved by each forecaster. (sections 3.1 and 3.2) 5. The wind power forecasts are the least accurate during the afternoon periods between hours 13 and 18 after forecast horizon T=6. (section 3.3) 6. The least accurate wind power forecasts are during the winter season (November, December, January, February) for all forecasters. The best accuracy is during the summer season (June, July, and August). (section 3.3) 7. Wind speed prediction errors are amplified by due to power conversion. (section 3.5) 8. The power prediction errors peak at the predicted wind speed of 1 m/s and show a different pattern from the wind speed error.(section 3.7) 9. The region with the least amount of error for wind is the central region (CE). The region with the largest amount of error is the Southwest Region (SW). (section 3.9) 1. For power forecasts, the accuracy for both SW and SC regions are similar. (section3.9) 11. Spatial smoothing reduces the error as the number of wind farms and the size of the area covered increases. (section 3.1) 12. All forecasters do not predict ramp events effectively. The reason may be that the forecasters were not given this specific objective. (section 3.11) 13. In examining the min-max spread for predicted power it was found that the measured values fell between the min-max predicted power 81%-95% of the time. (section 3.14)

13 Wind Power Forecasting Pilot Project Page 4 of 65 A number of general observations and recommendations are presented. These include: a) Data (i) ORTECH relied on GENIVAR (Phoenix Engineering) for providing a QA/QC d measured data-sets for the different sites. Although, ORTECH applied a screen out criteria on the data, more confidence could have been achieved if one or both of the recommendations listed below was employed: Recommendation 1: GENIVAR (Phoenix Engineering) to provide ORTECH with a coded data-set along with a key to the codes used to substitute invalid data based on their QA/QC procedures. A detailed QA/QC report would also be helpful Recommendation 2: ORTECH to receive a raw database and to apply its own QA/QC measures which include the following:» Reviewing instruments orientation and calibration reports and correcting the data accordingly when necessary.» Flagging data with abnormal wind speeds or power and/or standard deviations and filtering them out if they fall outside of a certain or agreed-upon range.» Screening the data for icing events or any other anomalies that may have not been caught in the screening-out criteria and filtering them out.» Comparing wind speed data from different anemometers levels and from adjacent sites looking for discrepancies that are then filtered when necessary.» Other site specific QA/QC procedures. (ii) Some of the measured-data were modified and/or added after the end of each quarter. It was expected to have the data QA/QC d and ready to be analyzed before they were posted on GENIVAR s (Phoenix Engineering) wind-server for ORTECH to download, which was not the case. Recommendation: ORTECH to receive final and unchanged values in the measured data-set.

14 Wind Power Forecasting Pilot Project Page 5 of 65 (iii) ORTECH had neither access to the measured data provided to each of the forecasters nor to their availability. It was assumed that the measured data made available to ORTECH and to each of the forecasters were comparable. Recommendation: ORTECH to receive a data-availability report from each of the forecasters summarizing the hours that were used from the measured data to produce their forecast. A more reliable comparison between the different forecasters could have been produced. (iv) ORTECH assumed that none of the forecasters did prescreening to their results. The fact that in some cases forecasters did not provide forecasts up to the 48 th time horizon (T=48) puts this assumption into question. Recommendation: ORTECH to receive from each of the forecaster reports summarizing their prescreening criteria and listing their omitted forecasts if they do prescreening or a statement saying that they do not. b) Trial Period In undertaking such a large project a trial period would have been advantageous. This trial period would allow all the components of the project (forecasters, data providers, data analyst) to test their procedures to ensure that the logistics were running as smoothly as possible. It is recommended that at least a two to three month trial period be undertaken before the main project is initiated. The trial period could be undertaken using a subset of the sites. c) Freezing the Models Provided one of the questions to be addressed is a comparison between forecasters then it is recommended that after an initial training period the forecast model codes be frozen. If they are not frozen the consultant doing the quantitative analysis can not predict whether the output at the start of the project was the same as the output generated from the forecast model at the end of the project. Another alternative would be to have each forecaster describe in detail the changes made to the model as the project progressed.

15 Wind Power Forecasting Pilot Project Page 6 of 65 d) Existing vs. Future Sites Given, that the main emphasis of the study was on wind power forecasting and not wind speed forecasting, it is recommended, that only existing sites which have measured wind power data be used. By using only measured wind power data a more representative indication of the accuracy of the forecaster s power conversion methodologies is provided. e) Focusing on the Priorities Given the amount of data, the presentation and analysis of the information generated was considerable. It is recommended that future Working Groups in cooperation with the consultants determine what are the specific questions that are to be addressed and thus defining the metrics/analyses that will be the focus of the progress and final reports.

16 Wind Power Forecasting Pilot Project Page 7 of INTRODUCTION As part of the Alberta Energy System Operator s (AESO) wind power forecasting pilot project, ORTECH Power (ORTECH) was contracted to provide quantitative analysis services. Wind and power forecast data was provided by three independent wind forecasting firms, referred to herein as Forecaster A, Forecaster B and Forecaster C. The analysis consisted of comparing the predicted data against actual meteorological data and real power data for seven existing Alberta wind power facilities and actual meteorological data and derived power data for five future Alberta wind power facilities. Analysis was carried out by examining the available data from each of the forecasters in seven categories, consisting of the entire dataset All Facilities (AF), Existing Facilities (EF), Future Facilities (FF) and four geographic regions. To define the four regions, the available data for the Existing Facilities and the Future Facilities were distributed geographically into South West (SW) Region, South Central (SC) Region, South East (SE) Region, and a Central Region (CE) (see map in Appendix C). The analysis for the individual facilities was not included in this report due to confidentiality concerns. AESO and the forecasters have access to all the graphs produced using their datasets which were provided to ORTECH for the purposes of this final report prior to issuing it. This report documents the analysis for the one-year study period, extending from May 1, 27 to April 3, 28, inclusive. The final report deviates from the three quarterly progress reports provided previously in that the final report attempts to emphasize more insight into what the information is saying as opposed to just providing the metrics. As well, the report attempts to address the questions posed by the working group, derived after reviewing the three quarterly reports. The questions posed by the working group are as follows; 1. What is the general accuracy of the Forecasts? 2. What is the accuracy of the Forecasts at the different forecast horizons studied (T=1 hour to T=48 hours)? 3. What is the accuracy of the Forecasts at different hours of the day and seasons of the year? 4. What is the accuracy of the Forecasted Meteorological Data before running through the Power Conversion models? 5. What is the accuracy of the Power Conversion?

17 Wind Power Forecasting Pilot Project Page 8 of What is the Potential co-variance from given data samples? 7. What is the accuracy of the Forecast at different wind speeds or different points of a Wind Power Facility's power curve? 8. What is the relative comparison between Forecasts? 9. Which is the region with the least amount of error? Which forecaster forecasts best in that region and why? 1. What is the effect of spatial smoothing on forecast error? 11. How well do the forecasts predict fast ramp up and ramp down times, event analysis (CSI)? 12. What is the Impact on data availability? 13. Are there times (day/month/weather pattern) when there is more uncertainty in the forecasts than other times? 14. What is the relationship between the spread of the min/max and the forecast error? 15. What is the correlation factor between all three forecasts? Is this related to the forecast error? Figures and tables are presented in the report proper as an illustration of the data used in support of addressing the questions. The complete set of figures and tables used to address each question are presented on a CD accompanying this report due to the large amount of information Only power data from the Existing Facilities and the two regions (SW,SC) which only have existing facilities are presented since for the Future Facilities one can not compare predicted power against a measured power. For completeness, selected tables and figures for wind speed and power by quarter are presented in Appendix C to E. The report is presented as follows: Section 2. describes the data sets provided by GENIVAR (Phoenix Engineering) Section 3. the results and analyses (response to questions posed by working group) Section 4. general observations and recommendations

18 Wind Power Forecasting Pilot Project Page 9 of DATA SETS GENIVAR (Phoenix Engineering) made the following data available and accessible to ORTECH: a) Actual meteorological data and real power data for seven (7) existing Alberta wind power facilities and actual meteorological data and calculated wind power data from five (5) future Alberta wind power facilities from May 1, 27 to April 3, 28, inclusive. b) Forecast meteorological and power datasets for seven (7) existing Alberta wind power facilities, five (5) future facilities and the different Regions (i.e. South West (SW), South Central (SC), South East (SE), Central (CE), Existing-Facilities (EF), Future-Facilities (FF) and All-Facilities (AF)) from three (3) forecasters for May 1, 27 to April 3, 28, inclusive, every hour, on the hour. Both actual and forecasted data gathered for the first, second and third quarter progress reports (i.e. Q1, Q2 and Q3) were used in this final report to cover for May 1, 27 to July 31, 27, August 1, 27 to October 31, 27 and November 1, 27 to January 31, 28 periods respectively. To complete a full year, ORTECH collected a fourth quarter dataset (February 1, 28 to April 3, 28) from GENIVAR (Phoenix Engineering) in the same fashion as the other quarters. Changes to the data made available by GENIVAR (Phoenix Engineering) after the end of each quarter were not incorporated into the datasets used by ORTECH. Also power data forecasts for Existing-Facilities (EF) (the Region) were evaluated based on eleven rather than twelve months of data from June 27 to April 28 since forecasted power datasets for the month of May 27 for Existing-Facilities (EF) for all forecasters were not complete. ORTECH relied solely on the aforementioned data-source and monitored the source to ensure all the data and/or information considered necessary for performing this analyses was available.

19 Wind Power Forecasting Pilot Project Page 1 of 65 The measured meteorological data i.e. wind speeds at different heights, wind direction, barometric pressure and temperature at each individual site (a total of 12 sites) were retrieved from GENIVAR (Phoenix Engineering) historicalreadings wind server. The naming convention was provided by AESO and the output attained is the ten (1) minute average data which was then averaged on an hourly basis by ORTECH. Only the hours with an acceptable count ( 8%) of valid ten (1) minute data were taken into consideration. ORTECH has also converted the time stamps of the retrieved data from Mountain Standard Time (MST) to Universal Time (UTC) in order to perform the analyses against the forecasted data; hence, ascertaining an apple to apple comparison. The measured power data at the existing facilities were also acquired from GENIVAR (Phoenix Engineering) historical-readings wind server in the same manner as the meteorological data except for a different naming convention; AESO.CSDR2 which stands for AESO Current Supply and Demand. Power data were obtained at the seven (7) existing facilities for the months of February, March and April (for the 4 th quarter). The predicted/calculated power data for the five (5) future sites were provided to ORTECH by GENIVAR (Phoenix Engineering) through their ftp site. Transmission constraint periods for each month at specific sites were provided to ORTECH by AESO. ORTECH applies a -997 code to the data during these periods which are then rejected when applying a screen-out criteria based on acceptable ranges listed in Table 2-5 in section 2.1. Similar to previous quarters, each of the three (3) forecasters has provided forecast datasets for each of the twelve (12) sites as well as the four (4) regions (i.e. South-West (SW), South-Central (SC), South-East (SE) and Central (CE)) and for Existing-Facilities (EF), Future-Facilities (FF) and All-Facilities (AF). These datasets were made available to ORTECH through GENIVAR (Phoenix Engineering) website for the 4 th quarter as well; February 1, 28 to April 3, 28, inclusive, every hour, on the hour. Each dataset consists of a forecast for forty-eight (48) forecast time horizons: T=1 (one hour ahead) to T=48 (48 hours ahead) and they are divided into two categories: meteorological and power. The meteorological datasets include forecasted wind speed, wind direction, temperature, air pressure and their uncertainty (min/max) and averages at each of the twelve (12) sites. The power datasets include forecasted power and ramp rate and their uncertainty (min/max) at each of the twelve (12) sites and the aforementioned four regions and EF, FF and AF.

20 Wind Power Forecasting Pilot Project Page 11 of 65 Tables 2-1 to 2-4 provide the monthly and total 1-minute and hourly averaged measured wind speed and measured/calculated power data recovery rates (prior to employing the screen-out criteria detailed in section 2.1) for the 4 th quarter (Q4) from February 1, 28 to April 3, 28, respectively. Similar tables for the 1 st quarter (Q1), 2 nd quarter (Q2) and 3 rd quarter (Q3) are shown in Appendix C. The data recovery rate is defined as the number of valid data records collected versus that possible over the reporting period. The method of calculation is as follows: Where, Data Recovery Rate (%) = Data Records Collected *(1) Data Records Possible Data Records Collected = Data Records Possible Number of Invalid Records As shown in Tables 2-1 and 2-2 the 1-minute and hourly recovery rate for the measured wind speed during the 4 th quarter were generally 97% except for site 8 and site 1 which, returned a 1-minute recovery rate of 81.2%.and 87.3%, respectively. The low recovery rate at site 8 has resulted from a SCADA PC/hardware problem detected during the month of February; hence, returning a recovery rate as low as 52.3% for that month. The recovery rate for the power data were also reasonable ( 98%) throughout the analysis period of the 4 th quarter as shown in Tables 2.3 and 2.4. A 9% overall recovery rate is normally considered as the minimum requirement by the industry to be temporally representative.

21 Wind Power Forecasting Pilot Project Page 12 of 65 Table 2-1: Summary of the 1-Minute Measured Wind Speed Data Recovery Rates for each Site during Q4 (February 1, 28 : UTC April 3, 28 23: UTC) 1 Minute Data Recovery Rate (%) Site/Month February March April All Months Table 2-2: Summary of the Hourly-Averaged Measured Wind Speed Data Recovery Rates for each Site during Q4 (February 1, 28 : UTC April 3, 28 23: UTC) Hourly Data Recovery Rate (%) Site/Month February March April All Months

22 Wind Power Forecasting Pilot Project Page 13 of 65 Table 2-3: Summary of the 1-Minute Measured/Calculated Power Data Recovery Rates for each Site during Q4 (February 1, 28 : UTC April 3, 28 23: UTC) 1-Minute Data Recovery Rate (%) Site/Month February March April All Months Table 2-4: Summary of the Hourly-Averaged Measured/Calculated Power Data Recovery Rates for each of the Sites during Q4 (February 1, 28 : UTC April 3, 28 23: UTC) Hourly Data Recovery Rate (%) Site/Month February March April All Months

23 Wind Power Forecasting Pilot Project Page 14 of Data Completeness ORTECH did not apply a standard quality control and quality assurance procedures on the measured or forecasted data retrieved. Instead, screen-out criteria is; however, employed on both datasets following a list of acceptable ranges, approved by AESO, for the different variables as detailed in Table 2-5. Values laying outside of these ranges were rejected. In order to perform legitimate statistical analyses, a complete set of data is required. Therefore, ORTECH in the analyses of the four quarters considered only data-sets that are valid (based on the screen-out criteria) and available from all sources (i.e. measured and forecasted from forecasters A, B and C). If data are invalid or missing at a specific hour/period from one of the aforementioned sources, ORTECH rejects that hour/period and does not consider them in the statistical analyses. Table 2-5: Summary of Acceptable Ranges of Variables used by ORTECH to Screen-Out Measured/Calculated and Forecasted Wind and Power Data Variable Acceptable Range (inclusive) Wind Speed (m/s) to 6 Wind Direction ( ) to 36 Power (MW) -.1 x site-specific rated capacity to 1.1 x site-specific rated capacity Surface Temperature ( C) -6 to 6 Surface Pressure (mbar) 5 to 15 The annual recovery rate for the hourly-averaged measured wind-speed and measured/calculated power after applying the screen-out criteria for the reporting period (May 1, 27 to April 3, 28) are summarized in Table 2-6. Each of the four (4) regions (SW, SC, SE and CE) includes three existing and/or future wind farms; thus, ORTECH has stacked together the wind speed data from the sites relevant for each region since it is difficult to generalize and/or normalize the wind speed for each Region. The wind speed recovery rates calculated for those Regions and presented in Table 2-6 are; subsequently, calculated based on the stacked data; hence, one should be careful before drawing any conclusions based on these recovery rates.

24 Wind Power Forecasting Pilot Project Page 15 of 65 The annual recovery rate (R) for wind-speed and power complete data-sets that are used in the statistical analyses at each of the Selected-Horizons and at the different Regions are summarized in Tables 2-7 and 2-8 respectively and they are calculated as follows: R = Number of hours with available and valid data during the year of analysis Number of hours possible during the year of analysis (i.e hours) The annual average recovery rates for the complete data-set of hourlyaveraged wind-speed were greater than 9% in all Regions except for the CE and FF where the annual average recovery rates returned were 83.6% and 87.2%, respectively as shown in Table 2-7. These rather low recovery rates are the result of the low recovery rates (illustrated in Table 2-6) returned for the measured wind-speed at some of the future sites particularly sites 1, 11 and 12 which fall within those regions. The annual recovery rates for the complete data-set of power were greater than 92% for the SW, SC and CE regions while they were around 9% for the SE, CE and EF with the lowest for the latter region which returned an annual recovery rate of 88.1% as illustrated in Table 2-8. The fairly low annual recovery rate at this region (i.e. EF) has resulted from incomplete forecasted power data (from all forecasters) during the month of May 27. To avoid misleading results ORTECH considered only eleven months (11) from June 27 to April 28, inclusive for the existing-facilities in the analyses of power (summarized in the following sections).

25 Wind Power Forecasting Pilot Project Page 16 of 65 Table 2-6: Summary of the Annual Recovery Rates of Measured Hourly-Averaged Wind-Speed and Measured/Calculated Power Data at the Different Sites and Regions from May 1, 27 to April 3, 28 Site/Region Recovery Rate of Hourly Wind Speed Data (%) Recovery Rate of Hourly Power Data (%) SW SC SE CE EF FF AF

26 Wind Power Forecasting Pilot Project Page 17 of 65 Table 2-7: Summary of the Annual Recovery Rates of Hourly-Averaged Wind-Speed Data-Sets used in the Statistical Analyses at Each of the Selected-Horizons and at the Different Regions from May 1, 27 to April 3, 28. Data Recovery Rate (%) Horizon/Region SW SC SE CE EF FF AF T= T= T= T= T= T= T= T= T= T= Avg (T=1 -T=48) Min (T=1 -T=48) Max (T=1 -T=48)

27 Wind Power Forecasting Pilot Project Page 18 of 65 Table 2-8: Summary of the Annual Recovery Rates of Hourly-Averaged Power Data-Sets used in the Statistical Analyses at Each of the Selected-Horizons and at the Different Regions from May 1, 27 to April 3, 28 Data Recovery Rate (%) Horizon/Region SW SC SE CE EF FF AF T= T= T= T= T= T= T= T= T= T= Avg (T=1 -T=48) Min (T=1 -T=48) Max (T=1 -T=48)

28 Wind Power Forecasting Pilot Project Page 19 of RESULTS AND ANALYSES The forecast data was analyzed using statistical methods described in the first, second, and third quarterly reports. Emphasis was placed on deriving meaningful and precise values for a metric that characterizes the typical magnitude of error over all hours in the evaluation between the predicted and measured data. Most error measures produce numbers that assess the deviations between predicted and measured values, but it is more meaningful to arrive at a statement such as at site X, y% of the forecasts are within z value from the mean which requires the correct choice of error measure and some indication of the type of distribution. To this end, only those statistical parameters that provide some insight into the error characteristics are presented based on the diagram below. Observations: Hourly time-scale Temporal statistics: Forecasts: 1 to 48 hour time scales Meteorological situations: 3 to >8 hour time scales 1 to 48 hour time horizons and Seasonal Stratified Statistics: season and meteorological situation Summary statistics: confidence intervals etc. resolved 1 to 48 hour horizons Distinction between forecasters: resolved 1 to 48 hour horizons The final outcome should attempt to distinguish one forecast methodology from another resolved to 1 to 48 hour forecast horizons for various regions. The information provided in this report works toward that goal. After reviewing the three quarterly reports the working group provided a set of specific questions that they wished to have addressed in the final report. The questions listed below are addressed in this section of the final report. In addressing these questions an extensive amount of statistical analysis was undertaken to try to provide some insight and support for any findings. At the direction of the working group the emphasis was put on power as opposed to wind. In some of the sections below metrics for both wind and power are presented. However following the working group s direction most sections only show metrics for power. Similarly in most sections only one metric is shown which, is the Root Mean Square Error (RMSE). All the metrics, figures and tables used in the analysis are attached at the back of the report on a CD.

29 Wind Power Forecasting Pilot Project Page 2 of 65 The questions that are addressed, as provided by the working group are as follows; 1. What is the general accuracy of the Forecasts? 2. What is the accuracy of the Forecasts at the different forecast horizons studied (T=1 hour to T=48 hours)? 3. What is the accuracy of the Forecasts at different hours of the day and seasons of the year? 4. What is the accuracy of the Forecasted Meteorological Data before running through the Power Conversion models? 5. What is the accuracy of the Power Conversion? 6. What is the Potential co-variance from given data samples? 7. What is the accuracy of the Forecast at different wind speeds or different points of a Wind Power Facility's power curve? 8. What is the relative comparison between Forecasts? 9. Which is the region with the least amount of error? Which forecaster forecasts best in that region and why? 1. What is the effect of spatial smoothing on forecast error? 11. How well do the forecasts predict fast ramp up and ramp down times, event analysis (CSI)? 12. What is the Impact on data availability? 13. Are there times (day/month/weather pattern) when there is more uncertainty in the forecasts than other times? 14. What is the relationship between the spread of the min/max and the forecast error? 15. What is the correlation factor between all three forecasts? Is this related to the forecast error? Results are presented for the four (4) geographical regions: South-West (SW), South-Central (SC), South East (SE) and Central (CE), as well as, for Existing Facilities (EF), Future Facilities (FF) and All Facilities (AF) when wind speed is presented. When power is presented only the SW, SC and EF are presented in order to compare forecasted power to measured power.

30 Wind Power Forecasting Pilot Project Page 21 of 65 Wind speed and power forecasts were provided each hour for forty-eight (48) hourly forecast horizons, T=1 (one hour ahead) to T=48 (48 hours ahead). In the following text Selected Forecast Horizons refers to T=1, T=2, T=3, T=4, T=6, T=12, T=17, T=24, T=36 and T=48 which were defined by the working group. As indicated in section 2., the forecasted power data-set for the month of May 27 for Existing Facilities for all forecasters was not complete. Therefore, when evaluating the metrics for power for Existing Facilities only eleven months of data was used (June 27-April 28). The results include those illustrations that were deemed relevant as mentioned above. A large number of additional illustrations can be generated from the statistical analysis, but were not included for practical reasons. They are available in the CD accompanying the report. 3.1 What is the General Accuracy of the Forecasts? MAE and RMSE Wind Speed Each of the four regions (SW, SC, SE, and CE) includes three existing or/and future facilities, thus it is difficult to generalize and/or normalize the wind speed for each region. Therefore, the assessment of the forecast errors for the wind speed was processed for each region without normalization, i.e., the predicted and measured pairs at those individual sites were simply stacked together for the relevant regions. What is meant by stacking, is grouping the wind data collected from sites at a particular region thereby considering them as one long data set. The objective of such a grouping is to derive regional error statistics for wind data without doing any averaging. The procedure of such grouping can be explained through an example as follows. The South West region contains three meteorological masts. Therefore, three sets of wind data time series are available. These three data sets are combined together irrespectively (of dates/times). The error statistics are then calculated using the same method as for an individual site. With this approach averaged wind speed data from any individual time series are not derived.

31 Wind Power Forecasting Pilot Project Page 22 of 65 In looking at the general accuracy of the wind speed as predicted by the forecasters in all regions and at all time horizons the Mean Absolute Error (MAE) and the Root Mean Square Error (RMSE) were the specific metrics used. In the case of the RMSE it was decomposed into its component part the bias and the variance of the error. The RMSE connects the important statistical quantities of the two time series. It is composed of three different components which contribute to the RMSE originating from different effects. The bias accounts for the difference between the mean values of forecast and measurement. The standard deviation (sde) measures the fluctuations of the error around its mean. The sde is very useful as it directly provides the 68%-confidence interval if the errors are normally distributed. In the context of comparing forecast and measurement, the sde has two contributions. The first is the sdbias, i.e. the difference between the standard deviations of predicted and measured values, which evaluates errors due to wrongly forecasted variability. This together with the bias is an indicator of amplitude errors. The second contribution is the dispersion (disp) which involves the cross-correlation coefficient (r) weighted with the standard deviations of both time series. Thus, disp accounts for the contribution of phase errors to the RMSE. The amplitude and phase errors are illustrated in Figure 3-1 which shows a fourday comparison of the time series of measured and predicted wind speeds at one wind farm. The amplitude error is clearly seen at 17 th hour on January 14, 28. The phase error is more pronounced at 2 th hour on January 12, 28 and 9 th hour on January 13, 28.

32 Wind Power Forecasting Pilot Project Page 23 of 65 Figure 3-1: Comparison of the time series of measured and predicted wind speeds at first forecast horizon (T=1hr) by an anonymous forecaster at a particular wind farm (name is not disclosed). Note examples are circled Measured Predicted Amplitude Error Windspeed (m/s) Phase Error 1/12/28 1/13/28 1/14/28 1/15/28 1/16/28 Date Figure 3-2 and Figure 3-3 show the annual MAE and RMSE respectively by region and by forecast horizon for each forecaster. For completeness, the quarterly MAE and RMSE are presented in Appendix D.

33 Wind Power Forecasting Pilot Project Page 24 of 65 Figure 3-2: 5 Annual Mean Absolute Error (MAE) of wind speed (WS) predictions (Pred) in South West (SW), South Central (SC), South East (SE), Central (CE), Existing Facilities (EF), Future Facilities (FF) and All Facilities (AF) by three forecasters A, B and C as a function of forecast horizons. 5 Forecaster A WIND : MAE MAE of WS Pred. (m/s) SW SC SE CE EF FF AF Forecaster B WIND : MAE MAE of WS Pred. (m/s) SW SC SE CE EF FF AF 5 Forecaster C WIND : MAE MAE of WS Pred. (m/s) SW SC SE CE EF FF AF

34 Wind Power Forecasting Pilot Project Page 25 of 65 Figure 3-3: Annual Root Mean Square Error (RMSE) of wind speed (WS) predictions (Pred.) in South West (SW), South Central (SC), South East (SE), Central (CE), Existing Facilities (EF), Future Facilities (FF) and All facilities (AF) by three forecasters A, B and C as a function of forecast horizons. Forecaster A MET : RMSE RMSE of WS Pred. (m/s) SW SC SE CE EF FF AF Forecaster B MET : RMSE RMSE of WS Pred. (m/s) SW SC SE CE EF FF AF 5 Forecaster C MET : RMSE RMSE of WS Pred. (m/s) SW SC SE CE EF FF AF

35 Wind Power Forecasting Pilot Project Page 26 of 65 Referring to Figures 3-2 and 3-3, the annual MAE and RMSE, the range of values over the three forecasters for MAE are; from 1.4 to 3.5 m/s and for RMSE from 1.9 to 4.6 m/s. It should be noted that Forecaster A and C show a rapid increase in MAE and RMSE from time horizon T=1 to time horizon T=6 before flattening out while Forecaster B remains relatively flat over all horizons. This difference between Forecasters A and C and Forecaster B is probably due to the forecast methodologies used. For all forecasters the ordering of the regions for both MAE and RMSE are the same, i.e. the largest MAE and RMSE values are in the SW region and the lowest values are in the Central (CE) region with the other regions falling in between. There are possibly three reasons why the SW region has the largest values and the CE region has the smallest values. As shown on the map of Alberta in Appendix A, the SW region s topography is more complex than for the CE region, which would indicate that it is more difficult to forecast in the SW region. Secondly, the SW region in relative size to the CE region is much smaller and thus the individual sites are closer together than in the CE region and again may have an effect on forecasting based on topography and spatial smoothing. Finally the winds on average are lighter in the CE regions than in the SW region which can influence the statistic Decomposition of RMSE Figure 3-4 shows the decomposition of RMSE for the three forecasters. For all forecasters the dispersion (disp) component dominates the RMSE in all regions over all forecast horizons. The disp accounts for the contribution of phase errors to the RMSE. Thus, the largest error in the forecasts is due to phase errors. It should be noted that different regions show a consistent behaviour for disp suggesting a similar property affects all regions in a similar manner. After time horizon T=6 there appears to be a linear increase of dispersion with prediction horizon indicating a systematic growth in the average phase errors between prediction and measurement with increasing lead time. The second contributor of RMSE seems to be sdbias, the absolute value of which increases steeply for Forecasters A and C until forecast horizon T=6 and remains asymptotic with increasing lead time. Another common feature is the negative sdbias which indicates the variability of the predicted wind speed is smaller than the measured wind speed variability. Model fidelity, as would be expected, limits the forecaster s ability to predict the finer detail of wind speed.

Forecaster comments to the ORTECH Report

Forecaster comments to the ORTECH Report Forecaster comments to the ORTECH Report The Alberta Forecasting Pilot Project was truly a pioneering and landmark effort in the assessment of wind power production forecast performance in North America.

More information

APPENDIX N. Data Validation Using Data Descriptors

APPENDIX N. Data Validation Using Data Descriptors APPENDIX N Data Validation Using Data Descriptors Data validation is often defined by six data descriptors: 1) reports to decision maker 2) documentation 3) data sources 4) analytical method and detection

More information

A Comparative Study of the Pickup Method and its Variations Using a Simulated Hotel Reservation Data

A Comparative Study of the Pickup Method and its Variations Using a Simulated Hotel Reservation Data A Comparative Study of the Pickup Method and its Variations Using a Simulated Hotel Reservation Data Athanasius Zakhary, Neamat El Gayar Faculty of Computers and Information Cairo University, Giza, Egypt

More information

Virtual Met Mast verification report:

Virtual Met Mast verification report: Virtual Met Mast verification report: June 2013 1 Authors: Alasdair Skea Karen Walter Dr Clive Wilson Leo Hume-Wright 2 Table of contents Executive summary... 4 1. Introduction... 6 2. Verification process...

More information

AP Physics 1 and 2 Lab Investigations

AP Physics 1 and 2 Lab Investigations AP Physics 1 and 2 Lab Investigations Student Guide to Data Analysis New York, NY. College Board, Advanced Placement, Advanced Placement Program, AP, AP Central, and the acorn logo are registered trademarks

More information

APPENDIX 15. Review of demand and energy forecasting methodologies Frontier Economics

APPENDIX 15. Review of demand and energy forecasting methodologies Frontier Economics APPENDIX 15 Review of demand and energy forecasting methodologies Frontier Economics Energex regulatory proposal October 2014 Assessment of Energex s energy consumption and system demand forecasting procedures

More information

MGT 267 PROJECT. Forecasting the United States Retail Sales of the Pharmacies and Drug Stores. Done by: Shunwei Wang & Mohammad Zainal

MGT 267 PROJECT. Forecasting the United States Retail Sales of the Pharmacies and Drug Stores. Done by: Shunwei Wang & Mohammad Zainal MGT 267 PROJECT Forecasting the United States Retail Sales of the Pharmacies and Drug Stores Done by: Shunwei Wang & Mohammad Zainal Dec. 2002 The retail sale (Million) ABSTRACT The present study aims

More information

2013 MBA Jump Start Program. Statistics Module Part 3

2013 MBA Jump Start Program. Statistics Module Part 3 2013 MBA Jump Start Program Module 1: Statistics Thomas Gilbert Part 3 Statistics Module Part 3 Hypothesis Testing (Inference) Regressions 2 1 Making an Investment Decision A researcher in your firm just

More information

Industry Environment and Concepts for Forecasting 1

Industry Environment and Concepts for Forecasting 1 Table of Contents Industry Environment and Concepts for Forecasting 1 Forecasting Methods Overview...2 Multilevel Forecasting...3 Demand Forecasting...4 Integrating Information...5 Simplifying the Forecast...6

More information

Value of storage in providing balancing services for electricity generation systems with high wind penetration

Value of storage in providing balancing services for electricity generation systems with high wind penetration Journal of Power Sources 162 (2006) 949 953 Short communication Value of storage in providing balancing services for electricity generation systems with high wind penetration Mary Black, Goran Strbac 1

More information

Step-by-Step Analytical Methods Validation and Protocol in the Quality System Compliance Industry

Step-by-Step Analytical Methods Validation and Protocol in the Quality System Compliance Industry Step-by-Step Analytical Methods Validation and Protocol in the Quality System Compliance Industry BY GHULAM A. SHABIR Introduction Methods Validation: Establishing documented evidence that provides a high

More information

The impact of window size on AMV

The impact of window size on AMV The impact of window size on AMV E. H. Sohn 1 and R. Borde 2 KMA 1 and EUMETSAT 2 Abstract Target size determination is subjective not only for tracking the vector but also AMV results. Smaller target

More information

Biostatistics: DESCRIPTIVE STATISTICS: 2, VARIABILITY

Biostatistics: DESCRIPTIVE STATISTICS: 2, VARIABILITY Biostatistics: DESCRIPTIVE STATISTICS: 2, VARIABILITY 1. Introduction Besides arriving at an appropriate expression of an average or consensus value for observations of a population, it is important to

More information

Improving Demand Forecasting

Improving Demand Forecasting Improving Demand Forecasting 2 nd July 2013 John Tansley - CACI Overview The ideal forecasting process: Efficiency, transparency, accuracy Managing and understanding uncertainty: Limits to forecast accuracy,

More information

Characteristics and statistics of digital remote sensing imagery

Characteristics and statistics of digital remote sensing imagery Characteristics and statistics of digital remote sensing imagery There are two fundamental ways to obtain digital imagery: Acquire remotely sensed imagery in an analog format (often referred to as hard-copy)

More information

Sandia National Laboratories New Mexico Wind Resource Assessment Lee Ranch

Sandia National Laboratories New Mexico Wind Resource Assessment Lee Ranch Sandia National Laboratories New Mexico Wind Resource Assessment Lee Ranch Data Summary and Transmittal for September December 2002 & Annual Analysis for January December 2002 Prepared for: Sandia National

More information

DATA VALIDATION, PROCESSING, AND REPORTING

DATA VALIDATION, PROCESSING, AND REPORTING DATA VALIDATION, PROCESSING, AND REPORTING After the field data are collected and transferred to your office computing environment, the next steps are to validate and process data, and generate reports.

More information

Northumberland Knowledge

Northumberland Knowledge Northumberland Knowledge Know Guide How to Analyse Data - November 2012 - This page has been left blank 2 About this guide The Know Guides are a suite of documents that provide useful information about

More information

Content Sheet 7-1: Overview of Quality Control for Quantitative Tests

Content Sheet 7-1: Overview of Quality Control for Quantitative Tests Content Sheet 7-1: Overview of Quality Control for Quantitative Tests Role in quality management system Quality Control (QC) is a component of process control, and is a major element of the quality management

More information

ILM Level 3 Certificate in Using Active Operations Management in the Workplace (QCF)

ILM Level 3 Certificate in Using Active Operations Management in the Workplace (QCF) PAGE 1 ILM Level 3 Certificate in Using Active Operations Management in the Workplace (QCF) CONTENTS Qualification Overview: ILM Level 5 Award, Certificate and Diploma in Management APPENDICES Appendix

More information

Ch.3 Demand Forecasting.

Ch.3 Demand Forecasting. Part 3 : Acquisition & Production Support. Ch.3 Demand Forecasting. Edited by Dr. Seung Hyun Lee (Ph.D., CPL) IEMS Research Center, E-mail : lkangsan@iems.co.kr Demand Forecasting. Definition. An estimate

More information

Introduction to time series analysis

Introduction to time series analysis Introduction to time series analysis Margherita Gerolimetto November 3, 2010 1 What is a time series? A time series is a collection of observations ordered following a parameter that for us is time. Examples

More information

Mean of Ratios or Ratio of Means: statistical uncertainty applied to estimate Multiperiod Probability of Default

Mean of Ratios or Ratio of Means: statistical uncertainty applied to estimate Multiperiod Probability of Default Mean of Ratios or Ratio of Means: statistical uncertainty applied to estimate Multiperiod Probability of Default Matteo Formenti 1 Group Risk Management UniCredit Group Università Carlo Cattaneo September

More information

User manual data files meteorological mast NoordzeeWind

User manual data files meteorological mast NoordzeeWind User manual data files meteorological mast NoordzeeWind Document code: NZW-16-S-4-R03 Version: 2.0 Date: 1 October 2007 Author: ing. HJ Kouwenhoven User manual data files meteorological mast NoordzeeWind

More information

http://www.jstor.org This content downloaded on Tue, 19 Feb 2013 17:28:43 PM All use subject to JSTOR Terms and Conditions

http://www.jstor.org This content downloaded on Tue, 19 Feb 2013 17:28:43 PM All use subject to JSTOR Terms and Conditions A Significance Test for Time Series Analysis Author(s): W. Allen Wallis and Geoffrey H. Moore Reviewed work(s): Source: Journal of the American Statistical Association, Vol. 36, No. 215 (Sep., 1941), pp.

More information

Data Management Implementation Plan

Data Management Implementation Plan Appendix 8.H Data Management Implementation Plan Prepared by Vikram Vyas CRESP-Amchitka Data Management Component 1. INTRODUCTION... 2 1.1. OBJECTIVES AND SCOPE... 2 2. DATA REPORTING CONVENTIONS... 2

More information

Design of a Weather- Normalization Forecasting Model

Design of a Weather- Normalization Forecasting Model Design of a Weather- Normalization Forecasting Model Project Proposal Abram Gross Yafeng Peng Jedidiah Shirey 2/11/2014 Table of Contents 1.0 CONTEXT... 3 2.0 PROBLEM STATEMENT... 4 3.0 SCOPE... 4 4.0

More information

On Correlating Performance Metrics

On Correlating Performance Metrics On Correlating Performance Metrics Yiping Ding and Chris Thornley BMC Software, Inc. Kenneth Newman BMC Software, Inc. University of Massachusetts, Boston Performance metrics and their measurements are

More information

Outline: Demand Forecasting

Outline: Demand Forecasting Outline: Demand Forecasting Given the limited background from the surveys and that Chapter 7 in the book is complex, we will cover less material. The role of forecasting in the chain Characteristics of

More information

Data Exploration Data Visualization

Data Exploration Data Visualization Data Exploration Data Visualization What is data exploration? A preliminary exploration of the data to better understand its characteristics. Key motivations of data exploration include Helping to select

More information

4.1 Exploratory Analysis: Once the data is collected and entered, the first question is: "What do the data look like?"

4.1 Exploratory Analysis: Once the data is collected and entered, the first question is: What do the data look like? Data Analysis Plan The appropriate methods of data analysis are determined by your data types and variables of interest, the actual distribution of the variables, and the number of cases. Different analyses

More information

NCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )

NCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( ) Chapter 340 Principal Components Regression Introduction is a technique for analyzing multiple regression data that suffer from multicollinearity. When multicollinearity occurs, least squares estimates

More information

Solar Input Data for PV Energy Modeling

Solar Input Data for PV Energy Modeling June 2012 Solar Input Data for PV Energy Modeling Marie Schnitzer, Christopher Thuman, Peter Johnson Albany New York, USA Barcelona Spain Bangalore India Company Snapshot Established in 1983; nearly 30

More information

( ) ( ) Central Tendency. Central Tendency

( ) ( ) Central Tendency. Central Tendency 1 Central Tendency CENTRAL TENDENCY: A statistical measure that identifies a single score that is most typical or representative of the entire group Usually, a value that reflects the middle of the distribution

More information

STATS8: Introduction to Biostatistics. Data Exploration. Babak Shahbaba Department of Statistics, UCI

STATS8: Introduction to Biostatistics. Data Exploration. Babak Shahbaba Department of Statistics, UCI STATS8: Introduction to Biostatistics Data Exploration Babak Shahbaba Department of Statistics, UCI Introduction After clearly defining the scientific problem, selecting a set of representative members

More information

A Quantitative Approach to Commercial Damages. Applying Statistics to the Measurement of Lost Profits + Website

A Quantitative Approach to Commercial Damages. Applying Statistics to the Measurement of Lost Profits + Website Brochure More information from http://www.researchandmarkets.com/reports/2212877/ A Quantitative Approach to Commercial Damages. Applying Statistics to the Measurement of Lost Profits + Website Description:

More information

Data Analysis: Describing Data - Descriptive Statistics

Data Analysis: Describing Data - Descriptive Statistics WHAT IT IS Return to Table of ontents Descriptive statistics include the numbers, tables, charts, and graphs used to describe, organize, summarize, and present raw data. Descriptive statistics are most

More information

Statistical Rules of Thumb

Statistical Rules of Thumb Statistical Rules of Thumb Second Edition Gerald van Belle University of Washington Department of Biostatistics and Department of Environmental and Occupational Health Sciences Seattle, WA WILEY AJOHN

More information

Fitch CDS Pricing Consensus Curve Methodology

Fitch CDS Pricing Consensus Curve Methodology Fitch CDS Pricing Consensus Curve Methodology Overview The Fitch CDS Pricing Service aims to provide greater transparency within the global Credit Derivative market. Fitch CDS Pricing is an industry-leading

More information

16 : Demand Forecasting

16 : Demand Forecasting 16 : Demand Forecasting 1 Session Outline Demand Forecasting Subjective methods can be used only when past data is not available. When past data is available, it is advisable that firms should use statistical

More information

Solar and PV forecasting in Canada

Solar and PV forecasting in Canada Solar and PV forecasting in Canada Sophie Pelland, CanmetENERGY IESO Wind Power Standing Committee meeting Toronto, September 23, 2010 Presentation Plan Introduction How are PV forecasts generated? Solar

More information

Public Service Co. of New Mexico (PNM) - Smoothing and Peak Shifting. DOE Peer Review Steve Willard, P.E. September 26, 2012

Public Service Co. of New Mexico (PNM) - Smoothing and Peak Shifting. DOE Peer Review Steve Willard, P.E. September 26, 2012 Public Service Co. of New Mexico (PNM) - PV Plus Storage for Simultaneous Voltage Smoothing and Peak Shifting DOE Peer Review Steve Willard, P.E. September 26, 2012 Project Goals Develop an even more Beneficial

More information

Fairfield Public Schools

Fairfield Public Schools Mathematics Fairfield Public Schools AP Statistics AP Statistics BOE Approved 04/08/2014 1 AP STATISTICS Critical Areas of Focus AP Statistics is a rigorous course that offers advanced students an opportunity

More information

Solar radiation data validation

Solar radiation data validation Loughborough University Institutional Repository Solar radiation data validation This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: MCKENNA, E., 2009.

More information

REDUCING UNCERTAINTY IN SOLAR ENERGY ESTIMATES

REDUCING UNCERTAINTY IN SOLAR ENERGY ESTIMATES REDUCING UNCERTAINTY IN SOLAR ENERGY ESTIMATES Mitigating Energy Risk through On-Site Monitoring Marie Schnitzer, Vice President of Consulting Services Christopher Thuman, Senior Meteorologist Peter Johnson,

More information

Cross Validation. Dr. Thomas Jensen Expedia.com

Cross Validation. Dr. Thomas Jensen Expedia.com Cross Validation Dr. Thomas Jensen Expedia.com About Me PhD from ETH Used to be a statistician at Link, now Senior Business Analyst at Expedia Manage a database with 720,000 Hotels that are not on contract

More information

Calibration of the MASS time constant by simulation

Calibration of the MASS time constant by simulation Calibration of the MASS time constant by simulation A. Tokovinin Version 1.1. July 29, 2009 file: prj/atm/mass/theory/doc/timeconstnew.tex 1 Introduction The adaptive optics atmospheric time constant τ

More information

Time series Forecasting using Holt-Winters Exponential Smoothing

Time series Forecasting using Holt-Winters Exponential Smoothing Time series Forecasting using Holt-Winters Exponential Smoothing Prajakta S. Kalekar(04329008) Kanwal Rekhi School of Information Technology Under the guidance of Prof. Bernard December 6, 2004 Abstract

More information

Evaluation of Quantitative Data (errors/statistical analysis/propagation of error)

Evaluation of Quantitative Data (errors/statistical analysis/propagation of error) Evaluation of Quantitative Data (errors/statistical analysis/propagation of error) 1. INTRODUCTION Laboratory work in chemistry can be divided into the general categories of qualitative studies and quantitative

More information

Global Seasonal Phase Lag between Solar Heating and Surface Temperature

Global Seasonal Phase Lag between Solar Heating and Surface Temperature Global Seasonal Phase Lag between Solar Heating and Surface Temperature Summer REU Program Professor Tom Witten By Abstract There is a seasonal phase lag between solar heating from the sun and the surface

More information

Demand Estimation Technical Forum

Demand Estimation Technical Forum Demand Estimation Technical Forum 2 nd nd June 2008 Agenda Overview of Demand Estimation & Timetable Presentation of Current Completed Analysis Modelling Basis Small NDM sample details, aggregations, initial

More information

Statistics Review PSY379

Statistics Review PSY379 Statistics Review PSY379 Basic concepts Measurement scales Populations vs. samples Continuous vs. discrete variable Independent vs. dependent variable Descriptive vs. inferential stats Common analyses

More information

APPENDIX E THE ASSESSMENT PHASE OF THE DATA LIFE CYCLE

APPENDIX E THE ASSESSMENT PHASE OF THE DATA LIFE CYCLE APPENDIX E THE ASSESSMENT PHASE OF THE DATA LIFE CYCLE The assessment phase of the Data Life Cycle includes verification and validation of the survey data and assessment of quality of the data. Data verification

More information

Discussion Paper On the validation and review of Credit Rating Agencies methodologies

Discussion Paper On the validation and review of Credit Rating Agencies methodologies Discussion Paper On the validation and review of Credit Rating Agencies methodologies 17 November 2015 ESMA/2015/1735 Responding to this paper The European Securities and Markets Authority (ESMA) invites

More information

RELEVANT TO ACCA QUALIFICATION PAPER P3. Studying Paper P3? Performance objectives 7, 8 and 9 are relevant to this exam

RELEVANT TO ACCA QUALIFICATION PAPER P3. Studying Paper P3? Performance objectives 7, 8 and 9 are relevant to this exam RELEVANT TO ACCA QUALIFICATION PAPER P3 Studying Paper P3? Performance objectives 7, 8 and 9 are relevant to this exam Business forecasting and strategic planning Quantitative data has always been supplied

More information

Simple Linear Regression Inference

Simple Linear Regression Inference Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation

More information

Wind resources map of Spain at mesoscale. Methodology and validation

Wind resources map of Spain at mesoscale. Methodology and validation Wind resources map of Spain at mesoscale. Methodology and validation Martín Gastón Edurne Pascal Laura Frías Ignacio Martí Uxue Irigoyen Elena Cantero Sergio Lozano Yolanda Loureiro e-mail:mgaston@cener.com

More information

ASSURING THE QUALITY OF TEST RESULTS

ASSURING THE QUALITY OF TEST RESULTS Page 1 of 12 Sections Included in this Document and Change History 1. Purpose 2. Scope 3. Responsibilities 4. Background 5. References 6. Procedure/(6. B changed Division of Field Science and DFS to Office

More information

Measurement Information Model

Measurement Information Model mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides

More information

Algebra 1 2008. Academic Content Standards Grade Eight and Grade Nine Ohio. Grade Eight. Number, Number Sense and Operations Standard

Algebra 1 2008. Academic Content Standards Grade Eight and Grade Nine Ohio. Grade Eight. Number, Number Sense and Operations Standard Academic Content Standards Grade Eight and Grade Nine Ohio Algebra 1 2008 Grade Eight STANDARDS Number, Number Sense and Operations Standard Number and Number Systems 1. Use scientific notation to express

More information

Validation and Calibration. Definitions and Terminology

Validation and Calibration. Definitions and Terminology Validation and Calibration Definitions and Terminology ACCEPTANCE CRITERIA: The specifications and acceptance/rejection criteria, such as acceptable quality level and unacceptable quality level, with an

More information

Algebra 1 Course Information

Algebra 1 Course Information Course Information Course Description: Students will study patterns, relations, and functions, and focus on the use of mathematical models to understand and analyze quantitative relationships. Through

More information

Influence of Solar Radiation Models in the Calibration of Building Simulation Models

Influence of Solar Radiation Models in the Calibration of Building Simulation Models Influence of Solar Radiation Models in the Calibration of Building Simulation Models J.K. Copper, A.B. Sproul 1 1 School of Photovoltaics and Renewable Energy Engineering, University of New South Wales,

More information

Collaborative Forecasting

Collaborative Forecasting Collaborative Forecasting By Harpal Singh What is Collaborative Forecasting? Collaborative forecasting is the process for collecting and reconciling the information from diverse sources inside and outside

More information

Analysis of Bayesian Dynamic Linear Models

Analysis of Bayesian Dynamic Linear Models Analysis of Bayesian Dynamic Linear Models Emily M. Casleton December 17, 2010 1 Introduction The main purpose of this project is to explore the Bayesian analysis of Dynamic Linear Models (DLMs). The main

More information

Section A. Index. Section A. Planning, Budgeting and Forecasting Section A.2 Forecasting techniques... 1. Page 1 of 11. EduPristine CMA - Part I

Section A. Index. Section A. Planning, Budgeting and Forecasting Section A.2 Forecasting techniques... 1. Page 1 of 11. EduPristine CMA - Part I Index Section A. Planning, Budgeting and Forecasting Section A.2 Forecasting techniques... 1 EduPristine CMA - Part I Page 1 of 11 Section A. Planning, Budgeting and Forecasting Section A.2 Forecasting

More information

Short-Term Forecasting in Retail Energy Markets

Short-Term Forecasting in Retail Energy Markets Itron White Paper Energy Forecasting Short-Term Forecasting in Retail Energy Markets Frank A. Monforte, Ph.D Director, Itron Forecasting 2006, Itron Inc. All rights reserved. 1 Introduction 4 Forecasting

More information

INTERNATIONAL STANDARD ON AUDITING 200 OBJECTIVE AND GENERAL PRINCIPLES GOVERNING AN AUDIT OF FINANCIAL STATEMENTS CONTENTS

INTERNATIONAL STANDARD ON AUDITING 200 OBJECTIVE AND GENERAL PRINCIPLES GOVERNING AN AUDIT OF FINANCIAL STATEMENTS CONTENTS INTERNATIONAL STANDARD ON AUDITING 200 OBJECTIVE AND GENERAL PRINCIPLES GOVERNING (Effective for audits of financial statements for periods beginning on or after December 15, 2005. The Appendix contains

More information

2. Simple Linear Regression

2. Simple Linear Regression Research methods - II 3 2. Simple Linear Regression Simple linear regression is a technique in parametric statistics that is commonly used for analyzing mean response of a variable Y which changes according

More information

Medical Information Management & Mining. You Chen Jan,15, 2013 You.chen@vanderbilt.edu

Medical Information Management & Mining. You Chen Jan,15, 2013 You.chen@vanderbilt.edu Medical Information Management & Mining You Chen Jan,15, 2013 You.chen@vanderbilt.edu 1 Trees Building Materials Trees cannot be used to build a house directly. How can we transform trees to building materials?

More information

Review of Transpower s. electricity demand. forecasting methods. Professor Rob J Hyndman. B.Sc. (Hons), Ph.D., A.Stat. Contact details: Report for

Review of Transpower s. electricity demand. forecasting methods. Professor Rob J Hyndman. B.Sc. (Hons), Ph.D., A.Stat. Contact details: Report for Review of Transpower s electricity demand forecasting methods Professor Rob J Hyndman B.Sc. (Hons), Ph.D., A.Stat. Contact details: Telephone: 0458 903 204 Email: robjhyndman@gmail.com Web: robjhyndman.com

More information

Climate and Weather. This document explains where we obtain weather and climate data and how we incorporate it into metrics:

Climate and Weather. This document explains where we obtain weather and climate data and how we incorporate it into metrics: OVERVIEW Climate and Weather The climate of the area where your property is located and the annual fluctuations you experience in weather conditions can affect how much energy you need to operate your

More information

Exponential Smoothing with Trend. As we move toward medium-range forecasts, trend becomes more important.

Exponential Smoothing with Trend. As we move toward medium-range forecasts, trend becomes more important. Exponential Smoothing with Trend As we move toward medium-range forecasts, trend becomes more important. Incorporating a trend component into exponentially smoothed forecasts is called double exponential

More information

I. Introduction. II. Background. KEY WORDS: Time series forecasting, Structural Models, CPS

I. Introduction. II. Background. KEY WORDS: Time series forecasting, Structural Models, CPS Predicting the National Unemployment Rate that the "Old" CPS Would Have Produced Richard Tiller and Michael Welch, Bureau of Labor Statistics Richard Tiller, Bureau of Labor Statistics, Room 4985, 2 Mass.

More information

Standard Deviation Calculator

Standard Deviation Calculator CSS.com Chapter 35 Standard Deviation Calculator Introduction The is a tool to calculate the standard deviation from the data, the standard error, the range, percentiles, the COV, confidence limits, or

More information

Quantitative Research Methods II. Vera E. Troeger Office: Office Hours: by appointment

Quantitative Research Methods II. Vera E. Troeger Office: Office Hours: by appointment Quantitative Research Methods II Vera E. Troeger Office: 0.67 E-mail: v.e.troeger@warwick.ac.uk Office Hours: by appointment Quantitative Data Analysis Descriptive statistics: description of central variables

More information

Current Standard: Mathematical Concepts and Applications Shape, Space, and Measurement- Primary

Current Standard: Mathematical Concepts and Applications Shape, Space, and Measurement- Primary Shape, Space, and Measurement- Primary A student shall apply concepts of shape, space, and measurement to solve problems involving two- and three-dimensional shapes by demonstrating an understanding of:

More information

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same!

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same! Software Metrics & Software Metrology Alain Abran Chapter 4 Quantification and Measurement are Not the Same! 1 Agenda This chapter covers: The difference between a number & an analysis model. The Measurement

More information

Schools Value-added Information System Technical Manual

Schools Value-added Information System Technical Manual Schools Value-added Information System Technical Manual Quality Assurance & School-based Support Division Education Bureau 2015 Contents Unit 1 Overview... 1 Unit 2 The Concept of VA... 2 Unit 3 Control

More information

CLOUD COVER IMPACT ON PHOTOVOLTAIC POWER PRODUCTION IN SOUTH AFRICA

CLOUD COVER IMPACT ON PHOTOVOLTAIC POWER PRODUCTION IN SOUTH AFRICA CLOUD COVER IMPACT ON PHOTOVOLTAIC POWER PRODUCTION IN SOUTH AFRICA Marcel Suri 1, Tomas Cebecauer 1, Artur Skoczek 1, Ronald Marais 2, Crescent Mushwana 2, Josh Reinecke 3 and Riaan Meyer 4 1 GeoModel

More information

AMVs at the Met Office: activities to improve their impact in NWP

AMVs at the Met Office: activities to improve their impact in NWP AMVs at the Met Office: activities to improve their impact in NWP James Cotton, Mary Forsythe Met Office, FitzRoy Road, Exeter EX1 3PB, UK Abstract Atmospheric motion vectors (AMVs) are an important source

More information

Forecasting Business Investment Using the Capital Expenditure Survey

Forecasting Business Investment Using the Capital Expenditure Survey Forecasting Business Investment Using the Capital Expenditure Survey Natasha Cassidy, Emma Doherty and Troy Gill* Business investment is a key driver of economic growth and is currently around record highs

More information

SOFTWARE FOR THE OPTIMAL ALLOCATION OF EV CHARGERS INTO THE POWER DISTRIBUTION GRID

SOFTWARE FOR THE OPTIMAL ALLOCATION OF EV CHARGERS INTO THE POWER DISTRIBUTION GRID SOFTWARE FOR THE OPTIMAL ALLOCATION OF EV CHARGERS INTO THE POWER DISTRIBUTION GRID Amparo MOCHOLÍ MUNERA, Carlos BLASCO LLOPIS, Irene AGUADO CORTEZÓN, Vicente FUSTER ROIG Instituto Tecnológico de la Energía

More information

M & V Guidelines for HUD Energy Performance Contracts Guidance for ESCo-Developed Projects 1/21/2011

M & V Guidelines for HUD Energy Performance Contracts Guidance for ESCo-Developed Projects 1/21/2011 M & V Guidelines for HUD Energy Performance Contracts Guidance for ESCo-Developed Projects 1/21/2011 1) Purpose of the HUD M&V Guide This document contains the procedures and guidelines for quantifying

More information

American Association for Laboratory Accreditation

American Association for Laboratory Accreditation Page 1 of 12 The examples provided are intended to demonstrate ways to implement the A2LA policies for the estimation of measurement uncertainty for methods that use counting for determining the number

More information

Integrated Resource Plan

Integrated Resource Plan Integrated Resource Plan March 19, 2004 PREPARED FOR KAUA I ISLAND UTILITY COOPERATIVE LCG Consulting 4962 El Camino Real, Suite 112 Los Altos, CA 94022 650-962-9670 1 IRP 1 ELECTRIC LOAD FORECASTING 1.1

More information

2016 ERCOT System Planning Long-Term Hourly Peak Demand and Energy Forecast December 31, 2015

2016 ERCOT System Planning Long-Term Hourly Peak Demand and Energy Forecast December 31, 2015 2016 ERCOT System Planning Long-Term Hourly Peak Demand and Energy Forecast December 31, 2015 2015 Electric Reliability Council of Texas, Inc. All rights reserved. Long-Term Hourly Peak Demand and Energy

More information

USING SIMULATED WIND DATA FROM A MESOSCALE MODEL IN MCP. M. Taylor J. Freedman K. Waight M. Brower

USING SIMULATED WIND DATA FROM A MESOSCALE MODEL IN MCP. M. Taylor J. Freedman K. Waight M. Brower USING SIMULATED WIND DATA FROM A MESOSCALE MODEL IN MCP M. Taylor J. Freedman K. Waight M. Brower Page 2 ABSTRACT Since field measurement campaigns for proposed wind projects typically last no more than

More information

Forecasting Methods. What is forecasting? Why is forecasting important? How can we evaluate a future demand? How do we make mistakes?

Forecasting Methods. What is forecasting? Why is forecasting important? How can we evaluate a future demand? How do we make mistakes? Forecasting Methods What is forecasting? Why is forecasting important? How can we evaluate a future demand? How do we make mistakes? Prod - Forecasting Methods Contents. FRAMEWORK OF PLANNING DECISIONS....

More information

Forecasting the first step in planning. Estimating the future demand for products and services and the necessary resources to produce these outputs

Forecasting the first step in planning. Estimating the future demand for products and services and the necessary resources to produce these outputs PRODUCTION PLANNING AND CONTROL CHAPTER 2: FORECASTING Forecasting the first step in planning. Estimating the future demand for products and services and the necessary resources to produce these outputs

More information

EXPERIMENTAL ERROR AND DATA ANALYSIS

EXPERIMENTAL ERROR AND DATA ANALYSIS EXPERIMENTAL ERROR AND DATA ANALYSIS 1. INTRODUCTION: Laboratory experiments involve taking measurements of physical quantities. No measurement of any physical quantity is ever perfectly accurate, except

More information

Actuarial Standard of Practice No. 23. Data Quality. Revised Edition

Actuarial Standard of Practice No. 23. Data Quality. Revised Edition Actuarial Standard of Practice No. 23 Data Quality Revised Edition Developed by the General Committee of the Actuarial Standards Board and Applies to All Practice Areas Adopted by the Actuarial Standards

More information

Algorithmic Trading Session 1 Introduction. Oliver Steinki, CFA, FRM

Algorithmic Trading Session 1 Introduction. Oliver Steinki, CFA, FRM Algorithmic Trading Session 1 Introduction Oliver Steinki, CFA, FRM Outline An Introduction to Algorithmic Trading Definition, Research Areas, Relevance and Applications General Trading Overview Goals

More information

Measuring the response of students to assessment: the Assessment Experience Questionnaire

Measuring the response of students to assessment: the Assessment Experience Questionnaire 11 th Improving Student Learning Symposium, 2003 Measuring the response of students to assessment: the Assessment Experience Questionnaire Graham Gibbs and Claire Simpson, Open University Abstract A review

More information

CALL VOLUME FORECASTING FOR SERVICE DESKS

CALL VOLUME FORECASTING FOR SERVICE DESKS CALL VOLUME FORECASTING FOR SERVICE DESKS Krishna Murthy Dasari Satyam Computer Services Ltd. This paper discusses the practical role of forecasting for Service Desk call volumes. Although there are many

More information

Successful Implementation of an Alternative Co-located Transfer Standard Audit Approach: Continuous Deployment of CTS Wind Sensors on a Tall Tower

Successful Implementation of an Alternative Co-located Transfer Standard Audit Approach: Continuous Deployment of CTS Wind Sensors on a Tall Tower Successful Implementation of an Alternative Co-located Transfer Standard Audit Approach: Continuous Deployment of CTS Wind Sensors on a Tall Tower Kirk Stopenhagen Vorticity Consulting LLC Redmond, WA

More information

Characterizing Digital Cameras with the Photon Transfer Curve

Characterizing Digital Cameras with the Photon Transfer Curve Characterizing Digital Cameras with the Photon Transfer Curve By: David Gardner Summit Imaging (All rights reserved) Introduction Purchasing a camera for high performance imaging applications is frequently

More information

Visualizing of Berkeley Earth, NASA GISS, and Hadley CRU averaging techniques

Visualizing of Berkeley Earth, NASA GISS, and Hadley CRU averaging techniques Visualizing of Berkeley Earth, NASA GISS, and Hadley CRU averaging techniques Robert Rohde Lead Scientist, Berkeley Earth Surface Temperature 1/15/2013 Abstract This document will provide a simple illustration

More information

Standard Deviation Estimator

Standard Deviation Estimator CSS.com Chapter 905 Standard Deviation Estimator Introduction Even though it is not of primary interest, an estimate of the standard deviation (SD) is needed when calculating the power or sample size of

More information

F. Farrokhyar, MPhil, PhD, PDoc

F. Farrokhyar, MPhil, PhD, PDoc Learning objectives Descriptive Statistics F. Farrokhyar, MPhil, PhD, PDoc To recognize different types of variables To learn how to appropriately explore your data How to display data using graphs How

More information