Branko Kosović Sue Ellen Haupt, Gerry Wiener, Luca Delle Monache, Yubao Liu, Marcia Politovich, Jenny Sun, John Williams*, Daniel Adriaansen, Stefano Alessandrini, Susan Dettling, and Seth Linden (NCAR, *WSI) ICEM 2015 Boulder, Colorado, June 22-26, 2015 1
Xcel Energy Wind Prediction Project 3.4 million customers annual revenue $11B
Xcel Energy Wind Forecasting Project Forecast Requirements 15-30 minute forecasts (emergency ramp adjustments) 1-3 hour forecasts (anticipate upcoming ramp adjustments) 24 hour forecasts (energy trading and planning) 3-5 day forecasts (long term trading & resource planning) Xcel Energy needed power forecasts for 55 connection nodes representing 94 wind farms with 3283 turbines totaling 4000 MW of power generation capability. But. The required forecasts need to be for POWER, not wind speed! 3
Variable Energy Forecasting System NCEP Data NAM, GFS, RUC GEM (Canada) WRF RTFDDA System Ensemble System Solar Energy Forecast Supplemental Wind Farm Data Met towers Wind profiler Surface Stations Windcube Lidar Wind Farm Data Nacelle wind speed Generator power Node power Met tower Availability Dynamic, Integrated Forecast System (DICast ) VDRAS (nowcasting) Expert System (nowcasting) CSV Data Statistical Verification Wind to Energy Conversion Subsystem Probabilistic and Analog Forecast Potential Power Forecasting Data Mining for Load Estimation Operator GUI Meteorologist GUI WRF Model Output Extreme Weather Events 4
DICast System Blends Output from Several Numerical Weather Prediction Models Public Service of Northwest Texas Area Total Power, 03/14 Ramp CAPACITY (%) TIME
Wind Power Forecasts Resulted in Savings for Ratepayers Forecasted MAE Percentage Savings 2009 2014* Improvement 16.83% 10.10% 40% $49,000,000 *Data through November, 2014 Also: saved > 267,343 tons CO2 (2014) Drake Bartlett, Xcel 6
Icing Forecasting System ExWx Provides Categorical Forecast of Icing Predicting wind turbine icing is critical for power trading on open market and short term load balancing. In order to successfully develop a robust wind turbine icing forecasting system, a truth dataset must be developed. Limited documentation of icing events and monitoring equipment make identifying icing after the fact difficult. Plus, there is a Big Data problem.
Power Data Datasets For Icing Forecast Sensor Data PRIMARY DICast Data SECONDARY http://www.newavionics.com/images/9734_410x359.jpg NWS Data METAR Data
ExWx Uses WRF-RTFDDA and DICast Blended NWP Output to Compute Icing Potential A B A B B C A B C D W W CPOI W D B W B WRF icing potential Evaluates all WRF model levels < 1km Combines model level height, model predicted supercooled liquid water, and temperature at each level using fuzzy logic maps (configurable) Final potential at each WRF grid point is the maximum of the icing potential at each level < 1km DICast icing potential Conditional probability of icing (CPOI) deterministic forecast from DICast Combines five NWP model solutions Typically one site per farm, more in some cases UCAR Confidential and Proprietary. 2015, University Corporation for Atmospheric Research. All rights reserved.
Icing Forecasting System Provides Categorical Icing Forecast Note no missing data-wherever DICast was missing the WRF is used exclusively (and vice-versa) Threshold of 0.5 is configurable based on experience of operators Event well forecast by ExWx 12/25/14 ExWx icing potential forecasts for all ExWx runs affecting the event window (8 hours centered on 00Z) Icing potential < 0.5 inside window Icing potential > 0.5 inside window Icing potential > 0.5 outside window Icing potential < 0.5 outside window 12/26/14 UCAR Confidential and Proprietary. 2015, University Corporation for Atmospheric Research. All rights reserved.
Scaled Load Load Forecasting and Distributed Solar Energy Forecasting Power [MW] 70 Actual Load Forecast 60 50 Date Increasing distributed generation behind the meter enhances need for load forecasting to mitigate rapid changes in weather-sensitive power production Hour
Develop and Evaluate Solar Energy Prediction Techniques Distributed Solar Forecasts Determined load cutout due to distributed generation solar (higher load on cloudy days) Used historical production data (gridded) and determined representativeness Season Higher load days p-value Sample size (clear/cloudy) Spring Cloudy 0.032 41/17 Summer Clear 0.0003 28/8 Fall Cloudy 0.638 30/24 Winter Cloudy 0.016 36/21 Capacity of DPV installations Correlation of hourly gridded PV percent capacity to those from a grid box near the NREL-NWTC site. Also shown are BVSD PV (blue dots) and Sun Edison PV installations (green dots), and METAR sites (black dots).
Evaluation Develop and Evaluate Solar Energy Prediction Techniques Difficult to evaluate given lack of actual production data (behind the meter) Used data from BVSD Scatter plot shows majority of forecasts along Y=X line UCAR Confidential and Proprietary. 2014, University Corporation for Atmospheric Research. All rights reserved.
Develop and Evaluate Solar Energy Prediction Techniques Normalized RMSE Bias Normalized RMSE and Bias are shown for 6 BVSD Sites Errors are for 11z Initialization Forecast over 6 month period Most nrmse values under 3%
Data Mining For Load Estimation Select machine learning method Identify, compute predictors Optimize spatial averaging weights Optimize temporal averaging weights Select optimal set of predictor variables Select machine learning parameters Evaluate model via historical playback Overview of System Development Machine Learning methods evaluated Ridge Regression Decision Trees Random forest K-nearest neighbor Gradient boosted regression Cubist Cubist and Gradient Boosted Regression were the Performing learning methods. Cubist was chosen due to existing software infrastructure and extensive experience with the learning method
Data Mining For Load Estimation Overview of System Development Basic predictors identified DICast Forecast Variables: temperature, wind speed, dew point, probability of rain, snow, and ice, quantitative precipitation 1 hour forecast, probability of precipitation in 1 hour, cloud coverage, downward solar radiation flux at the surface, precipitation Solar Angles Date and Time Variables: hour of day, day of week, week of year, season indicators
Data Mining For Load Estimation Overview of System Development Predictors enhanced or combined to maximize skill Physical insight used to meaningfully combine variables improves performance Spatial weighted averages Temporal decay averages Feature extraction of functional relationships cos(2*pi*local_hour/24) Load vs. Temp
Data Mining For Load Estimation Overview of System Development Machine Learning parameters tuned Iterative search performed over parameter combinations Minimize complexity Maintain performance Parameter Search Space complexity ERROR Parameter 1 (committee members) Parameter 2 (rules) complexity
Data Mining For Load Estimation Real time load observations removed as input to forecasts Predictors enhanced and skill maintained Enhanced predictors tend to be favored by Cubist NEW MODEL vs OLD MODEL
Data Mining For Load Estimation Select machine learning method Load observations dataset begins 1 Sep 2012 Results from 1 Oct 2013 30 Sep 2014 Identify, compute predictors Optimize spatial averaging weights Optimize temporal averaging weights Select optimal set of predictor variables Select machine learning parameters Evaluate model via historical playback Average load MAPE Day-ahead MAPE = 2.40% Week-ahead MAPE = 2.88% Peak MAPE occurs during ramp to peak load
Summary We have develop a comprehensive variable power forecasting system that integrates recent advances in forecasting at a range of time scales including probabilistic forecasting and forecasting of extreme events. Day-ahead forecasting system resulted in significant savings for ratepayers. The effectiveness of a forecasting system for efficient integration of variable generation depends on the quality and quantity of data. More data (amount, frequency) is better, however, First data from existing sources should be: Standardized Quality controlled Delivered in timely manner, and Archived for future use (e.g., training for machine learning algorithms).
CO-Labs - Governor s Award 2014 for Sustainability 22
Branko Kosović Research Applications Laboratory National Center for Atmospheric Research branko@ucar.edu 23