Time Series Analysis



Similar documents
Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models

Sales forecasting # 2

Lecture 2: ARMA(p,q) models (part 3)

ITSM-R Reference Manual

Advanced Forecasting Techniques and Models: ARIMA

Analysis and Computation for Finance Time Series - An Introduction

Rob J Hyndman. Forecasting using. 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1

Introduction to General and Generalized Linear Models

STATISTICA Formula Guide: Logistic Regression. Table of Contents

Time Series Analysis and Forecasting

TIME SERIES ANALYSIS

Forecasting of Paddy Production in Sri Lanka: A Time Series Analysis using ARIMA Model

Univariate and Multivariate Methods PEARSON. Addison Wesley

TIME SERIES ANALYSIS

5. Multiple regression

Lecture 8. Confidence intervals and the central limit theorem

Luciano Rispoli Department of Economics, Mathematics and Statistics Birkbeck College (University of London)

How To Model A Series With Sas

ARMA, GARCH and Related Option Pricing Method

Chapter 4: Vector Autoregressive Models

Time Series Analysis

Autocovariance and Autocorrelation

Chapter 6: Multivariate Cointegration Analysis

Topic 5: Stochastic Growth and Real Business Cycles

Promotional Forecast Demonstration

Time Series Laboratory

Introduction to Time Series Analysis. Lecture 1.

Multivariate Normal Distribution

Lecture 6 Black-Scholes PDE

Software Review: ITSM 2000 Professional Version 6.0.

Basics of Statistical Machine Learning

Maximum Likelihood Estimation

Simple Linear Regression Inference

Some useful concepts in univariate time series analysis

Time Series - ARIMA Models. Instructor: G. William Schwert

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

9th Russian Summer School in Information Retrieval Big Data Analytics with R

Time Series Analysis

AUTOMATION OF ENERGY DEMAND FORECASTING. Sanzad Siddique, B.S.

Traffic Safety Facts. Research Note. Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen

USE OF ARIMA TIME SERIES AND REGRESSORS TO FORECAST THE SALE OF ELECTRICITY

Non-Stationary Time Series andunitroottests

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Using JMP Version 4 for Time Series Analysis Bill Gjertsen, SAS, Cary, NC

THE UNIVERSITY OF CHICAGO, Booth School of Business Business 41202, Spring Quarter 2014, Mr. Ruey S. Tsay. Solutions to Homework Assignment #2

Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091)

Pattern Analysis. Logistic Regression. 12. Mai Joachim Hornegger. Chair of Pattern Recognition Erlangen University

Chapter 13 Introduction to Nonlinear Regression( 非 線 性 迴 歸 )

A Primer on Mathematical Statistics and Univariate Distributions; The Normal Distribution; The GLM with the Normal Distribution

Assignment 2: Option Pricing and the Black-Scholes formula The University of British Columbia Science One CS Instructor: Michael Gelbart

3. Regression & Exponential Smoothing

Impulse Response Functions

Time Series Analysis

" Y. Notation and Equations for Regression Lecture 11/4. Notation:

Monte Carlo-based statistical methods (MASM11/FMS091)

Package EstCRM. July 13, 2015

Time Series Analysis

Testing for Granger causality between stock prices and economic growth

Fitting Subject-specific Curves to Grouped Longitudinal Data

Statistics in Retail Finance. Chapter 6: Behavioural models

Statistical Machine Learning

Part 2: One-parameter models

Nonlinear Regression:

The VAR models discussed so fare are appropriate for modeling I(0) data, like asset returns or growth rates of macroeconomic time series.

Estimating an ARMA Process

MATHEMATICAL METHODS OF STATISTICS

Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.

Forecasting Stock Market Series. with ARIMA Model

Chapter 4: Statistical Hypothesis Testing

Confidence Intervals for Exponential Reliability

Introduction to Regression and Data Analysis

**BEGINNING OF EXAMINATION** The annual number of claims for an insured has probability function: , 0 < q < 1.

E(y i ) = x T i β. yield of the refined product as a percentage of crude specific gravity vapour pressure ASTM 10% point ASTM end point in degrees F

Reject Inference in Credit Scoring. Jie-Men Mok

Service courses for graduate students in degree programs other than the MS or PhD programs in Biostatistics.

Forecasting methods applied to engineering management

Forecasting model of electricity demand in the Nordic countries. Tone Pedersen

i=1 In practice, the natural logarithm of the likelihood function, called the log-likelihood function and denoted by

Logistic Regression. Jia Li. Department of Statistics The Pennsylvania State University. Logistic Regression

SAS Software to Fit the Generalized Linear Model

Basic Statistics and Data Analysis for Health Researchers from Foreign Countries

Threshold Autoregressive Models in Finance: A Comparative Approach

Web-based Supplementary Materials for Bayesian Effect Estimation. Accounting for Adjustment Uncertainty by Chi Wang, Giovanni

Overview Classes Logistic regression (5) 19-3 Building and applying logistic regression (6) 26-3 Generalizations of logistic regression (7)

Exact Confidence Intervals

Lecture 3: Linear methods for classification

17. SIMPLE LINEAR REGRESSION II

Statistics Graduate Courses

Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics

3.1 Stationary Processes and Mean Reversion

Gamma Distribution Fitting

Time Series Analysis

Gaussian Conjugate Prior Cheat Sheet

STA 4273H: Statistical Machine Learning

Transcription:

Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1

Outline of the lecture Identification of univariate time series models, cont.: Estimation of model parameters, Sec. 6.4 (cont.) Model order selection, Sec. 6.5 Model validation, Sec. 6.6 2

Estimation methods (from previous lecture) We have an appropriate model structure AR(p), M A(q), ARMA(p,q), ARIMA(p,d,q) with p, d, and q known Task: Based on the observations find appropriate values of the parameters The book describes many methods: Moment estimates LS-estimates Prediction error estimates Conditioned Unconditioned ML-estimates Conditioned Unconditioned (exact) 3

Maximum likelihood estimates ARM A(p, q)-process: Y t + φ 1 Y t 1 + + φ p Y t p = ε t + θ 1 ε t 1 + + θ q ε t q Notation: θ T = (φ 1,...,φ p,θ 1,...,θ q ) Y T t = (Y t,y t 1,...,Y 1 ) The Likelihood function is the joint probability distribution function for all observations for given values of θ and σ 2 ε: L(Y N ;θ,σ 2 ε) = f(y N θ,σ 2 ε) Given the observations Y N we estimate θ and σ 2 ε as the values for which the likelihood is maximized. 4

The likelihood function for ARM A(p, q)-models The random variable Y N Y N 1 only contains ε N as a random component ε N is a white noise process at time N and does therefore not depend on anything We therefore know that the random variables Y N Y N 1 and Y N 1 are independent, hence (see also page 3): f(y N θ,σ 2 ε) = f(y N Y N 1,θ,σ 2 ε)f(y N 1 θ,σ 2 ε) Repeating these arguments: L(Y N ;θ,σ 2 ε) = N t=p+1 f(y t Y t 1,θ,σε) 2 f(y p θ,σε) 2 5

The conditional likelihood function Evaluation of f(y p θ,σ 2 ε) requires special attention It turns out that the estimates obtained using the conditional likelihood function: N L(Y N ;θ,σε) 2 = f(y t Y t 1,θ,σε) 2 t=p+1 results in the same estimates as the exact likelihood function when many observations are available For small samples there can be some difference Software: The S-PLUS function arima.mle calculate conditional estimates The R function arima calculate exact estimates 6

Evaluating the conditional likelihood function Task: Find the conditional densities given specified values of the parameters θ and σ 2 ε The mean of the random variable Y t Y t 1 is the the 1-step forecast Ŷt t 1 The prediction error ε t = Y t Ŷ t t 1 has variance σ 2 ε We assume that the process is Gaussian: f(y t Y t 1,θ,σ 2 ε) = 1 e (Y t Y b t t 1(θ))2 /2σ2 ε σ ε 2π And therefore: L(Y N ;θ,σ 2 ε) = (σ 2 ε2π) N p 2 exp 1 2σ 2 ε N ε 2 t(θ) t=p+1 7

ML-estimates The (conditional) ML-estimate θ is a prediction error estimate since it is obtained by minimizing S(θ) = N t=p+1 ε 2 t(θ) By differentiating w.r.t. σ 2 ε it can be shown that the ML-estimate of σ 2 ε is σ 2 ε = S( θ)/(n p) The estimate θ is asymptoticly good and the variance-covariance matrix is approximately 2σ 2 εh 1 where H contains the 2nd order partial derivatives of S(θ) at the minimum 8

Finding the ML-estimates using the PE-method 1-step predictions: Ŷ t t 1 = φ 1 Y t 1 φ p Y t p + 0 + θ 1 ε t 1 + + θ q ε t q If we use ε p = ε p 1 = = ε p+1 q = 0 we can find: Ŷ p+1 p = φ 1 Y p φ p Y 1 + 0 + θ 1 ε p + + θ q ε p+1 q Which will give us ε p+1 = Y p+1 Ŷ p+1 p and we can then calculate Ŷp+2 p+1 and ε p+2... and so on until we have all the 1-step prediction errors we need. We use numerical optimization to find the parameters which minimize the sum of squared prediction errors 9

S(θ) for (1 + 0.7B)Y t = (1 0.4B)ε t with σ 2 ε = 0.25 2 Data: arima.sim(model=list(ar= 0.7,ma=0.4), n=500, sd=0.25) 1.0 0.8 0.6 45 AR parameter 0.4 0.2 0.0 40 35 0.2 0.4 30 1.0 0.5 0.0 0.5 MA parameter 10

Moment estimates Given the model structure: Find formulas for the theoretical autocorrelation or autocovariance as function of the parameters in the model Estimate, e.g. calculate the SACF Solve the equations by using the lowest lags necessary Complicated! General properties of the estimator unknown! 11

Moment estimates for AR(p)-processes In this case moment estimates are simple to find due to the Yule-Walker equations (page 104). We simply plug in the estimated autocorrelation function in lags 1 to p: ρ(1) 1 ρ(1) ρ(p 1) φ 1 ρ(2). = ρ(1) 1 ρ(p 2) φ 2.... ρ(p) ρ(p 1) ρ(p 2) 1 φ p and solve w.r.t. the φ s The function ar in S-PLUS or R use this approach as default 12

Model building 1. Identification (Specifying the model order) Data Theory physical insight 2. Estimation (of the model parameters) No 3. Model checking Is the model OK? Yes Applications using the model (Prediction, simulation, etc.) 13

Validation of the model and extensions / reductions Residual analysis (Sec. 6.6.2): Is it possible to detect problems with residuals? (the 1-step prediction errors using the estimates, i.e. {ε t ( θ)}, should be white noise) If the SACF or the SPACF of {ε t ( θ)} points towards a particular ARMA-structure we can derive how the original model should be extended (Sec. 6.5.1) If the model pass the residual analysis it makes sense to test null hypotheses about the parameters (Sec. 6.5.2) 14

Residual analysis Plot {ε t ( θ)}; do the residuals look stationary? Tests in the autocorrelation. If {ε t ( θ)} is white noise then ˆρ ε (k) is approximately Gaussian distributed with mean 0 and variance 1/N. If the model fails calculate SPACF also and see if an ARMA-structure for the residuals can be derived (Sec. 6.5.1) Since ˆρ ε (k 1 ) and ˆρ ε (k 2 ) are independent (Eq. 6.4) the test m ( ) 2 statistic Q 2 = N ˆρεt( bθ) (k) is approximately distributed k=1 as χ 2 (m n), where n is the number of parameters. S-PLUS: arima.diag( output from arima.mle ) 15

Residual analysis (continued) Test for the number of changes in sign. In a series of length N there is N 1 possibilities for changes in sign. If the series is white noise (with mean zero) the probability of change is 1/2 and the changes will be independent. Therefore the number of changes is distributed as Bin(N 1, 1/2) S-PLUS: binom.test(n-1, No. of changes ) Test in the scaled cumulated periodogram of the residuals is done by plotting it and adding lines at ±K α / q, where q = (N 2)/2 for N even and q = (N 1)/2 for N odd. For 1 α confidence limits K α can be found in Table 6.2 S-PLUS (95% confidence interval): library(mass) cpgram( residuals ) 16

Sum of squared residuals depend on the model size S( ^θ 1 ) x x x x x x x 1 2 3 4 5 6 7 i = number of parameters (It is assumed that the models are nested) 17

Test is the model The test essentially checks if the reduction in SSE (S 1 S 2 ) is large enough to justify the extra parameters in model 2 (n 2 parameters) as compared to model 1 (n 1 parameters). The number of observations used is called N. If vector θ extra is used to denote the extra parameters in model 2 as compared to model 1, then the test is formally: H 0 : θ extra = 0 vs. H 0 : θ extra 0 If H 0 is true it (approximately) hold that (S 1 S 2 )/(n 2 n 1 ) S 2 /(N n 2 ) F(n 2 n 1,N n 2 ) (The likelihood ratio test is also a possibility) 18

Testing one parameter for significance H 0 : θ i = 0 against H 1 : θ i 0 Can be done as described on the previous slide Alternatively we can use a t-test based on the estimate and its standard error: ˆθ i / ˆV (ˆθ i ) Under H 0 and for an ARMA(p,q)-model this follows a t(n p q) distribution (or t(n 1 p q) if we estimated an overall mean of the series) Often N is so large compared to the number of parameters that we can just use the standard normal distribution 19

Information criteria Select the model which minimize some information criterion Akaike s Information Criterion AIC = 2log(L(Y N ; θ, ˆσ ε)) 2 + 2n par Bayesian Information Criterion BIC = 2log(L(Y N ; θ, ˆσ ε)) 2 + logn n par Except for an additive constant this can also be expressed as AIC = N log ˆσ ε 2 + 2n par BIC = N log ˆσ ε 2 + logn n par BIC yields a consistent estimate of the model order 20

Example A model for CO 2... 21