Lecture 2: ARMA(p,q) models (part 3)



Similar documents
Time Series Analysis

Sales forecasting # 2

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models

Introduction to Time Series Analysis. Lecture 6.

ITSM-R Reference Manual

Time Series Analysis

Some useful concepts in univariate time series analysis

Univariate and Multivariate Methods PEARSON. Addison Wesley

Forecasting the US Dollar / Euro Exchange rate Using ARMA Models

Estimating an ARMA Process

Analysis and Computation for Finance Time Series - An Introduction

Exam Solutions. X t = µ + βt + A t,

Time Series Analysis 1. Lecture 8: Time Series Analysis. Time Series Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 MIT 18.S096

3.1 Stationary Processes and Mean Reversion

Time Series Analysis in Economics. Klaus Neusser

Non-Stationary Time Series andunitroottests

ARMA, GARCH and Related Option Pricing Method

Time Series Analysis and Forecasting

Advanced Forecasting Techniques and Models: ARIMA

Graphical Tools for Exploring and Analyzing Data From ARIMA Time Series Models

Time Series - ARIMA Models. Instructor: G. William Schwert

Software Review: ITSM 2000 Professional Version 6.0.

AR(p) + MA(q) = ARMA(p, q)

1 Short Introduction to Time Series


Time Series Analysis

Rob J Hyndman. Forecasting using. 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1

Luciano Rispoli Department of Economics, Mathematics and Statistics Birkbeck College (University of London)

Forecasting model of electricity demand in the Nordic countries. Tone Pedersen

Topic 5: Stochastic Growth and Real Business Cycles

Semi Parametric Estimation of Long Memory: Comparisons and Some Attractive Alternatives

TIME SERIES ANALYSIS

Time Series Analysis

Prediction of Stock Price usingautoregressiveintegrated Moving AverageFilter Arima P,D,Q

Discrete Time Series Analysis with ARMA Models

Studying Achievement

Autocovariance and Autocorrelation

How To Model A Series With Sas

Traffic Safety Facts. Research Note. Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen

Time Series Analysis III

Time Series Analysis

Chapter 9: Univariate Time Series Analysis

Time Series Laboratory

TIME SERIES ANALYSIS

Forecasting Using Eviews 2.0: An Overview

Analysis of algorithms of time series analysis for forecasting sales

ANALYZING INVESTMENT RETURN OF ASSET PORTFOLIOS WITH MULTIVARIATE ORNSTEIN-UHLENBECK PROCESSES

9th Russian Summer School in Information Retrieval Big Data Analytics with R

Forecasting Chilean Industrial Production and Sales with Automated Procedures 1

The SAS Time Series Forecasting System

Financial TIme Series Analysis: Part II

Fractionally integrated data and the autodistributed lag model: results from a simulation study

Maximum Likelihood Estimation

Agenda. Managing Uncertainty in the Supply Chain. The Economic Order Quantity. Classic inventory theory

Two Topics in Parametric Integration Applied to Stochastic Simulation in Industrial Engineering

THE SVM APPROACH FOR BOX JENKINS MODELS

3. Regression & Exponential Smoothing

Logistic Regression. Jia Li. Department of Statistics The Pennsylvania State University. Logistic Regression

Is the Basis of the Stock Index Futures Markets Nonlinear?

Time Series Analysis

A FULLY INTEGRATED ENVIRONMENT FOR TIME-DEPENDENT DATA ANALYSIS

2. Linear regression with multiple regressors

Chapter 1. Vector autoregressions. 1.1 VARs and the identi cation problem

Time Series in Mathematical Finance

THE UNIVERSITY OF CHICAGO, Booth School of Business Business 41202, Spring Quarter 2014, Mr. Ruey S. Tsay. Solutions to Homework Assignment #2

Integrated Resource Plan

Time series Forecasting using Holt-Winters Exponential Smoothing

Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay

ADVANCED FORECASTING MODELS USING SAS SOFTWARE

Spectral Analysis of Stochastic Processes

Impulse Response Functions

State Space Time Series Analysis

Introduction to Time Series Analysis. Lecture 1.

Some stability results of parameter identification in a jump diffusion model

Time Series Analysis

Forecasting of Paddy Production in Sri Lanka: A Time Series Analysis using ARIMA Model

On-line outlier detection and data cleaning

Time Series HILARY TERM 2010 PROF. GESINE REINERT

Promotional Analysis and Forecasting for Demand Planning: A Practical Time Series Approach Michael Leonard, SAS Institute Inc.

Master s Thesis. Volume and Volatility in the Icelandic Foreign Exchange Market

Testing for Granger causality between stock prices and economic growth

4. Continuous Random Variables, the Pareto and Normal Distributions

Forecasting in supply chains

Modeling and forecasting regional GDP in Sweden. using autoregressive models

Package SCperf. February 19, 2015

In this paper we study how the time-series structure of the demand process affects the value of information

Chapter 3: The Multiple Linear Regression Model

Lecture 8. Confidence intervals and the central limit theorem

TURUN YLIOPISTO UNIVERSITY OF TURKU TALOUSTIEDE DEPARTMENT OF ECONOMICS RESEARCH REPORTS. A nonlinear moving average test as a robust test for ARCH

A Regime-Switching Model for Electricity Spot Prices. Gero Schindlmayr EnBW Trading GmbH

Monte Carlo-based statistical methods (MASM11/FMS091)

Forecasting methods applied to engineering management

Moving averages. Rob J Hyndman. November 8, 2009

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

TIME-SERIES ANALYSIS, MODELLING AND FORECASTING USING SAS SOFTWARE

Threshold Autoregressive Models in Finance: A Comparative Approach

Properties of Future Lifetime Distributions and Estimation

Equations, Inequalities & Partial Fractions

Transcription:

Lecture 2: ARMA(p,q) models (part 3) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 1 / 32

Motivation Introduction Characterize the main properties of ARMA(p,q) models. Estimation of ARMA(p,q) models Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 2 / 32

Road map Introduction 1 ARMA(1,1) model Definition and conditions Moments Estimation 2 ARMA(p,q) model Definition and conditions Moments Estimation 3 Application 4 Appendix Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 3 / 32

ARMA(1,1) model Definition and conditions 1. ARMA(1,1) 1.1. Definition and conditions Definition A stochastic process (X t ) t Z is said to be a mixture autoregressive moving average model of order 1, ARMA(1,1), if it satisfies the following equation : X t = µ + φx t 1 + ɛ t + θɛ t 1 t Φ(L)X t = µ + Θ(L)ɛ t where θ 0, θ 0, µ is a constant term, (ɛ t ) t Z is a weak white noise process with expectation zero and variance σ 2 ɛ (ɛ t WN(0, σ 2 ɛ )), Φ(L) = 1 φl and Θ(L) = 1 + θl. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 4 / 32

ARMA(1,1) model Definition and conditions Simulation of an ARMA(1,1) with (-0.5;-0.5) 4 Simulation of an ARMA(1,1) with (-0.5;0.5) 10 2 5 0 0-2 -5-4 0 100 200 300-10 0 100 200 300 Simulation of an ARMA(1,1) with (0.9;-0.5) 10 Simulation of an ARMA(1,1) with (0.9;0.5) 4 5 2 0 0-5 -2-10 0 100 200 300-4 0 100 200 300 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 5 / 32

ARMA(1,1) model Definition and conditions The properties of an ARMA(1,1) process are a mixture of those of an AR(1) and MA(1) processes : The (stability) stationarity condition is the one of an AR(1) process (or ARMA(1,0) process) : φ < 1. The invertibility condition is the one of a MA(1) process (or ARMA(0,1) process) : θ < 1. The representation of an ARMA(1,1) process is fundamental or causal if : φ < 1 and θ < 1. The representation of an ARMA(1,1) process is said to be minimal and causal if : φ < 1, θ < 1 and φ θ. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 6 / 32

ARMA(1,1) model Definition and conditions If (X t ) is stable and thus weakly stationary, then (X t ) has an infinite moving average representation (MA( )) : X t = = µ 1 φ + (1 φl) 1 (1 + θl)ɛ t µ 1 φ + a k ɛ t k k=0 where : a 0 = 1 a k = φ k + θφ k 1. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 7 / 32

ARMA(1,1) model Definition and conditions If (X t ) is invertible, then (X t ) has an infinite autoregressive representation (AR( )) : i.e. (1 θ L) 1 (1 φl)x t = X t = = µ 1 θ + ɛ t µ 1 θ + b k X t k + ɛ t k=1 where θ = θ, and : b k = θ k θ k 1 φ. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 8 / 32

1.2. Moments ARMA(1,1) model Moments Definition Let (X t ) denote a stationary stochastic process that has a fundamental ARMA(1,1) representation, X t = µ + φx t 1 + ɛ t + θɛ t 1. Then : E [X t ] = Proof : See Appendix 1. µ 1 φ m γ X (0) 1 + 2φθ + θ2 V(X t ) = 1 φ 2 σɛ 2 γ X (1) (φ + θ)(1 + φθ) Cov [X t, X t 1 ] = 1 φ 2 σɛ 2 γ X (h) = φγ X (h 1) for h > 1. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 9 / 32

ARMA(1,1) model Moments Definition The autocorrelation function of an ARMA(1,1) process satisfies : 1 if h = 0 (φ+θ)(1+φθ) ρ X (h) = if h = 1 1+2φθ+θ 2 φρ X (h 1) if h > 1. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 10 / 32

ARMA(1,1) model Moments The autocorrelation function of an ARMA(1,1) process exhibits exponential decay towards zero : it does not cut off but gradually dies out as h increases. The autocorrelation function of an ARMA(1,1) process displays the shape of that of an AR(1) process for h > 1. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 11 / 32

ARMA(1,1) model Moments Partial Autocorrelation : The partial autocorrelation function of an ARMA(1,1) process will gradually die out (the same property as a moving average model). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 12 / 32

Estimation ARMA(1,1) model Estimation Same techniques as before, especially those of MA models. Yule-Walker estimator : the extended Yule-Walker equations could be used in principe to estimate the AR coefficients but the MA coefficients need to be estimated by other means. In the presence of moving average components, the least squares estimator becomes nonlinear and the corresponding estimator is the conditional nonlinear least squares estimator (see estimation of MA(q) models). It has to be solved with numerical methods. Taking explicit distributional assumption for the error term, the conditional or exact maximum likelihood estimator can be computed (using also numerical or optimization methods). Other methods are also available : the Kalman filter, the generalized method of moments, etc. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 13 / 32

ARMA(1,1) model Estimation Estimation of ARMA(p,q) models (true DGP: µ = 0, φ = 0.9, and θ = 0.5) Coefficient Std. Error t-statistic p-value. Akaike info criterion Schwarz criterion ARMA(1,1) C 0.4532 0.7558 0.5996 0.5490 2.8841 2.9134 AR(1) 0.9112 0.0184 49.4884 0.0000 MA(1) 0.4639 0.0409 11.3368 0.0000 ARMA(2,2) C 0.3137 0.6942 0.4518 0.6516 2.8865 2.9288 AR(1) 1.4836 0.2655 5.5879 0.0000 AR(2) -0.5230 0.2422-2.1598 0.0313 MA(1) -0.1233 0.2666-0.4625 0.6439 MA(2) -0.2837 0.1245-2.2784 0.0231 ARMA(2,1) 0.3941 0.7431 0.5303 0.5961 2.8857 2.9195 0.8581 0.0964 8.8992 0.0000 AR(1) AR(2) 0.0491 0.0937 0.5244 0.6002 MA(1) 0.5047 0.0843 5.9848 0.0000 AR(2) C 0.4276 0.6628 0.6451 0.5192 2.9206 2.9459 AR(1) 1.2819 0.0419 30.5971 0.0000 AR(2) -0.3523 0.0416-8.4649 0.0000 AR(1) C 0.2857 0.9394 0.3042 0.7611 3.0559 3.0728 AR(1) 0.9467 0.0135 70.1399 0.0000 MA(2) C 0.6545 0.2043 3.2042 0.0014 3.6527 3.6779 MA(1) 1.3803 0.0329 41.9665 0.0000 MA(2) 0.6740 0.0324 20.8027 0.0000 MA(4) C 0.6259 0.2633 2.3768 0.0178 3.2039 3.2461 MA(1) 1.4334 0.0408 35.1707 0.0000 MA(2) 1.2250 0.0657 18.6595 0.0000 MA(3) 0.8552 0.0658 12.9989 0.0000 MA(4) 0.4131 0.0407 10.1483 0.0000 Note: C, AR(j), and MA(j) are respectively the estimate of the constant term, the jth autogressive term, and the jth moving average term. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 14 / 32

ARMA(p,q) model Definition and conditions 2. ARMA(p,q) 2.1. Definition and conditions Definition A stochastic process (X t ) t Z is said to be a mixture autoregressive moving average model of order p and q, ARMA(p,q), if it satisfies the following equation : X t = µ + φ 1 X t 1 + + φ p X t p + ɛ t + θ 1 ɛ t 1 + + θ q ɛ t q t Φ(L)X t = µ + Θ(L)ɛ t where θ q 0, φ p 0, µ is a constant term, (ɛ t ) t Z is a weak white noise process with expectation zero and variance σ 2 ɛ (ɛ t WN(0, σ 2 ɛ )), Φ(L) = 1 φ 1 L φ p L p and Θ(L) = 1 + θ 1 L + + θ q L q. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 15 / 32

ARMA(p,q) model Definition and conditions Main idea of ARMA(p,q) models Approximate Wold form of stationary time series by parsimonious parametric models AR and MA models can be cumbersome because one may need a high-order model with many parameters to adequately describe the data dynamics (see the effective Fed fund rate application) By mixing AR and MA models into a more compact form, the number of parameters is kept small... Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 16 / 32

ARMA(p,q) model Definition and conditions The properties of an ARMA(p,q) process are a mixture of those of an AR(p) and MA(q) processes : The (stability) stationarity conditions are those of an AR(p) process (or ARMA(p,0) process) : z p Φ(z 1 ) = 0 z p φ 1 z p 1 φ p = 0 z i < 1. for i = 1,, p. The invertibility conditions are those of an MA(q) process (or ARMA(0,q) process) : z q Θ(z 1 ) = 0 z q + θ 1 z q 1 + + θ q = 0 z i < 1. for i = 1,, q. The representation of an ARMA(p,q) process is fundamental or causal if it is stable and invertible The representation of an ARMA(1,1) process is said to be minimal and causal if it is stable, invertible and the characteristic polynomials z p Φ(z 1 ) and z q Θ(z 1 ) have no common roots. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 17 / 32

ARMA(p,q) model Definition and conditions Definition The representation of a mixture autoregressive moving average process of order p and q defined by : X t = µ + φ 1X t 1 + + φ px t p + ɛ t + θ 1ɛ t 1 + + θ qɛ t q, is said to be a minimal causal (fundamental) representation (ɛ t) is the innovation process if : (i) All the roots of the characteristic equation associated to Φ z p φ 1z p 1 φ p = 0 are of modulus less than one, z i < 1 for i = 1,, p ; (ii) All the roots of the characteristic equation associated to Θ z q + θ 1z q 1 + + θ q = 0 are of modulus less than one, z i < 1 for i = 1,, q ; (iii) The characteristic polynomials z p Φ(z 1 ) and z q Θ(z 1 ) have no common roots. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 18 / 32

ARMA(p,q) model Definition and conditions If (X t ) is stable and thus weakly stationary, then (X t ) has an infinite moving average representation (MA( )) : X t = = µ 1 p k=1 φ k µ 1 p k=1 φ k + Φ(L) 1 (1 + θl)ɛ t + a k ɛ t k k=0 where : a 0 = 1 a k < k=0 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 19 / 32

ARMA(p,q) model Definition and conditions If (X t ) is invertible, then (X t ) has an infinite autoregressive representation (AR( )) : i.e. Θ(L) 1 Φ(L)X t = X t = = µ 1 q k=1 θ q µ 1 q k=1 θ k + + ɛ t b k X t k + ɛ t k=1 where θ k = θ k. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 20 / 32

ARMA(p,q) model Moments 2.2. Moments of an ARMA(p,q) The properties of the moments of an ARMA(p,q) are also a mixture of those of an AR(1) and MA(1) processes. The mean is the same as the one of an AR(p) model (with a constant term) : E(X t ) = µ 1 p k=1 φ k m. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 21 / 32

ARMA(p,q) model Moments Autocorrelation : The autocorrelation function of an ARMA(p,q) process exhibits exponential decay towards zero : it does not cut off but gradually dies out as h increases (possibly damped oscillations. The autocorrelation function of an ARMA(p,q) process displays the shape of that of an AR(p) process for h > max(p, q + 1). Partial Autocorrelation : The partial autocorrelation function of an ARMA(p,q) process will gradually die out (the same property as a MA(q) model). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 22 / 32

2.3. Estimation ARMA(p,q) model Estimation Same techniques as in previous models... 1 Conditional least squares method 2 Maximum likelihood estimator (conditional or exact) 3 Generalized method of moments 4 Etc Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 23 / 32

Application 3. Application Effective Fed fund rate : 1970 :01-2010 :01 (monthly observations) An ARMA(1,2) captures better the dynamics of the effective Fed fund rate. ML estimation of the effective Fed fund rate : ARMA(1,2) Coefficients Estimates Std. Error P-value µ 6.225 1.263 0.000 φ 0.967 0.012 0.000 θ 1 0.488 0.048 0.000 θ 2 0.082 0.048 0.088 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 24 / 32

Application Effective Fed fund rate: ARMA(1,2) specification 20 15 10 4 2 0-2 -4-6 -8 1970 1975 1980 1985 1990 1995 2000 2005 5 0 Residual Actual Fitted Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 25 / 32

Application Effective Fed fund rate: diagnostics of the ARMA(1,2) specification Autocorrelation 1.0 0.9 0.8 0.7 0.6 0.5 0.4 2 4 6 8 10 12 14 16 18 20 22 24 Actual Theoretical Partial autocorrelation 1.0 0.5 0.0-0.5 2 4 6 8 10 12 14 16 18 20 22 24 Actual Theoretical Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 26 / 32

Application Effective Fed fund rate: Impulse response function of the estimated ARMA(1,2) specification 1.0 Impulse Response ± 2 S.E. 0.8 0.6 0.4 0.2 2 4 6 8 10 12 14 16 18 20 22 24 20 Accumulated Response ± 2 S.E. 15 10 5 0 2 4 6 8 10 12 14 16 18 20 22 24 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 27 / 32

4. Appendix Appendix 1. Moments of an ARMA(1,1). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 28 / 32

Appendix 1. Moments of an ARMA(1,1) The properties of the moments of an ARMA(1,1) are a mixture of those of an AR(1) and MA(1) processes. The mean is the same as the one of an AR(1) model (with a constant term) : E(X t ) = E (µ + φx t 1 + ɛ t + θɛ t 1 ) = µ + φe(x t 1 ) + E(ɛ t ) + θe(ɛ t 1 ) = µ + φe(x t ) since E(X t ) = E(X t j ) for all j (stationarity property) and E(ɛ t j ) = 0 for all j(white noise). Therefore, E(X t ) = µ 1 φ m. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 29 / 32

Appendix Autocovariances Trick : Proceed in the same way and note that the Yule-Walker equations are the same as those of an AR(1) for h > 1. For h = 0 : γ X (0) = E [(X t m)(x t m)] = E [(φ(x t 1 m) + ɛ t + θɛ t 1) (X t m)] = φe [(X t m)(x t 1 m)] + E [ɛ t(x t m)] + θ E [ɛ t 1(X t m)] } {{ } 0 = φγ X (1) + σɛ 2 +θe [(φ(x t 1 m) + ɛ t + θɛ t 1) ɛ t 1] } {{ } AR(1) part = φγ X (1) + σ 2 ɛ + θφe [(X t 1 m)ɛ t 1] + θe [ɛ tɛ t 1] + θ 2 E = φγ X (1) + σ 2 ɛ (1 + θ(φ + θ)). [ ] ɛ 2 t 1 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 30 / 32

Appendix For h = 1 : γ X (1) = E [(X t m)(x t 1 m)] = E [(φ(x t 1 m) + ɛ t + θɛ t 1 ) (X t 1 m)] = φe [(X t 1 m)(x t 1 m)] + E [ɛ t (X t 1 m)] +θe [ɛ t 1 (X t 1 m)] = φγ X (0) +θσɛ 2 } {{ } AR(1) part Solving for γ X (0) and γ X (1) : γ X (0) V(X t ) = γ X (1) Cov(X t, X t 1 ) = 1 + 2φθ + θ2 1 φ 2 σ 2 ɛ (φ + θ)(1 + φθ) 1 φ 2 σ 2 ɛ. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 31 / 32

Appendix For h > 1 : γ X (h) = E [(X t m)(x t h m)] = E [(φ(x t 1 m) + ɛ t + θɛ t 1 ) (X t h m)] = φe [(X t 1 m)(x t h m)] + E [ɛ t (X t h m)] +θe [ɛ t 1 (X t h m)] = φγ X (h 1) } {{ } AR(1) part since E [ɛ t j (X t h m)] = 0 for all h > j (ɛ t ) is the innovation process. The expression of the autocovariance of order h displays the same difference or recurrence equation as in an AR(1) model only the initial value γ X (1) changes! Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 32 / 32