Introduction to Time Series Analysis. Lecture 6.



Similar documents
AR(p) + MA(q) = ARMA(p, q)

Lecture 2: ARMA(p,q) models (part 3)

Discrete Time Series Analysis with ARMA Models

ITSM-R Reference Manual

1 Short Introduction to Time Series

Sales forecasting # 2

Time Series - ARIMA Models. Instructor: G. William Schwert

DEFINITION A complex number is a matrix of the form. x y. , y x

Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay

Time Series Analysis

Time Series Analysis

Time Series Analysis 1. Lecture 8: Time Series Analysis. Time Series Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 MIT 18.S096

Lecture Notes on Polynomials

Analysis and Computation for Finance Time Series - An Introduction

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models

Financial TIme Series Analysis: Part II

TIME SERIES. Syllabus... Keywords...

Time Series Analysis

Time Series Analysis

Some useful concepts in univariate time series analysis

Time Series Analysis in Economics. Klaus Neusser

Univariate and Multivariate Methods PEARSON. Addison Wesley

Time Series HILARY TERM 2010 PROF. GESINE REINERT

Analysis of algorithms of time series analysis for forecasting sales

Zeros of Polynomial Functions

General Theory of Differential Equations Sections 2.8, , 4.1

TIME SERIES ANALYSIS: AN OVERVIEW

Introduction to Time Series Analysis. Lecture 1.

Luciano Rispoli Department of Economics, Mathematics and Statistics Birkbeck College (University of London)

3.1 Stationary Processes and Mean Reversion

Nonhomogeneous Linear Equations

Traffic Safety Facts. Research Note. Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen

8. Linear least-squares

Do Electricity Prices Reflect Economic Fundamentals?: Evidence from the California ISO

Sect Solving Equations Using the Zero Product Rule

Graphical Tools for Exploring and Analyzing Data From ARIMA Time Series Models

2.3. Finding polynomial functions. An Introduction:

Matrices and Polynomials

Software Review: ITSM 2000 Professional Version 6.0.

Time Series in Mathematical Finance

Promotional Analysis and Forecasting for Demand Planning: A Practical Time Series Approach Michael Leonard, SAS Institute Inc.


Forecasting model of electricity demand in the Nordic countries. Tone Pedersen

Recursive Algorithms. Recursion. Motivating Example Factorial Recall the factorial function. { 1 if n = 1 n! = n (n 1)! if n > 1

Time Series Analysis

a 1 x + a 0 =0. (3) ax 2 + bx + c =0. (4)

PRE-CALCULUS GRADE 12

3 1. Note that all cubes solve it; therefore, there are no more

Solutions for Math 311 Assignment #1

FACTORING POLYNOMIALS IN THE RING OF FORMAL POWER SERIES OVER Z

Solving for the Roots of the Cubic Equation. Finding the solution to the roots of a polynomial equation has been a fundamental

Chapter 6. Orthogonality

March 29, S4.4 Theorems about Zeros of Polynomial Functions

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Time series analysis. Jan Grandell

Estimating an ARMA Process

minimal polyonomial Example

Roots of Polynomials

Time Series Analysis of Aviation Data

Since [L : K(α)] < [L : K] we know from the inductive assumption that [L : K(α)] s < [L : K(α)]. It follows now from Lemma 6.

Vieta s Formulas and the Identity Theorem

The Method of Partial Fractions Math 121 Calculus II Spring 2015

A course in Time Series Analysis. Suhasini Subba Rao

Time Series Analysis III

The SAS Time Series Forecasting System

Time Series Analysis: Basic Forecasting.

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include

TIME SERIES ANALYSIS

Indiana State Core Curriculum Standards updated 2009 Algebra I

In this paper we study how the time-series structure of the demand process affects the value of information

Statistics and Probability Letters

IRREDUCIBLE OPERATOR SEMIGROUPS SUCH THAT AB AND BA ARE PROPORTIONAL. 1. Introduction

JUST THE MATHS UNIT NUMBER 1.8. ALGEBRA 8 (Polynomials) A.J.Hobson

THE SVM APPROACH FOR BOX JENKINS MODELS

Time Series Analysis and Forecasting

1 = (a 0 + b 0 α) (a m 1 + b m 1 α) 2. for certain elements a 0,..., a m 1, b 0,..., b m 1 of F. Multiplying out, we obtain

1.7. Partial Fractions Rational Functions and Partial Fractions. A rational function is a quotient of two polynomials: R(x) = P (x) Q(x).

Applied Time Series Analysis Part I

The integrating factor method (Sect. 2.1).

Factoring and Applications

Ideal Class Group and Units

9th Russian Summer School in Information Retrieval Big Data Analytics with R

Solutions of Equations in One Variable. Fixed-Point Iteration II

MATH 52: MATLAB HOMEWORK 2

Solving Systems of Linear Equations

1 Lecture: Integration of rational functions by decomposition

MATH1231 Algebra, 2015 Chapter 7: Linear maps

SECTION 0.6: POLYNOMIAL, RATIONAL, AND ALGEBRAIC EXPRESSIONS

Forecasting the US Dollar / Euro Exchange rate Using ARMA Models

SECTION 2.5: FINDING ZEROS OF POLYNOMIAL FUNCTIONS

A FULLY INTEGRATED ENVIRONMENT FOR TIME-DEPENDENT DATA ANALYSIS

Notes on Determinant

Increasing for all. Convex for all. ( ) Increasing for all (remember that the log function is only defined for ). ( ) Concave for all.

College Algebra - MAT 161 Page: 1 Copyright 2009 Killoran

Lecture 14: Section 3.3

Big Data - Lecture 1 Optimization reminders

Taylor and Maclaurin Series

COMPLEX NUMBERS AND SERIES. Contents

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Transcription:

Introduction to Time Series Analysis. Lecture 6. Peter Bartlett www.stat.berkeley.edu/ bartlett/courses/153-fall2010 Last lecture: 1. Causality 2. Invertibility 3. AR(p) models 4. ARMA(p,q) models 1

Introduction to Time Series Analysis. Lecture 6. Peter Bartlett www.stat.berkeley.edu/ bartlett/courses/153-fall2010 1. ARMA(p,q) models 2. Stationarity, causality and invertibility 3. The linear process representation of ARMA processes: ψ. 4. Autocovariance of an ARMA process. 5. Homogeneous linear difference equations. 2

Review: Causality A linear process {X t } is causal (strictly, a causal function of {W t }) if there is a ψ(b) = ψ 0 + ψ 1 B + ψ 2 B 2 + with ψ j < j=0 and X t = ψ(b)w t. 3

Review: Invertibility A linear process {X t } is invertible (strictly, an invertible function of {W t }) if there is a π(b) = π 0 + π 1 B + π 2 B 2 + with π j < j=0 and W t = π(b)x t. 4

Review: AR(p), Autoregressive models of order p An AR(p) process {X t } is a stationary process that satisfies X t φ 1 X t 1 φ p X t p = W t, where {W t } WN(0, σ 2 ). Equivalently, φ(b)x t = W t, where φ(b) = 1 φ 1 B φ p B p. 5

Review: AR(p), Autoregressive models of order p Theorem: A (unique) stationary solution to φ(b)x t = W t exists iff the roots of φ(z) avoid the unit circle: z = 1 φ(z) = 1 φ 1 z φ p z p 0. This AR(p) process is causal iff the roots of φ(z) are outside the unit circle: z 1 φ(z) = 1 φ 1 z φ p z p 0. 6

Reminder: Polynomials of a complex variable Every degree p polynomial a(z) can be factorized as a(z) = a 0 + a 1 z + + a p z p = a p (z z 1 )(z z 2 ) (z z p ), where z 1,...,z p C are the roots of a(z). If the coefficients a 0, a 1,...,a p are all real, then the roots are all either real or come in complex conjugate pairs, z i = z j. Example: z + z 3 = z(1 + z 2 ) = (z 0)(z i)(z + i), that is, z 1 = 0, z 2 = i, z 3 = i. So z 1 R; z 2, z 3 R; z 2 = z 3. Recall notation: A complex number z = a + ib has Re(z) = a, Im(z) = b, z = a ib, z = a 2 + b 2, arg(z) = tan 1 (b/a) ( π, π]. 7

Review: Calculating ψ for an AR(p): general case φ(b)x t = W t, X t = ψ(b)w t so 1 = ψ(b)φ(b) 1 = (ψ 0 + ψ 1 B + )(1 φ 1 B φ p B p ) 1 = ψ 0, 0 = ψ j (j < 0), 0 = φ(b)ψ j (j > 0). We can solve these linear difference equations in several ways: numerically, or by guessing the form of a solution and using an inductive proof, or by using the theory of linear difference equations. 8

Introduction to Time Series Analysis. Lecture 6. 1. Review: Causality, invertibility, AR(p) models 2. ARMA(p,q) models 3. Stationarity, causality and invertibility 4. The linear process representation of ARMA processes: ψ. 5. Autocovariance of an ARMA process. 6. Homogeneous linear difference equations. 9

ARMA(p,q): Autoregressive moving average models An ARMA(p,q) process {X t } is a stationary process that satisfies X t φ 1 X t 1 φ p X t p = W t +θ 1 W t 1 + +θ q W t q, where {W t } WN(0, σ 2 ). AR(p) = ARMA(p,0): θ(b) = 1. MA(q) = ARMA(0,q): φ(b) = 1. 10

ARMA(p,q): Autoregressive moving average models An ARMA(p,q) process {X t } is a stationary process that satisfies X t φ 1 X t 1 φ p X t p = W t +θ 1 W t 1 + +θ q W t q, where {W t } WN(0, σ 2 ). Usually, we insist that φ p, θ q 0 and that the polynomials φ(z) = 1 φ 1 z φ p z p, θ(z) = 1 + θ 1 z + + θ q z q have no common factors. This implies it is not a lower order ARMA model. 11

ARMA(p,q): An example of parameter redundancy Consider a white noise process W t. We can write X t = W t X t X t 1 + 0.25X t 2 = W t W t 1 + 0.25W t 2 (1 B + 0.25B 2 )X t = (1 B + 0.25B 2 )W t This is in the form of an ARMA(2,2) process, with φ(b) = 1 B + 0.25B 2, θ(b) = 1 B + 0.25B 2. But it is white noise. 12

ARMA(p,q): An example of parameter redundancy ARMA model: φ(b)x t = θ(b)w t, with φ(b) = 1 B + 0.25B 2, θ(b) = 1 B + 0.25B 2 X t = ψ(b)w t ψ(b) = θ(b) φ(b) = 1. i.e., X t = W t. 13

Introduction to Time Series Analysis. Lecture 6. 1. Review: Causality, invertibility, AR(p) models 2. ARMA(p,q) models 3. Stationarity, causality and invertibility 4. The linear process representation of ARMA processes: ψ. 5. Autocovariance of an ARMA process. 6. Homogeneous linear difference equations. 14

Recall: Causality and Invertibility A linear process {X t } is causal if there is a ψ(b) = ψ 0 + ψ 1 B + ψ 2 B 2 + with ψ j < and X t = ψ(b)w t. j=0 It is invertible if there is a π(b) = π 0 + π 1 B + π 2 B 2 + with π j < and W t = π(b)x t. j=0 15

ARMA(p,q): Stationarity, causality, and invertibility Theorem: If φ and θ have no common factors, a (unique) stationary solution to φ(b)x t = θ(b)w t exists iff the roots of φ(z) avoid the unit circle: z = 1 φ(z) = 1 φ 1 z φ p z p 0. This ARMA(p,q) process is causal iff the roots of φ(z) are outside the unit circle: z 1 φ(z) = 1 φ 1 z φ p z p 0. It is invertible iff the roots of θ(z) are outside the unit circle: z 1 θ(z) = 1 + θ 1 z + + θ q z q 0. 16

ARMA(p,q): Stationarity, causality, and invertibility Example: (1 1.5B)X t = (1 + 0.2B)W t. φ(z) = 1 1.5z = 3 2 ( z 2 ), 3 θ(z) = 1 + 0.2z = 1 5 (z + 5). 1. φ and θ have no common factors, and φ s root is at 2/3, which is not on the unit circle, so {X t } is an ARMA(1,1) process. 2. φ s root (at 2/3) is inside the unit circle, so {X t } is not causal. 3. θ s root is at 5, which is outside the unit circle, so {X t } is invertible. 17

ARMA(p,q): Stationarity, causality, and invertibility Example: (1 + 0.25B 2 )X t = (1 + 2B)W t. φ(z) = 1 + 0.25z 2 = 1 ( z 2 + 4 ) = 1 (z + 2i)(z 2i), 4 ( 4 θ(z) = 1 + 2z = 2 z + 1 ). 2 1. φ and θ have no common factors, and φ s roots are at ±2i, which is not on the unit circle, so {X t } is an ARMA(2,1) process. 2. φ s roots (at ±2i) are outside the unit circle, so {X t } is causal. 3. θ s root (at 1/2) is inside the unit circle, so {X t } is not invertible. 18

Causality and Invertibility Theorem: Let {X t } be an ARMA process defined by φ(b)x t = θ(b)w t. If all z = 1 have θ(z) 0, then there are polynomials φ and θ and a white noise sequence W t such that {X t } satisfies φ(b)x t = θ(b) W t, and this is a causal, invertible ARMA process. So we ll stick to causal, invertible ARMA processes. 19

Introduction to Time Series Analysis. Lecture 6. 1. Review: Causality, invertibility, AR(p) models 2. ARMA(p,q) models 3. Stationarity, causality and invertibility 4. The linear process representation of ARMA processes: ψ. 5. Autocovariance of an ARMA process. 6. Homogeneous linear difference equations. 20

Calculating ψ for an ARMA(p,q): matching coefficients Example: X t = ψ(b)w t (1 + 0.25B 2 )X t = (1 + 0.2B)W t, so 1 + 0.2B = (1 + 0.25B 2 )ψ(b) 1 + 0.2B = (1 + 0.25B 2 )(ψ 0 + ψ 1 B + ψ 2 B 2 + ) 1 = ψ 0, 0.2 = ψ 1, 0 = ψ 2 + 0.25ψ 0, 0 = ψ 3 + 0.25ψ 1,. 21

Calculating ψ for an ARMA(p,q): example 1 = ψ 0, 0.2 = ψ 1, 0 = ψ j + 0.25ψ j 2 (j 2). We can think of this as θ j = φ(b)ψ j, with θ 0 = 1, θ j = 0 for j < 0, j > q. This is a first order difference equation in the ψ j s. We can use the θ j s to give the initial conditions and solve it using the theory of homogeneous difference equations. ψ j = ( 1, 1 5, 1 4, 1 20, 1 16, 1 80, 1 64, 1 320,...). 22

Calculating ψ for an ARMA(p,q): general case so φ(b)x t = θ(b)w t, X t = ψ(b)w t θ(b) = ψ(b)φ(b) 1 + θ 1 B + + θ q B q = (ψ 0 + ψ 1 B + )(1 φ 1 B φ p B p ) 1 = ψ 0, θ 1 = ψ 1 φ 1 ψ 0, θ 2 = ψ 2 φ 1 ψ 1 φ 2 ψ 0,. This is equivalent to θ j = φ(b)ψ j, with θ 0 = 1, θ j = 0 for j < 0, j > q. 23

Introduction to Time Series Analysis. Lecture 6. 1. Review: Causality, invertibility, AR(p) models 2. ARMA(p,q) models 3. Stationarity, causality and invertibility 4. The linear process representation of ARMA processes: ψ. 5. Autocovariance of an ARMA process. 6. Homogeneous linear difference equations. 24

Autocovariance functions of linear processes Consider a (mean 0) linear process {X t } defined by X t = ψ(b)w t. γ(h) = E (X t X t+h ) = E (ψ 0 W t + ψ 1 W t 1 + ψ 2 W t 2 + ) (ψ 0 W t+h + ψ 1 W t+h 1 + ψ 2 W t+h 2 + ) = σw 2 (ψ 0 ψ h + ψ 1 ψ h+1 + ψ 2 ψ h+2 + ). 25

Autocovariance functions of MA processes Consider an MA(q) process {X t } defined by X t = θ(b)w t. σ 2 q h w j=0 γ(h) = θ jθ j+h if h q, 0 if h > q. 26

Autocovariance functions of ARMA processes ARMA process: φ(b)x t = θ(b)w t. To compute γ, we can compute ψ, and then use γ(h) = σ 2 w (ψ 0 ψ h + ψ 1 ψ h+1 + ψ 2 ψ h+2 + ). 27

Autocovariance functions of ARMA processes An alternative approach: X t φ 1 X t 1 φ p X t p = W t + θ 1 W t 1 + + θ q W t q, so E((X t φ 1 X t 1 φ p X t p ) X t h ) = E((W t + θ 1 W t 1 + + θ q W t q )X t h ), that is, γ(h) φ 1 γ(h 1) φ p γ(h p) = E(θ h W t h X t h + + θ q W t q X t h ) = σ 2 w q h j=0 This is a linear difference equation. θ h+j ψ j. (Write θ 0 = 1). 28

Autocovariance functions of ARMA processes: Example (1 + 0.25B 2 )X t = (1 + 0.2B)W t, X t = ψ(b)w t, ( ψ j = 1, 1 5, 1 4, 1 20, 1 16, 1 80, 1 64, 1 ) 320,.... γ(h) φ 1 γ(h 1) φ 2 γ(h 2) = σ 2 w γ(h) + 0.25γ(h 2) = q h j=0 θ h+j ψ j σ 2 w (ψ 0 + 0.2ψ 1 ) if h = 0, 0.2σ 2 wψ 0 if h = 1, 0 otherwise. 29

Autocovariance functions of ARMA processes: Example We have the homogeneous linear difference equation for h 2, with initial conditions γ(h) + 0.25γ(h 2) = 0 γ(0) + 0.25γ( 2) = σ 2 w (1 + 1/25) γ(1) + 0.25γ( 1) = σ 2 w/5. We can solve these linear equations to determine γ. Or we can use the theory of linear difference equations... 30

Introduction to Time Series Analysis. Lecture 6. Peter Bartlett www.stat.berkeley.edu/ bartlett/courses/153-fall2010 1. ARMA(p,q) models 2. Stationarity, causality and invertibility 3. The linear process representation of ARMA processes: ψ. 4. Autocovariance of an ARMA process. 5. Homogeneous linear difference equations. 31

Difference equations Examples: x t 3x t 1 = 0 x t x t 1 x t 2 = 0 x t + 2x t 1 x 2 t 3 = 0 (first order, linear) (2nd order, nonlinear) (3rd order, nonlinear) 32

Homogeneous linear diff eqns with constant coefficients a 0 x t + a 1 x t 1 + + a k x t k = 0 ( a0 + a 1 B + + a k B k) x t = 0 a(b)x t = 0 auxiliary equation: a 0 + a 1 z + + a k z k = 0 (z z 1 )(z z 2 ) (z z k ) = 0 where z 1, z 2,...,z k C are the roots of this characteristic polynomial. Thus, a(b)x t = 0 (B z 1 )(B z 2 ) (B z k )x t = 0. 33

Homogeneous linear diff eqns with constant coefficients a(b)x t = 0 (B z 1 )(B z 2 ) (B z k )x t = 0. So any {x t } satisfying (B z i )x t = 0 for some i also satisfies a(b)x t = 0. Three cases: 1. The z i are real and distinct. 2. The z i are complex and distinct. 3. Some z i are repeated. 34

Homogeneous linear diff eqns with constant coefficients 1. The z i are real and distinct. a(b)x t = 0 (B z 1 )(B z 2 ) (B z k )x t = 0 for some constants c 1,...,c k. x t is a linear combination of solutions to (B z 1 )x t = 0, (B z 2 )x t = 0,..., (B z k )x t = 0 x t = c 1 z t 1 + c 2 z t 2 + + c k z t k, 35

Homogeneous linear diff eqns with constant coefficients 1. The z i are real and distinct. e.g., z 1 = 1.2, z 2 = 1.3 1 0.8 c 1 z 1 t + c2 z 2 t c 1 =1, c 2 =0 c 1 =0, c 2 =1 c 1 = 0.8, c 2 = 0.2 0.6 0.4 0.2 0 0.2 0.4 0.6 0.8 0 2 4 6 8 10 12 14 16 18 20 36

Reminder: Complex exponentials a + ib = re iθ = r(cosθ + isinθ), where r = a + ib = a 2 + b 2 ( ) b θ = tan 1 ( π, π]. a Thus, r 1 e iθ 1 r 2 e iθ 2 = (r 1 r 2 )e i(θ 1+θ 2 ), z z = z 2. 37

Homogeneous linear diff eqns with constant coefficients 2. The z i are complex and distinct. As before, a(b)x t = 0 x t = c 1 z t 1 + c 2 z t 2 + + c k z t k. If z 1 R, since a 1,...,a k are real, we must have the complex conjugate root, z j = z 1. And for x t to be real, we must have c j = c 1. For example: x t = c z t 1 + c z 1 t = r e iθ z 1 t e iωt + r e iθ z 1 t e iωt = r z 1 t ( e i(θ ωt) + e i(θ ωt)) = 2r z 1 t cos(ωt θ) where z 1 = z 1 e iω and c = re iθ. 38

Homogeneous linear diff eqns with constant coefficients 2. The z i are complex and distinct. e.g., z 1 = 1.2 + i, z 2 = 1.2 i 1 0.8 c 1 z 1 t + c2 z 2 t c=1.0+0.0i c=0.0+1.0i c= 0.8 0.2i 0.6 0.4 0.2 0 0.2 0.4 0.6 0.8 1 0 2 4 6 8 10 12 14 16 18 20 39

Homogeneous linear diff eqns with constant coefficients 2. The z i are complex and distinct. e.g., z 1 = 1 + 0.1i, z 2 = 1 0.1i 2 1.5 c 1 z 1 t + c2 z 2 t c=1.0+0.0i c=0.0+1.0i c= 0.8 0.2i 1 0.5 0 0.5 1 1.5 2 0 10 20 30 40 50 60 70 40