4.3 Moving Average Process MA(q)

Similar documents
Discrete Time Series Analysis with ARMA Models

Introduction to Time Series Analysis. Lecture 6.

Time Series Analysis

1 Short Introduction to Time Series

ITSM-R Reference Manual

Sales forecasting # 2

Probability and Random Variables. Generation of random variables (r.v.)

Time Series Analysis 1. Lecture 8: Time Series Analysis. Time Series Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 MIT 18.S096

Autocovariance and Autocorrelation

Some useful concepts in univariate time series analysis

Time Series Analysis in Economics. Klaus Neusser

AR(p) + MA(q) = ARMA(p, q)

Lecture 2: ARMA(p,q) models (part 3)

Trend and Seasonal Components

Analysis and Computation for Finance Time Series - An Introduction

Time Series Analysis

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models

Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay

Financial TIme Series Analysis: Part II

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Time Series - ARIMA Models. Instructor: G. William Schwert

Univariate and Multivariate Methods PEARSON. Addison Wesley

Time Series Analysis by Higher Order Crossings. Benjamin Kedem Department of Mathematics & ISR University of Maryland College Park, MD

Solving Linear Systems, Continued and The Inverse of a Matrix

Chapter 1. Vector autoregressions. 1.1 VARs and the identi cation problem

State Space Time Series Analysis

10.2 Series and Convergence

Forecasting model of electricity demand in the Nordic countries. Tone Pedersen

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

3.1 Stationary Processes and Mean Reversion

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Time Series Analysis

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Time Series Analysis

Representation of functions as power series

Advanced Forecasting Techniques and Models: ARIMA

Traffic Safety Facts. Research Note. Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen

COMP6053 lecture: Time series analysis, autocorrelation.

10.2 ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS. The Jacobi Method

Time Series Analysis and Forecasting

Follow links for Class Use and other Permissions. For more information send to:

Exam Solutions. X t = µ + βt + A t,

The Bivariate Normal Distribution

Time Series Analysis

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!

88 CHAPTER 2. VECTOR FUNCTIONS. . First, we need to compute T (s). a By definition, r (s) T (s) = 1 a sin s a. sin s a, cos s a

Forecasting the US Dollar / Euro Exchange rate Using ARMA Models

E3: PROBABILITY AND STATISTICS lecture notes

In this paper we study how the time-series structure of the demand process affects the value of information

Numeraire-invariant option pricing

Luciano Rispoli Department of Economics, Mathematics and Statistics Birkbeck College (University of London)

Graphical Tools for Exploring and Analyzing Data From ARIMA Time Series Models

Non-Stationary Time Series andunitroottests

Applied Time Series Analysis Part I

2.2 Elimination of Trend and Seasonality

LS.6 Solution Matrices

Similarity and Diagonalization. Similar Matrices

How To Find The Correlation Of Random Bits With The Xor Operator

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Row Echelon Form and Reduced Row Echelon Form

Estimating an ARMA Process

ST 371 (IV): Discrete Random Variables

Continued Fractions and the Euclidean Algorithm

Math 55: Discrete Mathematics

Overview. Essential Questions. Precalculus, Quarter 4, Unit 4.5 Build Arithmetic and Geometric Sequences and Series

TIME SERIES ANALYSIS

Time Series Analysis

Chapter 1: Financial Markets and Financial Derivatives

Lecture Notes on Polynomials

Continued Fractions. Darren C. Collins

9th Russian Summer School in Information Retrieval Big Data Analytics with R

1 if 1 x 0 1 if 0 x 1

Rob J Hyndman. Forecasting using. 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1

Introduction to Time Series Analysis. Lecture 1.

a 1 x + a 0 =0. (3) ax 2 + bx + c =0. (4)

Time Series Analysis

Walrasian Demand. u(x) where B(p, w) = {x R n + : p x w}.

4. Simple regression. QBUS6840 Predictive Analytics.

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

k, then n = p2α 1 1 pα k

How To Model A Series With Sas

Basic Proof Techniques

Matrices and Polynomials

NOTES ON LINEAR TRANSFORMATIONS

1 Determinants and the Solvability of Linear Systems

4.5 Linear Dependence and Linear Independence

5 Homogeneous systems

Solving Systems of Linear Equations

TOPIC 4: DERIVATIVES

Spatial Statistics Chapter 3 Basics of areal data and areal data modeling

Notes on Determinant

Predicting Indian GDP. And its relation with FMCG Sales

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Subspaces of R n LECTURE Subspaces

Math 55: Discrete Mathematics

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES


Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan , Fall 2010

Transcription:

66 CHAPTER 4. STATIONARY TS MODELS 4.3 Moving Average Process MA(q) Definition 4.5. {X t } is a moving-average process of order q if X t = Z t + θ 1 Z t 1 +... + θ q Z t q, (4.9) where and θ 1,...,θ q are constants. Z t WN(0, σ 2 ) Remark 4.6. X t is a linear combination of q + 1 white noise variables and we say that it is q-correlated, that is X t and X t+τ are uncorrelated for all lags τ > q. Remark 4.7. If Z t is an i.i.d process then X t is a strictly stationary TS since (Z t,...,z t q ) T d = (Z t+τ,..., Z t q+τ ) T for all τ. Then it is called q-dependent, that is X t and X t+τ are independent for all lags τ > q. Remark 4.8. Obviously, IID noise is a 0-dependent TS. White noise is a 0-correlated TS. MA(1) is 1-correlated TS if it is a combination of WN r.vs, 1-dependent if it is a combination of IID r.vs. Remark 4.9. The MA(q) process can also be written in the following equivalent form X t = θ(b)z t, (4.10) where the moving average operator θ(b) = 1 + θ 1 B + θ 2 B 2 +... + θ q B q (4.11) defines a linear combination of values in the shift operator B k Z t = Z t k.

4.3. MOVING AVERAGE PROCESS MA(Q) 67 Example 4.4. MA(2) process. This process is written as X t = Z t + θ 1 Z t 1 + θ 2 Z t 2 = (1 + θ 1 B + θ 2 B 2 )Z t. (4.12) What are the properties of MA(2)? As it is a combination of a zero mean white noise, it also has zero mean, i.e., E X t = E(Z t + θ 1 Z t 1 + θ 2 Z t 2 ) = 0. It is easy to calculate the covariance of X t and X t+τ. We get (1 + θ1 2 + θ2)σ 2 2 for τ = 0, (θ γ(τ) = cov(x t, X t+τ ) = 1 + θ 1 θ 2 )σ 2 for τ = ±1, θ 2 σ 2 for τ = ±2, 0 for τ > 2, which shows that the autocovariances depend on lag, but not on time. Dividing γ(τ) by γ(0) we obtain the autocorrelation function, 1 for τ = 0, θ 1 +θ 1 θ 2 for τ = ±1, 1+θ1 ρ(τ) = 2+θ2 2 θ 2 for τ = ±2 1+θ1 2+θ2 2 0 for τ > 2. MA(2) process is a weakly stationary, 2-correlated TS. Figure 4.5 shows MA(2) processes obtained from the simulated Gaussian white noise shown in Figure 4.1 for various values of the parameters (θ 1, θ 2 ). The blue series is x t = z t + 0.5z t 1 + 0.5z t 2, while the purple series is x t = z t + 5z t 1 + 5z t 2, where z t are realizations of an i.i.d. Gaussian noise. As you can see very different processes can be obtained for different sets of the parameters. This is an important property of MA(q) processes, which is a very large family of models. This property is reinforced by the following Proposition. Proposition 4.2. If {X t } is a stationary q-correlated time series with mean zero, then it can be represented as an MA(q) process.

68 CHAPTER 4. STATIONARY TS MODELS Simulated MA(2) 10 0-10 -20 10 30 50 70 90 time index Figure 4.5: Two simulated MA(2) processes, both from the white noise shown in Figure 4.1, but for different sets of parameters: (θ 1, θ 2 ) = (0.5, 0.5) and (θ 1, θ 2 ) = (5, 5). Series : GaussianWN$xt Series : GaussianWN$xt55 ACF -0.2 0.0 0.2 0.4 0.6 0.8 1.0 ACF -0.2 0.0 0.2 0.4 0.6 0.8 1.0 (a) 0 5 10 15 20 Lag 0 5 10 15 20 Lag (b) Figure 4.6: (a) Sample ACF for x t = z t + 0.5z t 1 + 0.5z t 2 and (b) for x t = z t + 5z t 1 + 5z t 2.

4.3. MOVING AVERAGE PROCESS MA(Q) 69 Also, the following theorem gives the form of ACF for a general MA(q). Theorem 4.2. An MA(q) process (as in Definition 4.5) is a weakly stationary TS with the ACVF { σ 2 q τ γ(τ) = j=0 θ jθ j+ τ, if τ q, (4.13) 0, if τ > q, where θ 0 is defined to be 1. The ACF of an MA(q) has a distinct cut-off at lag τ = q. Furthermore, if q is small the maximum value of ρ(1) is well below unity. It can be shown that ( ) π ρ(1) cos. (4.14) q + 2 4.3.1 Non-uniqueness of MA Models Consider an example of MA(1) whose ACF is X t = Z t + θz t 1 1, if τ = 0, θ ρ(τ) = if τ = ±1, 1+θ 2 0, if τ > 1. For q = 1, formula (4.14) means that the maximum value of ρ(1) is 0.5. It can be verified directly from the formula for the ACF above. Treating ρ(1) as a function of θ we can calculate its extrema. Denote Then f(θ) = θ 1 + θ 2. f (θ) = 1 θ2 (1 + θ 2 ) 2. The derivative is equal to 0 at θ = ±1 and the function f attains maximum at θ = 1 and minimum at θ = 1. We have f(1) = 1 2, f( 1) = 1 2. This fact can be helpful in recognizing MA(1) processes. In fact, MA(1) with θ = 1 may be uniquely identified from the autocorrelation function.

70 CHAPTER 4. STATIONARY TS MODELS However, it is easy to see that the form of the ACF stays the same for θ and for 1. Take for example 5 and 1. In both cases θ 5 1 if τ = 0, 5 ρ(τ) = if τ = ±1, 26 0 if τ > 1. Also, the pair σ 2 = 1, θ = 5 gives the same ACVF as the pair σ 2 = 25, θ = 1, 5 namely (1 + θ 2 )σ 2 = 26, if τ = 0, γ(τ) = θσ 2 = 5, if τ = ±1, 0, if τ > 1. Hence, the MA(1) processes X t = Z t + 1 5 Z t 1, Z t iid N(0, 25) and X t = Y t + 5Y t 1, Y t iid N(0, 1) are the same. We can observe the variable X t, not the noise variable, so we can not distinguish between these two models. Except for the case of θ = 1, a particular autocorrelation function will be compatible with two models. To which of the two models we should restrict our attention? 4.3.2 Invertibility of MA Processes The MA(1) process can be expressed in terms of lagged values of X t by substituting repeatedly for lagged values of Z t. We have The substitution yields Z t = X t θz t 1 = X t θ(x t 1 θz t 2 ) Z t = X t θz t 1. = X t θx t 1 + θ 2 Z t 2 = X t θx t 1 + θ 2 (X t 2 θz t 3 ) = X t θx t 1 + θ 2 X t 2 θ 3 Z t 3 =... = X t θx t 1 + θ 2 X t 2 θ 3 X t 3 + θ 4 X t 4 +... + ( θ) n Z t n.

4.3. MOVING AVERAGE PROCESS MA(Q) 71 This can be rewritten as However, if θ < 1, then n 1 ( θ) n Z t n = Z t ( θ) j X t j. j=0 j=0 ( ) n 1 2 E Z t ( θ) j X t j = E ( θ 2n Zt n) 2 0 n and we say that the sum is convergent in the mean square sense. Hence, we obtain another representation of the model Z t = ( θ) j X t j. j=0 This is a representation of another class of models, called infinite autoregressive (AR) models. So we inverted MA(1) to an infinite AR. It was possible due to the assumption that θ < 1. Such a process is called an invertible process. This is a desired property of TS, so in the example we would choose the model with σ 2 = 25, θ = 1 5.

72 CHAPTER 4. STATIONARY TS MODELS 4.4 Linear Processes Definition 4.6. The TS {X t } is called a linear process if it has the representation X t = ψ j Z t j, (4.15) for all t, where Z t WN(0, σ 2 ) and {ψ j } is a sequence of constants such that ψ j <. Remark 4.10. The condition ψ j < ensures that the process converges in the mean square sense, that is E ( X t n ) 2 ψ j Z t j 0 as n. j= n Remark 4.11. MA( ) is a linear process with ψ j = 0 for j < 0 and ψ j = θ j for j 0, that is MA( ) has the representation X t = θ j Z t j, j=0 where θ 0 = 1. Note that the formula (4.15) can be written using the backward shift operator B. We have Z t j = B j Z t. Hence Denoting X t = ψ j Z t j = ψ(b) = ψ j B j Z t. ψ j B j, (4.16)

4.4. LINEAR PROCESSES 73 we can write the linear process in a neat way X t = ψ(b)z t. The operator ψ(b) is a linear filter, which when applied to a stationary process produces a stationary process. This fact is proved in the following proposition. Proposition 4.3. Let {Y t } be a stationary TS with mean zero and autocovariance function γ Y. If ψ j <, then the process X t = is stationary with mean zero and autocovariance function γ X (τ) = k= ψ j Y t j = ψ(b)y t (4.17) ψ j ψ k γ Y (τ k + j). (4.18) Proof. The assumption ψ j < assures convergence of the series. Now, since E Y t = 0, we have ( ) E X t = E ψ j Y t j = ψ j E(Y t j ) = 0 and E(X t X t+τ ) = E = = [( k= k= ψ j Y t j )( k= ψ j ψ k E(Y t j Y t+τ k ) ψ j ψ k γ Y (τ k + j). ψ k Y t+τ k )] It means that {X t } is a stationary TS with the autocavarianxe function given by formula (4.18). Corrolary 4.1. If {Y t } is a white noise process, then {X t } given by (4.17) is a stationary linear process with zero mean and the ACVF γ X (τ) = ψ j ψ j+τ σ 2. (4.19)