Lecture 6. Cross covariance and cross spectra. Introduction to ARMA models. Jesper Rydén Department of Mathematics, Uppsala University jesper@math.uu.se Stationary Stochastic Processes Fall 2011
Cross spectra Theorem. If {X(t)} and {Y(t)} are stationary correlated processes in continuous time, with continuous covariance-matrix function r X,Y (τ), there exists a spectral-density matrix function R X,Y (f) = ( RX (f) R X,Y (f) R Y,X (f) R Y (f) ) such that r X,Y (τ) = e i2πfτ R X,Y (f)df
Cross spectra The cross-spectral density R X,Y (f) is a complex-valued function, such that R X,Y ( f) = R X,Y (f). The matrix R X,Y is of positive type, i.e. for every two complex numbers z 1 and z 2, and z 1 2 R X (f)+z 1 z 2 R X,Y (f)+z 1 z 2 R Y,X (f)+ z 2 2 R Y (f) 0 0 R X,Y(f) 2 R X (f)r Y (f) 1. If r X,Y (τ) dτ <, the inversion formula R X,Y (f) = e i2πfτ r X,Y (τ)dτ holds.
Example: Derivative Cross-spectrum between a stationary process {X(t)} and its derivative {X (t)} can be found by use of Theorem 5.5 and the results above: R X,X (f) = (i2πf)r X (f).
An important example: Input-output relations Let {X(t)} and {Z(t)} be stationary processes and assume that the disturbance is uncorrelated with {X(t)}, i.e. r X,Z (s,t) = 0, s and t. Y(t) = h(t u)x(u)du+z(t) = Statistical properties discussed on black board. h(u)x(t u)du+z(t).
Interpretation of cross-spectral density Write the cross-spectral density in polar form: R X,Y (f) = A X,Y (f)e iφ X,Y(f) Here, the modulus A X,Y (f) is called the cross-amplitude spectrum, and the argument Φ X,Y (f) is called the phase spectrum. The function κ 2 X,Y = R X,Y(f) 2 R X (f)r Y (f) = A X,Y(f) 2 R X (f)r Y (f) is called the squared coherence spectrum. Properties: A X,Y ( f) = A X,Y (f) Φ X,Y ( f) = Φ X,Y (f) 0 κ 2 X,Y 1
Example: Modelling of temperature A not unrealistic assumption is that the indoor temperature Y t (t is time in hours) is related to outdoor temperature X s for s t: Y t 18 = 0.4X t +0.9X t 1 +e t where e t is uncorrelated with X s for all s and t. Assume that r X (τ) = 2 0.9 τ and find the cross-covariance function and cross-spectral density.
Linear filters in discrete time: ARMA models White noise in discrete time {e t, t =,±1,...}: E[e t ] = 0, { σ C[e s,e t ] = 2, if s = t, 0, otherwise The sequence {e t } is called the innovation process and its spectral density is constant, R e (f) = σ 2 for 1/2 < f 1/2
An autoregressive process AR(p) process. The operator T 1 delays the signal one time unit.
The AR(p) process Definition. Let A(z) be a stable polynomial of dgree p. A stationary sequence {X t } is called an AR(p) process with generating polynomial A(z) if the sequence {e t } given by is a white-noise sequence with X t +a 1 X t 1 + a p X t p = e t E[e t ] = 0, V[e t ] = σ 2 (constant) and e t uncorrelated with X t 1,X t 2,... The variables e t are the innovations to the AR process. In a Gaussian stationary AR process, also the innovations are Gaussian.
The innovations In an AR process, the innovation e t influences all X s, s t
The AR(p) process Theorem. If {X t } is an AR(p) process, with generating polynomial A(z) and innovation variance σ 2, then m X = E[X t ] = 0 and R X (f) = σ 2 p k=0 a ke i2πfk 2 = σ2 A(e i2πf ) 2 The covariance function r X is the solution of the Yule Walker equations, r X (k)+a 1 r X (k 1)+ +a p r X (k p) = 0, k = 1,2,... with initial condition r X (0)+a 1 r X (1)+ +a p r X (p) = σ 2 The general solution is of the form r X (τ) = p 1 C kr τ k, where r k,k = 1,2,...,p with r k < 1 are the roots of the characteristic equation (or modifications thereof, if there are multiple roots).
AR(1) process Let X t = θx t 1 +e t where {e t } is a white-noise process with variance σ 2. Spectral density: R(f) = σ 2 1+θ 2 2θcos2πf.
AR(1) process AR(1) processes with θ = 0.9 and θ = 0.6, respectively. Spectral Density 0 200 400 0.0 0.1 0.2 0.3 0.4 0.5 Frequency Spectral Density 0 2 4 6 0.0 0.1 0.2 0.3 0.4 0.5 Frequency
Motivation Reasons to use AR processes in time-series modelling: Many series are actually generated in a feedback system The AR process is flexible, and by a clever choice of coefficients they can approximate most variance and spectrum structures They are convenient to use in forecasting; suppose one wants to predict at time t the future value X t+1 knowing all...,x t p+1,...,x t. The linear predictor ˆX t+1 = a 1 X t a 2 X t 1 a p X t p+1 is the best prediction of X t+1 in the least-squares sense.
Example Consider the AR(1) process X t 0.5X t 1 = e t where {e t } is a white-noise process with σ = 3. Compute P( X t X t 1 > 1).
AR(2) process: Stability region The parabola a 2 = a 2 1 /4 is the boundary between a cvf with complex roots and one with real roots.
The MA(q) process Definition. The process {X t } given by X t = e t +c 1 e t 1 + +c q e t q is called a moving-average process of order q, MA(q) process, with innovation sequence {e t } and generating polynomial C(z).
The MA(q) process Theorem. An MA(q) process {X t } is stationary, with m X = E[X t ] = 0, and { σ 2 r X (τ) = j k=τ c jc k, if τ q, 0, otherwise q R X (f) = σ 2 c k e i2πfk 2 = σ 2 C(e i2πf ) 2 k=0 = r X (0)+2 q r X (τ)cos2πfτ. τ=1 The main feature of an MA process: Its covariance function is 0 for τ > q.
MA(1) process Let X t = e t +c 1 e t 1 where {e t } WN(0,σ 2 ). Spectral density: R(f) = σ 2 (1+c1 2 +2c 1cos2πf).
MA(1) process MA(1) processes with c 1 = 0.9 and c 1 = 0.9, respectively. Spectral Density 0 1 2 3 0.0 0.1 0.2 0.3 0.4 0.5 Frequency Spectral Density 0 1 2 3 0.0 0.1 0.2 0.3 0.4 0.5 Frequency
The ARMA(p, q) process An ARMA(p,q) process is given by X t +a 1 X t 1 + +a p X t p = e t +c 1 e t 1 +c q e t q where {e t } is a white-noise process, such that e t and X t k are uncorrelated for k = 1,2,.... The spectral density is given by R X (f) = σ 2 q k=0 c ke i2πfk 2 ) 2 p k=0 a ke i2πfk = σ2 C(e i2πf 2 A(e i2πf ) 2