Lecture 6. Cross covariance and cross spectra. Introduction to ARMA models.

Similar documents
Probability and Random Variables. Generation of random variables (r.v.)

Lecture 2: ARMA(p,q) models (part 3)

Lecture 8: Signal Detection and Noise Assumption

Univariate and Multivariate Methods PEARSON. Addison Wesley

Introduction to Time Series Analysis. Lecture 6.

1 Short Introduction to Time Series

ITSM-R Reference Manual

Sales forecasting # 2

Forecasting model of electricity demand in the Nordic countries. Tone Pedersen

Time Series Analysis 1. Lecture 8: Time Series Analysis. Time Series Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 MIT 18.S096

Advanced Signal Processing and Digital Noise Reduction

Time Series Analysis III

Time Series Analysis

Analysis of algorithms of time series analysis for forecasting sales

Software Review: ITSM 2000 Professional Version 6.0.

Some useful concepts in univariate time series analysis

EE 179 April 21, 2014 Digital and Analog Communication Systems Handout #16 Homework #2 Solutions

Univariate Time Series Analysis; ARIMA Models

Chapter 8 - Power Density Spectrum

Estimating an ARMA Process

2.3. Finding polynomial functions. An Introduction:

Time Series - ARIMA Models. Instructor: G. William Schwert

Time Series Analysis of Aviation Data

THE SVM APPROACH FOR BOX JENKINS MODELS

Discrete Time Series Analysis with ARMA Models

General Theory of Differential Equations Sections 2.8, , 4.1

5 Transforming Time Series

AR(p) + MA(q) = ARMA(p, q)

Analysis and Computation for Finance Time Series - An Introduction

Univariate Time Series Analysis; ARIMA Models

3. Regression & Exponential Smoothing

Chapter 10 Introduction to Time Series Analysis

3.1 Stationary Processes and Mean Reversion

Solutions to Exam in Speech Signal Processing EN2300

Time Series Analysis

Algebraic Concepts Algebraic Concepts Writing

Luciano Rispoli Department of Economics, Mathematics and Statistics Birkbeck College (University of London)

Adaptive Equalization of binary encoded signals Using LMS Algorithm

MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

How To Model A Series With Sas

State Space Time Series Analysis

Some probability and statistics

Probability for Estimation (review)

Lecture 18: The Time-Bandwidth Product

A FULLY INTEGRATED ENVIRONMENT FOR TIME-DEPENDENT DATA ANALYSIS

Time Series Analysis

Time Series Laboratory

Understanding and Applying Kalman Filtering

Advanced Forecasting Techniques and Models: ARIMA

3.5.1 CORRELATION MODELS FOR FREQUENCY SELECTIVE FADING

Lagrange Interpolation is a method of fitting an equation to a set of points that functions well when there are few points given.

Dirichlet forms methods for error calculus and sensitivity analysis

Matrices and Polynomials

TIME SERIES. Syllabus... Keywords...

Time Series Analysis

min ǫ = E{e 2 [n]}. (11.2)

Automated Stellar Classification for Large Surveys with EKF and RBF Neural Networks

Time Series Analysis

Impulse Response Functions

Time Series Analysis

Time Series HILARY TERM 2010 PROF. GESINE REINERT

Lecture 5 Least-squares

Content. Professur für Steuerung, Regelung und Systemdynamik. Lecture: Vehicle Dynamics Tutor: T. Wey Date: , 20:11:52

Graphical Tools for Exploring and Analyzing Data From ARIMA Time Series Models

y t by left multiplication with 1 (L) as y t = 1 (L) t =ª(L) t 2.5 Variance decomposition and innovation accounting Consider the VAR(p) model where

The Bivariate Normal Distribution

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include

Traffic Safety Facts. Research Note. Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen

REVIEW EXERCISES DAVID J LOWRY

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Relations Between Time Domain and Frequency Domain Prediction Error Methods - Tomas McKelvey

NRZ Bandwidth - HF Cutoff vs. SNR

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Time Series Analysis in Economics. Klaus Neusser

PITFALLS IN TIME SERIES ANALYSIS. Cliff Hurvich Stern School, NYU

JUST THE MATHS UNIT NUMBER 1.8. ALGEBRA 8 (Polynomials) A.J.Hobson

Sections 2.11 and 5.8

NAG C Library Chapter Introduction. g13 Time Series Analysis

Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.

5 Signal Design for Bandlimited Channels

On the mathematical theory of splitting and Russian roulette

Lectures on Stochastic Processes. William G. Faris

Correlation key concepts:

QUALITY ENGINEERING PROGRAM

Modelling electricity market data: the CARMA spot model, forward prices and the risk premium

TIME SERIES ANALYSIS: AN OVERVIEW

SGN-1158 Introduction to Signal Processing Test. Solutions

Trend and Seasonal Components

Formulations of Model Predictive Control. Dipartimento di Elettronica e Informazione

Threshold Autoregressive Models in Finance: A Comparative Approach

Lecture 13 Linear quadratic Lyapunov theory

Algebra I Vocabulary Cards

11. Time series and dynamic linear models

ARMA, GARCH and Related Option Pricing Method

Linear and quadratic Taylor polynomials for functions of several variables.

Math 241, Exam 1 Information.

Chapter 1. Vector autoregressions. 1.1 VARs and the identi cation problem

**BEGINNING OF EXAMINATION** The annual number of claims for an insured has probability function: , 0 < q < 1.

Time Series Analysis in WinIDAMS

ANALYZER BASICS WHAT IS AN FFT SPECTRUM ANALYZER? 2-1

Transcription:

Lecture 6. Cross covariance and cross spectra. Introduction to ARMA models. Jesper Rydén Department of Mathematics, Uppsala University jesper@math.uu.se Stationary Stochastic Processes Fall 2011

Cross spectra Theorem. If {X(t)} and {Y(t)} are stationary correlated processes in continuous time, with continuous covariance-matrix function r X,Y (τ), there exists a spectral-density matrix function R X,Y (f) = ( RX (f) R X,Y (f) R Y,X (f) R Y (f) ) such that r X,Y (τ) = e i2πfτ R X,Y (f)df

Cross spectra The cross-spectral density R X,Y (f) is a complex-valued function, such that R X,Y ( f) = R X,Y (f). The matrix R X,Y is of positive type, i.e. for every two complex numbers z 1 and z 2, and z 1 2 R X (f)+z 1 z 2 R X,Y (f)+z 1 z 2 R Y,X (f)+ z 2 2 R Y (f) 0 0 R X,Y(f) 2 R X (f)r Y (f) 1. If r X,Y (τ) dτ <, the inversion formula R X,Y (f) = e i2πfτ r X,Y (τ)dτ holds.

Example: Derivative Cross-spectrum between a stationary process {X(t)} and its derivative {X (t)} can be found by use of Theorem 5.5 and the results above: R X,X (f) = (i2πf)r X (f).

An important example: Input-output relations Let {X(t)} and {Z(t)} be stationary processes and assume that the disturbance is uncorrelated with {X(t)}, i.e. r X,Z (s,t) = 0, s and t. Y(t) = h(t u)x(u)du+z(t) = Statistical properties discussed on black board. h(u)x(t u)du+z(t).

Interpretation of cross-spectral density Write the cross-spectral density in polar form: R X,Y (f) = A X,Y (f)e iφ X,Y(f) Here, the modulus A X,Y (f) is called the cross-amplitude spectrum, and the argument Φ X,Y (f) is called the phase spectrum. The function κ 2 X,Y = R X,Y(f) 2 R X (f)r Y (f) = A X,Y(f) 2 R X (f)r Y (f) is called the squared coherence spectrum. Properties: A X,Y ( f) = A X,Y (f) Φ X,Y ( f) = Φ X,Y (f) 0 κ 2 X,Y 1

Example: Modelling of temperature A not unrealistic assumption is that the indoor temperature Y t (t is time in hours) is related to outdoor temperature X s for s t: Y t 18 = 0.4X t +0.9X t 1 +e t where e t is uncorrelated with X s for all s and t. Assume that r X (τ) = 2 0.9 τ and find the cross-covariance function and cross-spectral density.

Linear filters in discrete time: ARMA models White noise in discrete time {e t, t =,±1,...}: E[e t ] = 0, { σ C[e s,e t ] = 2, if s = t, 0, otherwise The sequence {e t } is called the innovation process and its spectral density is constant, R e (f) = σ 2 for 1/2 < f 1/2

An autoregressive process AR(p) process. The operator T 1 delays the signal one time unit.

The AR(p) process Definition. Let A(z) be a stable polynomial of dgree p. A stationary sequence {X t } is called an AR(p) process with generating polynomial A(z) if the sequence {e t } given by is a white-noise sequence with X t +a 1 X t 1 + a p X t p = e t E[e t ] = 0, V[e t ] = σ 2 (constant) and e t uncorrelated with X t 1,X t 2,... The variables e t are the innovations to the AR process. In a Gaussian stationary AR process, also the innovations are Gaussian.

The innovations In an AR process, the innovation e t influences all X s, s t

The AR(p) process Theorem. If {X t } is an AR(p) process, with generating polynomial A(z) and innovation variance σ 2, then m X = E[X t ] = 0 and R X (f) = σ 2 p k=0 a ke i2πfk 2 = σ2 A(e i2πf ) 2 The covariance function r X is the solution of the Yule Walker equations, r X (k)+a 1 r X (k 1)+ +a p r X (k p) = 0, k = 1,2,... with initial condition r X (0)+a 1 r X (1)+ +a p r X (p) = σ 2 The general solution is of the form r X (τ) = p 1 C kr τ k, where r k,k = 1,2,...,p with r k < 1 are the roots of the characteristic equation (or modifications thereof, if there are multiple roots).

AR(1) process Let X t = θx t 1 +e t where {e t } is a white-noise process with variance σ 2. Spectral density: R(f) = σ 2 1+θ 2 2θcos2πf.

AR(1) process AR(1) processes with θ = 0.9 and θ = 0.6, respectively. Spectral Density 0 200 400 0.0 0.1 0.2 0.3 0.4 0.5 Frequency Spectral Density 0 2 4 6 0.0 0.1 0.2 0.3 0.4 0.5 Frequency

Motivation Reasons to use AR processes in time-series modelling: Many series are actually generated in a feedback system The AR process is flexible, and by a clever choice of coefficients they can approximate most variance and spectrum structures They are convenient to use in forecasting; suppose one wants to predict at time t the future value X t+1 knowing all...,x t p+1,...,x t. The linear predictor ˆX t+1 = a 1 X t a 2 X t 1 a p X t p+1 is the best prediction of X t+1 in the least-squares sense.

Example Consider the AR(1) process X t 0.5X t 1 = e t where {e t } is a white-noise process with σ = 3. Compute P( X t X t 1 > 1).

AR(2) process: Stability region The parabola a 2 = a 2 1 /4 is the boundary between a cvf with complex roots and one with real roots.

The MA(q) process Definition. The process {X t } given by X t = e t +c 1 e t 1 + +c q e t q is called a moving-average process of order q, MA(q) process, with innovation sequence {e t } and generating polynomial C(z).

The MA(q) process Theorem. An MA(q) process {X t } is stationary, with m X = E[X t ] = 0, and { σ 2 r X (τ) = j k=τ c jc k, if τ q, 0, otherwise q R X (f) = σ 2 c k e i2πfk 2 = σ 2 C(e i2πf ) 2 k=0 = r X (0)+2 q r X (τ)cos2πfτ. τ=1 The main feature of an MA process: Its covariance function is 0 for τ > q.

MA(1) process Let X t = e t +c 1 e t 1 where {e t } WN(0,σ 2 ). Spectral density: R(f) = σ 2 (1+c1 2 +2c 1cos2πf).

MA(1) process MA(1) processes with c 1 = 0.9 and c 1 = 0.9, respectively. Spectral Density 0 1 2 3 0.0 0.1 0.2 0.3 0.4 0.5 Frequency Spectral Density 0 1 2 3 0.0 0.1 0.2 0.3 0.4 0.5 Frequency

The ARMA(p, q) process An ARMA(p,q) process is given by X t +a 1 X t 1 + +a p X t p = e t +c 1 e t 1 +c q e t q where {e t } is a white-noise process, such that e t and X t k are uncorrelated for k = 1,2,.... The spectral density is given by R X (f) = σ 2 q k=0 c ke i2πfk 2 ) 2 p k=0 a ke i2πfk = σ2 C(e i2πf 2 A(e i2πf ) 2