# 4.3 Moving Average Process MA(q)

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 66 CHAPTER 4. STATIONARY TS MODELS 4.3 Moving Average Process MA(q) Definition 4.5. {X t } is a moving-average process of order q if X t = Z t + θ 1 Z t θ q Z t q, (4.9) where and θ 1,...,θ q are constants. Z t WN(0, σ 2 ) Remark 4.6. X t is a linear combination of q + 1 white noise variables and we say that it is q-correlated, that is X t and X t+τ are uncorrelated for all lags τ > q. Remark 4.7. If Z t is an i.i.d process then X t is a strictly stationary TS since (Z t,...,z t q ) T d = (Z t+τ,..., Z t q+τ ) T for all τ. Then it is called q-dependent, that is X t and X t+τ are independent for all lags τ > q. Remark 4.8. Obviously, IID noise is a 0-dependent TS. White noise is a 0-correlated TS. MA(1) is 1-correlated TS if it is a combination of WN r.vs, 1-dependent if it is a combination of IID r.vs. Remark 4.9. The MA(q) process can also be written in the following equivalent form X t = θ(b)z t, (4.10) where the moving average operator θ(b) = 1 + θ 1 B + θ 2 B θ q B q (4.11) defines a linear combination of values in the shift operator B k Z t = Z t k.

2 4.3. MOVING AVERAGE PROCESS MA(Q) 67 Example 4.4. MA(2) process. This process is written as X t = Z t + θ 1 Z t 1 + θ 2 Z t 2 = (1 + θ 1 B + θ 2 B 2 )Z t. (4.12) What are the properties of MA(2)? As it is a combination of a zero mean white noise, it also has zero mean, i.e., E X t = E(Z t + θ 1 Z t 1 + θ 2 Z t 2 ) = 0. It is easy to calculate the covariance of X t and X t+τ. We get (1 + θ1 2 + θ2)σ 2 2 for τ = 0, (θ γ(τ) = cov(x t, X t+τ ) = 1 + θ 1 θ 2 )σ 2 for τ = ±1, θ 2 σ 2 for τ = ±2, 0 for τ > 2, which shows that the autocovariances depend on lag, but not on time. Dividing γ(τ) by γ(0) we obtain the autocorrelation function, 1 for τ = 0, θ 1 +θ 1 θ 2 for τ = ±1, 1+θ1 ρ(τ) = 2+θ2 2 θ 2 for τ = ±2 1+θ1 2+θ2 2 0 for τ > 2. MA(2) process is a weakly stationary, 2-correlated TS. Figure 4.5 shows MA(2) processes obtained from the simulated Gaussian white noise shown in Figure 4.1 for various values of the parameters (θ 1, θ 2 ). The blue series is x t = z t + 0.5z t z t 2, while the purple series is x t = z t + 5z t 1 + 5z t 2, where z t are realizations of an i.i.d. Gaussian noise. As you can see very different processes can be obtained for different sets of the parameters. This is an important property of MA(q) processes, which is a very large family of models. This property is reinforced by the following Proposition. Proposition 4.2. If {X t } is a stationary q-correlated time series with mean zero, then it can be represented as an MA(q) process.

3 68 CHAPTER 4. STATIONARY TS MODELS Simulated MA(2) time index Figure 4.5: Two simulated MA(2) processes, both from the white noise shown in Figure 4.1, but for different sets of parameters: (θ 1, θ 2 ) = (0.5, 0.5) and (θ 1, θ 2 ) = (5, 5). Series : GaussianWN\$xt Series : GaussianWN\$xt55 ACF ACF (a) Lag Lag (b) Figure 4.6: (a) Sample ACF for x t = z t + 0.5z t z t 2 and (b) for x t = z t + 5z t 1 + 5z t 2.

4 4.3. MOVING AVERAGE PROCESS MA(Q) 69 Also, the following theorem gives the form of ACF for a general MA(q). Theorem 4.2. An MA(q) process (as in Definition 4.5) is a weakly stationary TS with the ACVF { σ 2 q τ γ(τ) = j=0 θ jθ j+ τ, if τ q, (4.13) 0, if τ > q, where θ 0 is defined to be 1. The ACF of an MA(q) has a distinct cut-off at lag τ = q. Furthermore, if q is small the maximum value of ρ(1) is well below unity. It can be shown that ( ) π ρ(1) cos. (4.14) q Non-uniqueness of MA Models Consider an example of MA(1) whose ACF is X t = Z t + θz t 1 1, if τ = 0, θ ρ(τ) = if τ = ±1, 1+θ 2 0, if τ > 1. For q = 1, formula (4.14) means that the maximum value of ρ(1) is 0.5. It can be verified directly from the formula for the ACF above. Treating ρ(1) as a function of θ we can calculate its extrema. Denote Then f(θ) = θ 1 + θ 2. f (θ) = 1 θ2 (1 + θ 2 ) 2. The derivative is equal to 0 at θ = ±1 and the function f attains maximum at θ = 1 and minimum at θ = 1. We have f(1) = 1 2, f( 1) = 1 2. This fact can be helpful in recognizing MA(1) processes. In fact, MA(1) with θ = 1 may be uniquely identified from the autocorrelation function.

5 70 CHAPTER 4. STATIONARY TS MODELS However, it is easy to see that the form of the ACF stays the same for θ and for 1. Take for example 5 and 1. In both cases θ 5 1 if τ = 0, 5 ρ(τ) = if τ = ±1, 26 0 if τ > 1. Also, the pair σ 2 = 1, θ = 5 gives the same ACVF as the pair σ 2 = 25, θ = 1, 5 namely (1 + θ 2 )σ 2 = 26, if τ = 0, γ(τ) = θσ 2 = 5, if τ = ±1, 0, if τ > 1. Hence, the MA(1) processes X t = Z t Z t 1, Z t iid N(0, 25) and X t = Y t + 5Y t 1, Y t iid N(0, 1) are the same. We can observe the variable X t, not the noise variable, so we can not distinguish between these two models. Except for the case of θ = 1, a particular autocorrelation function will be compatible with two models. To which of the two models we should restrict our attention? Invertibility of MA Processes The MA(1) process can be expressed in terms of lagged values of X t by substituting repeatedly for lagged values of Z t. We have The substitution yields Z t = X t θz t 1 = X t θ(x t 1 θz t 2 ) Z t = X t θz t 1. = X t θx t 1 + θ 2 Z t 2 = X t θx t 1 + θ 2 (X t 2 θz t 3 ) = X t θx t 1 + θ 2 X t 2 θ 3 Z t 3 =... = X t θx t 1 + θ 2 X t 2 θ 3 X t 3 + θ 4 X t ( θ) n Z t n.

6 4.3. MOVING AVERAGE PROCESS MA(Q) 71 This can be rewritten as However, if θ < 1, then n 1 ( θ) n Z t n = Z t ( θ) j X t j. j=0 j=0 ( ) n 1 2 E Z t ( θ) j X t j = E ( θ 2n Zt n) 2 0 n and we say that the sum is convergent in the mean square sense. Hence, we obtain another representation of the model Z t = ( θ) j X t j. j=0 This is a representation of another class of models, called infinite autoregressive (AR) models. So we inverted MA(1) to an infinite AR. It was possible due to the assumption that θ < 1. Such a process is called an invertible process. This is a desired property of TS, so in the example we would choose the model with σ 2 = 25, θ = 1 5.

7 72 CHAPTER 4. STATIONARY TS MODELS 4.4 Linear Processes Definition 4.6. The TS {X t } is called a linear process if it has the representation X t = ψ j Z t j, (4.15) for all t, where Z t WN(0, σ 2 ) and {ψ j } is a sequence of constants such that ψ j <. Remark The condition ψ j < ensures that the process converges in the mean square sense, that is E ( X t n ) 2 ψ j Z t j 0 as n. j= n Remark MA( ) is a linear process with ψ j = 0 for j < 0 and ψ j = θ j for j 0, that is MA( ) has the representation X t = θ j Z t j, j=0 where θ 0 = 1. Note that the formula (4.15) can be written using the backward shift operator B. We have Z t j = B j Z t. Hence Denoting X t = ψ j Z t j = ψ(b) = ψ j B j Z t. ψ j B j, (4.16)

8 4.4. LINEAR PROCESSES 73 we can write the linear process in a neat way X t = ψ(b)z t. The operator ψ(b) is a linear filter, which when applied to a stationary process produces a stationary process. This fact is proved in the following proposition. Proposition 4.3. Let {Y t } be a stationary TS with mean zero and autocovariance function γ Y. If ψ j <, then the process X t = is stationary with mean zero and autocovariance function γ X (τ) = k= ψ j Y t j = ψ(b)y t (4.17) ψ j ψ k γ Y (τ k + j). (4.18) Proof. The assumption ψ j < assures convergence of the series. Now, since E Y t = 0, we have ( ) E X t = E ψ j Y t j = ψ j E(Y t j ) = 0 and E(X t X t+τ ) = E = = [( k= k= ψ j Y t j )( k= ψ j ψ k E(Y t j Y t+τ k ) ψ j ψ k γ Y (τ k + j). ψ k Y t+τ k )] It means that {X t } is a stationary TS with the autocavarianxe function given by formula (4.18). Corrolary 4.1. If {Y t } is a white noise process, then {X t } given by (4.17) is a stationary linear process with zero mean and the ACVF γ X (τ) = ψ j ψ j+τ σ 2. (4.19)

### Introduction to ARMA Models

Statistics 910, #8 1 Overview Introduction to ARMA Models 1. Modeling paradigm 2. Review stationary linear processes 3. ARMA processes 4. Stationarity of ARMA processes 5. Identifiability of ARMA processes

### 188 CHAPTER 6. TIME SERIES IN THE FREQUENCY DOMAIN

188 CHAPTER 6. TIME SERIES IN THE FREQUENCY DOMAIN Conceptually, the reciprocal of the spectrum in the expression for Ψ(e iλ ) in Example 6.5.1 plays the role of Γ 1 k in the Yule-Walker equations, as

### Lecture 1: Univariate Time Series B41910: Autumn Quarter, 2008, by Mr. Ruey S. Tsay

Lecture 1: Univariate Time Series B41910: Autumn Quarter, 2008, by Mr. Ruey S. Tsay 1 Some Basic Concepts 1. Time Series: A sequence of random variables measuring certain quantity of interest over time.

### Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ).

Correlation Regression Bivariate Normal Suppose that X and Y are r.v. s with joint density f(x y) and suppose that the means of X and Y are respectively µ 1 µ 2 and the variances are 1 2. Definition The

### Time Series Analysis

Time Series Analysis Autoregressive, MA and ARMA processes Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 212 Alonso and García-Martos

### Discrete Time Series Analysis with ARMA Models

Discrete Time Series Analysis with ARMA Models Veronica Sitsofe Ahiati (veronica@aims.ac.za) African Institute for Mathematical Sciences (AIMS) Supervised by Tina Marquardt Munich University of Technology,

### Introduction to Time Series Analysis. Lecture 6.

Introduction to Time Series Analysis. Lecture 6. Peter Bartlett www.stat.berkeley.edu/ bartlett/courses/153-fall2010 Last lecture: 1. Causality 2. Invertibility 3. AR(p) models 4. ARMA(p,q) models 1 Introduction

### Statistics 626. In this section we study an important set of time series models; the autoregressive moving average (ARMA) models.

11 ARMA Models In this section we study an important set of time series models; the autoregressive moving average (ARMA) models. 11.1 Reasons for Studying Models In addition to being a simple description

### 1 Short Introduction to Time Series

ECONOMICS 7344, Spring 202 Bent E. Sørensen January 24, 202 Short Introduction to Time Series A time series is a collection of stochastic variables x,.., x t,.., x T indexed by an integer value t. The

### ITSM-R Reference Manual

ITSM-R Reference Manual George Weigt June 5, 2015 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

### Lecture 14: Correlation and Autocorrelation Steven Skiena. skiena

Lecture 14: Correlation and Autocorrelation Steven Skiena Department of Computer Science State University of New York Stony Brook, NY 11794 4400 http://www.cs.sunysb.edu/ skiena Overuse of Color, Dimensionality,

### Analysis of Financial Time Series with EViews

Analysis of Financial Time Series with EViews Enrico Foscolo Contents 1 Asset Returns 2 1.1 Empirical Properties of Returns................. 2 2 Heteroskedasticity and Autocorrelation 4 2.1 Testing for

### Sales forecasting # 2

Sales forecasting # 2 Arthur Charpentier arthur.charpentier@univ-rennes1.fr 1 Agenda Qualitative and quantitative methods, a very general introduction Series decomposition Short versus long term forecasting

### Time Series Analysis 1. Lecture 8: Time Series Analysis. Time Series Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 MIT 18.S096

Lecture 8: Time Series Analysis MIT 18.S096 Dr. Kempthorne Fall 2013 MIT 18.S096 Time Series Analysis 1 Outline Time Series Analysis 1 Time Series Analysis MIT 18.S096 Time Series Analysis 2 A stochastic

### Probability and Random Variables. Generation of random variables (r.v.)

Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly

### SOME PROPERTIES OF THE EWMA CONTROL CHART IN THE PRESENCE OF AUTOCORRELATION

The Annals of Statistics 1997, Vol. 25, No. 3, 1277 1283 SOME PROPERTIES OF THE EWMA CONTROL CHART IN THE PRESENCE OF AUTOCORRELATION By Wolfgang Schmid and Alexander Schöne Europe-University, Frankfurt

### Some useful concepts in univariate time series analysis

Some useful concepts in univariate time series analysis Autoregressive moving average models Autocorrelation functions Model Estimation Diagnostic measure Model selection Forecasting Assumptions: 1. Non-seasonal

### Exercise 1. 1) Using Eviews (file named return.wf1), plot the ACF and PACF function for the series returns. Identify the series.

Exercise 1 1) Using Eviews (file named return.wf1), plot the ACF and PACF function for the series returns. Identify the series. 2) Read the paper Do we really know that financial markets are efficient?

### Autocovariance and Autocorrelation

Chapter 3 Autocovariance and Autocorrelation If the {X n } process is weakly stationary, the covariance of X n and X n+k depends only on the lag k. This leads to the following definition of the autocovariance

### Time Series Analysis in Economics. Klaus Neusser

Time Series Analysis in Economics Klaus Neusser May 26, 2015 Contents I Univariate Time Series Analysis 3 1 Introduction 1 1.1 Some examples.......................... 2 1.2 Formal definitions.........................

### AR(p) + MA(q) = ARMA(p, q)

AR(p) + MA(q) = ARMA(p, q) Outline 1 3.4: ARMA(p, q) Model 2 Homework 3a Arthur Berg AR(p) + MA(q) = ARMA(p, q) 2/ 12 ARMA(p, q) Model Definition (ARMA(p, q) Model) A time series is ARMA(p, q) if it is

### Econometrics Regression Analysis with Time Series Data

Econometrics Regression Analysis with Time Series Data João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, May 2011

### A Multiplicative Seasonal Box-Jenkins Model to Nigerian Stock Prices

A Multiplicative Seasonal Box-Jenkins Model to Nigerian Stock Prices Ette Harrison Etuk Department of Mathematics/Computer Science, Rivers State University of Science and Technology, Nigeria Email: ettetuk@yahoo.com

### Lecture 2: ARMA(p,q) models (part 3)

Lecture 2: ARMA(p,q) models (part 3) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

### Time Series Analysis

Time Series Analysis Time series and stochastic processes Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 2012 Alonso and García-Martos

### Analysis and Computation for Finance Time Series - An Introduction

ECMM703 Analysis and Computation for Finance Time Series - An Introduction Alejandra González Harrison 161 Email: mag208@exeter.ac.uk Time Series - An Introduction A time series is a sequence of observations

### Trend and Seasonal Components

Chapter 2 Trend and Seasonal Components If the plot of a TS reveals an increase of the seasonal and noise fluctuations with the level of the process then some transformation may be necessary before doing

### 1 Limiting distribution for a Markov chain

Copyright c 2009 by Karl Sigman Limiting distribution for a Markov chain In these Lecture Notes, we shall study the limiting behavior of Markov chains as time n In particular, under suitable easy-to-check

### Time Series Analysis

Time Series Analysis Forecasting with ARIMA models Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 2012 Alonso and García-Martos (UC3M-UPM)

### 3.6: General Hypothesis Tests

3.6: General Hypothesis Tests The χ 2 goodness of fit tests which we introduced in the previous section were an example of a hypothesis test. In this section we now consider hypothesis tests more generally.

### 5. Convergence of sequences of random variables

5. Convergence of sequences of random variables Throughout this chapter we assume that {X, X 2,...} is a sequence of r.v. and X is a r.v., and all of them are defined on the same probability space (Ω,

### Testing for Serial Correlation in Fixed-Effects Panel Data Models

Testing for Serial Correlation in Fixed-Effects Panel Data Models Benjamin Born Jörg Breitung In this paper, we propose three new tests for serial correlation in the disturbances of fixed-effects panel

### 7. Autocorrelation (Violation of Assumption #B3)

7. Autocorrelation (Violation of Assumption #B3) Assumption #B3: The error term u i is not autocorrelated, i.e. Cov(u i, u j ) = 0 for all i = 1,..., N and j = 1,..., N where i j Where do we typically

### Study & Development of Short Term Load Forecasting Models Using Stochastic Time Series Analysis

International Journal of Engineering Research and Development e-issn: 2278-067X, p-issn: 2278-800X, www.ijerd.com Volume 9, Issue 11 (February 2014), PP. 31-36 Study & Development of Short Term Load Forecasting

### Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

### Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall

### Univariate Time Series Analysis; ARIMA Models

Econometrics 2 Spring 25 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

### Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay Business cycle plays an important role in economics. In time series analysis, business cycle

### ELEC-E8104 Stochastics models and estimation, Lecture 3b: Linear Estimation in Static Systems

Stochastics models and estimation, Lecture 3b: Linear Estimation in Static Systems Minimum Mean Square Error (MMSE) MMSE estimation of Gaussian random vectors Linear MMSE estimator for arbitrarily distributed

### Univariate Time Series Analysis; ARIMA Models

Econometrics 2 Fall 25 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Univariate Time Series Analysis We consider a single time series, y,y 2,..., y T. We want to construct simple

### AR(1) TIME SERIES PROCESS Econometrics 7590

AR(1) TIME SERIES PROCESS Econometrics 7590 Zsuzsanna HORVÁTH and Ryan JOHNSTON Abstract: We define the AR(1) process and its properties and applications. We demonstrate the applicability of our method

### ANALYTICAL MATHEMATICS FOR APPLICATIONS 2016 LECTURE NOTES Series

ANALYTICAL MATHEMATICS FOR APPLICATIONS 206 LECTURE NOTES 8 ISSUED 24 APRIL 206 A series is a formal sum. Series a + a 2 + a 3 + + + where { } is a sequence of real numbers. Here formal means that we don

### Financial TIme Series Analysis: Part II

Department of Mathematics and Statistics, University of Vaasa, Finland January 29 February 13, 2015 Feb 14, 2015 1 Univariate linear stochastic models: further topics Unobserved component model Signal

### The Delta Method and Applications

Chapter 5 The Delta Method and Applications 5.1 Linear approximations of functions In the simplest form of the central limit theorem, Theorem 4.18, we consider a sequence X 1, X,... of independent and

### CHAPTER III - MARKOV CHAINS

CHAPTER III - MARKOV CHAINS JOSEPH G. CONLON 1. General Theory of Markov Chains We have already discussed the standard random walk on the integers Z. A Markov Chain can be viewed as a generalization of

### The University of Chicago, Booth School of Business Business 41202, Spring Quarter 2013, Mr. Ruey S. Tsay Solutions to Final Exam

The University of Chicago, Booth School of Business Business 41202, Spring Quarter 2013, Mr. Ruey S. Tsay Solutions to Final Exam Problem A: (40 points) Answer briefly the following questions. 1. Describe

### Time Series - ARIMA Models. Instructor: G. William Schwert

APS 425 Fall 25 Time Series : ARIMA Models Instructor: G. William Schwert 585-275-247 schwert@schwert.ssb.rochester.edu Topics Typical time series plot Pattern recognition in auto and partial autocorrelations

### Ged Ridgway Wellcome Trust Centre for Neuroimaging University College London. [slides from the FIL Methods group] SPM Course Vancouver, August 2010

Ged Ridgway Wellcome Trust Centre for Neuroimaging University College London [slides from the FIL Methods group] SPM Course Vancouver, August 2010 β β y X X e one sample t-test two sample t-test paired

### A TEST FOR PARAMETER CHANGE IN GENERAL CAUSAL MODELS

A TEST FOR PARAMETER CHANGE IN GENERAL CAUSAL MODELS Work supervised by Jean-Marc BARDET (Université Paris 1) Université Paris 1 & Université de Yaoundé I Non stationary time series, Paris, February 10-11,

### Univariate and Multivariate Methods PEARSON. Addison Wesley

Time Series Analysis Univariate and Multivariate Methods SECOND EDITION William W. S. Wei Department of Statistics The Fox School of Business and Management Temple University PEARSON Addison Wesley Boston

### Projects in time series analysis MAP565

Projects in time series analysis MAP565 Programming : Any software can be used: Scilab, Matlab, R, Octave, Python,... The scripts must be archived in a single file (zip, tar or tgz) and should not be printed

### Time Series Analysis by Higher Order Crossings. Benjamin Kedem Department of Mathematics & ISR University of Maryland College Park, MD

Time Series Analysis by Higher Order Crossings Benjamin Kedem Department of Mathematics & ISR University of Maryland College Park, MD 1 Stochastic Process A stochastic or random process {Z t },, 1,0,1,,

### Lecture 18 Chapter 6: Empirical Statistics

Lecture 18 Chapter 6: Empirical Statistics M. George Akritas Definition (Sample Covariance and Pearson s Correlation) Let (X 1, Y 1 ),..., (X n, Y n ) be a sample from a bivariate population. The sample

### Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

### 6.8 Taylor and Maclaurin s Series

6.8. TAYLOR AND MACLAURIN S SERIES 357 6.8 Taylor and Maclaurin s Series 6.8.1 Introduction The previous section showed us how to find the series representation of some functions by using the series representation

### State Space Time Series Analysis

State Space Time Series Analysis p. 1 State Space Time Series Analysis Siem Jan Koopman http://staff.feweb.vu.nl/koopman Department of Econometrics VU University Amsterdam Tinbergen Institute 2011 State

### Conditional expectation

A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined

### Filtering Data using Frequency Domain Filters

Filtering Data using Frequency Domain Filters Wouter J. Den Haan London School of Economics c Wouter J. Den Haan August 27, 2014 Overview Intro lag operator Why frequency domain? Fourier transform Data

### Fisher Information and Cramér-Rao Bound. 1 Fisher Information. Math 541: Statistical Theory II. Instructor: Songfeng Zheng

Math 54: Statistical Theory II Fisher Information Cramér-Rao Bound Instructor: Songfeng Zheng In the parameter estimation problems, we obtain information about the parameter from a sample of data coming

### Simultaneous Equations

Simultaneous Equations 1 Motivation and Examples Now we relax the assumption that EX u 0 his wil require new techniques: 1 Instrumental variables 2 2- and 3-stage least squares 3 Limited LIML and full

### A Partition Formula for Fibonacci Numbers

A Partition Formula for Fibonacci Numbers Philipp Fahr and Claus Michael Ringel Fakultät für Mathematik Universität Bielefeld POBox 00 3 D-330 Bielefeld, Germany philfahr@mathuni-bielefeldde ringel@mathuni-bielefeldde

### Math 497C Sep 17, Curves and Surfaces Fall 2004, PSU

Math 497C Sep 17, 2004 1 Curves and Surfaces Fall 2004, PSU Lecture Notes 4 1.9 Curves of Constant Curvature Here we show that the only curves in the plane with constant curvature are lines and circles.

### Negative Feedback with Capacitors

Negative Feedback with Capacitors http://gaussmarkov.net January 8, 008 The web page Op-Amps 4: Negative Feedback, describes several characteristics of the gain of a circuit like this one. This document

### Section 3 Sequences and Limits

Section 3 Sequences and Limits Definition A sequence of real numbers is an infinite ordered list a, a 2, a 3, a 4,... where, for each n N, a n is a real number. We call a n the n-th term of the sequence.

### Section 3. Compactness Theorem

Section 3 Compactness Theorem 1 / 24 Compactness Theorem The most important result in model theory is: Compactness Theorem Let T be a theory in language L. If every finite subset of T has a model, then

### MATH 2030: SYSTEMS OF LINEAR EQUATIONS. ax + by + cz = d. )z = e. while these equations are not linear: xy z = 2, x x = 0,

MATH 23: SYSTEMS OF LINEAR EQUATIONS Systems of Linear Equations In the plane R 2 the general form of the equation of a line is ax + by = c and that the general equation of a plane in R 3 will be we call

### Forecasting model of electricity demand in the Nordic countries. Tone Pedersen

Forecasting model of electricity demand in the Nordic countries Tone Pedersen 3/19/2014 Abstract A model implemented in order to describe the electricity demand on hourly basis for the Nordic countries.

### Advanced time-series analysis

UCL DEPARTMENT OF SECURITY AND CRIME SCIENCE Advanced time-series analysis Lisa Tompson Research Associate UCL Jill Dando Institute of Crime Science l.tompson@ucl.ac.uk Overview Fundamental principles

### Finite Markov Chains and Algorithmic Applications. Matematisk statistik, Chalmers tekniska högskola och Göteborgs universitet

Finite Markov Chains and Algorithmic Applications Olle Häggström Matematisk statistik, Chalmers tekniska högskola och Göteborgs universitet PUBLISHED BY THE PRESS SYNDICATE OF THE UNIVERSITY OF CAMBRIDGE

### 2.8 Expected values and variance

y 1 b a 0 y 0.0 0.2 0.4 0.6 0.8 1.0 1.2 0 a b (a) Probability density function x 0 a b (b) Cumulative distribution function x Figure 2.3: The probability density function and cumulative distribution function

### L10: Probability, statistics, and estimation theory

L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian

### Solving Linear Systems, Continued and The Inverse of a Matrix

, Continued and The of a Matrix Calculus III Summer 2013, Session II Monday, July 15, 2013 Agenda 1. The rank of a matrix 2. The inverse of a square matrix Gaussian Gaussian solves a linear system by reducing

### Good luck! Problem 1. (b) The random variables X 1 and X 2 are independent and N(0, 1)-distributed. Show that the random variables X 1

Avd. Matematisk statistik TETAME I SF2940 SAOLIKHETSTEORI/EXAM I SF2940 PROBABILITY THE- ORY, WEDESDAY OCTOBER 26, 2016, 08.00-13.00. Examinator : Boualem Djehiche, tel. 08-7907875, email: boualem@kth.se

### POL 571: Expectation and Functions of Random Variables

POL 571: Expectation and Functions of Random Variables Kosuke Imai Department of Politics, Princeton University March 10, 2006 1 Expectation and Independence To gain further insights about the behavior

### Convexity in R N Main Notes 1

John Nachbar Washington University December 16, 2016 Convexity in R N Main Notes 1 1 Introduction. These notes establish basic versions of the Supporting Hyperplane Theorem (Theorem 5) and the Separating

### A Semi-parametric Approach for Decomposition of Absorption Spectra in the Presence of Unknown Components

A Semi-parametric Approach for Decomposition of Absorption Spectra in the Presence of Unknown Components Payman Sadegh 1,2, Henrik Aalborg Nielsen 1, and Henrik Madsen 1 Abstract Decomposition of absorption

### Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

### Continued fractions and good approximations.

Continued fractions and good approximations We will study how to find good approximations for important real life constants A good approximation must be both accurate and easy to use For instance, our

### Homework Zero. Max H. Farrell Chicago Booth BUS41100 Applied Regression Analysis. Complete before the first class, do not turn in

Homework Zero Max H. Farrell Chicago Booth BUS41100 Applied Regression Analysis Complete before the first class, do not turn in This homework is intended as a self test of your knowledge of the basic statistical

### Permutation tests are similar to rank tests, except that we use the observations directly without replacing them by ranks.

Chapter 2 Permutation Tests Permutation tests are similar to rank tests, except that we use the observations directly without replacing them by ranks. 2.1 The two-sample location problem Assumptions: x

### 7. The Gauss-Bonnet theorem

7. The Gauss-Bonnet theorem 7. Hyperbolic polygons In Euclidean geometry, an n-sided polygon is a subset of the Euclidean plane bounded by n straight lines. Thus the edges of a Euclidean polygon are formed

### Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,

### Cofactor Expansion: Cramer s Rule

Cofactor Expansion: Cramer s Rule MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Introduction Today we will focus on developing: an efficient method for calculating

### Estimation and Inference in Cointegration Models Economics 582

Estimation and Inference in Cointegration Models Economics 582 Eric Zivot May 17, 2012 Tests for Cointegration Let the ( 1) vector Y be (1). Recall, Y is cointegrated with 0 cointegrating vectors if there

### 10.2 Series and Convergence

10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and

### Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written

### MATH : HONORS CALCULUS-3 HOMEWORK 6: SOLUTIONS

MATH 16300-33: HONORS CALCULUS-3 HOMEWORK 6: SOLUTIONS 25-1 Find the absolute value and argument(s) of each of the following. (ii) (3 + 4i) 1 (iv) 7 3 + 4i (ii) Put z = 3 + 4i. From z 1 z = 1, we have

### MATHEMATICAL INDUCTION AND DIFFERENCE EQUATIONS

1 CHAPTER 6. MATHEMATICAL INDUCTION AND DIFFERENCE EQUATIONS 1 INSTITIÚID TEICNEOLAÍOCHTA CHEATHARLACH INSTITUTE OF TECHNOLOGY CARLOW MATHEMATICAL INDUCTION AND DIFFERENCE EQUATIONS 1 Introduction Recurrence

### Investment. Investment expenditures are one of the most volatile elements of aggregate economy.

Investment Investment expenditures are one of the most volatile elements of aggregate economy. Critical to productivity and growth. Key issue for policy intervention: investment tax credits. accelerated

### Chapter 1. Vector autoregressions. 1.1 VARs and the identi cation problem

Chapter Vector autoregressions We begin by taking a look at the data of macroeconomics. A way to summarize the dynamics of macroeconomic data is to make use of vector autoregressions. VAR models have become

### Introduction to Monte-Carlo Methods

Introduction to Monte-Carlo Methods Bernard Lapeyre Halmstad January 2007 Monte-Carlo methods are extensively used in financial institutions to compute European options prices to evaluate sensitivities

### 2 A General Definition of Identification

1 Identification in Econometrics Much of the course so far has studied properties of certain estimators (e.g., extremum estimators). A minimal requirement on an estimator is consistency, i.e., as the sample

### 3.1 Stationary Processes and Mean Reversion

3. Univariate Time Series Models 3.1 Stationary Processes and Mean Reversion Definition 3.1: A time series y t, t = 1,..., T is called (covariance) stationary if (1) E[y t ] = µ, for all t Cov[y t, y t

### Chapter 11: Two Variable Regression Analysis

Department of Mathematics Izmir University of Economics Week 14-15 2014-2015 In this chapter, we will focus on linear models and extend our analysis to relationships between variables, the definitions

### Time Series Analysis

JUNE 2012 Time Series Analysis CONTENT A time series is a chronological sequence of observations on a particular variable. Usually the observations are taken at regular intervals (days, months, years),

### The purpose of this section is to derive the asymptotic distribution of the Pearson chi-square statistic. k (n j np j ) 2. np j.

Chapter 7 Pearson s chi-square test 7. Null hypothesis asymptotics Let X, X 2, be independent from a multinomial(, p) distribution, where p is a k-vector with nonnegative entries that sum to one. That

### Time Series Analysis

Time Series Analysis Lecture Notes for 475.726 Ross Ihaka Statistics Department University of Auckland April 14, 2005 ii Contents 1 Introduction 1 1.1 Time Series.............................. 1 1.2 Stationarity