4.3 Moving Average Process MA(q)
|
|
|
- Alexandrina Bradley
- 9 years ago
- Views:
Transcription
1 66 CHAPTER 4. STATIONARY TS MODELS 4.3 Moving Average Process MA(q) Definition 4.5. {X t } is a moving-average process of order q if X t = Z t + θ 1 Z t θ q Z t q, (4.9) where and θ 1,...,θ q are constants. Z t WN(0, σ 2 ) Remark 4.6. X t is a linear combination of q + 1 white noise variables and we say that it is q-correlated, that is X t and X t+τ are uncorrelated for all lags τ > q. Remark 4.7. If Z t is an i.i.d process then X t is a strictly stationary TS since (Z t,...,z t q ) T d = (Z t+τ,..., Z t q+τ ) T for all τ. Then it is called q-dependent, that is X t and X t+τ are independent for all lags τ > q. Remark 4.8. Obviously, IID noise is a 0-dependent TS. White noise is a 0-correlated TS. MA(1) is 1-correlated TS if it is a combination of WN r.vs, 1-dependent if it is a combination of IID r.vs. Remark 4.9. The MA(q) process can also be written in the following equivalent form X t = θ(b)z t, (4.10) where the moving average operator θ(b) = 1 + θ 1 B + θ 2 B θ q B q (4.11) defines a linear combination of values in the shift operator B k Z t = Z t k.
2 4.3. MOVING AVERAGE PROCESS MA(Q) 67 Example 4.4. MA(2) process. This process is written as X t = Z t + θ 1 Z t 1 + θ 2 Z t 2 = (1 + θ 1 B + θ 2 B 2 )Z t. (4.12) What are the properties of MA(2)? As it is a combination of a zero mean white noise, it also has zero mean, i.e., E X t = E(Z t + θ 1 Z t 1 + θ 2 Z t 2 ) = 0. It is easy to calculate the covariance of X t and X t+τ. We get (1 + θ1 2 + θ2)σ 2 2 for τ = 0, (θ γ(τ) = cov(x t, X t+τ ) = 1 + θ 1 θ 2 )σ 2 for τ = ±1, θ 2 σ 2 for τ = ±2, 0 for τ > 2, which shows that the autocovariances depend on lag, but not on time. Dividing γ(τ) by γ(0) we obtain the autocorrelation function, 1 for τ = 0, θ 1 +θ 1 θ 2 for τ = ±1, 1+θ1 ρ(τ) = 2+θ2 2 θ 2 for τ = ±2 1+θ1 2+θ2 2 0 for τ > 2. MA(2) process is a weakly stationary, 2-correlated TS. Figure 4.5 shows MA(2) processes obtained from the simulated Gaussian white noise shown in Figure 4.1 for various values of the parameters (θ 1, θ 2 ). The blue series is x t = z t + 0.5z t z t 2, while the purple series is x t = z t + 5z t 1 + 5z t 2, where z t are realizations of an i.i.d. Gaussian noise. As you can see very different processes can be obtained for different sets of the parameters. This is an important property of MA(q) processes, which is a very large family of models. This property is reinforced by the following Proposition. Proposition 4.2. If {X t } is a stationary q-correlated time series with mean zero, then it can be represented as an MA(q) process.
3 68 CHAPTER 4. STATIONARY TS MODELS Simulated MA(2) time index Figure 4.5: Two simulated MA(2) processes, both from the white noise shown in Figure 4.1, but for different sets of parameters: (θ 1, θ 2 ) = (0.5, 0.5) and (θ 1, θ 2 ) = (5, 5). Series : GaussianWN$xt Series : GaussianWN$xt55 ACF ACF (a) Lag Lag (b) Figure 4.6: (a) Sample ACF for x t = z t + 0.5z t z t 2 and (b) for x t = z t + 5z t 1 + 5z t 2.
4 4.3. MOVING AVERAGE PROCESS MA(Q) 69 Also, the following theorem gives the form of ACF for a general MA(q). Theorem 4.2. An MA(q) process (as in Definition 4.5) is a weakly stationary TS with the ACVF { σ 2 q τ γ(τ) = j=0 θ jθ j+ τ, if τ q, (4.13) 0, if τ > q, where θ 0 is defined to be 1. The ACF of an MA(q) has a distinct cut-off at lag τ = q. Furthermore, if q is small the maximum value of ρ(1) is well below unity. It can be shown that ( ) π ρ(1) cos. (4.14) q Non-uniqueness of MA Models Consider an example of MA(1) whose ACF is X t = Z t + θz t 1 1, if τ = 0, θ ρ(τ) = if τ = ±1, 1+θ 2 0, if τ > 1. For q = 1, formula (4.14) means that the maximum value of ρ(1) is 0.5. It can be verified directly from the formula for the ACF above. Treating ρ(1) as a function of θ we can calculate its extrema. Denote Then f(θ) = θ 1 + θ 2. f (θ) = 1 θ2 (1 + θ 2 ) 2. The derivative is equal to 0 at θ = ±1 and the function f attains maximum at θ = 1 and minimum at θ = 1. We have f(1) = 1 2, f( 1) = 1 2. This fact can be helpful in recognizing MA(1) processes. In fact, MA(1) with θ = 1 may be uniquely identified from the autocorrelation function.
5 70 CHAPTER 4. STATIONARY TS MODELS However, it is easy to see that the form of the ACF stays the same for θ and for 1. Take for example 5 and 1. In both cases θ 5 1 if τ = 0, 5 ρ(τ) = if τ = ±1, 26 0 if τ > 1. Also, the pair σ 2 = 1, θ = 5 gives the same ACVF as the pair σ 2 = 25, θ = 1, 5 namely (1 + θ 2 )σ 2 = 26, if τ = 0, γ(τ) = θσ 2 = 5, if τ = ±1, 0, if τ > 1. Hence, the MA(1) processes X t = Z t Z t 1, Z t iid N(0, 25) and X t = Y t + 5Y t 1, Y t iid N(0, 1) are the same. We can observe the variable X t, not the noise variable, so we can not distinguish between these two models. Except for the case of θ = 1, a particular autocorrelation function will be compatible with two models. To which of the two models we should restrict our attention? Invertibility of MA Processes The MA(1) process can be expressed in terms of lagged values of X t by substituting repeatedly for lagged values of Z t. We have The substitution yields Z t = X t θz t 1 = X t θ(x t 1 θz t 2 ) Z t = X t θz t 1. = X t θx t 1 + θ 2 Z t 2 = X t θx t 1 + θ 2 (X t 2 θz t 3 ) = X t θx t 1 + θ 2 X t 2 θ 3 Z t 3 =... = X t θx t 1 + θ 2 X t 2 θ 3 X t 3 + θ 4 X t ( θ) n Z t n.
6 4.3. MOVING AVERAGE PROCESS MA(Q) 71 This can be rewritten as However, if θ < 1, then n 1 ( θ) n Z t n = Z t ( θ) j X t j. j=0 j=0 ( ) n 1 2 E Z t ( θ) j X t j = E ( θ 2n Zt n) 2 0 n and we say that the sum is convergent in the mean square sense. Hence, we obtain another representation of the model Z t = ( θ) j X t j. j=0 This is a representation of another class of models, called infinite autoregressive (AR) models. So we inverted MA(1) to an infinite AR. It was possible due to the assumption that θ < 1. Such a process is called an invertible process. This is a desired property of TS, so in the example we would choose the model with σ 2 = 25, θ = 1 5.
7 72 CHAPTER 4. STATIONARY TS MODELS 4.4 Linear Processes Definition 4.6. The TS {X t } is called a linear process if it has the representation X t = ψ j Z t j, (4.15) for all t, where Z t WN(0, σ 2 ) and {ψ j } is a sequence of constants such that ψ j <. Remark The condition ψ j < ensures that the process converges in the mean square sense, that is E ( X t n ) 2 ψ j Z t j 0 as n. j= n Remark MA( ) is a linear process with ψ j = 0 for j < 0 and ψ j = θ j for j 0, that is MA( ) has the representation X t = θ j Z t j, j=0 where θ 0 = 1. Note that the formula (4.15) can be written using the backward shift operator B. We have Z t j = B j Z t. Hence Denoting X t = ψ j Z t j = ψ(b) = ψ j B j Z t. ψ j B j, (4.16)
8 4.4. LINEAR PROCESSES 73 we can write the linear process in a neat way X t = ψ(b)z t. The operator ψ(b) is a linear filter, which when applied to a stationary process produces a stationary process. This fact is proved in the following proposition. Proposition 4.3. Let {Y t } be a stationary TS with mean zero and autocovariance function γ Y. If ψ j <, then the process X t = is stationary with mean zero and autocovariance function γ X (τ) = k= ψ j Y t j = ψ(b)y t (4.17) ψ j ψ k γ Y (τ k + j). (4.18) Proof. The assumption ψ j < assures convergence of the series. Now, since E Y t = 0, we have ( ) E X t = E ψ j Y t j = ψ j E(Y t j ) = 0 and E(X t X t+τ ) = E = = [( k= k= ψ j Y t j )( k= ψ j ψ k E(Y t j Y t+τ k ) ψ j ψ k γ Y (τ k + j). ψ k Y t+τ k )] It means that {X t } is a stationary TS with the autocavarianxe function given by formula (4.18). Corrolary 4.1. If {Y t } is a white noise process, then {X t } given by (4.17) is a stationary linear process with zero mean and the ACVF γ X (τ) = ψ j ψ j+τ σ 2. (4.19)
Discrete Time Series Analysis with ARMA Models
Discrete Time Series Analysis with ARMA Models Veronica Sitsofe Ahiati ([email protected]) African Institute for Mathematical Sciences (AIMS) Supervised by Tina Marquardt Munich University of Technology,
Introduction to Time Series Analysis. Lecture 6.
Introduction to Time Series Analysis. Lecture 6. Peter Bartlett www.stat.berkeley.edu/ bartlett/courses/153-fall2010 Last lecture: 1. Causality 2. Invertibility 3. AR(p) models 4. ARMA(p,q) models 1 Introduction
Time Series Analysis
Time Series Analysis Autoregressive, MA and ARMA processes Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 212 Alonso and García-Martos
1 Short Introduction to Time Series
ECONOMICS 7344, Spring 202 Bent E. Sørensen January 24, 202 Short Introduction to Time Series A time series is a collection of stochastic variables x,.., x t,.., x T indexed by an integer value t. The
ITSM-R Reference Manual
ITSM-R Reference Manual George Weigt June 5, 2015 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................
Sales forecasting # 2
Sales forecasting # 2 Arthur Charpentier [email protected] 1 Agenda Qualitative and quantitative methods, a very general introduction Series decomposition Short versus long term forecasting
Probability and Random Variables. Generation of random variables (r.v.)
Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly
Time Series Analysis 1. Lecture 8: Time Series Analysis. Time Series Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 MIT 18.S096
Lecture 8: Time Series Analysis MIT 18.S096 Dr. Kempthorne Fall 2013 MIT 18.S096 Time Series Analysis 1 Outline Time Series Analysis 1 Time Series Analysis MIT 18.S096 Time Series Analysis 2 A stochastic
Autocovariance and Autocorrelation
Chapter 3 Autocovariance and Autocorrelation If the {X n } process is weakly stationary, the covariance of X n and X n+k depends only on the lag k. This leads to the following definition of the autocovariance
Some useful concepts in univariate time series analysis
Some useful concepts in univariate time series analysis Autoregressive moving average models Autocorrelation functions Model Estimation Diagnostic measure Model selection Forecasting Assumptions: 1. Non-seasonal
Time Series Analysis in Economics. Klaus Neusser
Time Series Analysis in Economics Klaus Neusser May 26, 2015 Contents I Univariate Time Series Analysis 3 1 Introduction 1 1.1 Some examples.......................... 2 1.2 Formal definitions.........................
AR(p) + MA(q) = ARMA(p, q)
AR(p) + MA(q) = ARMA(p, q) Outline 1 3.4: ARMA(p, q) Model 2 Homework 3a Arthur Berg AR(p) + MA(q) = ARMA(p, q) 2/ 12 ARMA(p, q) Model Definition (ARMA(p, q) Model) A time series is ARMA(p, q) if it is
Lecture 2: ARMA(p,q) models (part 3)
Lecture 2: ARMA(p,q) models (part 3) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
Trend and Seasonal Components
Chapter 2 Trend and Seasonal Components If the plot of a TS reveals an increase of the seasonal and noise fluctuations with the level of the process then some transformation may be necessary before doing
Analysis and Computation for Finance Time Series - An Introduction
ECMM703 Analysis and Computation for Finance Time Series - An Introduction Alejandra González Harrison 161 Email: [email protected] Time Series - An Introduction A time series is a sequence of observations
Time Series Analysis
Time Series Analysis Forecasting with ARIMA models Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 2012 Alonso and García-Martos (UC3M-UPM)
Univariate Time Series Analysis; ARIMA Models
Econometrics 2 Spring 25 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
Univariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 25 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Univariate Time Series Analysis We consider a single time series, y,y 2,..., y T. We want to construct simple
Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay
Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay Business cycle plays an important role in economics. In time series analysis, business cycle
Financial TIme Series Analysis: Part II
Department of Mathematics and Statistics, University of Vaasa, Finland January 29 February 13, 2015 Feb 14, 2015 1 Univariate linear stochastic models: further topics Unobserved component model Signal
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
Time Series - ARIMA Models. Instructor: G. William Schwert
APS 425 Fall 25 Time Series : ARIMA Models Instructor: G. William Schwert 585-275-247 [email protected] Topics Typical time series plot Pattern recognition in auto and partial autocorrelations
Univariate and Multivariate Methods PEARSON. Addison Wesley
Time Series Analysis Univariate and Multivariate Methods SECOND EDITION William W. S. Wei Department of Statistics The Fox School of Business and Management Temple University PEARSON Addison Wesley Boston
Time Series Analysis by Higher Order Crossings. Benjamin Kedem Department of Mathematics & ISR University of Maryland College Park, MD
Time Series Analysis by Higher Order Crossings Benjamin Kedem Department of Mathematics & ISR University of Maryland College Park, MD 1 Stochastic Process A stochastic or random process {Z t },, 1,0,1,,
Solving Linear Systems, Continued and The Inverse of a Matrix
, Continued and The of a Matrix Calculus III Summer 2013, Session II Monday, July 15, 2013 Agenda 1. The rank of a matrix 2. The inverse of a square matrix Gaussian Gaussian solves a linear system by reducing
Chapter 1. Vector autoregressions. 1.1 VARs and the identi cation problem
Chapter Vector autoregressions We begin by taking a look at the data of macroeconomics. A way to summarize the dynamics of macroeconomic data is to make use of vector autoregressions. VAR models have become
State Space Time Series Analysis
State Space Time Series Analysis p. 1 State Space Time Series Analysis Siem Jan Koopman http://staff.feweb.vu.nl/koopman Department of Econometrics VU University Amsterdam Tinbergen Institute 2011 State
10.2 Series and Convergence
10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and
Forecasting model of electricity demand in the Nordic countries. Tone Pedersen
Forecasting model of electricity demand in the Nordic countries Tone Pedersen 3/19/2014 Abstract A model implemented in order to describe the electricity demand on hourly basis for the Nordic countries.
Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100
Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,
3.1 Stationary Processes and Mean Reversion
3. Univariate Time Series Models 3.1 Stationary Processes and Mean Reversion Definition 3.1: A time series y t, t = 1,..., T is called (covariance) stationary if (1) E[y t ] = µ, for all t Cov[y t, y t
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS Systems of Equations and Matrices Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a
Time Series Analysis
JUNE 2012 Time Series Analysis CONTENT A time series is a chronological sequence of observations on a particular variable. Usually the observations are taken at regular intervals (days, months, years),
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
Time Series Analysis
Time Series Analysis Lecture Notes for 475.726 Ross Ihaka Statistics Department University of Auckland April 14, 2005 ii Contents 1 Introduction 1 1.1 Time Series.............................. 1 1.2 Stationarity
Representation of functions as power series
Representation of functions as power series Dr. Philippe B. Laval Kennesaw State University November 9, 008 Abstract This document is a summary of the theory and techniques used to represent functions
Advanced Forecasting Techniques and Models: ARIMA
Advanced Forecasting Techniques and Models: ARIMA Short Examples Series using Risk Simulator For more information please visit: www.realoptionsvaluation.com or contact us at: [email protected]
Traffic Safety Facts. Research Note. Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen
Traffic Safety Facts Research Note March 2004 DOT HS 809 718 Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen Summary This research note uses
COMP6053 lecture: Time series analysis, autocorrelation. [email protected]
COMP6053 lecture: Time series analysis, autocorrelation [email protected] Time series analysis The basic idea of time series analysis is simple: given an observed sequence, how can we build a model that
10.2 ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS. The Jacobi Method
578 CHAPTER 1 NUMERICAL METHODS 1. ITERATIVE METHODS FOR SOLVING LINEAR SYSTEMS As a numerical technique, Gaussian elimination is rather unusual because it is direct. That is, a solution is obtained after
Time Series Analysis and Forecasting
Time Series Analysis and Forecasting Math 667 Al Nosedal Department of Mathematics Indiana University of Pennsylvania Time Series Analysis and Forecasting p. 1/11 Introduction Many decision-making applications
Follow links for Class Use and other Permissions. For more information send email to: [email protected]
COPYRIGHT NOTICE: Ariel Rubinstein: Lecture Notes in Microeconomic Theory is published by Princeton University Press and copyrighted, c 2006, by Princeton University Press. All rights reserved. No part
Exam Solutions. X t = µ + βt + A t,
Exam Solutions Please put your answers on these pages. Write very carefully and legibly. HIT Shenzhen Graduate School James E. Gentle, 2015 1. 3 points. There was a transcription error on the registrar
The Bivariate Normal Distribution
The Bivariate Normal Distribution This is Section 4.7 of the st edition (2002) of the book Introduction to Probability, by D. P. Bertsekas and J. N. Tsitsiklis. The material in this section was not included
Time Series Analysis
Time Series 1 April 9, 2013 Time Series Analysis This chapter presents an introduction to the branch of statistics known as time series analysis. Often the data we collect in environmental studies is collected
HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!
Math 7 Fall 205 HOMEWORK 5 SOLUTIONS Problem. 2008 B2 Let F 0 x = ln x. For n 0 and x > 0, let F n+ x = 0 F ntdt. Evaluate n!f n lim n ln n. By directly computing F n x for small n s, we obtain the following
88 CHAPTER 2. VECTOR FUNCTIONS. . First, we need to compute T (s). a By definition, r (s) T (s) = 1 a sin s a. sin s a, cos s a
88 CHAPTER. VECTOR FUNCTIONS.4 Curvature.4.1 Definitions and Examples The notion of curvature measures how sharply a curve bends. We would expect the curvature to be 0 for a straight line, to be very small
Forecasting the US Dollar / Euro Exchange rate Using ARMA Models
Forecasting the US Dollar / Euro Exchange rate Using ARMA Models LIUWEI (9906360) - 1 - ABSTRACT...3 1. INTRODUCTION...4 2. DATA ANALYSIS...5 2.1 Stationary estimation...5 2.2 Dickey-Fuller Test...6 3.
E3: PROBABILITY AND STATISTICS lecture notes
E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................
In this paper we study how the time-series structure of the demand process affects the value of information
MANAGEMENT SCIENCE Vol. 51, No. 6, June 25, pp. 961 969 issn 25-199 eissn 1526-551 5 516 961 informs doi 1.1287/mnsc.15.385 25 INFORMS Information Sharing in a Supply Chain Under ARMA Demand Vishal Gaur
Numeraire-invariant option pricing
Numeraire-invariant option pricing Farshid Jamshidian NIB Capital Bank N.V. FELAB, University of Twente Nov-04 Numeraire-invariant option pricing p.1/20 A conceptual definition of an option An Option can
Luciano Rispoli Department of Economics, Mathematics and Statistics Birkbeck College (University of London)
Luciano Rispoli Department of Economics, Mathematics and Statistics Birkbeck College (University of London) 1 Forecasting: definition Forecasting is the process of making statements about events whose
Graphical Tools for Exploring and Analyzing Data From ARIMA Time Series Models
Graphical Tools for Exploring and Analyzing Data From ARIMA Time Series Models William Q. Meeker Department of Statistics Iowa State University Ames, IA 50011 January 13, 2001 Abstract S-plus is a highly
Non-Stationary Time Series andunitroottests
Econometrics 2 Fall 2005 Non-Stationary Time Series andunitroottests Heino Bohn Nielsen 1of25 Introduction Many economic time series are trending. Important to distinguish between two important cases:
Applied Time Series Analysis Part I
Applied Time Series Analysis Part I Robert M. Kunst University of Vienna and Institute for Advanced Studies Vienna October 3, 2009 1 Introduction and overview 1.1 What is econometric time-series analysis?
2.2 Elimination of Trend and Seasonality
26 CHAPTER 2. TREND AND SEASONAL COMPONENTS 2.2 Elimination of Trend and Seasonality Here we assume that the TS model is additive and there exist both trend and seasonal components, that is X t = m t +
LS.6 Solution Matrices
LS.6 Solution Matrices In the literature, solutions to linear systems often are expressed using square matrices rather than vectors. You need to get used to the terminology. As before, we state the definitions
Similarity and Diagonalization. Similar Matrices
MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that
How To Find The Correlation Of Random Bits With The Xor Operator
Exclusive OR (XOR) and hardware random number generators Robert B Davies February 28, 2002 1 Introduction The exclusive or (XOR) operation is commonly used to reduce the bias from the bits generated by
MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.
MATH10212 Linear Algebra Textbook: D. Poole, Linear Algebra: A Modern Introduction. Thompson, 2006. ISBN 0-534-40596-7. Systems of Linear Equations Definition. An n-dimensional vector is a row or a column
Row Echelon Form and Reduced Row Echelon Form
These notes closely follow the presentation of the material given in David C Lay s textbook Linear Algebra and its Applications (3rd edition) These notes are intended primarily for in-class presentation
Estimating an ARMA Process
Statistics 910, #12 1 Overview Estimating an ARMA Process 1. Main ideas 2. Fitting autoregressions 3. Fitting with moving average components 4. Standard errors 5. Examples 6. Appendix: Simple estimators
ST 371 (IV): Discrete Random Variables
ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible
Continued Fractions and the Euclidean Algorithm
Continued Fractions and the Euclidean Algorithm Lecture notes prepared for MATH 326, Spring 997 Department of Mathematics and Statistics University at Albany William F Hammond Table of Contents Introduction
Math 55: Discrete Mathematics
Math 55: Discrete Mathematics UC Berkeley, Spring 2012 Homework # 9, due Wednesday, April 11 8.1.5 How many ways are there to pay a bill of 17 pesos using a currency with coins of values of 1 peso, 2 pesos,
Overview. Essential Questions. Precalculus, Quarter 4, Unit 4.5 Build Arithmetic and Geometric Sequences and Series
Sequences and Series Overview Number of instruction days: 4 6 (1 day = 53 minutes) Content to Be Learned Write arithmetic and geometric sequences both recursively and with an explicit formula, use them
TIME SERIES ANALYSIS
TIME SERIES ANALYSIS L.M. BHAR AND V.K.SHARMA Indian Agricultural Statistics Research Institute Library Avenue, New Delhi-0 02 [email protected]. Introduction Time series (TS) data refers to observations
Time Series Analysis
Time Series Analysis [email protected] Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Identification of univariate time series models, cont.:
Chapter 1: Financial Markets and Financial Derivatives
Chapter 1: Financial Markets and Financial Derivatives 1.1 Financial Markets Financial markets are markets for financial instruments, in which buyers and sellers find each other and create or exchange
Lecture Notes on Polynomials
Lecture Notes on Polynomials Arne Jensen Department of Mathematical Sciences Aalborg University c 008 Introduction These lecture notes give a very short introduction to polynomials with real and complex
Continued Fractions. Darren C. Collins
Continued Fractions Darren C Collins Abstract In this paper, we discuss continued fractions First, we discuss the definition and notation Second, we discuss the development of the subject throughout history
9th Russian Summer School in Information Retrieval Big Data Analytics with R
9th Russian Summer School in Information Retrieval Big Data Analytics with R Introduction to Time Series with R A. Karakitsiou A. Migdalas Industrial Logistics, ETS Institute Luleå University of Technology
1 if 1 x 0 1 if 0 x 1
Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or
Rob J Hyndman. Forecasting using. 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1
Rob J Hyndman Forecasting using 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1 Outline 1 Regression with ARIMA errors 2 Example: Japanese cars 3 Using Fourier terms for seasonality 4
Introduction to Time Series Analysis. Lecture 1.
Introduction to Time Series Analysis. Lecture 1. Peter Bartlett 1. Organizational issues. 2. Objectives of time series analysis. Examples. 3. Overview of the course. 4. Time series models. 5. Time series
a 1 x + a 0 =0. (3) ax 2 + bx + c =0. (4)
ROOTS OF POLYNOMIAL EQUATIONS In this unit we discuss polynomial equations. A polynomial in x of degree n, where n 0 is an integer, is an expression of the form P n (x) =a n x n + a n 1 x n 1 + + a 1 x
Time Series Analysis
Time Series Analysis Identifying possible ARIMA models Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 2012 Alonso and García-Martos
Walrasian Demand. u(x) where B(p, w) = {x R n + : p x w}.
Walrasian Demand Econ 2100 Fall 2015 Lecture 5, September 16 Outline 1 Walrasian Demand 2 Properties of Walrasian Demand 3 An Optimization Recipe 4 First and Second Order Conditions Definition Walrasian
4. Simple regression. QBUS6840 Predictive Analytics. https://www.otexts.org/fpp/4
4. Simple regression QBUS6840 Predictive Analytics https://www.otexts.org/fpp/4 Outline The simple linear model Least squares estimation Forecasting with regression Non-linear functional forms Regression
The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].
Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real
k, then n = p2α 1 1 pα k
Powers of Integers An integer n is a perfect square if n = m for some integer m. Taking into account the prime factorization, if m = p α 1 1 pα k k, then n = pα 1 1 p α k k. That is, n is a perfect square
How To Model A Series With Sas
Chapter 7 Chapter Table of Contents OVERVIEW...193 GETTING STARTED...194 TheThreeStagesofARIMAModeling...194 IdentificationStage...194 Estimation and Diagnostic Checking Stage...... 200 Forecasting Stage...205
Basic Proof Techniques
Basic Proof Techniques David Ferry [email protected] September 13, 010 1 Four Fundamental Proof Techniques When one wishes to prove the statement P Q there are four fundamental approaches. This document
Matrices and Polynomials
APPENDIX 9 Matrices and Polynomials he Multiplication of Polynomials Let α(z) =α 0 +α 1 z+α 2 z 2 + α p z p and y(z) =y 0 +y 1 z+y 2 z 2 + y n z n be two polynomials of degrees p and n respectively. hen,
NOTES ON LINEAR TRANSFORMATIONS
NOTES ON LINEAR TRANSFORMATIONS Definition 1. Let V and W be vector spaces. A function T : V W is a linear transformation from V to W if the following two properties hold. i T v + v = T v + T v for all
1 Determinants and the Solvability of Linear Systems
1 Determinants and the Solvability of Linear Systems In the last section we learned how to use Gaussian elimination to solve linear systems of n equations in n unknowns The section completely side-stepped
4.5 Linear Dependence and Linear Independence
4.5 Linear Dependence and Linear Independence 267 32. {v 1, v 2 }, where v 1, v 2 are collinear vectors in R 3. 33. Prove that if S and S are subsets of a vector space V such that S is a subset of S, then
5 Homogeneous systems
5 Homogeneous systems Definition: A homogeneous (ho-mo-jeen -i-us) system of linear algebraic equations is one in which all the numbers on the right hand side are equal to : a x +... + a n x n =.. a m
Solving Systems of Linear Equations
LECTURE 5 Solving Systems of Linear Equations Recall that we introduced the notion of matrices as a way of standardizing the expression of systems of linear equations In today s lecture I shall show how
TOPIC 4: DERIVATIVES
TOPIC 4: DERIVATIVES 1. The derivative of a function. Differentiation rules 1.1. The slope of a curve. The slope of a curve at a point P is a measure of the steepness of the curve. If Q is a point on the
Spatial Statistics Chapter 3 Basics of areal data and areal data modeling
Spatial Statistics Chapter 3 Basics of areal data and areal data modeling Recall areal data also known as lattice data are data Y (s), s D where D is a discrete index set. This usually corresponds to data
Notes on Determinant
ENGG2012B Advanced Engineering Mathematics Notes on Determinant Lecturer: Kenneth Shum Lecture 9-18/02/2013 The determinant of a system of linear equations determines whether the solution is unique, without
Predicting Indian GDP. And its relation with FMCG Sales
Predicting Indian GDP And its relation with FMCG Sales GDP A Broad Measure of Economic Activity Definition The monetary value of all the finished goods and services produced within a country's borders
For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
Subspaces of R n LECTURE 7. 1. Subspaces
LECTURE 7 Subspaces of R n Subspaces Definition 7 A subset W of R n is said to be closed under vector addition if for all u, v W, u + v is also in W If rv is in W for all vectors v W and all scalars r
Math 55: Discrete Mathematics
Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 5, due Wednesday, February 22 5.1.4 Let P (n) be the statement that 1 3 + 2 3 + + n 3 = (n(n + 1)/2) 2 for the positive integer n. a) What
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have
FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES
FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES CHRISTOPHER HEIL 1. Cosets and the Quotient Space Any vector space is an abelian group under the operation of vector addition. So, if you are have studied
Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan. 15.450, Fall 2010
Simulation Methods Leonid Kogan MIT, Sloan 15.450, Fall 2010 c Leonid Kogan ( MIT, Sloan ) Simulation Methods 15.450, Fall 2010 1 / 35 Outline 1 Generating Random Numbers 2 Variance Reduction 3 Quasi-Monte
