2. What are the theoretical and practical consequences of autocorrelation?


 Magdalene Bradley
 1 years ago
 Views:
Transcription
1 Lecture 10 Serial Correlation In this lecture, you will learn the following: 1. What is the nature of autocorrelation? 2. What are the theoretical and practical consequences of autocorrelation? 3. Since the assumption of nonautocorrelation relates to the unobservable disturbance ε i, how does one know that there is autocorrelation in any given situation? 4. What are the remedies for the problem of autocorrelation? 10.1 Nature of Autocorrelation Below are some concepts about the independence and serial correlation (or autocorrelation) for a time series. Serial independence: error terms ε t and ε s, for different observations t and s, are independently distributed. When one deals with time series data, this assumption is frequently violated. Error terms for time periods not too far apart may be correlated. Serial correlation (or autocorrelation): error terms ε t and ε s, for t s, are correlated. This property is frequently observed from time series data. 1
2 2 LECTURE 10 SERIAL CORRELATION Besides, three factors can also lead to serially correlated errors. They are: (1) omitted variables, (2) ignoring nonlinearities, and (3) measurement errors. For example, suppose a dependent variable Y t is related to the independent variables X t1 and X t2, but the investigator does not include X t2 in the model. The effect of this variable will be captured by the error term ε t. Because many time series exhibit trends over time, X t2 is likely to depend on X t 1,2, X t 2,2,.... This will translate into apparent correlation between ε t and ε t 1, ε t 2,..., thereby violating the serial independence assumption. Thus, growth in omitted variables could cause autocorrelation in errors. Serial correlation can also be caused by misspecification of the functional form. Suppose, for example, the relationship between Y and X is quadratic but we assume a straight line. Then the error term ε t will depend on X 2. If X has been growing over time, ε t will also exhibit such growth, indicating autocorrelation. Systematic errors in measurement can also cause autocorrelation. For example, suppose a firm is updating its inventory in a given period. If there is a systematic error in the way it was measured, cumulative inventory stock will reflect accumulated measurement errors. This will show up as serial correlation. Example: Consider the consumption of electricity during different hours of the day. Because the temperature patterns are similar between successive time periods, we can expect consumption patterns to be correlated between neighboring periods. If the model is not properly specified, this effect may show up as high correlation among errors from nearby periods. Example: Consider stock market data. The price of a particular security or a stock market index at the close of successive days or during successive hours is likely to be serially correlated Serial Correlation of the First Order If serial correlation is present, then Cov(ε t, ε t s ) 0 for t s; that is, the error for the period t is correlated with the error for the period s. There are
3 3 LECTURE 10 SERIAL CORRELATION many forms of process to capture the serial correlation. Two basic processes are the atuoregressive (AR) process and the moving average (MA) process. Given u t is a white noise, u t (0, σ 2 u), the pth order of AR process and the qth order of MA process are defined as follows: AR(p): ε t = ρ 1 ε t 1 + ρ 2 ε t ρ p ε t p + u t MA(q): ε t = u t + θ 1 u t θ q u t q In this chapter, we only discuss the serial correlation in the form of autoregression. Specify the regression model with serial correlated error as: y t = β 0 + β 1 x t + ε t ε t = ρε t 1 + u t ; 1 < ρ < 1 ρ is called the firstorder autocorrelation coefficient. The error term ε described above follows a firstorder autoregressive process [AR(1)]. The white noise u t is assumed to satisfy the following conditions. DEFINITION OF WHITE NOISE with ZERO MEAN: {u t, t = 1, 2,, T } are independently and identically distributed with zero mean and constant variance so that E(u t ) = 0, E(u 2 t ) = σu 2 <, and E(u t u t s ) = 0 for s 0. REMARKS: By assuming u t as a white noise series with zero mean, ε t is correlated with all past errors. (reason: ε t depends on ε t 1, so they are correlated. Though ε t does not depend directly on ε t 2, it does do so indirectly through ε t 1 because ε t 1 depends on ε t 2. ) positive autocorrelation: when the covariance is positive. negative autocorrelation: when the covariance is negative. Cov(ε t, ε s ) = σ 2 uρ s, for s Consequences of Ignoring Serial Correlation If we ignore the serial correlation in error, the impacts on the OLS estimates are as follows:
4 4 LECTURE 10 SERIAL CORRELATION OLS estimates (and forecasts based on them) are unbiased and consistent even if the error terms are serially correlated. The problem is with the efficiency of the estimates. In the proof of the GaussMarkov Theorem that established efficiency, one of the steps involved minimization of the variance of the linear combination a t ε t : Var ( at ε t ) = a 2 t σ 2 ε + t s at a s Cov(ε t, ε s ) where the summation is over all t and s that are different. If Cov(ε t, ε s ) 0, the second term on the righthand side will not vanish. Therefore, the best linear unbiased estimator (BLUE) that minimizes Var( a t ε t ) will not be the same as the OLS estimator. That is, OLS estimates are not BLUE and are hence inefficient. Thus the consequences of ignoring autocorrelation are the same as those of ignoring heteroskedasticity, namely, the OLS estimates and forecasts are unbiased and consistent, but are inefficient. If the serial correlation in ε t is positive and the independent variable X t grows over time, then the estimated residual variance (ˆσ 2 ) will be an underestimate and the value of R 2 will be an overestimate. In other words, the goodness of fit will be exaggerated and the estimated standard errors will be smaller than the true standard errors. In the general case, the variances of the OLS estimates for regression coefficients will be biased Effects on Tests of Hypotheses In the case in which the serial correlation in ε t is positive and the independent variable x t grows over time, estimated standard errors will be smaller than the true standard errors, and hence the estimated standard errors will be underestimated. Therefore, the tstatistics will be overestimates a regression coefficient that appears to be significant may not really be so Effects: The estimated variances of the parameters will be biased and inconsistent. Thus the t and F tests are no longer valid.
5 5 LECTURE 10 SERIAL CORRELATION Effect on Forecasting Forecasts based on OLS estimated will be unbiased. But forecasts are inefficient with larger variances. By explicitly taking into account the serial correlation among residuals, it is possible to generate better forecasts than those generated by the OLS procedure. Suppose we ignore the AR(1) serial correlation and obtain OLS estimates ˆα and ˆβ. The OLS prediction would be ŷ t = ˆβ 0 + ˆβ 1 x t. However, in the case of firstorder serial correlation, ε t is predictable from ρε t 1 + u t, provided ρ can be estimated (call it ˆρ). Once we have e t = ˆρû t 1, the residual for the previous period (û t 1 ) is known at time t. Therefore, the AR(1) prediction will be ỹ t = ˆβ 0 + ˆβ 1 x t + ˆρũ t 1 = ˆβ 0 + ˆβ 1 x t + ˆρ ( y t ˆβ 0 ˆβ 1 x t 1 ) by making use of the fact that û t 1 = y t 1 ˆβ 0 ˆβ 1 x t 1. Thus ỹ t will be more efficient than that obtained by the OLS procedure. The procedure for estimating ρ is described below. PROPERTY If serial correlation among the stochastic disturbance terms in a regression model is ignored and the OLS procedure is used to estimate the parameters, the following properties hold: 1. The estimates and forecasts based on them will still be unbiased and consistent. The consistency property does not hole, however, if lagged dependent variables are included as explanatory variables. 2. The OLS estimates are no longer BLUE and will be inefficient. Forecasts will also be inefficient. 3. The estimated variances of the regression coefficients will be biased, and hence tests of hypotheses are invalid. If the serial correlation is positive and the independent variables X t is growing over time, then the standard errors will underestimate of the true values. This means that the computed R 2 will be an overestimate, indicating a better fit
6 6 LECTURE 10 SERIAL CORRELATION than actually exists. Also, the tstatistics in such a case will tend to appear more significant than they actually are Testing for FirstOrder Serial Correlation The Residual Plot Residual plot: a graph of the estimated residuals e t against time (t). If successive residuals tend to cluster on one side of the zero line of the other, it is a graphical indication of the presence of serial correlation. As the first step toward identifying the presence of serial correlation, it is a good practice to plot e t against t and look for the clustering effect The DurbinWatson Test Durbin and Watson (1950, 1951): For the multiple regression model with AR(1) error: y t = β 0 + β 1 x t1 + β 2 x t2 + + β k x tk + ε t ε t = ρε t 1 + u t 1 < ρ < 1 DurbinWatson statistic is calculated by below steps: STEP 1: Estimate the model by OLS and compute the residuals e t as y t ˆβ 0 ˆβ 1 x t1 ˆβ k x tk. STEP 2: Compute the DurbinWatson statistic: d = Tt=2 (e t e t 1 ) 2 Tt=1 e 2 t It is shown later that 0 d 4. The exact distribution of d depends on ρ, which is unknown, as well as on the observations on the x s. Durbin and Watson (1950) showed that the distribution of d is bounded by two limiting distributions. See Savin and White (1977) for the critical values for the limiting distributions of d, namely d U and d L, for different sample size T and the number of coefficients k, not counting
7 7 LECTURE 10 SERIAL CORRELATION the constant term. These are used to construct critical regions for the DurbinWatson test. STEP 3a: To test H 0 : ρ = 0 against H 1 : ρ > 0 (onetailed test), we at first have to find the critical values for the DurbinWatson statistic: d L and d U. Reject H 0 if d d L. If d d U, we cannot reject H 0. If d L < d < d U, the test is inconclusive. STEP 3b: To test for negative serial correlation (that is, for H 1 : ρ < 0), use 4 d. This is done when d is greater than 2. If 4 d d L, we conclude that there is significant negative autocorrelation. If 4 d d U, we conclude that there is no negative autocorrelation. The test is inconclusive if d L < d < d U. REMARKS: The inconclusiveness of the DW test arisen from the fact that there is no exact smallsample distribution for the DW statistic d. When the test is inconclusive, one might try the Lagrange multiplier test described next. EXPLANATION: from the estimated residuals we can obtain an estimate of the firstorder serial correlation coefficient as ˆρ = Tt=2 e t e t 1 Tt=1 e 2 t This estimate is approximately equal to the one obtained by regressing e t against û t 1 without a constant term. It can be shown that DW statistic d is approximately equal to 2(1 ˆρ) d 2(1 ˆρ) Because ρ can range from 1 to +1, the range for d is 0 to 4. When ρ is 0, d is 4. Thus, a DW statistic of nearly 2 means there is no firstorder serial correlation. A strong positive autocorrelation means ρ is close to +1. This indicates low values of d. Similarly, values of d close to 4 indicate a strong negative correlation; that is, ρ is close to 1. The DW test is invalid if the righthand side of regression equation includes lagged dependent variables: y t 1, y t 2,....
8 8 LECTURE 10 SERIAL CORRELATION The Lagrange Multiplier Test The LM statistic is useful in identifying serial correlation not only of the first order but of higher orders as well. Here we confine ourselves to the firstorder case. The general case of AR(p) is discussed later. y t = β 0 + β 1 x t1 + β 2 x t2 + + β k x tk + ρε t 1 + u t The test for ρ = 0 can be treated as the LM test for the addition of the variable ε t 1 (which is unknown, and hence one would use e t 1 instead). Steps for Carrying Out the LM Test: STEP 1: Estimate the regression model by OLS and compute its estimated residuals, e t. STEP 2: Regress e t against a constant, x t1,, x tk, and e t 1, using the T 1 observations 2 through T. Then the LM statistic can be calculated by (T 1)R 2 e, where R 2 e is the Rsquared from the auxiliary regression. T 1 is used because the efficient number of observations is T 1. STEP 3: Reject the null hypothesis of zero autocorrelation in favor of the alternative that ρ 0 if (T 1)R 2 e > χ 2 1,(1 α), the value of χ2 1 in the chisquare distribution with 1 d.f. such that the area to the right of it is 1 α, and α is the significance level. REMARKS: If there were serial correlation in the residuals, we would expect e t to be related to e t 1. This is the motivation behind the auxiliary regression in which e t 1 is included along with all the independent variables in the model. The LM test does not have the inconclusiveness of the DW test. However, the LM test is a largesample test and would need at least 30 d.f. to be meaningful.
9 9 LECTURE 10 SERIAL CORRELATION 10.5 Treatment of Serial Correlation Model Formulation in First Differences Granger and Newbold (1974 and 1976) have cautioned against spurious regressions that might arise when a regression is based on levels of trending variables, especially when a significant DW statistic. A common way to get around this problem is to formulate models in terms of first difference which is the difference between the value at time t and at time t 1. That is, we estimate y t = β x t + u t where y t = y t y t 1 and x t = x t x t 1. However, the solution of using first differences might not always be appropriate. The first difference model can be rewritten as y t = y t 1 + β 1 x t β 1 x t 1 + u t Estimation Procedures When modified function forms do not eliminate autocorrelation, several estimation procedures are available that will produce more efficient estimates than those obtained by OLS procedure. These methods need to be applied only for time series data. With crosssection data one can rearrange the observations in any manner and get a DW statistic that is acceptable. REMARKS: The DW test is meaningless for crosssection data because one can rearrange the observations in any manner and get a DW statistic that is acceptable. Because time series data cannot be rearranged, one needs to be concerned about possible serial correlation. Some usual procedure for estimating models with AR(1) serial correlation are listed below. CochraneOrcutt (CORC) Iterative Procedure Cochrane and Orcutt (1949): This procedure requires the transformation of the regression model to a form in which the OLS procedure is applicable.
10 10 LECTURE 10 SERIAL CORRELATION Quasidifferencing or generalized differencing transformation: generate variables y and x. Rewrite the model for the period t 1 we get y t 1 = β 0 + β 1 x t 1,1 + β 2 x t 1,2 + + β k x t 1,k + ε t 1 Multiplying by ρ and subtracting from the original equation, we obtain y t ρy t 1 = β 0 (1 ρ) + β 1 [x t1 ρx t 1,1 ] + β 2 [x t2 ρx t 1,2 ] β k [x tk ρx t 1,k ] + u t where we have used the fact that ε t = ρε t 1 + u t. Rewrite this equation again, y t = β 0 + β 1 x t1 + β 2 x t2 + + β k x tk + u t (10.1) where y t = y t ρy t 1, β 0 = β 0 (1 ρ), and x ti = x ti ρx t 1,i, for t = 2, 3,..., T and i = 1,..., k. Note that the error term satisfies all the properties needed for applying the OLS procedure. If ρ were known, we could apply OLS to the transformed y and x and obtain estimates that are BLUE. However, ρ is unknown and has to be estimated from the sample. Steps for Carrying Out the ORCR Procedure: STEP 1: Estimate the original equation by OLS and compute its residuals e t. STEP 2: Estimate the firstorder serial correlation coefficient (ˆρ) by regressing e t against e t 1. STEP 3: Transform the variables as follows: y t = y t ˆρy t 1, x t1 = x t1 ˆρx t 1,1, and so on STEP 4: Regress yt against a constant, x t1, x t2,..., x tk and get OLS estimate of βj, j = 0, 1,, k. STEP 5: Derive estimates for the β 0 as β 0/(1 ˆρ). Plug β 0 and estimated β j, j = 1,, k into the original regression, and then obtain a new set of estimates for ε t. Then go back and repeat Step 2 with these new values until the following stopping rules applies.
11 11 LECTURE 10 SERIAL CORRELATION STEP 6: This iterative procedure can be stopped when the estimates of ρ from two successive iterations differ by no more than some preselected values, such as the final ˆρ is then used to get the CORC estimates for transformed regression. HildrethLu (HILU) Search Procedure Steps of Hildreth and Lu (1960) Procedure: STEP 1: Choose a value of ρ (say ρ 1 ). Using this value, transform the variables and estimate the transformed regression by OLS. STEP 2: From these estimates, derive û t from Equation (10.1) and the error sum of squares associated with it. Call it SSRû(ρ 1 ), Nest choose a different ρ (ρ 2 ) and repeat Steps 1 and 2. STEP 3: By varying ρ from 1 to +1 in some systematic way (say, at steps of length 0.05 or 0.01), we can get a series of values of SSRû(ρ). Choose that ρ for which SSRûis a minimum. This is the final ρ that globally minimizes the error sum of squares of the transformed model. Equation (10.1) is then estimated with the final ρ as the optimum solution. A Comparison of the Two Procedures THE HILU procedure is basically searches for the values of ρ between 1 and +1 that minimizes the sum of squares of the residuals of the transformed equation. If the step intervals are small, the procedure involves running a large number if regressions; hence, compared to the CORC procedure, the HILU method is computer intensive. The CORC procedure iterates to a local minimum of SSR(ρ) and might miss the global minimum if there is more than one local minimum HighOrder Serial Correlation AR(p): pthorder autoregressive process of the residuals y t = β 0 + β 1 x t1 + β 2 x t2 + + β k x tk + ε t ε t = ρ 1 ε t 1 + ρ 2 ε t 2 + ρ 3 ε t ρ p ε t p + u t
12 12 LECTURE 10 SERIAL CORRELATION Lagrange Multiplier Test of HigherOrder Autocorrelation Combine the above two equations, we obtain y t = β 0 + β 1 x t1 + + β k x tk + ρ 1 ε t 1 + ρ 2 ε t ρ p ε t p + u t The null hypothesis is that each of the ρs is zero (that is, ρ 1 = ρ 2 = = ρ p = 0) against the alternative that at least one of them is not zero. The null hypothesis is very similar to the one for testing the addition of new variables. In this case, the new variables are ε t 1, ε t 2,..., ε t p, which can be estimated by e t 1, e t 2,..., e t p. Steps of LM test for higher order AR(p) serial correlation: STEP 1 Estimate the original regression by OLS and obtain the residuals e t. STEP 2 Regress e t against all the independent variables x 1,, x k plus e t 1, e t 2,, e t p. The effective number of observations used in the auxiliary regression is T p because t p is defines only for the period p + 1 to T. STEP 3 Compute (T p)r 2 e from the auxiliary regression run in Step 2. If it exceeds χ 2 p,1 α, the value of the chisquare distribution with p d.f. such that the area to the right is 1 α, then reject H 0 : ρ 1 = ρ 2 = = ρ p = 0 in favor of H 1 : at least one of the ρs is significantly different from zero Estimating a Model with General Order Autoregressive Errors If the LM test rejects the null hypothesis of no serial correlation, we must estimate efficiently the parameters of the below equation with pthorder autoregressive error: y t ρ 1 y t 1 ρ 2 y t 2 ρ 3 y t 3 ρ p y t p = β 0 (1 ρ 1 ρ 2 ρ p ) + β 1 [x t1 ρ 1 x t 1,1 ρ p x t p,1 ] + + β k [x tk ρ 1 x t 1,k ρ p x t p,k ] + u t
13 13 LECTURE 10 SERIAL CORRELATION STEP 1 Estimate the original regression model by OLS and obtain the residuals, e t. STEP 2 Regress e t against e t 1, e t 2,..., e t p (with no constant term) to obtain the estimates ˆρ 1, ˆρ 2, and so on of the parameters of e t j, j = 1, 2,, p. Here only T p observations are used. STEP 3 Using these estimates of ˆρ j, j = 1,, k, transform y and x s to get the dependent and independent variables to get the new transformed variables y and x s as y t = y t ˆρ 1 y t 1 ˆρ p y t p and x ti = x ti ˆρ 1 x t 1,i ˆρ p x t p,i, i = 1, 2,, k. STEP 4 Estimate the transformed model by regressing y t against x ti, i = 1, 2,, k. Using the estimate β 0, we calculate β 0 = β 0/(1 ˆρ 1 ˆρ 2 ˆρ p ). STEP 5 Then plug this β 0 along with estimates of β j, j = 1, 2,, k into the original regression to compute a revised estimate of the residuals ε t. Then go back to Step 2 and iterate until some criterion is satisfied. REMARKS: The final ρs obtained in Step 5 can then be used to make one last transformation of the data to estimate β s Forecasts and Goodness of Fit in AR Models The predicted y t is: ŷ t = ˆβ 0 + ˆβ 1 x t1 + + ˆβ k x tk + ˆρ 1 ε t 1 + ˆρ 2 ε t ˆρ p ε t p The forecast of y t obtained this way will be more efficient than the OLS prediction that ignores the e terms Engle s ARCH Test The variance of prediction errors is not a constant but differs from period to period. For instance, freefloating exchange rates fluctuate a great deal, making their variances larger. Increased volatility of security prices are often indicators that the variances are not constant over time.
14 14 LECTURE 10 SERIAL CORRELATION Engle (1982) introduced a new approach to modelling heteroskedasticity in time series context. For the regression model, where Var(ε t F t 1 = σ 2 t. y t = β 0 + β 1 x t1 + β 2 x t2 + + β k x tk + ε t ARCH (autoregressive conditional heteroskedasticity) model: Specify the conditional variance process as follows: σ 2 t = α 0 + α 1 σ 2 t α p σ 2 t p + v t This conditional variance process is denoted as the pth order ARCH, ARCH(p), process. The conditional variance at time t depends on those in previous periods and hence the term conditional heteroskedasticity. ARCH test: this is test for the null hypothesis H 0 : α 1 = α 2 = = α p = 0. STEP 1 Estimate β s in the original equation by OLS. STEP 2 Compute the residual e t = y t ˆβ 0 ˆβ 1 x t1 ˆβ 2 x t2 ˆβ k x tk, square it to obtain e 2 t, and generate e 2 t 1, e 2 t 2,, e 2 t p. STEP 3 Regress e 2 t against a constant, e 2 t 1, e 2 t 2,..., and e 2 t p. This is the auxiliary regression, which uses T p observations. STEP 4 Using R 2 e 2, the Rsquared of the auxiliary regression, we compute LM=(T p)r 2 e 2. Under the null hypothesis H 0 : α 1 = α 2 = = α p = 0, (T p)r 2 e 2 has the chisquare distribution with p d.f. Reject H 0 if (T p)r 2 e 2 > χ 2 p,1 α, the point on χ 2 p with an area 1 α to the right of it. References Greene, W. H., 2003, Econometric Analysis, 5th ed., Prentice Hall. Chapter 12. Gujarati, D. N., 2003, Basic Econometrics, 4th ed., McGrawHill. Chapter 11. Ramanathan, R., 2002, Introductory Econometrics with Applications, 5th ed., Harcourt College Publishers. Chapter 9. Ruud, P. A., 2000, An Introduction to Classical Econometric Theory, 1st ed., Oxford University Press. Chapter 19.
Wooldridge, Introductory Econometrics, 3d ed. Chapter 12: Serial correlation and heteroskedasticity in time series regressions
Wooldridge, Introductory Econometrics, 3d ed. Chapter 12: Serial correlation and heteroskedasticity in time series regressions What will happen if we violate the assumption that the errors are not serially
More informationNote 2 to Computer class: Standard misspecification tests
Note 2 to Computer class: Standard misspecification tests Ragnar Nymoen September 2, 2013 1 Why misspecification testing of econometric models? As econometricians we must relate to the fact that the
More informationMULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS
MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS MSR = Mean Regression Sum of Squares MSE = Mean Squared Error RSS = Regression Sum of Squares SSE = Sum of Squared Errors/Residuals α = Level of Significance
More informationThe following postestimation commands for time series are available for regress:
Title stata.com regress postestimation time series Postestimation tools for regress with time series Description Syntax for estat archlm Options for estat archlm Syntax for estat bgodfrey Options for estat
More information2. Linear regression with multiple regressors
2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measuresoffit in multiple regression Assumptions
More informationOverview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
More informationTIME SERIES REGRESSION
DEPARTMENT OF POLITICAL SCIENCE AND INTERNATIONAL RELATIONS Posc/Uapp 816 TIME SERIES REGRESSION I. AGENDA: A. A couple of general considerations in analyzing time series data B. Intervention analysis
More informationHeteroskedasticity and Weighted Least Squares
Econ 507. Econometric Analysis. Spring 2009 April 14, 2009 The Classical Linear Model: 1 Linearity: Y = Xβ + u. 2 Strict exogeneity: E(u) = 0 3 No Multicollinearity: ρ(x) = K. 4 No heteroskedasticity/
More informationSimple Linear Regression Inference
Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation
More informationAn Introduction to Time Series Regression
An Introduction to Time Series Regression Henry Thompson Auburn University An economic model suggests examining the effect of exogenous x t on endogenous y t with an exogenous control variable z t. In
More informationTURUN YLIOPISTO UNIVERSITY OF TURKU TALOUSTIEDE DEPARTMENT OF ECONOMICS RESEARCH REPORTS. A nonlinear moving average test as a robust test for ARCH
TURUN YLIOPISTO UNIVERSITY OF TURKU TALOUSTIEDE DEPARTMENT OF ECONOMICS RESEARCH REPORTS ISSN 0786 656 ISBN 951 9 1450 6 A nonlinear moving average test as a robust test for ARCH Jussi Tolvi No 81 May
More informationIs the Forward Exchange Rate a Useful Indicator of the Future Exchange Rate?
Is the Forward Exchange Rate a Useful Indicator of the Future Exchange Rate? Emily Polito, Trinity College In the past two decades, there have been many empirical studies both in support of and opposing
More informationDurbinWatson Significance Tables
DurbinWatson Significance Tables Appendix A The DurbinWatson test statistic tests the null hypothesis that the residuals from an ordinary leastsquares regression are not autocorrelated against the alternative
More informationDEPARTMENT OF ECONOMICS. Unit ECON 12122 Introduction to Econometrics. Notes 4 2. R and F tests
DEPARTMENT OF ECONOMICS Unit ECON 11 Introduction to Econometrics Notes 4 R and F tests These notes provide a summary of the lectures. They are not a complete account of the unit material. You should also
More informationECON 142 SKETCH OF SOLUTIONS FOR APPLIED EXERCISE #2
University of California, Berkeley Prof. Ken Chay Department of Economics Fall Semester, 005 ECON 14 SKETCH OF SOLUTIONS FOR APPLIED EXERCISE # Question 1: a. Below are the scatter plots of hourly wages
More informationAnalysis of Financial Time Series with EViews
Analysis of Financial Time Series with EViews Enrico Foscolo Contents 1 Asset Returns 2 1.1 Empirical Properties of Returns................. 2 2 Heteroskedasticity and Autocorrelation 4 2.1 Testing for
More informationIntroduction to Regression and Data Analysis
Statlab Workshop Introduction to Regression and Data Analysis with Dan Campbell and Sherlock Campbell October 28, 2008 I. The basics A. Types of variables Your variables may take several forms, and it
More informationProblems with OLS Considering :
Problems with OLS Considering : we assume Y i X i u i E u i 0 E u i or var u i E u i u j 0orcov u i,u j 0 We have seen that we have to make very specific assumptions about u i in order to get OLS estimates
More informationSimple Regression Theory II 2010 Samuel L. Baker
SIMPLE REGRESSION THEORY II 1 Simple Regression Theory II 2010 Samuel L. Baker Assessing how good the regression equation is likely to be Assignment 1A gets into drawing inferences about how close the
More informationVariance of OLS Estimators and Hypothesis Testing. Randomness in the model. GM assumptions. Notes. Notes. Notes. Charlie Gibbons ARE 212.
Variance of OLS Estimators and Hypothesis Testing Charlie Gibbons ARE 212 Spring 2011 Randomness in the model Considering the model what is random? Y = X β + ɛ, β is a parameter and not random, X may be
More informationAdvanced Forecasting Techniques and Models: ARIMA
Advanced Forecasting Techniques and Models: ARIMA Short Examples Series using Risk Simulator For more information please visit: www.realoptionsvaluation.com or contact us at: admin@realoptionsvaluation.com
More informationEconometrics Simple Linear Regression
Econometrics Simple Linear Regression Burcu Eke UC3M Linear equations with one variable Recall what a linear equation is: y = b 0 + b 1 x is a linear equation with one variable, or equivalently, a straight
More informationChapter 9: Serial Correlation
Chapter 9: Serial Correlation In this chapter: 1. Creating a residual series from a regression model 2. Plotting the error term to detect serial correlation (UE, pp. 313315) 3. Using regression to estimate
More informationAlternative Tests for Time Series Dependence Based on Autocorrelation Coefficients
Alternative Tests for Time Series Dependence Based on Autocorrelation Coefficients Richard M. Levich and Rosario C. Rizzo * Current Draft: December 1998 Abstract: When autocorrelation is small, existing
More informationOLS is not only unbiased it is also the most precise (efficient) unbiased estimation technique  ie the estimator has the smallest variance
Lecture 5: Hypothesis Testing What we know now: OLS is not only unbiased it is also the most precise (efficient) unbiased estimation technique  ie the estimator has the smallest variance (if the GaussMarkov
More informationRegression analysis in practice with GRETL
Regression analysis in practice with GRETL Prerequisites You will need the GNU econometrics software GRETL installed on your computer (http://gretl.sourceforge.net/), together with the sample files that
More informationRegression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur
Regression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur Lecture  7 Multiple Linear Regression (Contd.) This is my second lecture on Multiple Linear Regression
More informationChapter 4: Statistical Hypothesis Testing
Chapter 4: Statistical Hypothesis Testing Christophe Hurlin November 20, 2015 Christophe Hurlin () Advanced Econometrics  Master ESA November 20, 2015 1 / 225 Section 1 Introduction Christophe Hurlin
More informationUNIVERSITY OF WAIKATO. Hamilton New Zealand
UNIVERSITY OF WAIKATO Hamilton New Zealand Can We Trust ClusterCorrected Standard Errors? An Application of Spatial Autocorrelation with Exact Locations Known John Gibson University of Waikato Bonggeun
More informationRegression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur. Lecture  2 Simple Linear Regression
Regression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur Lecture  2 Simple Linear Regression Hi, this is my second lecture in module one and on simple
More informationMultiple Regression  Selecting the Best Equation An Example Techniques for Selecting the "Best" Regression Equation
Multiple Regression  Selecting the Best Equation When fitting a multiple linear regression model, a researcher will likely include independent variables that are not important in predicting the dependent
More informationBivariate Regression Analysis. The beginning of many types of regression
Bivariate Regression Analysis The beginning of many types of regression TOPICS Beyond Correlation Forecasting Two points to estimate the slope Meeting the BLUE criterion The OLS method Purpose of Regression
More informationRegression Analysis (Spring, 2000)
Regression Analysis (Spring, 2000) By Wonjae Purposes: a. Explaining the relationship between Y and X variables with a model (Explain a variable Y in terms of Xs) b. Estimating and testing the intensity
More informationSome useful concepts in univariate time series analysis
Some useful concepts in univariate time series analysis Autoregressive moving average models Autocorrelation functions Model Estimation Diagnostic measure Model selection Forecasting Assumptions: 1. Nonseasonal
More informationInstrumental Variables & 2SLS
Instrumental Variables & 2SLS y 1 = β 0 + β 1 y 2 + β 2 z 1 +... β k z k + u y 2 = π 0 + π 1 z k+1 + π 2 z 1 +... π k z k + v Economics 20  Prof. Schuetze 1 Why Use Instrumental Variables? Instrumental
More informationSales forecasting # 2
Sales forecasting # 2 Arthur Charpentier arthur.charpentier@univrennes1.fr 1 Agenda Qualitative and quantitative methods, a very general introduction Series decomposition Short versus long term forecasting
More informationSales forecasting # 1
Sales forecasting # 1 Arthur Charpentier arthur.charpentier@univrennes1.fr 1 Agenda Qualitative and quantitative methods, a very general introduction Series decomposition Short versus long term forecasting
More information1. The Classical Linear Regression Model: The Bivariate Case
Business School, Brunel University MSc. EC5501/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 018956584) Lecture Notes 3 1.
More informationHypothesis testing  Steps
Hypothesis testing  Steps Steps to do a twotailed test of the hypothesis that β 1 0: 1. Set up the hypotheses: H 0 : β 1 = 0 H a : β 1 0. 2. Compute the test statistic: t = b 1 0 Std. error of b 1 =
More informationADVANCED FORECASTING MODELS USING SAS SOFTWARE
ADVANCED FORECASTING MODELS USING SAS SOFTWARE Girish Kumar Jha IARI, Pusa, New Delhi 110 012 gjha_eco@iari.res.in 1. Transfer Function Model Univariate ARIMA models are useful for analysis and forecasting
More informationARMA, GARCH and Related Option Pricing Method
ARMA, GARCH and Related Option Pricing Method Author: Yiyang Yang Advisor: Pr. Xiaolin Li, Pr. Zari Rachev Department of Applied Mathematics and Statistics State University of New York at Stony Brook September
More informationSYSTEMS OF REGRESSION EQUATIONS
SYSTEMS OF REGRESSION EQUATIONS 1. MULTIPLE EQUATIONS y nt = x nt n + u nt, n = 1,...,N, t = 1,...,T, x nt is 1 k, and n is k 1. This is a version of the standard regression model where the observations
More informationThe Multiple Regression Model: Hypothesis Tests and the Use of Nonsample Information
Chapter 8 The Multiple Regression Model: Hypothesis Tests and the Use of Nonsample Information An important new development that we encounter in this chapter is using the F distribution to simultaneously
More informationPlease follow the directions once you locate the Stata software in your computer. Room 114 (Business Lab) has computers with Stata software
STATA Tutorial Professor Erdinç Please follow the directions once you locate the Stata software in your computer. Room 114 (Business Lab) has computers with Stata software 1.Wald Test Wald Test is used
More informationForecasting the US Dollar / Euro Exchange rate Using ARMA Models
Forecasting the US Dollar / Euro Exchange rate Using ARMA Models LIUWEI (9906360)  1  ABSTRACT...3 1. INTRODUCTION...4 2. DATA ANALYSIS...5 2.1 Stationary estimation...5 2.2 DickeyFuller Test...6 3.
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK2800 Kgs. Lyngby 1 Outline of the lecture Identification of univariate time series models, cont.:
More informationEconometrics Regression Analysis with Time Series Data
Econometrics Regression Analysis with Time Series Data João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, May 2011
More informationTrend and Seasonal Components
Chapter 2 Trend and Seasonal Components If the plot of a TS reveals an increase of the seasonal and noise fluctuations with the level of the process then some transformation may be necessary before doing
More informationInstrumental Variables & 2SLS
Instrumental Variables & 2SLS y 1 = β 0 + β 1 y 2 + β 2 z 1 +... β k z k + u y 2 = π 0 + π 1 z k+1 + π 2 z 1 +... π k z k + v Economics 20  Prof. Schuetze 1 Why Use Instrumental Variables? Instrumental
More informationUnit Labor Costs and the Price Level
Unit Labor Costs and the Price Level Yash P. Mehra A popular theoretical model of the inflation process is the expectationsaugmented Phillipscurve model. According to this model, prices are set as markup
More informationOutline. Topic 4  Analysis of Variance Approach to Regression. Partitioning Sums of Squares. Total Sum of Squares. Partitioning sums of squares
Topic 4  Analysis of Variance Approach to Regression Outline Partitioning sums of squares Degrees of freedom Expected mean squares General linear test  Fall 2013 R 2 and the coefficient of correlation
More informationINDIRECT INFERENCE (prepared for: The New Palgrave Dictionary of Economics, Second Edition)
INDIRECT INFERENCE (prepared for: The New Palgrave Dictionary of Economics, Second Edition) Abstract Indirect inference is a simulationbased method for estimating the parameters of economic models. Its
More informatione = random error, assumed to be normally distributed with mean 0 and standard deviation σ
1 Linear Regression 1.1 Simple Linear Regression Model The linear regression model is applied if we want to model a numeric response variable and its dependency on at least one numeric factor variable.
More informationNonStationary Time Series andunitroottests
Econometrics 2 Fall 2005 NonStationary Time Series andunitroottests Heino Bohn Nielsen 1of25 Introduction Many economic time series are trending. Important to distinguish between two important cases:
More informationTesting for Lack of Fit
Chapter 6 Testing for Lack of Fit How can we tell if a model fits the data? If the model is correct then ˆσ 2 should be an unbiased estimate of σ 2. If we have a model which is not complex enough to fit
More informationWhat s New in Econometrics? Lecture 8 Cluster and Stratified Sampling
What s New in Econometrics? Lecture 8 Cluster and Stratified Sampling Jeff Wooldridge NBER Summer Institute, 2007 1. The Linear Model with Cluster Effects 2. Estimation with a Small Number of Groups and
More informationSolución del Examen Tipo: 1
Solución del Examen Tipo: 1 Universidad Carlos III de Madrid ECONOMETRICS Academic year 2009/10 FINAL EXAM May 17, 2010 DURATION: 2 HOURS 1. Assume that model (III) verifies the assumptions of the classical
More informationA Multiplicative Seasonal BoxJenkins Model to Nigerian Stock Prices
A Multiplicative Seasonal BoxJenkins Model to Nigerian Stock Prices Ette Harrison Etuk Department of Mathematics/Computer Science, Rivers State University of Science and Technology, Nigeria Email: ettetuk@yahoo.com
More informationDepartment of Economics Session 2012/2013. EC352 Econometric Methods. Solutions to Exercises from Week 10 + 0.0077 (0.052)
Department of Economics Session 2012/2013 University of Essex Spring Term Dr Gordon Kemp EC352 Econometric Methods Solutions to Exercises from Week 10 1 Problem 13.7 This exercise refers back to Equation
More informationSimple Linear Regression Chapter 11
Simple Linear Regression Chapter 11 Rationale Frequently decisionmaking situations require modeling of relationships among business variables. For instance, the amount of sale of a product may be related
More informationTypes of Specification Errors 1. Omitted Variables. 2. Including an irrelevant variable. 3. Incorrect functional form. 2
Notes on Model Specification To go with Gujarati, Basic Econometrics, Chapter 13 Copyright  Jonathan Nagler; April 4, 1999 Attributes of a Good Econometric Model A.C. Harvey: (in Gujarati, p. 4534) ffl
More informationChapter 12: Time Series Models
Chapter 12: Time Series Models In this chapter: 1. Estimating ad hoc distributed lag & Koyck distributed lag models (UE 12.1.3) 2. Testing for serial correlation in Koyck distributed lag models (UE 12.2.2)
More informationUnivariate Regression
Univariate Regression Correlation and Regression The regression line summarizes the linear relationship between 2 variables Correlation coefficient, r, measures strength of relationship: the closer r is
More informationImpulse Response Functions
Impulse Response Functions Wouter J. Den Haan University of Amsterdam April 28, 2011 General definition IRFs The IRF gives the j th period response when the system is shocked by a onestandarddeviation
More informationWooldridge, Introductory Econometrics, 4th ed. Multiple regression analysis:
Wooldridge, Introductory Econometrics, 4th ed. Chapter 4: Inference Multiple regression analysis: We have discussed the conditions under which OLS estimators are unbiased, and derived the variances of
More information1 Another method of estimation: least squares
1 Another method of estimation: least squares erm: estim.tex, Dec8, 009: 6 p.m. (draft  typos/writos likely exist) Corrections, comments, suggestions welcome. 1.1 Least squares in general Assume Y i
More informationOnline Appendices to the Corporate Propensity to Save
Online Appendices to the Corporate Propensity to Save Appendix A: Monte Carlo Experiments In order to allay skepticism of empirical results that have been produced by unusual estimators on fairly small
More information16 : Demand Forecasting
16 : Demand Forecasting 1 Session Outline Demand Forecasting Subjective methods can be used only when past data is not available. When past data is available, it is advisable that firms should use statistical
More informationUnit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression
Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression Objectives: To perform a hypothesis test concerning the slope of a least squares line To recognize that testing for a
More informationLecture 2: Simple Linear Regression
DMBA: Statistics Lecture 2: Simple Linear Regression Least Squares, SLR properties, Inference, and Forecasting Carlos Carvalho The University of Texas McCombs School of Business mccombs.utexas.edu/faculty/carlos.carvalho/teaching
More informationEstimation and Inference in Cointegration Models Economics 582
Estimation and Inference in Cointegration Models Economics 582 Eric Zivot May 17, 2012 Tests for Cointegration Let the ( 1) vector Y be (1). Recall, Y is cointegrated with 0 cointegrating vectors if there
More informationChapter 10: Basic Linear Unobserved Effects Panel Data. Models:
Chapter 10: Basic Linear Unobserved Effects Panel Data Models: Microeconomic Econometrics I Spring 2010 10.1 Motivation: The Omitted Variables Problem We are interested in the partial effects of the observable
More informationAR(1) TIME SERIES PROCESS Econometrics 7590
AR(1) TIME SERIES PROCESS Econometrics 7590 Zsuzsanna HORVÁTH and Ryan JOHNSTON Abstract: We define the AR(1) process and its properties and applications. We demonstrate the applicability of our method
More informationIntegrated Resource Plan
Integrated Resource Plan March 19, 2004 PREPARED FOR KAUA I ISLAND UTILITY COOPERATIVE LCG Consulting 4962 El Camino Real, Suite 112 Los Altos, CA 94022 6509629670 1 IRP 1 ELECTRIC LOAD FORECASTING 1.1
More informationTime Series Analysis
Time Series Analysis Forecasting with ARIMA models Andrés M. Alonso Carolina GarcíaMartos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 2012 Alonso and GarcíaMartos (UC3MUPM)
More informationUsing Excel for inferential statistics
FACT SHEET Using Excel for inferential statistics Introduction When you collect data, you expect a certain amount of variation, just caused by chance. A wide variety of statistical tests can be applied
More informationWooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares
Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Many economic models involve endogeneity: that is, a theoretical relationship does not fit
More informationTesting for Granger causality between stock prices and economic growth
MPRA Munich Personal RePEc Archive Testing for Granger causality between stock prices and economic growth Pasquale Foresti 2006 Online at http://mpra.ub.unimuenchen.de/2962/ MPRA Paper No. 2962, posted
More informationIntroductory Econometrics
Based on the textbook by Ramanathan: Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna September 23, 2011 Outline Introduction Empirical economic
More informationLinear combinations of parameters
Linear combinations of parameters Suppose we want to test the hypothesis that two regression coefficients are equal, e.g. β 1 = β 2. This is equivalent to testing the following linear constraint (null
More informationChapter 4: Vector Autoregressive Models
Chapter 4: Vector Autoregressive Models 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and und Econometrics Ökonometrie IV.1 Vector Autoregressive Models (VAR)...
More informationIs the Basis of the Stock Index Futures Markets Nonlinear?
University of Wollongong Research Online Applied Statistics Education and Research Collaboration (ASEARC)  Conference Papers Faculty of Engineering and Information Sciences 2011 Is the Basis of the Stock
More informationEcon 371 Exam #4  Practice
Econ 37 Exam #4  Practice Multiple Choice (5 points each): For each of the following, select the single most appropriate option to complete the statement. ) The following will not cause correlation between
More informationThe Simple Linear Regression Model: Specification and Estimation
Chapter 3 The Simple Linear Regression Model: Specification and Estimation 3.1 An Economic Model Suppose that we are interested in studying the relationship between household income and expenditure on
More informationPITFALLS IN TIME SERIES ANALYSIS. Cliff Hurvich Stern School, NYU
PITFALLS IN TIME SERIES ANALYSIS Cliff Hurvich Stern School, NYU The t Test If x 1,..., x n are independent and identically distributed with mean 0, and n is not too small, then t = x 0 s n has a standard
More informationTime Series and Forecasting
Chapter 22 Page 1 Time Series and Forecasting A time series is a sequence of observations of a random variable. Hence, it is a stochastic process. Examples include the monthly demand for a product, the
More informationPerforming Unit Root Tests in EViews. Unit Root Testing
Página 1 de 12 Unit Root Testing The theory behind ARMA estimation is based on stationary time series. A series is said to be (weakly or covariance) stationary if the mean and autocovariances of the series
More informationMultiple Regression: What Is It?
Multiple Regression Multiple Regression: What Is It? Multiple regression is a collection of techniques in which there are multiple predictors of varying kinds and a single outcome We are interested in
More informationRegression Analysis Using ArcMap. By Jennie Murack
Regression Analysis Using ArcMap By Jennie Murack Regression Basics How is Regression Different from other Spatial Statistical Analyses? With other tools you ask WHERE something is happening? Are there
More informationStatistical Tests for Multiple Forecast Comparison
Statistical Tests for Multiple Forecast Comparison Roberto S. Mariano (Singapore Management University & University of Pennsylvania) Daniel Preve (Uppsala University) June 67, 2008 T.W. Anderson Conference,
More informationwhere b is the slope of the line and a is the intercept i.e. where the line cuts the y axis.
Least Squares Introduction We have mentioned that one should not always conclude that because two variables are correlated that one variable is causing the other to behave a certain way. However, sometimes
More informationRob J Hyndman. Forecasting using. 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1
Rob J Hyndman Forecasting using 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1 Outline 1 Regression with ARIMA errors 2 Example: Japanese cars 3 Using Fourier terms for seasonality 4
More informationIAPRI Quantitative Analysis Capacity Building Series. Multiple regression analysis & interpreting results
IAPRI Quantitative Analysis Capacity Building Series Multiple regression analysis & interpreting results How important is Rsquared? Rsquared Published in Agricultural Economics 0.45 Best article of the
More informationLecture 2: ARMA(p,q) models (part 3)
Lecture 2: ARMA(p,q) models (part 3) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEANice) Sept. 2011  Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationTime Series Analysis
Time Series Analysis Autoregressive, MA and ARMA processes Andrés M. Alonso Carolina GarcíaMartos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 212 Alonso and GarcíaMartos
More information4. Simple regression. QBUS6840 Predictive Analytics. https://www.otexts.org/fpp/4
4. Simple regression QBUS6840 Predictive Analytics https://www.otexts.org/fpp/4 Outline The simple linear model Least squares estimation Forecasting with regression Nonlinear functional forms Regression
More informationOLS in Matrix Form. Let y be an n 1 vector of observations on the dependent variable.
OLS in Matrix Form 1 The True Model Let X be an n k matrix where we have observations on k independent variables for n observations Since our model will usually contain a constant term, one of the columns
More informationHYPOTHESIS TESTING: CONFIDENCE INTERVALS, TTESTS, ANOVAS, AND REGRESSION
HYPOTHESIS TESTING: CONFIDENCE INTERVALS, TTESTS, ANOVAS, AND REGRESSION HOD 2990 10 November 2010 Lecture Background This is a lightning speed summary of introductory statistical methods for senior undergraduate
More informationModule 5: Multiple Regression Analysis
Using Statistical Data Using to Make Statistical Decisions: Data Multiple to Make Regression Decisions Analysis Page 1 Module 5: Multiple Regression Analysis Tom Ilvento, University of Delaware, College
More informationLinear Models for Continuous Data
Chapter 2 Linear Models for Continuous Data The starting point in our exploration of statistical models in social research will be the classical linear model. Stops along the way include multiple linear
More information