DEPARTMENT OF ECONOMICS. Unit ECON Introduction to Econometrics. Notes 4 2. R and F tests


 Godwin Marsh
 2 years ago
 Views:
Transcription
1 DEPARTMENT OF ECONOMICS Unit ECON 11 Introduction to Econometrics Notes 4 R and F tests These notes provide a summary of the lectures. They are not a complete account of the unit material. You should also consult the reading as given in the unit outline and the lectures. R 1. Once a regression has been estimated, it is important to evaluate the results. The best known way of doing this is by using a statistic called R. In the simple regression model or E(Yi Xi ) = α + βx i i=1...n Y i = α + βx i + u i where E(ui Xi ) = 0, we estimate α and β by OLS. We can then write Y i = αˆ + βˆ X i + e i (1) Where αˆ and βˆ are the OLS estimators of α and β respectively and e i are the OLS residuals. Since α ˆ= Y βx we can write (1) as ( Yi Y) = β ˆ (X X) i + e i () Both sides of () can be squared and summed to give i Y) Σ ( Y = β ˆ Σ(X X) + i Σ e i (3) The cross product term β ˆΣ(X X)e 0. i i = 1
2 Equation (3) is an important relationship. Each term is referred to as a kind of sum of squares. Σ( Y βˆ Σ(X Σe i i Y) i X) Total sum of squares (TSS) Explained sum of Squares (ESS) Residual sum of squares (RSS) Thus (3) says TSS = ESS + RSS R is defined in the following way R = ESS TSS Because of (3) 0 R 1 R is often regarded as the proportion of the variance of the dependent variable which is explained by the regression line. As such a higher value of R is regarded as better than a lower value. Unfortunately adding spurious explanatory variables to the regression will always raise coefficients. Thus a high value of R when the OLS technique is used to estimate the R is not always a good sign. In practice it is often easy to find comparatively high values of R in regressions using time series samples and in this context R is not very informative. Lower values of R generally occur in regressions using large crosssection samples and in this context R is more useful. Although see remarks on the F test below.. F tests Whenever we wish to test a null hypothesis which contains restrictions on more than one coefficient or consists of more than one linear restriction, a convenient test statistic has the F distribution if the null hypothesis is true. Suppose the model is Y i = α 0 + α 1 X 1i + X i examples of such null hypotheses would be (i) α 1 = 0, α = 0 or (ii) α + α + α = α + α 3 X 3i + ui i = 1,,..,n (4)
3 The F test procedure compares the value of RSS under the null (that is when the model is restricted by the null) with the RSS when the model is unrestricted. Thus the restricted model under the null of (i) above would be Y i = α 0 + α 3 X 3i + ui i = 1,,..,n The formula for the F test is then (RSSR RSS U RSSU ) / d (n k) (5) is distributed as F with d and (nk) degrees of freedom. Where RSS R is the RSS from the restricted equation RSS U is the RSS from the unrestricted equation d is the number of restrictions n is the number of observations k is the number of coefficients in the unrestricted equation. Thus if the null was as in (i) above d = and k = 4. F tests can be used to test any linear restriction of the coefficients of a regression model. It is important to remember that (5) only has the appropriate F distribution under the null when certain assumptions about the regression model are true. These are (i) The model as described under H 0 is true. (ii) E(u i u j ) = σ i=j (homoscedasticity) (iii) either i theorem to apply. = 0 i j (no serial correlation) u ~ N( ο, σ ) or a sufficiently large sample for the central limit 3. Production Function Example In Notes, a CobbDouglas production function was estimated on a sample of annual data for UK manufacturing. The OLS results were these y t = t k l t + (5.030) (0.499) (0.51) et t = 1,,..,n (6) standard errors in brackets, n = 4, et is the regression residual, R = 0.163, s = 0.078, RSS = 0.183, F = A number of diagnostic statistics have now been included. 3
4 R indicates that by the standard of time series regressions, the regression line explains a comparatively small proportion of the variation in log output. This is not surprising given the fact the neither slope coefficient is estimated very precisely (both estimated slopes have comparatively large standard errors). s is an estimate of the standard error of the residuals. It is sometimes called the standard error of the regression. It is calculated in the following way; s = (n e t k) = e t 1 In models where all the variables are measured in logs (as here), s has the interpretation that it is a measure of the size of the average residual as a percentage of the dependent variable. If this regression was used to predict the value of log output over the sample period then the prediction would (on average) be wrong by 7.81 per cent. RSS is the residual sum of squares. It will have a straightforward relationship with s above. You should check that it does so. F is the F statistic see section 4 below. Now the reason why we were interested in estimating this CobbDouglas production function was to test the hypothesis of constant returns to scale. The model is y = A + α1 k + α l + u Where y = log(y), A = log( α 0 ), k = log(k), l = log(l) and the random disturbance, u with E(u k,l) = 0 and the hypothesis of constant returns to scale is α 1 + α = 1. This is the sort of hypothesis which the F test is designed for. To calculate the appropriate F statistic we need RSS U and RSS R. We already have the RSS U from equation (6) above (0.183). We need RSS R. In practice this can often be computed by whatever computer programme we are using. (See Exercise 7). In this example it turns out that RSS R is Note that RSS R is larger than RSS U. If it was not, there would be something wrong with the calculations. We can now compute the F statistic using equation (5) above. F = ( ) / / 1 = This F statistic has 1 and 1 degrees of freedom. The critical value of F(1,1) at 95 % is 4.3. Thus we cannot reject the null hypothesis that α + α 1. As we have seen in 1 = 4
5 Notes, the 95 % confidence interval for α 1 included one, so it is not perhaps a surprise that we cannot reject the null of constant returns to scale. 4. Tests of Significance. Often when a regression model is estimated, the investigator examines each of the estimated coefficients to see if they are significant. This means testing the null hypothesis that the coefficient is zero. The test statistic is βˆ 0 s.e.(ˆ) β = βˆ s.e.(ˆ) β which has a t distribution of (nk) degrees of freedom. This is often called the t ratio and is sometimes given in regression results in brackets under the estimated coefficients instead of the standard error. It is important to realize that it can be misleading to focus exclusively on the t ratio. A t ratio may be less than its critical value (and thus the null is not rejected) because the standard error is large even though the point estimate of β (βˆ ) is also comparatively large. On another occasion the point estimate may be comparatively small (0.00 say) but because the standard error is even smaller, the estimated coefficient may be significant (i.e. the null that the coefficient is zero is rejected). If, in the context of the model 0.00 is a very small effect, the fact that this particular coefficient is significant may not be very interesting. It is important to remember the that a confidence interval may give more information about the range of possible values of the coefficient than a test of significance. Just as there is the t ratio which tests the significance of one coefficient in a regression, so there is the F test which tests the significance of all the slope coefficients in the regression. Returning to the example given above, suppose the model is Y i = α 0 + α 1 X 1i + X i We can test the joint null that H : α =, α = 0, α 0 against = i H : any α 0 for i = 1,,3 α + α 3 X 3i + ui i = 1,,..,n (7) The test statistic uses the formula (5) above. In this case the RSS U is the RSS from the OLS estimate of the equation (6). The restricted equation takes the form Y i = α 0 + ui i = 1,,..,n 5
6 and the RSS from this equation is the TSS from (6). This gives the F statistic a particular form which is related to the R from the unrestricted equation. the F statistic = (TSS RSSU ) /(k RSS (n k) U 1) = (1 R R /(k 1) ) /(n k) Often the F statistic is given as a diagnostic statistic with the regression results. For an example of this see the estimates of the production function, equation (6) above. There the F statistic is given as This has a distribution of and 1 degrees of freedom. The critical value at 95 % is Thus we do not reject the hypothesis that both α 1 and α are zero. Again this is not very surprising since the 95% confidence intervals for both these coefficients included zero (see Notes ). The link between R and the F statistic provides a further interpretation to R. If is comparatively high, it is more likely that the null that all the slope coefficients in the regression are zero will be rejected. If it is comparatively low, then it is more likely that this null will not to be rejected. Notice that the F statistic (like all F statistics) depends on the number of observations (n) and the number of coefficients in the model (k). R does not depend on n or k and thus can be artificially boosted as described in section 1 above. The reservations concerning the use of the t ratio given above also apply to the F statistic. R 5. Chow Tests A special and useful application of the F test procedure is to test in time series models for a structural break. A structural break is when the coefficients of the model change. Thus suppose we have the following model Y i = α 0 + α 1 X 1i + X i α + u i i =1,,..,T (8) It is believed that the coefficients may have changed at some point in the sample, say after period s. If this were true we would have Y i = β 0 + β 1 X 1i + X i β + u i i =1,,..,s and (9) Y i = 0γ + γ 1 X 1i + X i γ + u i i =s+1,..,t (10) Note that the null hypothesis is H 0 : no structural break after observation s. H : structural break after observation s 1 6
7 Thus the restricted model is model (8) and the OLS estimates of that model provide RSS R. The unrestricted model is equations (9) and (10). The RSS U is the sum of the RSS for equation (9) and for equation (10). We then apply the formula for the F test as given in (5). In this case it becomes (RSSR RSS U RSSU ) / k (T k) which is distributed as F with k and (Tk) degrees of freedom. Or in the example above F with and (T4) degrees of freedom. Note that this test requires that the point s is so placed in the sample that there are enough observations both before and after s for the model to be estimated in each part. If this is not true another form of the test is available (see the textbooks). The test assumes that the variance of the disturbances is the same in both parts of the sample. It is worthwhile checking that the estimates of the variance of the disturbances from each part of the sample are not different by an order of magnitude. If the estimated variances are different by that kind of margin, the Chow test will probably not be valid. 6. Examples of F tests The following estimates were made with a sample of quarterly observations on UK data The dependent variables is the log of consumers expenditure on consumption goods at 1985 prices. y is the log of disposable income at 1985 prices. (I) (ii) (iii) (iv) Constant (0.176) (0.181) (0.514) (0.337) Yd t (0.143) (0.017) (0.1) (0.186) Yd t (0.151) (0.13) (0.18) Yd t (0.144) (0.10) (0.18) R s RSS n Standard errors given in brackets. 7
8 The model being estimated here can be written c t = a 0 + a 1 y t + y t 1 a + a 3 y t + u t t=1,,,t where c t is the log of consumers expenditure at constant prices, y t is the log of disposable income at constant prices. We will use the F test to test two different hypotheses. (i) H 0 : a = 0, a 3= 0 H 1: either a 0 or a 3 0 Using the estimates given above RSS U = and RSS R = Thus, F = ( ) / /107 = 4.3 This has an F distribution of and 107 degrees of freedom. The 95% critical value is 3.09 (approximately). Thus we reject the null hypothesis. (ii) Using the estimates given above we can also test for structural change after the 50 th observation, that is after H 0 : no structural break after observation H : structural break after observation For this test Thus, RSS U = = and RSS R = F = ( ) / /103 = 3.91 David Winter March 000 This has an F distribution of 4 and 103 degrees of freedom. The 95% critical value is.46 (approximately). Thus we reject the null hypothesis that this form of the consumption function did not have a structural break after It is also important to check that the variance of the disturbances did not change at the break point. The estimate of the variance for the first part of the sample is /46 = In the second half of the sample it is 0.051/57 = Although these estimates are not identical, they do not indicate that the variance has substantially changed. 8
Simple Linear Regression Inference
Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation
More informationOLS is not only unbiased it is also the most precise (efficient) unbiased estimation technique  ie the estimator has the smallest variance
Lecture 5: Hypothesis Testing What we know now: OLS is not only unbiased it is also the most precise (efficient) unbiased estimation technique  ie the estimator has the smallest variance (if the GaussMarkov
More informationUnit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression
Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression Objectives: To perform a hypothesis test concerning the slope of a least squares line To recognize that testing for a
More informationBivariate Regression Analysis. The beginning of many types of regression
Bivariate Regression Analysis The beginning of many types of regression TOPICS Beyond Correlation Forecasting Two points to estimate the slope Meeting the BLUE criterion The OLS method Purpose of Regression
More information1. The Classical Linear Regression Model: The Bivariate Case
Business School, Brunel University MSc. EC5501/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 018956584) Lecture Notes 3 1.
More information17. SIMPLE LINEAR REGRESSION II
17. SIMPLE LINEAR REGRESSION II The Model In linear regression analysis, we assume that the relationship between X and Y is linear. This does not mean, however, that Y can be perfectly predicted from X.
More informationExample: Boats and Manatees
Figure 96 Example: Boats and Manatees Slide 1 Given the sample data in Table 91, find the value of the linear correlation coefficient r, then refer to Table A6 to determine whether there is a significant
More informationStatistics 112 Regression Cheatsheet Section 1B  Ryan Rosario
Statistics 112 Regression Cheatsheet Section 1B  Ryan Rosario I have found that the best way to practice regression is by brute force That is, given nothing but a dataset and your mind, compute everything
More information" Y. Notation and Equations for Regression Lecture 11/4. Notation:
Notation: Notation and Equations for Regression Lecture 11/4 m: The number of predictor variables in a regression Xi: One of multiple predictor variables. The subscript i represents any number from 1 through
More information2. Simple Linear Regression
Research methods  II 3 2. Simple Linear Regression Simple linear regression is a technique in parametric statistics that is commonly used for analyzing mean response of a variable Y which changes according
More informationLectures 8, 9 & 10. Multiple Regression Analysis
Lectures 8, 9 & 0. Multiple Regression Analysis In which you learn how to apply the principles and tests outlined in earlier lectures to more realistic models involving more than explanatory variable and
More information5. Linear Regression
5. Linear Regression Outline.................................................................... 2 Simple linear regression 3 Linear model............................................................. 4
More informationTwoVariable Regression: Interval Estimation and Hypothesis Testing
TwoVariable Regression: Interval Estimation and Hypothesis Testing Jamie Monogan University of Georgia Intermediate Political Methodology Jamie Monogan (UGA) Confidence Intervals & Hypothesis Testing
More informationChapter 11: Two Variable Regression Analysis
Department of Mathematics Izmir University of Economics Week 1415 20142015 In this chapter, we will focus on linear models and extend our analysis to relationships between variables, the definitions
More informationE205 Final: Version B
Name: Class: Date: E205 Final: Version B Multiple Choice Identify the choice that best completes the statement or answers the question. 1. The owner of a local nightclub has recently surveyed a random
More informationRegression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur
Regression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur Lecture  7 Multiple Linear Regression (Contd.) This is my second lecture on Multiple Linear Regression
More informationHeteroskedasticity and Weighted Least Squares
Econ 507. Econometric Analysis. Spring 2009 April 14, 2009 The Classical Linear Model: 1 Linearity: Y = Xβ + u. 2 Strict exogeneity: E(u) = 0 3 No Multicollinearity: ρ(x) = K. 4 No heteroskedasticity/
More information1. What is the critical value for this 95% confidence interval? CV = z.025 = invnorm(0.025) = 1.96
1 Final Review 2 Review 2.1 CI 1propZint Scenario 1 A TV manufacturer claims in its warranty brochure that in the past not more than 10 percent of its TV sets needed any repair during the first two years
More informationPanel Data Analysis in Stata
Panel Data Analysis in Stata Anton Parlow Lab session Econ710 UWM Econ Department??/??/2010 or in a SBahn in Berlin, you never know.. Our plan Introduction to Panel data Fixed vs. Random effects Testing
More informationUnivariate Regression
Univariate Regression Correlation and Regression The regression line summarizes the linear relationship between 2 variables Correlation coefficient, r, measures strength of relationship: the closer r is
More informationIAPRI Quantitative Analysis Capacity Building Series. Multiple regression analysis & interpreting results
IAPRI Quantitative Analysis Capacity Building Series Multiple regression analysis & interpreting results How important is Rsquared? Rsquared Published in Agricultural Economics 0.45 Best article of the
More informationttests and Ftests in regression
ttests and Ftests in regression Johan A. Elkink University College Dublin 5 April 2012 Johan A. Elkink (UCD) t and Ftests 5 April 2012 1 / 25 Outline 1 Simple linear regression Model Variance and R
More informationHomework #3 is due Friday by 5pm. Homework #4 will be posted to the class website later this week. It will be due Friday, March 7 th, at 5pm.
Homework #3 is due Friday by 5pm. Homework #4 will be posted to the class website later this week. It will be due Friday, March 7 th, at 5pm. Political Science 15 Lecture 12: Hypothesis Testing Sampling
More informationSELFTEST: SIMPLE REGRESSION
ECO 22000 McRAE SELFTEST: SIMPLE REGRESSION Note: Those questions indicated with an (N) are unlikely to appear in this form on an inclass examination, but you should be able to describe the procedures
More informationHypothesis testing  Steps
Hypothesis testing  Steps Steps to do a twotailed test of the hypothesis that β 1 0: 1. Set up the hypotheses: H 0 : β 1 = 0 H a : β 1 0. 2. Compute the test statistic: t = b 1 0 Std. error of b 1 =
More informationLinear and Piecewise Linear Regressions
Tarigan Statistical Consulting & Coaching statisticalcoaching.ch Doctoral Program in Computer Science of the Universities of Fribourg, Geneva, Lausanne, Neuchâtel, Bern and the EPFL Handson Data Analysis
More information2. What are the theoretical and practical consequences of autocorrelation?
Lecture 10 Serial Correlation In this lecture, you will learn the following: 1. What is the nature of autocorrelation? 2. What are the theoretical and practical consequences of autocorrelation? 3. Since
More informationEconometrics Simple Linear Regression
Econometrics Simple Linear Regression Burcu Eke UC3M Linear equations with one variable Recall what a linear equation is: y = b 0 + b 1 x is a linear equation with one variable, or equivalently, a straight
More informationWooldridge, Introductory Econometrics, 4th ed. Multiple regression analysis:
Wooldridge, Introductory Econometrics, 4th ed. Chapter 4: Inference Multiple regression analysis: We have discussed the conditions under which OLS estimators are unbiased, and derived the variances of
More informationSolución del Examen Tipo: 1
Solución del Examen Tipo: 1 Universidad Carlos III de Madrid ECONOMETRICS Academic year 2009/10 FINAL EXAM May 17, 2010 DURATION: 2 HOURS 1. Assume that model (III) verifies the assumptions of the classical
More informationPASS Sample Size Software. Linear Regression
Chapter 855 Introduction Linear regression is a commonly used procedure in statistical analysis. One of the main objectives in linear regression analysis is to test hypotheses about the slope (sometimes
More information2. What is the general linear model to be used to model linear trend? (Write out the model) = + + + or
Simple and Multiple Regression Analysis Example: Explore the relationships among Month, Adv.$ and Sales $: 1. Prepare a scatter plot of these data. The scatter plots for Adv.$ versus Sales, and Month versus
More information2. Linear regression with multiple regressors
2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measuresoffit in multiple regression Assumptions
More informationwhere b is the slope of the line and a is the intercept i.e. where the line cuts the y axis.
Least Squares Introduction We have mentioned that one should not always conclude that because two variables are correlated that one variable is causing the other to behave a certain way. However, sometimes
More informationRegression in SPSS. Workshop offered by the Mississippi Center for Supercomputing Research and the UM Office of Information Technology
Regression in SPSS Workshop offered by the Mississippi Center for Supercomputing Research and the UM Office of Information Technology John P. Bentley Department of Pharmacy Administration University of
More informationStudy Guide for the Final Exam
Study Guide for the Final Exam When studying, remember that the computational portion of the exam will only involve new material (covered after the second midterm), that material from Exam 1 will make
More informationWeek TSX Index 1 8480 2 8470 3 8475 4 8510 5 8500 6 8480
1) The S & P/TSX Composite Index is based on common stock prices of a group of Canadian stocks. The weekly close level of the TSX for 6 weeks are shown: Week TSX Index 1 8480 2 8470 3 8475 4 8510 5 8500
More informationRegression in ANOVA. James H. Steiger. Department of Psychology and Human Development Vanderbilt University
Regression in ANOVA James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 30 Regression in ANOVA 1 Introduction 2 Basic Linear
More informationHypothesis Testing in the Linear Regression Model An Overview of t tests, D Prescott
Hypothesis Testing in the Linear Regression Model An Overview of t tests, D Prescott 1. Hypotheses as restrictions An hypothesis typically places restrictions on population regression coefficients. Consider
More informationCausal Forecasting Models
CTL.SC1x Supply Chain & Logistics Fundamentals Causal Forecasting Models MIT Center for Transportation & Logistics Causal Models Used when demand is correlated with some known and measurable environmental
More informationMath 62 Statistics Sample Exam Questions
Math 62 Statistics Sample Exam Questions 1. (10) Explain the difference between the distribution of a population and the sampling distribution of a statistic, such as the mean, of a sample randomly selected
More informationQuestions and Answers on Hypothesis Testing and Confidence Intervals
Questions and Answers on Hypothesis Testing and Confidence Intervals L. Magee Fall, 2008 1. Using 25 observations and 5 regressors, including the constant term, a researcher estimates a linear regression
More informationRegression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur. Lecture  2 Simple Linear Regression
Regression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur Lecture  2 Simple Linear Regression Hi, this is my second lecture in module one and on simple
More information, then the form of the model is given by: which comprises a deterministic component involving the three regression coefficients (
Multiple regression Introduction Multiple regression is a logical extension of the principles of simple linear regression to situations in which there are several predictor variables. For instance if we
More informationSimultaneous Equation Models As discussed last week, one important form of endogeneity is simultaneity. This arises when one or more of the
Simultaneous Equation Models As discussed last week, one important form of endogeneity is simultaneity. This arises when one or more of the explanatory variables is jointly determined with the dependent
More informatione = random error, assumed to be normally distributed with mean 0 and standard deviation σ
1 Linear Regression 1.1 Simple Linear Regression Model The linear regression model is applied if we want to model a numeric response variable and its dependency on at least one numeric factor variable.
More informationStatistics II Final Exam  January Use the University stationery to give your answers to the following questions.
Statistics II Final Exam  January 2012 Use the University stationery to give your answers to the following questions. Do not forget to write down your name and class group in each page. Indicate clearly
More informationEconometrics The Multiple Regression Model: Inference
Econometrics The Multiple Regression Model: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, March 2011 1 / 24 in
More informationChapter 10. Key Ideas Correlation, Correlation Coefficient (r),
Chapter 0 Key Ideas Correlation, Correlation Coefficient (r), Section 0: Overview We have already explored the basics of describing single variable data sets. However, when two quantitative variables
More informationChapter 11: Hypothesis Testing and the Wald Test
Chapter 11: Hypothesis Testing and the Wald Test Chapter 11 Outline No Money Illusion Theory: Taking Stock No Money Illusion Theory: Calculating Prob[Results IF H 0 True] o Clever Algebraic Manipulation
More informationUsing SPSS for Multiple Regression. UDP 520 Lab 7 Lin Lin December 4 th, 2007
Using SPSS for Multiple Regression UDP 520 Lab 7 Lin Lin December 4 th, 2007 Step 1 Define Research Question What factors are associated with BMI? Predict BMI. Step 2 Conceptualizing Problem (Theory) Individual
More informationIllustration (and the use of HLM)
Illustration (and the use of HLM) Chapter 4 1 Measurement Incorporated HLM Workshop The Illustration Data Now we cover the example. In doing so we does the use of the software HLM. In addition, we will
More informationSimple Regression Theory II 2010 Samuel L. Baker
SIMPLE REGRESSION THEORY II 1 Simple Regression Theory II 2010 Samuel L. Baker Assessing how good the regression equation is likely to be Assignment 1A gets into drawing inferences about how close the
More informationCHAPTER 13 SIMPLE LINEAR REGRESSION. Opening Example. Simple Regression. Linear Regression
Opening Example CHAPTER 13 SIMPLE LINEAR REGREION SIMPLE LINEAR REGREION! Simple Regression! Linear Regression Simple Regression Definition A regression model is a mathematical equation that descries the
More informationProblems with OLS Considering :
Problems with OLS Considering : we assume Y i X i u i E u i 0 E u i or var u i E u i u j 0orcov u i,u j 0 We have seen that we have to make very specific assumptions about u i in order to get OLS estimates
More information1.5 Oneway Analysis of Variance
Statistics: Rosie Cornish. 200. 1.5 Oneway Analysis of Variance 1 Introduction Oneway analysis of variance (ANOVA) is used to compare several means. This method is often used in scientific or medical experiments
More informationRegression analysis in practice with GRETL
Regression analysis in practice with GRETL Prerequisites You will need the GNU econometrics software GRETL installed on your computer (http://gretl.sourceforge.net/), together with the sample files that
More informationEconometric Methods fo Panel Data Part II
Econometric Methods fo Panel Data Part II Robert M. Kunst University of Vienna April 2009 1 Tests in panel models Whereas restriction tests within a specific panel model follow the usual principles, based
More informationBasic Statistcs Formula Sheet
Basic Statistcs Formula Sheet Steven W. ydick May 5, 0 This document is only intended to review basic concepts/formulas from an introduction to statistics course. Only meanbased procedures are reviewed,
More informationA Review of Cross Sectional Regression for Financial Data You should already know this material from previous study
A Review of Cross Sectional Regression for Financial Data You should already know this material from previous study But I will offer a review, with a focus on issues which arise in finance 1 TYPES OF FINANCIAL
More information0.1 Multiple Regression Models
0.1 Multiple Regression Models We will introduce the multiple Regression model as a mean of relating one numerical response variable y to two or more independent (or predictor variables. We will see different
More informationSimple Linear Regression Chapter 11
Simple Linear Regression Chapter 11 Rationale Frequently decisionmaking situations require modeling of relationships among business variables. For instance, the amount of sale of a product may be related
More informationSUBMODELS (NESTED MODELS) AND ANALYSIS OF VARIANCE OF REGRESSION MODELS
1 SUBMODELS (NESTED MODELS) AND ANALYSIS OF VARIANCE OF REGRESSION MODELS We will assume we have data (x 1, y 1 ), (x 2, y 2 ),, (x n, y n ) and make the usual assumptions of independence and normality.
More informationChapter 10: Basic Linear Unobserved Effects Panel Data. Models:
Chapter 10: Basic Linear Unobserved Effects Panel Data Models: Microeconomic Econometrics I Spring 2010 10.1 Motivation: The Omitted Variables Problem We are interested in the partial effects of the observable
More informationSPSS Guide: Regression Analysis
SPSS Guide: Regression Analysis I put this together to give you a stepbystep guide for replicating what we did in the computer lab. It should help you run the tests we covered. The best way to get familiar
More informationSimple Linear Regression in SPSS STAT 314
Simple Linear Regression in SPSS STAT 314 1. Ten Corvettes between 1 and 6 years old were randomly selected from last year s sales records in Virginia Beach, Virginia. The following data were obtained,
More informationOneWay Analysis of Variance: A Guide to Testing Differences Between Multiple Groups
OneWay Analysis of Variance: A Guide to Testing Differences Between Multiple Groups In analysis of variance, the main research question is whether the sample means are from different populations. The
More informationLeast Squares Estimation
Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN13: 9780470860809 ISBN10: 0470860804 Editors Brian S Everitt & David
More informationMULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS
MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS MSR = Mean Regression Sum of Squares MSE = Mean Squared Error RSS = Regression Sum of Squares SSE = Sum of Squared Errors/Residuals α = Level of Significance
More informationFinal Exam Practice Problem Answers
Final Exam Practice Problem Answers The following data set consists of data gathered from 77 popular breakfast cereals. The variables in the data set are as follows: Brand: The brand name of the cereal
More informationFinancial Risk Management Exam Sample Questions/Answers
Financial Risk Management Exam Sample Questions/Answers Prepared by Daniel HERLEMONT 1 2 3 4 5 6 Chapter 3 Fundamentals of Statistics FRM99, Question 4 Random walk assumes that returns from one time period
More informationIntroduction to Hypothesis Testing. Point estimation and confidence intervals are useful statistical inference procedures.
Introduction to Hypothesis Testing Point estimation and confidence intervals are useful statistical inference procedures. Another type of inference is used frequently used concerns tests of hypotheses.
More informationAP Statistics 2002 Scoring Guidelines
AP Statistics 2002 Scoring Guidelines The materials included in these files are intended for use by AP teachers for course and exam preparation in the classroom; permission for any other use must be sought
More informationMultiple Regression Analysis in Minitab 1
Multiple Regression Analysis in Minitab 1 Suppose we are interested in how the exercise and body mass index affect the blood pressure. A random sample of 10 males 50 years of age is selected and their
More informationAugust 2012 EXAMINATIONS Solution Part I
August 01 EXAMINATIONS Solution Part I (1) In a random sample of 600 eligible voters, the probability that less than 38% will be in favour of this policy is closest to (B) () In a large random sample,
More informationNote 2 to Computer class: Standard misspecification tests
Note 2 to Computer class: Standard misspecification tests Ragnar Nymoen September 2, 2013 1 Why misspecification testing of econometric models? As econometricians we must relate to the fact that the
More informationRegression Analysis: A Complete Example
Regression Analysis: A Complete Example This section works out an example that includes all the topics we have discussed so far in this chapter. A complete example of regression analysis. PhotoDisc, Inc./Getty
More informationHence, multiplying by 12, the 95% interval for the hourly rate is (965, 1435)
Confidence Intervals for Poisson data For an observation from a Poisson distribution, we have σ 2 = λ. If we observe r events, then our estimate ˆλ = r : N(λ, λ) If r is bigger than 20, we can use this
More information3.6: General Hypothesis Tests
3.6: General Hypothesis Tests The χ 2 goodness of fit tests which we introduced in the previous section were an example of a hypothesis test. In this section we now consider hypothesis tests more generally.
More informationMultiple Linear Regression
Multiple Linear Regression A regression with two or more explanatory variables is called a multiple regression. Rather than modeling the mean response as a straight line, as in simple regression, it is
More informationRegression Analysis: Basic Concepts
The simple linear model Regression Analysis: Basic Concepts Allin Cottrell Represents the dependent variable, y i, as a linear function of one independent variable, x i, subject to a random disturbance
More informationLecture 18 Linear Regression
Lecture 18 Statistics Unit Andrew Nunekpeku / Charles Jackson Fall 2011 Outline 1 1 Situation  used to model quantitative dependent variable using linear function of quantitative predictor(s). Situation
More informationChapter 11: Linear Regression  Inference in Regression Analysis  Part 2
Chapter 11: Linear Regression  Inference in Regression Analysis  Part 2 Note: Whether we calculate confidence intervals or perform hypothesis tests we need the distribution of the statistic we will use.
More informationfifty Fathoms Statistics Demonstrations for Deeper Understanding Tim Erickson
fifty Fathoms Statistics Demonstrations for Deeper Understanding Tim Erickson Contents What Are These Demos About? How to Use These Demos If This Is Your First Time Using Fathom Tutorial: An Extended Example
More informationRockefeller College University at Albany
Rockefeller College University at Albany PAD 705 Handout: Hypothesis Testing on Multiple Parameters In many cases we may wish to know whether two or more variables are jointly significant in a regression.
More informationThe scatterplot indicates a positive linear relationship between waist size and body fat percentage:
STAT E150 Statistical Methods Multiple Regression Three percent of a man's body is essential fat, which is necessary for a healthy body. However, too much body fat can be dangerous. For men between the
More informationA Logic of Prediction and Evaluation
5  Hypothesis Testing in the Linear Model Page 1 A Logic of Prediction and Evaluation 5:12 PM One goal of science: determine whether current ways of thinking about the world are adequate for predicting
More informationECON 142 SKETCH OF SOLUTIONS FOR APPLIED EXERCISE #2
University of California, Berkeley Prof. Ken Chay Department of Economics Fall Semester, 005 ECON 14 SKETCH OF SOLUTIONS FOR APPLIED EXERCISE # Question 1: a. Below are the scatter plots of hourly wages
More informationConditional guidance as a response to supply uncertainty
1 Conditional guidance as a response to supply uncertainty Appendix to the speech given by Ben Broadbent, External Member of the Monetary Policy Committee, Bank of England At the London Business School,
More informationPart 2: Analysis of Relationship Between Two Variables
Part 2: Analysis of Relationship Between Two Variables Linear Regression Linear correlation Significance Tests Multiple regression Linear Regression Y = a X + b Dependent Variable Independent Variable
More informationInference in Regression Analysis. Dr. Frank Wood
Inference in Regression Analysis Dr. Frank Wood Inference in the Normal Error Regression Model Y i = β 0 + β 1 X i + ɛ i Y i value of the response variable in the i th trial β 0 and β 1 are parameters
More informationRegression analysis in the Assistant fits a model with one continuous predictor and one continuous response and can fit two types of models:
This paper explains the research conducted by Minitab statisticians to develop the methods and data checks used in the Assistant in Minitab 17 Statistical Software. The simple regression procedure in the
More informationMultiple Regression in SPSS STAT 314
Multiple Regression in SPSS STAT 314 I. The accompanying data is on y = profit margin of savings and loan companies in a given year, x 1 = net revenues in that year, and x 2 = number of savings and loan
More information17.0 Linear Regression
17.0 Linear Regression 1 Answer Questions Lines Correlation Regression 17.1 Lines The algebraic equation for a line is Y = β 0 + β 1 X 2 The use of coordinate axes to show functional relationships was
More information2013 MBA Jump Start Program. Statistics Module Part 3
2013 MBA Jump Start Program Module 1: Statistics Thomas Gilbert Part 3 Statistics Module Part 3 Hypothesis Testing (Inference) Regressions 2 1 Making an Investment Decision A researcher in your firm just
More informationInstitute of Actuaries of India Subject CT3 Probability and Mathematical Statistics
Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics For 2015 Examinations Aim The aim of the Probability and Mathematical Statistics subject is to provide a grounding in
More informationBasic Econometrics Tools Correlation and Regression Analysis
Basic Econometrics Tools Correlation and Regression Analysis Christopher Grigoriou Executive MBA HEC Lausanne 2007/2008 1 A collector of antique grandfather clocks wants to know if the price received for
More informationRegression stepbystep using Microsoft Excel
Step 1: Regression stepbystep using Microsoft Excel Notes prepared by Pamela Peterson Drake, James Madison University Type the data into the spreadsheet The example used throughout this How to is a regression
More informationThe Multiple Regression Model: Hypothesis Tests and the Use of Nonsample Information
Chapter 8 The Multiple Regression Model: Hypothesis Tests and the Use of Nonsample Information An important new development that we encounter in this chapter is using the F distribution to simultaneously
More informationHypothesis Testing in the Classical Regression Model
LECTURE 5 Hypothesis Testing in the Classical Regression Model The Normal Distribution and the Sampling Distributions It is often appropriate to assume that the elements of the disturbance vector ε within
More information