Analisi Statistica per le Imprese


 Alexandra Montgomery
 1 years ago
 Views:
Transcription
1 Analisi Statistica per le Imprese Dip. Economia Politica e Statistica 4. La prospettiva dell'ecienza interna 1 / 27
2 Obiettivo Ricerca dell'ecienza interna: si tratta determinare il corretto utilizzo delle risorse, dato un volume produzione atteso. 2 / 27
3 Secondo il BS Gli interventi da eseguire dovranno legare ecienza interna, aumento della produzione/ricavi, investimenti e ricerca della sodsfazione del cliente. Quin secondo la visione rappresentata dal sistema descritto, l'ecienza interna è il primo step una stima simultanea dell'intero sistema. 3 / 27
4 Formalmente Consideriamo la prima delle relazioni del sistema dove la variabile X può rappresentare grandezze come: spese per il personale spese per le materie prime spese per gli investimenti (capitale) ovvero ciò che riguarda i fattori produttivi utili per ottenere un livello produzione pressato. Veamo quin quali meto statistici possono essere applicati per il raggiungimento dei nostri obiettivi 4 / 27
5 Deterministic relationship A deterministic relationship f between a dependent variable Y and an independent variables X is expressed as: Y = f (X) (1) In the most simple case, when the relationship is linear and k = 1, it is commonly expressed as follows: Y = β 0 + β 1 X (2) This relationship can be represented graphically on a Cartesian Coornate System as a line with intercept β 0 and slope β 1. 5 / 27
6 Stochastic relationship However deterministic relationships are very rare due to randomness. The true relationship between a dependent variable Y and an independent variables X is approximated by the stochastic regression model as follows: Y = f (X) + ε (3) The term ε is supposed to incorporate the size of the approximation error. The introduction of such element identies the stochastic relationship. If f is a linear function, the regression model is linear and is expressed as: Y = β 0 + β 1 X + ε (4) The terms (β 0,β 1 ) are the regression parameters or model coecients. 6 / 27
7 analysis is a method to study functional relationships between variables. The relationship is expressed as an equation or model that links the dependent (tted or prected) variable to one or more independent (prector) variable Example Si supponga aver classicato i clienti una banca secondo due caratteri: redto meo sponibile e frequenza mensile ricorso al servizio Bancomat. La relazione che si vuole vericare è se la frequenza mensile ricorso al servizio Bancomat Y (variabile pendente), vari al variare del redto X. Observing a sample of size n we will have n observations for each variable. 7 / 27
8 Simple Linear The relationship between the dependent variable and the independent variable is expressed by the following linear model: Y = β 0 + β 1 X + ε (5) The parameters or coecients of regression are (β 0,β 1 ) and ε is the random component of the model. For each observation the model can be written as: Y i = β 0 + β 1 X i + ε i i = 1,...,n (6) 8 / 27
9 Scatter plot y y (x 2,y 2 ) (x 1,y 1 ) (x 3,y 3 ) (x 4,y 4 ) (x 5,y 5 ) (x 6,y 6 ) x x 9 / 27
10 Scatter plot y Ŷ = β 0 + β 1 X Ŷ = a + bx Ŷ = c + dx x 10 / 27
11 Optimal Parameters One must then nd the equation that best approximates the observations with coornates (X, Y). This equation is specied by the following linear model: Ŷ = β 0 + β 1 X (7) Hence one must nd the optimal parameters ( β 0, β 1 ). 11 / 27
12 (OLS) The method of (OLS) minimizes the following auxiliary function: n i=1 ( ) 2 n ( Y i Ŷ i = Y i β 0 + β 2 1 X) (8) i=1 The function is minimized for the following parameter values: ( β 1 = n i=1 Xi X )( Y i Y ) ( Xi X ) = n i=1 x iy i 2 n (9) i=1 x2 i n i=1 β 0 = Y β 1 X (10) 12 / 27
13 (OLS) x i = ( X i X ) (11) y i = ( Y i Y ) (12) X = 1 n Y = 1 n n i=1 n i=1 X i (13) Y i (14) 13 / 27
14 (OLS) A stochastic model is much more realistic then a deterministic model when the observations are not perfectly aligned. The introduction of the ε introduces culties but enhances the results that are more useful and deeper in meaning. First consideration: Justication How can we justify the introduction of the stochastic component? 1 There may be errors in the model 2 The number of covariates is limited 3 Randomness from empirical sampling 4 Measurement errors 14 / 27
15 (OLS) Second consideration: Random Variable The introduction of the error term allows the identication of Y as a random variable (r.v.). This implies that every value expressed as a function of Y is also a r.v. 15 / 27
16 (OLS) Third consideration: Assumptions 1 The functional relationship must be linear 2 The independent variable has a deterministic nature 3 The error term has a null expected value E [ε i ] = 0 4 The error term is homoskedastic Var [ε i ] = σ 2 5 The error terms are not autocorrelated Cov[ε i ε j ] i j = 0 16 / 27
17 Creble assumptions 1 seems restrictive, but many non linear relationships can be expressed as linear relationships of the parameters through appropriate transformations. 2 is possibly the most unrealistic in socioeconomic analysis, but guarantees important by E [X i ε i ] = X i E [ε i ] = 0 3 guarantees that the most likely error value is zero, as in this case the mean value is also the modal value. 4 can be unlikely in cross section observations, and can give serious culties. We will focus on this later. 5 can be unlikely in time series observations, and can give serious culties. 17 / 27
18 We will now examine the of the OLS s of the unknown parameters of the regression function. The estimates are obtained for both (Y, X) from n observations extracted with probabilistic sampling from a target population. If we extracted a new sample from the same population we would most likely obtain erent results, hence the parameters estimates would be erent. For this reason we can say that the estimates are tied to the r.v. When one writes β one means: 1 The slope of the regression line, estimated from a set of observations 2 the that has a specic probability stribution 18 / 27
19 Properties Given assumptions 15, OLS s are the Best Linear Unbiased Estimators (BLUE). This means that they are the most ecient (minimum variance) not systematically storted s within the class of linear s. Here we just prove that it is linear and unbiased. 19 / 27
20 Linear ( β 1 = n i=1 Xi X )( Y i Y ) n ( i=1 Xi X ) = n i=1 x iy i 2 n i=1 x2 i β 1 = x iy i x 2 i Similarly it can be proven for β 0. = w i y i w i = x i x 2 i (15) (16) w i = 0 w i x i = 1 (17) 20 / 27
21 Unbiased β 1 = w i y i = w i ( Yi Y ) = w i Y i Y w i }{{} 0 (18) β 1 = w i [β 0 + β 1 X i + ε i ] + 0 = w i β 0 + β 1 w i X i + w i ε i }{{}}{{} 0 1 (19) β 1 = 0 + β 1 + w i ε i (20) [ ] E β1 = E [ ] β 1 + w i ε i = β1 + w i E [ε i ] = β }{{} 1 (21) 0 Similarly it can be proven for β / 27
22 Let us add to the hypothesis 15 the following: The error term has Normal stribution so:ε i N(0,σ 2 ) so: y i N(βX i,σ 2 ) 22 / 27
23 OLS stribution As β is a weighted average of y and the y are normally stributed, β is also normally stributed. ( β 1 N β 1, σ ε 2 ( β 0 N β 0,σε 2 x 2 i X 2 i ) N x 2 i OLS s coincide with Maximum Likelihood (ML) s. The Central Limit supports that even if the y where not to be normally stributed, under a very nonrestrictive contions, the asymptotic stribution would still hold. ) (22) (23) 23 / 27
24 Residual Variance Estimator This, called the residual variance, can be obtained through Maximum Likelihood computations (here omitted). s 2 = σ 2 = 2 ε (Y i n 2 = i β 0 β ) 2 1 X i n 2 (24) ε i = Y i Ŷ i (Residual) (25) The residual variance is an unbiased and consistent of the variance of the error term. The denominator is n 2 because that is the number of degrees of freedom which are calculated as the number of observations minus the required parameters in the computation. In this case to calculate this one needs both (β 0,β 1 ). 24 / 27
25 Further observations ) Var ( β1 is a rect function of Var (ε i ) strongly variable errors lower precision and reliability ) Var ( β1 is an inrect function of the variability of X i covariates concentrated in a close interval worsen the quality of β 1 25 / 27
26 OLS Estimators standard error Once we have estimated the variance of the stochastic term in the regression model, we may substitute it in the OLS 's variance to obtain the standard errors (s.e.) of s. s 2 β1 = s2 x 2 i s 2 β0 = s 2 ( X 2 i n x 2 i ) (26) (27) 26 / 27
27 Pindyck R. and Rubinfeld D., 1998 Econometrics Models and Economic Forecasts (4th Ed.) Bracalente, Cossignani, Mulas, 2009 Statistica Aziendale sec / 27
Chapter 11: Two Variable Regression Analysis
Department of Mathematics Izmir University of Economics Week 1415 20142015 In this chapter, we will focus on linear models and extend our analysis to relationships between variables, the definitions
More informationSimple Linear Regression Chapter 11
Simple Linear Regression Chapter 11 Rationale Frequently decisionmaking situations require modeling of relationships among business variables. For instance, the amount of sale of a product may be related
More informationWeighted Least Squares
Weighted Least Squares The standard linear model assumes that Var(ε i ) = σ 2 for i = 1,..., n. As we have seen, however, there are instances where Var(Y X = x i ) = Var(ε i ) = σ2 w i. Here w 1,..., w
More informationEconometrics The Multiple Regression Model: Inference
Econometrics The Multiple Regression Model: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, March 2011 1 / 24 in
More informationOLS is not only unbiased it is also the most precise (efficient) unbiased estimation technique  ie the estimator has the smallest variance
Lecture 5: Hypothesis Testing What we know now: OLS is not only unbiased it is also the most precise (efficient) unbiased estimation technique  ie the estimator has the smallest variance (if the GaussMarkov
More information15.1 The Regression Model: Analysis of Residuals
15.1 The Regression Model: Analysis of Residuals Tom Lewis Fall Term 2009 Tom Lewis () 15.1 The Regression Model: Analysis of Residuals Fall Term 2009 1 / 12 Outline 1 The regression model 2 Estimating
More informationSimple Linear Regression Inference
Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation
More informationRegression Analysis: Basic Concepts
The simple linear model Regression Analysis: Basic Concepts Allin Cottrell Represents the dependent variable, y i, as a linear function of one independent variable, x i, subject to a random disturbance
More informationThe Simple Linear Regression Model: Specification and Estimation
Chapter 3 The Simple Linear Regression Model: Specification and Estimation 3.1 An Economic Model Suppose that we are interested in studying the relationship between household income and expenditure on
More information1. The Classical Linear Regression Model: The Bivariate Case
Business School, Brunel University MSc. EC5501/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 018956584) Lecture Notes 3 1.
More informationwhere b is the slope of the line and a is the intercept i.e. where the line cuts the y axis.
Least Squares Introduction We have mentioned that one should not always conclude that because two variables are correlated that one variable is causing the other to behave a certain way. However, sometimes
More informationE 31501/4150 Properties of OLS estimators by Monte Carlo simulation
E 31501/4150 Properties of OLS estimators by Monte Carlo simulation Ragnar Nymoen 19 January 2012 Repeated sampling Section 2.4.3 of the HGL book is called Repeated sampling The point is that by drawing
More informationOverview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
More informationOLS in Matrix Form. Let y be an n 1 vector of observations on the dependent variable.
OLS in Matrix Form 1 The True Model Let X be an n k matrix where we have observations on k independent variables for n observations Since our model will usually contain a constant term, one of the columns
More informationPASS Sample Size Software. Linear Regression
Chapter 855 Introduction Linear regression is a commonly used procedure in statistical analysis. One of the main objectives in linear regression analysis is to test hypotheses about the slope (sometimes
More informationIn this chapter, we aim to answer the following questions: 1. What is the nature of heteroskedasticity?
Lecture 9 Heteroskedasticity In this chapter, we aim to answer the following questions: 1. What is the nature of heteroskedasticity? 2. What are its consequences? 3. how does one detect it? 4. What are
More informationA brushup on residuals
A brushup on residuals Going back to a single level model... y i = β 0 + β 1 x 1i + ε i The residual for each observation is the difference between the value of y predicted by the equation and the actual
More information1 Another method of estimation: least squares
1 Another method of estimation: least squares erm: estim.tex, Dec8, 009: 6 p.m. (draft  typos/writos likely exist) Corrections, comments, suggestions welcome. 1.1 Least squares in general Assume Y i
More informationHence, multiplying by 12, the 95% interval for the hourly rate is (965, 1435)
Confidence Intervals for Poisson data For an observation from a Poisson distribution, we have σ 2 = λ. If we observe r events, then our estimate ˆλ = r : N(λ, λ) If r is bigger than 20, we can use this
More informationSimple Linear Regression
Inference for Regression Simple Linear Regression IPS Chapter 10.1 2009 W.H. Freeman and Company Objectives (IPS Chapter 10.1) Simple linear regression Statistical model for linear regression Estimating
More informationRegression, least squares
Regression, least squares Joe Felsenstein Department of Genome Sciences and Department of Biology Regression, least squares p.1/24 Fitting a straight line X Two distinct cases: The X values are chosen
More informationWooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares
Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Many economic models involve endogeneity: that is, a theoretical relationship does not fit
More informationNotes 5: More on regression and residuals ECO 231W  Undergraduate Econometrics
Notes 5: More on regression and residuals ECO 231W  Undergraduate Econometrics Prof. Carolina Caetano 1 Regression Method Let s review the method to calculate the regression line: 1. Find the point of
More informationInference in Regression Analysis. Dr. Frank Wood
Inference in Regression Analysis Dr. Frank Wood Inference in the Normal Error Regression Model Y i = β 0 + β 1 X i + ɛ i Y i value of the response variable in the i th trial β 0 and β 1 are parameters
More informationExam and Solution. Please discuss each problem on a separate sheet of paper, not just on a separate page!
Econometrics  Exam 1 Exam and Solution Please discuss each problem on a separate sheet of paper, not just on a separate page! Problem 1: (20 points A health economist plans to evaluate whether screening
More informationDefinition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ).
Correlation Regression Bivariate Normal Suppose that X and Y are r.v. s with joint density f(x y) and suppose that the means of X and Y are respectively µ 1 µ 2 and the variances are 1 2. Definition The
More informationRegression Analysis. Pekka Tolonen
Regression Analysis Pekka Tolonen Outline of Topics Simple linear regression: the form and estimation Hypothesis testing and statistical significance Empirical application: the capital asset pricing model
More informationBivariate Regression Analysis. The beginning of many types of regression
Bivariate Regression Analysis The beginning of many types of regression TOPICS Beyond Correlation Forecasting Two points to estimate the slope Meeting the BLUE criterion The OLS method Purpose of Regression
More informationBIO 226: APPLIED LONGITUDINAL ANALYSIS LECTURE 13. Linear Mixed Effects Model and Prediction
BIO 226: APPLIED LONGITUDINAL ANALYSIS LECTURE 13 Linear Mixed Effects Model and Prediction 1 Linear Mixed Effects Model and Prediction Recall that in the linear mixed effects model, Y ij = X ijβ + Z ijb
More information2. Linear regression with multiple regressors
2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measuresoffit in multiple regression Assumptions
More informationSELFTEST: SIMPLE REGRESSION
ECO 22000 McRAE SELFTEST: SIMPLE REGRESSION Note: Those questions indicated with an (N) are unlikely to appear in this form on an inclass examination, but you should be able to describe the procedures
More informationRegression Estimation  Least Squares and Maximum Likelihood. Dr. Frank Wood
Regression Estimation  Least Squares and Maximum Likelihood Dr. Frank Wood Least Squares Max(min)imization Function to minimize w.r.t. b 0, b 1 Q = n (Y i (b 0 + b 1 X i )) 2 i=1 Minimize this by maximizing
More informationGaussMarkov Theorem. The GaussMarkov Theorem is given in the following regression model and assumptions:
GaussMarkov Theorem The GaussMarkov Theorem is given in the following regression model and assumptions: The regression model y i = β 1 + β x i + u i, i = 1,, n (1) Assumptions (A) or Assumptions (B):
More informationRegression Estimation  Least Squares and Maximum Likelihood. Dr. Frank Wood
Regression Estimation  Least Squares and Maximum Likelihood Dr. Frank Wood Least Squares Max(min)imization 1. Function to minimize w.r.t. β 0, β 1 Q = n (Y i (β 0 + β 1 X i )) 2 i=1 2. Minimize this by
More informationA Guide to Modern Econometrics / 2ed. Answers to selected exercises  Chapter 4. Exercise 4.1. Variable Coefficient Std. Error tstatistic Prob.
A Guide to Modern Econometrics / 2ed Answers to selected exercises  Chapter 4 Exercise 4.1 a. The OLS results (Eviews 5.0 output) are as follows: Dependent Variable: AIRQ Date: 12/02/05 Time: 14:04 C
More informationSIMPLE REGRESSION ANALYSIS
SIMPLE REGRESSION ANALYSIS Introduction. Regression analysis is used when two or more variables are thought to be systematically connected by a linear relationship. In simple regression, we have only two
More informationFinancial Econometrics Jeffrey R. Russell Final Exam
Name Financial Econometrics Jeffrey R. Russell Final Exam You have 3 hours to complete the exam. Use can use a calculator. Try to fit all your work in the space provided. If you find you need more space
More informationLinear and Piecewise Linear Regressions
Tarigan Statistical Consulting & Coaching statisticalcoaching.ch Doctoral Program in Computer Science of the Universities of Fribourg, Geneva, Lausanne, Neuchâtel, Bern and the EPFL Handson Data Analysis
More informationttests and Ftests in regression
ttests and Ftests in regression Johan A. Elkink University College Dublin 5 April 2012 Johan A. Elkink (UCD) t and Ftests 5 April 2012 1 / 25 Outline 1 Simple linear regression Model Variance and R
More informationHeteroskedasticity and Weighted Least Squares
Econ 507. Econometric Analysis. Spring 2009 April 14, 2009 The Classical Linear Model: 1 Linearity: Y = Xβ + u. 2 Strict exogeneity: E(u) = 0 3 No Multicollinearity: ρ(x) = K. 4 No heteroskedasticity/
More informationCHAPTER 9: SERIAL CORRELATION
Serial correlation (or autocorrelation) is the violation of Assumption 4 (observations of the error term are uncorrelated with each other). Pure Serial Correlation This type of correlation tends to be
More informationChapter 8: Interval Estimates and Hypothesis Testing
Chapter 8: Interval Estimates and Hypothesis Testing Chapter 8 Outline Clint s Assignment: Taking Stock Estimate Reliability: Interval Estimate Question o Normal Distribution versus the Student tdistribution:
More information2 Testing under Normality Assumption
1 Hypothesis testing A statistical test is a method of making a decision about one hypothesis (the null hypothesis in comparison with another one (the alternative using a sample of observations of known
More informationEconometrics Simple Linear Regression
Econometrics Simple Linear Regression Burcu Eke UC3M Linear equations with one variable Recall what a linear equation is: y = b 0 + b 1 x is a linear equation with one variable, or equivalently, a straight
More informationThe Method of Least Squares
33 The Method of Least Squares KEY WORDS confidence interval, critical sum of squares, dependent variable, empirical model, experimental error, independent variable, joint confidence region, least squares,
More informationAn Introduction to Bayesian Statistics
An Introduction to Bayesian Statistics Robert Weiss Department of Biostatistics UCLA School of Public Health robweiss@ucla.edu April 2011 Robert Weiss (UCLA) An Introduction to Bayesian Statistics UCLA
More informationThe Classical Linear Regression Model
The Classical Linear Regression Model 1 September 2004 A. A brief review of some basic concepts associated with vector random variables Let y denote an n x l vector of random variables, i.e., y = (y 1,
More informationOutline. Correlation & Regression, III. Review. Relationship between r and regression
Outline Correlation & Regression, III 9.07 4/6/004 Relationship between correlation and regression, along with notes on the correlation coefficient Effect size, and the meaning of r Other kinds of correlation
More informationChapter 4: Vector Autoregressive Models
Chapter 4: Vector Autoregressive Models 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and und Econometrics Ökonometrie IV.1 Vector Autoregressive Models (VAR)...
More informationInstrumental Variables & 2SLS
Instrumental Variables & 2SLS y 1 = β 0 + β 1 y 2 + β 2 z 1 +... β k z k + u y 2 = π 0 + π 1 z k+1 + π 2 z 1 +... π k z k + v Economics 20  Prof. Schuetze 1 Why Use Instrumental Variables? Instrumental
More informationModule 10: Mixed model theory II: Tests and confidence intervals
St@tmaster 02429/MIXED LINEAR MODELS PREPARED BY THE STATISTICS GROUPS AT IMM, DTU AND KULIFE Module 10: Mixed model theory II: Tests and confidence intervals 10.1 Notes..................................
More informationEconometrics Regression Analysis with Time Series Data
Econometrics Regression Analysis with Time Series Data João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, May 2011
More informationStatistiek (WISB361)
Statistiek (WISB361) Final exam June 29, 2015 Schrijf uw naam op elk in te leveren vel. Schrijf ook uw studentnummer op blad 1. The maximum number of points is 100. Points distribution: 23 20 20 20 17
More informationTwoVariable Regression: Interval Estimation and Hypothesis Testing
TwoVariable Regression: Interval Estimation and Hypothesis Testing Jamie Monogan University of Georgia Intermediate Political Methodology Jamie Monogan (UGA) Confidence Intervals & Hypothesis Testing
More informationRegression in SPSS. Workshop offered by the Mississippi Center for Supercomputing Research and the UM Office of Information Technology
Regression in SPSS Workshop offered by the Mississippi Center for Supercomputing Research and the UM Office of Information Technology John P. Bentley Department of Pharmacy Administration University of
More informationRegression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur. Lecture  2 Simple Linear Regression
Regression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur Lecture  2 Simple Linear Regression Hi, this is my second lecture in module one and on simple
More informationOrdinary Least Squares and Poisson Regression Models by Luc Anselin University of Illinois ChampaignUrbana, IL
Appendix C Ordinary Least Squares and Poisson Regression Models by Luc Anselin University of Illinois ChampaignUrbana, IL This note provides a brief description of the statistical background, estimators
More information12.1 Inference for Linear Regression
12.1 Inference for Linear Regression Least Squares Regression Line y = a + bx You might want to refresh your memory of LSR lines by reviewing Chapter 3! 1 Sample Distribution of b p740 Shape Center Spread
More informationStatistics & Regression: Easier than SAS
ST003 Statistics & Regression: Easier than SAS Vincent Maffei, Anthem Blue Cross and Blue Shield, North Haven, CT Michael Davis, Bassett Consulting Services, Inc., North Haven, CT ABSTRACT In this paper
More informationLinear combinations of parameters
Linear combinations of parameters Suppose we want to test the hypothesis that two regression coefficients are equal, e.g. β 1 = β 2. This is equivalent to testing the following linear constraint (null
More information1 Logit & Probit Models for Binary Response
ECON 370: Limited Dependent Variable 1 Limited Dependent Variable Econometric Methods, ECON 370 We had previously discussed the possibility of running regressions even when the dependent variable is dichotomous
More informationPARTIAL PLOTS IN REGRESSION ANALYSIS
PARTIAL PLOTS IN REGRESSION ANALYSIS George C. J. Fernandez, Department of Applied Economics and Statistics /204, University of Nevada, USA In Multiple linear regression models, problems arise when serious
More informationTIME SERIES REGRESSION
DEPARTMENT OF POLITICAL SCIENCE AND INTERNATIONAL RELATIONS Posc/Uapp 816 TIME SERIES REGRESSION I. AGENDA: A. A couple of general considerations in analyzing time series data B. Intervention analysis
More informationEC327: Advanced Econometrics, Spring 2007
EC327: Advanced Econometrics, Spring 2007 Wooldridge, Introductory Econometrics (3rd ed, 2006) Appendix D: Summary of matrix algebra Basic definitions A matrix is a rectangular array of numbers, with m
More informationLecture 7 Linear Regression Diagnostics
Lecture 7 Linear Regression Diagnostics BIOST 515 January 27, 2004 BIOST 515, Lecture 6 Major assumptions 1. The relationship between the outcomes and the predictors is (approximately) linear. 2. The error
More informationInstrumental Variables (IV) Instrumental Variables (IV) is a method of estimation that is widely used
Instrumental Variables (IV) Instrumental Variables (IV) is a method of estimation that is widely used in many economic applications when correlation between the explanatory variables and the error term
More informationChapter 11: Linear Regression  Inference in Regression Analysis  Part 2
Chapter 11: Linear Regression  Inference in Regression Analysis  Part 2 Note: Whether we calculate confidence intervals or perform hypothesis tests we need the distribution of the statistic we will use.
More informationEconometric Methods for Panel Data
Based on the books by Baltagi: Econometric Analysis of Panel Data and by Hsiao: Analysis of Panel Data Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies
More informationELECE8104 Stochastics models and estimation, Lecture 3b: Linear Estimation in Static Systems
Stochastics models and estimation, Lecture 3b: Linear Estimation in Static Systems Minimum Mean Square Error (MMSE) MMSE estimation of Gaussian random vectors Linear MMSE estimator for arbitrarily distributed
More information2. What are the theoretical and practical consequences of autocorrelation?
Lecture 10 Serial Correlation In this lecture, you will learn the following: 1. What is the nature of autocorrelation? 2. What are the theoretical and practical consequences of autocorrelation? 3. Since
More informationSolución del Examen Tipo: 1
Solución del Examen Tipo: 1 Universidad Carlos III de Madrid ECONOMETRICS Academic year 2009/10 FINAL EXAM May 17, 2010 DURATION: 2 HOURS 1. Assume that model (III) verifies the assumptions of the classical
More informationLecture 18 Chapter 6: Empirical Statistics
Lecture 18 Chapter 6: Empirical Statistics M. George Akritas Definition (Sample Covariance and Pearson s Correlation) Let (X 1, Y 1 ),..., (X n, Y n ) be a sample from a bivariate population. The sample
More informationSection 3: Simple Linear Regression
Section 3: Simple Linear Regression Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 Regression: General Introduction
More informationStatistics for Management IISTAT 362Final Review
Statistics for Management IISTAT 362Final Review Multiple Choice Identify the letter of the choice that best completes the statement or answers the question. 1. The ability of an interval estimate to
More informationAssessing Forecasting Error: The Prediction Interval. David Gerbing School of Business Administration Portland State University
Assessing Forecasting Error: The Prediction Interval David Gerbing School of Business Administration Portland State University November 27, 2015 Contents 1 Prediction Intervals 1 1.1 Modeling Error..................................
More informationInference in Regression Analysis
Yang Feng Inference in the Normal Error Regression Model Y i = β 0 + β 1 X i + ɛ i Y i value of the response variable in the i th trial β 0 and β 1 are parameters X i is a known constant, the value of
More informationThe University of Chicago, Booth School of Business Business 41202, Spring Quarter 2013, Mr. Ruey S. Tsay Solutions to Final Exam
The University of Chicago, Booth School of Business Business 41202, Spring Quarter 2013, Mr. Ruey S. Tsay Solutions to Final Exam Problem A: (40 points) Answer briefly the following questions. 1. Describe
More informationStatistical Modelling in Stata 5: Linear Models
Statistical Modelling in Stata 5: Linear Models Mark Lunt Arthritis Research UK Centre for Excellence in Epidemiology University of Manchester 08/11/2016 Structure This Week What is a linear model? How
More informationMath 62 Statistics Sample Exam Questions
Math 62 Statistics Sample Exam Questions 1. (10) Explain the difference between the distribution of a population and the sampling distribution of a statistic, such as the mean, of a sample randomly selected
More information17. SIMPLE LINEAR REGRESSION II
17. SIMPLE LINEAR REGRESSION II The Model In linear regression analysis, we assume that the relationship between X and Y is linear. This does not mean, however, that Y can be perfectly predicted from X.
More informationRegression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur
Regression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur Lecture  7 Multiple Linear Regression (Contd.) This is my second lecture on Multiple Linear Regression
More informationStatistics II Final Exam  January Use the University stationery to give your answers to the following questions.
Statistics II Final Exam  January 2012 Use the University stationery to give your answers to the following questions. Do not forget to write down your name and class group in each page. Indicate clearly
More informationStatistics in Geophysics: Linear Regression II
Statistics in Geophysics: Linear Regression II Steffen Unkel Department of Statistics LudwigMaximiliansUniversity Munich, Germany Winter Term 2013/14 1/28 Model definition Suppose we have the following
More informationThe scatterplot indicates a positive linear relationship between waist size and body fat percentage:
STAT E150 Statistical Methods Multiple Regression Three percent of a man's body is essential fat, which is necessary for a healthy body. However, too much body fat can be dangerous. For men between the
More informationCHAPTER 13 SIMPLE LINEAR REGRESSION. Opening Example. Simple Regression. Linear Regression
Opening Example CHAPTER 13 SIMPLE LINEAR REGREION SIMPLE LINEAR REGREION! Simple Regression! Linear Regression Simple Regression Definition A regression model is a mathematical equation that descries the
More information8. Model Specification and Data Problems. 8.1 Functional Form Misspecification
8. Model Specification and Data Problems 8.1 Functional Form Misspecification A functional form misspecification generally means that the model does not account for some important nonlinearities. Recall
More informationChapter 6: Multivariate Cointegration Analysis
Chapter 6: Multivariate Cointegration Analysis 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and und Econometrics Ökonometrie VI. Multivariate Cointegration
More information7. Tests of association and Linear Regression
7. Tests of association and Linear Regression In this chapter we consider 1. Tests of Association for 2 qualitative variables. 2. Measures of the strength of linear association between 2 quantitative variables.
More informationRegression Analysis: A Complete Example
Regression Analysis: A Complete Example This section works out an example that includes all the topics we have discussed so far in this chapter. A complete example of regression analysis. PhotoDisc, Inc./Getty
More informationSimple linear regression
Simple linear regression Introduction Simple linear regression is a statistical method for obtaining a formula to predict values of one variable from another where there is a causal relationship between
More informationHypothesis Testing in the Linear Regression Model An Overview of t tests, D Prescott
Hypothesis Testing in the Linear Regression Model An Overview of t tests, D Prescott 1. Hypotheses as restrictions An hypothesis typically places restrictions on population regression coefficients. Consider
More information7. Autocorrelation (Violation of Assumption #B3)
7. Autocorrelation (Violation of Assumption #B3) Assumption #B3: The error term u i is not autocorrelated, i.e. Cov(u i, u j ) = 0 for all i = 1,..., N and j = 1,..., N where i j Where do we typically
More informationMS&E 226: Small Data. Lecture 17: Additional topics in inference (v1) Ramesh Johari
MS&E 226: Small Data Lecture 17: Additional topics in inference (v1) Ramesh Johari ramesh.johari@stanford.edu 1 / 34 Warnings 2 / 34 Modeling assumptions: Regression Remember that most of the inference
More informationAnalysis of Environmental Data Problem Set Conceptual Foundations: Hy p o th e sis te stin g c o n c e p ts
Analysis of Environmental Data Problem Set Conceptual Foundations: Hy p o th e sis te stin g c o n c e p ts Answer any one of questions 14, AND answer question 5: 1. Consider a hypothetical study of wood
More information, then the form of the model is given by: which comprises a deterministic component involving the three regression coefficients (
Multiple regression Introduction Multiple regression is a logical extension of the principles of simple linear regression to situations in which there are several predictor variables. For instance if we
More informationThe method of least squares
The method of least squares 1 Contents 1. Introduction. Statistical interpretation of the curve fitting problem 3. The parametric model 4. The stochastic model 5. The least squares estimators of parameters
More informationPaired Differences and Regression
Paired Differences and Regression Students sometimes have difficulty distinguishing between paired data and independent samples when comparing two means. One can return to this topic after covering simple
More informationSection I: Multiple Choice Select the best answer for each question.
Chapter 15 (Regression Inference) AP Statistics Practice Test (TPS 4 p796) Section I: Multiple Choice Select the best answer for each question. 1. Which of the following is not one of the conditions that
More informationSimultaneous Equations
Simultaneous Equations 1 Motivation and Examples Now we relax the assumption that EX u 0 his wil require new techniques: 1 Instrumental variables 2 2 and 3stage least squares 3 Limited LIML and full
More informationCov(x, y) V ar(x)v ar(y)
Simple linear regression Systematic components: β 0 + β 1 x i Stochastic component : error term ε Y i = β 0 + β 1 x i + ε i ; i = 1,..., n E(Y X) = β 0 + β 1 x the central parameter is the slope parameter
More information