Regression III: Advanced Methods


 Archibald Johnston
 1 years ago
 Views:
Transcription
1 Lecture 16: Generalized Additive Models Regression III: Advanced Methods Bill Jacoby Michigan State University
2 Goals of the Lecture Introduce Additive Models Explain how they extend from simple nonparametric regression (i.e., local polynomial regression) Discuss estimation using backfitting Explain how to interpret their results Conclude with some examples of Additive Models applied to real social science data 2
3 Limitations of the Multiple Nonparametric Models Recall that the general nonparametric model (both the lowess smooth and the smoothing spline) takes the following form: As we see here, the multiple nonparametric model allows all possible interactions between the independent variables in their effects on Y we specify a jointly conditional functional form This model is ideal under the following circumstances: 1. There are no more than two predictors 2. The pattern of nonlinearity is complicated and thus cannot be easily modelled with a simple transformation or polynomial regression 3. The sample size is sufficiently large 3
4 Limitations of the Multiple Nonparametric Models (2) The general nonparametric model becomes impossible to interpret and unstable as we add more explanatory variables, however 1. For example, in the lowess case, as the number of variables increases, the window span must become wider in order to ensure that each local regression has enough cases This process can create significant bias (the curve becomes too smooth) 2. It is impossible to interpret general nonparametric regression when there are more than two variables there are no coefficients, and we cannot graph effects more than three dimensions These limitations lead us to the Additive Models 4
5 Additive Regression Models Additive regression models essentially apply local regression to low dimensional projections of the data The nonparametric additive regression model is The f i are arbitrary functions estimated from the data; the errors ε are assumed to have constant variance and a mean of 0 Additive models create an estimate of the regression surface by a combination of a collection of onedimensional functions The estimated functions f i are the analogues of the coefficients in linear regression 5
6 Additive Regression Models (2) The assumption that the contribution of each covariate is additive is analogous to the assumption in linear regression that each component is estimated separately Recall that the linear regression model is where the B j represent linear effects For the additive model we model Y as an additive combination of arbitrary functions of the Xs The f j represent arbitrary functions that can be estimated by lowess or smoothing splines 6
7 Additive Regression Models (3) Now comes the question: How do we find these arbitrary functions? If the X s were completely independent which will not be the case we could simply estimate each functional form using a nonparametric regression of Y on each of the X s separately Similarly in linear regression when the X s are completely uncorrelated the partial regression slopes are identical to the marginal regression slopes Since the X s are related, however, we need to proceed in another way, in effect removing the effects of other predictors which are unknown before we begin We use a procedure called backfitting to find each curve, controlling for the effects of the others 7
8 Estimation and Backfitting Suppose that we had a two predictor additive model: If we unrealistically knew the partial regression function f 2 but not f 1 we could rearrange the equation in order to solve for f 1 In other words, smoothing Y i f 2 (x i2 ) against x i1 produces an estimate of α+f 1 (x i1 ). Simply put, knowing one function allows us to find the other in the real world, however we don t know either so we must proceed initially with estimates 8
9 Estimation and Backfitting (2) 1. We start by expressing the variables in mean deviation form so that the partial regressions sum to zero, thus eliminating the individual intercepts 2. We then take preliminary estimates of each function from a leastsquares regression of Y on the X s 3. These estimates are then used as step (0) in an iterative estimation process 4. We then find the partial residuals for X 1, which removes Y from its linear relationship to X 2 but retains the relationship between Y and X 1 9
10 Estimation and Backfitting (3) The partial residuals for X 1 are then 5. The same procedure in step 4 is done for X 2 6. Next we smooth these partial residuals against their respective X s, providing a new estimate of f where S is the (n n) smoother transformation matrix for X j that depends only on the configuration of X ij for the jth predictor 10
11 Estimation and Backfitting (4) This process of finding new estimates of the functions by smoothing the partial residuals is reiterated until the partial functions converge That is, when the estimates of the smooth functions stabilize from one iteration to the next we stop When this process is done, we obtain estimates of s j (X ij ) for every value of X j More importantly, we will have reduced a multiple regression to a series of twodimensional partial regression problems, making interpretation easy: Since each partial regression is only twodimensional, the functional forms can be plotted on twodimensional plots showing the partial effects of each X j on Y In other words, perspective plots are no longer necessary unless we include an interaction between two smoother terms 11
12 Interpreting the Effects A plot of of X j versus s j (X j ) shows the relationship between X j and Y holding constant the other variables in the model Since Y is expressed in mean deviation form, the smooth term s j (X j ) is also centered and thus each plot represents how Y changes relative to its mean with changes in X Interpreting the scale of the graphs then becomes easy: The value of 0 on the Yaxis is the mean of Y As the line moves away from 0 in a negative direction we subtract the distance from the mean when determining the fitted value. For example, if the mean is 45, and for a particular Xvalue (say x=15) the curve is at s j (X j )=4, this means the fitted value of Y controlling for all other explanatory variables is 45+4=49. If there are several nonparametric relationships, we can add together the effects on the two graphs for any particular observation to find its fitted value of Y 12
13 Additive Regression Models in R: Example: Canadian prestige data Here we use the Canadian Prestige data to fit an additive model to prestige regressed on income and occupation In R we use the gam function (for generalized additive models) that is found in mgcv package The gam function in mgcv fits only smoothing splines (local polynomial regression can be done in SPLUS) The formula takes the same form as the glm function except now we have the option of having parametric terms and smoothed estimates Smooths will be fit to any variable specified with the s(variable) argument The simple Rscript is as follows: 13
14 Additive Regression Models in R: Example: Canadian prestige data (2) The summary function returns tests for each smooth, the degrees of freedom for each smooth, and an adjusted R square for the model. The deviance can be obtained from the deviance(model) command 14
15 Additive Regression Models in R: Example: Canadian prestige data (3) Again, as with other nonparametric models, we have no slope parameters to investigate (we do have an intercept, however) A plot of the regression surface is necessary 15
16 Additive Regression Models in R: Example: Canadian prestige data (4) Additive Model: We can see the nonlinear relationship for both education and Income with Prestige but there is no interaction between them i.e., the slope for income is the same at every value of education We can compare this model to the general nonparametric regression model 80 Prestige Income Education 16
17 Additive Regression Models in R: Example: Canadian prestige data (5) General Nonparametric Model: This model is quite similar to the additive model, but there are some nuances particularly in the midrange of income that are not picked up by the additive model because the X s do not interact Prestige Income Education
18 Additive Regression Models in R: Example: Canadian prestige data (6) Perspective plots can also be made automatically using the persp.gam function. These graphs include a 95% confidence region income education red/green are +/2 se 18
19 Additive Regression Models in R: Example: Canadian prestige data (7) Since the slices of the additive regression in the direction of one predictor (holding the other constant) are parallel, we can graph each partialregression function separately This is the benefit of the additive model we can graph as many plots as there are variables, and allowing us to easily visualize the relationships In other words, a multidimensional regression has been reduced to a series of twodimensional partialregression plots To get these in R: 19
20 Additive Regression Models in R: Example: Canadian prestige data (8) income s(education,3.18) s(income,3.12) education 20
21 Additive Regression Models in R: Example: Canadian prestige data (9) s(income,3.12) s(education,3.18) income education 21
22 Rscript for previous slide 22
23 Residual Sum of Squares As was the case for smoothing splines and lowess smooths, statistical inference and hypothesis testing is based on the residual sum of squares (or deviance in the case of generalized additive models) and the degrees of freedom The RSS for an additive model is easily defined in the usual manner: The approximate degrees of freedom, however, need to be adjusted from the regular nonparametric case, however, because we are no longer specifying a jointlyconditional functional form 23
24 Degrees of Freedom Recall that for nonparametric regression, the approximate degrees of freedom are equal to the trace of the smoother matrix (the matrix that projects Y onto Yhat) We extend this to the additive model: 1 is subtracted from each df reflecting the constraint that each partial regression function sums to zero (the individual intercept have been removed) Parametric terms entered in the model each occupy a single degree of freedom as in the linear regression case The individual degrees of freedom are then combined for a single measure: 1 is added to the final degrees of freedom to account for the overall constant in the model 24
25 Testing for Linearity I can compare the linear model of prestige regressed on income and education with the additive model by carrying out an analysis of deviance I begin by fitting the linear model using the gam function Next I want the residual degrees of freedom from the additive model 25
26 Testing for Linearity (2) Now I simply calculate the difference in the deviance between the two model relative to the difference in degrees of freedom (difference in df=7.32=5) This gives a Chisquare test for linearity The difference between the models is highly statistically significant the additive model describe the relationship between prestige and education and income much better 26
27 Testing for Linearity An anova function written by John Fox (see the Rscript for this class) makes the analysis of deviance simpler to implement: As we see here, the results are identical to those found on the previous slide 27
Regression III: Advanced Methods
Lecture 5: Linear leastsquares Regression III: Advanced Methods William G. Jacoby Department of Political Science Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Simple Linear Regression
More informationPerform hypothesis testing
Multivariate hypothesis tests for fixed effects Testing homogeneity of level1 variances In the following sections, we use the model displayed in the figure below to illustrate the hypothesis tests. Partial
More informationRegression III: Advanced Methods
Lecture 11: Outliers and Influential data Regression III: Advanced Methods William G. Jacoby Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Outlying Observations: Why pay attention?
More informationPOLYNOMIAL AND MULTIPLE REGRESSION. Polynomial regression used to fit nonlinear (e.g. curvilinear) data into a least squares linear regression model.
Polynomial Regression POLYNOMIAL AND MULTIPLE REGRESSION Polynomial regression used to fit nonlinear (e.g. curvilinear) data into a least squares linear regression model. It is a form of linear regression
More informationPart 2: Analysis of Relationship Between Two Variables
Part 2: Analysis of Relationship Between Two Variables Linear Regression Linear correlation Significance Tests Multiple regression Linear Regression Y = a X + b Dependent Variable Independent Variable
More informationLecture  32 Regression Modelling Using SPSS
Applied Multivariate Statistical Modelling Prof. J. Maiti Department of Industrial Engineering and Management Indian Institute of Technology, Kharagpur Lecture  32 Regression Modelling Using SPSS (Refer
More informationSimple Linear Regression Chapter 11
Simple Linear Regression Chapter 11 Rationale Frequently decisionmaking situations require modeling of relationships among business variables. For instance, the amount of sale of a product may be related
More informationChapter 11: Two Variable Regression Analysis
Department of Mathematics Izmir University of Economics Week 1415 20142015 In this chapter, we will focus on linear models and extend our analysis to relationships between variables, the definitions
More informationExample: Boats and Manatees
Figure 96 Example: Boats and Manatees Slide 1 Given the sample data in Table 91, find the value of the linear correlation coefficient r, then refer to Table A6 to determine whether there is a significant
More informationRegression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur
Regression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur Lecture  7 Multiple Linear Regression (Contd.) This is my second lecture on Multiple Linear Regression
More informationRegression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur. Lecture  2 Simple Linear Regression
Regression Analysis Prof. Soumen Maity Department of Mathematics Indian Institute of Technology, Kharagpur Lecture  2 Simple Linear Regression Hi, this is my second lecture in module one and on simple
More informationNCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )
Chapter 340 Principal Components Regression Introduction is a technique for analyzing multiple regression data that suffer from multicollinearity. When multicollinearity occurs, least squares estimates
More informationRegression Analysis: Basic Concepts
The simple linear model Regression Analysis: Basic Concepts Allin Cottrell Represents the dependent variable, y i, as a linear function of one independent variable, x i, subject to a random disturbance
More informationUNDERSTANDING MULTIPLE REGRESSION
UNDERSTANDING Multiple regression analysis (MRA) is any of several related statistical methods for evaluating the effects of more than one independent (or predictor) variable on a dependent (or outcome)
More informationRegression in ANOVA. James H. Steiger. Department of Psychology and Human Development Vanderbilt University
Regression in ANOVA James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 30 Regression in ANOVA 1 Introduction 2 Basic Linear
More informationUsing Minitab for Regression Analysis: An extended example
Using Minitab for Regression Analysis: An extended example The following example uses data from another text on fertilizer application and crop yield, and is intended to show how Minitab can be used to
More informationSample Problems. Lecture Notes Equations with Parameters page 1
Lecture Notes Equations with Parameters page Sample Problems. In each of the parametric equations given, nd the value of the parameter m so that the equation has exactly one real solution. a) x + mx m
More informationRegression III: Advanced Methods
Lecture 4: Transformations Regression III: Advanced Methods William G. Jacoby Michigan State University Goals of the lecture The Ladder of Roots and Powers Changing the shape of distributions Transforming
More informationOverview Classes. 123 Logistic regression (5) 193 Building and applying logistic regression (6) 263 Generalizations of logistic regression (7)
Overview Classes 123 Logistic regression (5) 193 Building and applying logistic regression (6) 263 Generalizations of logistic regression (7) 24 Loglinear models (8) 54 1517 hrs; 5B02 Building and
More informationModule 5: Multiple Regression Analysis
Using Statistical Data Using to Make Statistical Decisions: Data Multiple to Make Regression Decisions Analysis Page 1 Module 5: Multiple Regression Analysis Tom Ilvento, University of Delaware, College
More informationExamining a Fitted Logistic Model
STAT 536 Lecture 16 1 Examining a Fitted Logistic Model Deviance Test for Lack of Fit The data below describes the male birth fraction male births/total births over the years 1931 to 1990. A simple logistic
More informationAnswer: C. The strength of a correlation does not change if units change by a linear transformation such as: Fahrenheit = 32 + (5/9) * Centigrade
Statistics Quiz Correlation and Regression  ANSWERS 1. Temperature and air pollution are known to be correlated. We collect data from two laboratories, in Boston and Montreal. Boston makes their measurements
More informationOLS is not only unbiased it is also the most precise (efficient) unbiased estimation technique  ie the estimator has the smallest variance
Lecture 5: Hypothesis Testing What we know now: OLS is not only unbiased it is also the most precise (efficient) unbiased estimation technique  ie the estimator has the smallest variance (if the GaussMarkov
More informationRegression in SPSS. Workshop offered by the Mississippi Center for Supercomputing Research and the UM Office of Information Technology
Regression in SPSS Workshop offered by the Mississippi Center for Supercomputing Research and the UM Office of Information Technology John P. Bentley Department of Pharmacy Administration University of
More informationMultiple Linear Regression
Multiple Linear Regression A regression with two or more explanatory variables is called a multiple regression. Rather than modeling the mean response as a straight line, as in simple regression, it is
More informationThe scatterplot indicates a positive linear relationship between waist size and body fat percentage:
STAT E150 Statistical Methods Multiple Regression Three percent of a man's body is essential fat, which is necessary for a healthy body. However, too much body fat can be dangerous. For men between the
More information, then the form of the model is given by: which comprises a deterministic component involving the three regression coefficients (
Multiple regression Introduction Multiple regression is a logical extension of the principles of simple linear regression to situations in which there are several predictor variables. For instance if we
More informationSection 14 Simple Linear Regression: Introduction to Least Squares Regression
Slide 1 Section 14 Simple Linear Regression: Introduction to Least Squares Regression There are several different measures of statistical association used for understanding the quantitative relationship
More informationLecture 5: Correlation and Linear Regression
Lecture 5: Correlation and Linear Regression 3.5. (Pearson) correlation coefficient The correlation coefficient measures the strength of the linear relationship between two variables. The correlation is
More informationSimple Linear Regression Inference
Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation
More informationMultiple Regression Analysis in Minitab 1
Multiple Regression Analysis in Minitab 1 Suppose we are interested in how the exercise and body mass index affect the blood pressure. A random sample of 10 males 50 years of age is selected and their
More informationStatistical Modelling in Stata 5: Linear Models
Statistical Modelling in Stata 5: Linear Models Mark Lunt Arthritis Research UK Centre for Excellence in Epidemiology University of Manchester 08/11/2016 Structure This Week What is a linear model? How
More informationTIME SERIES REGRESSION
DEPARTMENT OF POLITICAL SCIENCE AND INTERNATIONAL RELATIONS Posc/Uapp 816 TIME SERIES REGRESSION I. AGENDA: A. A couple of general considerations in analyzing time series data B. Intervention analysis
More informationChapter 7: Simple linear regression Learning Objectives
Chapter 7: Simple linear regression Learning Objectives Reading: Section 7.1 of OpenIntro Statistics Video: Correlation vs. causation, YouTube (2:19) Video: Intro to Linear Regression, YouTube (5:18) 
More informationSimple Regression Theory II 2010 Samuel L. Baker
SIMPLE REGRESSION THEORY II 1 Simple Regression Theory II 2010 Samuel L. Baker Assessing how good the regression equation is likely to be Assignment 1A gets into drawing inferences about how close the
More informationSAS Software to Fit the Generalized Linear Model
SAS Software to Fit the Generalized Linear Model Gordon Johnston, SAS Institute Inc., Cary, NC Abstract In recent years, the class of generalized linear models has gained popularity as a statistical modeling
More informationCopyright 2013 by Laura Schultz. All rights reserved. Page 1 of 6
Using Your TINSpire Calculator: Linear Correlation and Regression Dr. Laura Schultz Statistics I This handout describes how to use your calculator for various linear correlation and regression applications.
More informationPenalized regression: Introduction
Penalized regression: Introduction Patrick Breheny August 30 Patrick Breheny BST 764: Applied Statistical Modeling 1/19 Maximum likelihood Much of 20thcentury statistics dealt with maximum likelihood
More informationwhere b is the slope of the line and a is the intercept i.e. where the line cuts the y axis.
Least Squares Introduction We have mentioned that one should not always conclude that because two variables are correlated that one variable is causing the other to behave a certain way. However, sometimes
More informationResiduals. Residuals = ª Department of ISM, University of Alabama, ST 260, M23 Residuals & Minitab. ^ e i = y i  y i
A continuation of regression analysis Lesson Objectives Continue to build on regression analysis. Learn how residual plots help identify problems with the analysis. M231 M232 Example 1: continued Case
More informationSimple Regression Theory I 2010 Samuel L. Baker
SIMPLE REGRESSION THEORY I 1 Simple Regression Theory I 2010 Samuel L. Baker Regression analysis lets you use data to explain and predict. A simple regression line drawn through data points In Assignment
More informationLinear and Piecewise Linear Regressions
Tarigan Statistical Consulting & Coaching statisticalcoaching.ch Doctoral Program in Computer Science of the Universities of Fribourg, Geneva, Lausanne, Neuchâtel, Bern and the EPFL Handson Data Analysis
More informationRegression, least squares
Regression, least squares Joe Felsenstein Department of Genome Sciences and Department of Biology Regression, least squares p.1/24 Fitting a straight line X Two distinct cases: The X values are chosen
More informationA correlation exists between two variables when one of them is related to the other in some way.
Lecture #10 Chapter 10 Correlation and Regression The main focus of this chapter is to form inferences based on sample data that come in pairs. Given such paired sample data, we want to determine whether
More informationParametric versus Semi/nonparametric Regression Models
Parametric versus Semi/nonparametric Regression Models Hamdy F. F. Mahmoud Virginia Polytechnic Institute and State University Department of Statistics LISA short course series July 23, 2014 Hamdy Mahmoud
More informationSmoothing and NonParametric Regression
Smoothing and NonParametric Regression Germán Rodríguez grodri@princeton.edu Spring, 2001 Objective: to estimate the effects of covariates X on a response y nonparametrically, letting the data suggest
More informationMultivariate Normal Distribution
Multivariate Normal Distribution Lecture 4 July 21, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Lecture #47/21/2011 Slide 1 of 41 Last Time Matrices and vectors Eigenvalues
More informationHypothesis Testing in the Classical Regression Model
LECTURE 5 Hypothesis Testing in the Classical Regression Model The Normal Distribution and the Sampling Distributions It is often appropriate to assume that the elements of the disturbance vector ε within
More information2. Simple Linear Regression
Research methods  II 3 2. Simple Linear Regression Simple linear regression is a technique in parametric statistics that is commonly used for analyzing mean response of a variable Y which changes according
More informationDirections for using SPSS
Directions for using SPSS Table of Contents Connecting and Working with Files 1. Accessing SPSS... 2 2. Transferring Files to N:\drive or your computer... 3 3. Importing Data from Another File Format...
More informationX X X a) perfect linear correlation b) no correlation c) positive correlation (r = 1) (r = 0) (0 < r < 1)
CORRELATION AND REGRESSION / 47 CHAPTER EIGHT CORRELATION AND REGRESSION Correlation and regression are statistical methods that are commonly used in the medical literature to compare two or more variables.
More information, has mean A) 0.3. B) the smaller of 0.8 and 0.5. C) 0.15. D) which cannot be determined without knowing the sample results.
BA 275 Review Problems  Week 9 (11/20/0611/24/06) CD Lessons: 69, 70, 1620 Textbook: pp. 520528, 111124, 133141 An SRS of size 100 is taken from a population having proportion 0.8 of successes. An
More informationThe Simple Linear Regression Model: Specification and Estimation
Chapter 3 The Simple Linear Regression Model: Specification and Estimation 3.1 An Economic Model Suppose that we are interested in studying the relationship between household income and expenditure on
More informationLeast Squares Estimation
Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN13: 9780470860809 ISBN10: 0470860804 Editors Brian S Everitt & David
More informationSELFTEST: SIMPLE REGRESSION
ECO 22000 McRAE SELFTEST: SIMPLE REGRESSION Note: Those questions indicated with an (N) are unlikely to appear in this form on an inclass examination, but you should be able to describe the procedures
More information12/31/2016. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2
PSY 512: Advanced Statistics for Psychological and Behavioral Research 2 Understand linear regression with a single predictor Understand how we assess the fit of a regression model Total Sum of Squares
More informationAssumptions. Assumptions of linear models. Boxplot. Data exploration. Apply to response variable. Apply to error terms from linear model
Assumptions Assumptions of linear models Apply to response variable within each group if predictor categorical Apply to error terms from linear model check by analysing residuals Normality Homogeneity
More informationUnit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression
Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression Objectives: To perform a hypothesis test concerning the slope of a least squares line To recognize that testing for a
More information121 Multiple Linear Regression Models
121.1 Introduction Many applications of regression analysis involve situations in which there are more than one regressor variable. A regression model that contains more than one regressor variable is
More informationLesson Lesson Outline Outline
Lesson 15 Linear Regression Lesson 15 Outline Review correlation analysis Dependent and Independent variables Least Squares Regression line Calculating l the slope Calculating the Intercept Residuals and
More informationSimple Linear Regression
Inference for Regression Simple Linear Regression IPS Chapter 10.1 2009 W.H. Freeman and Company Objectives (IPS Chapter 10.1) Simple linear regression Statistical model for linear regression Estimating
More informationMISSING DATA TECHNIQUES WITH SAS. IDRE Statistical Consulting Group
MISSING DATA TECHNIQUES WITH SAS IDRE Statistical Consulting Group ROAD MAP FOR TODAY To discuss: 1. Commonly used techniques for handling missing data, focusing on multiple imputation 2. Issues that could
More informationCHAPTER 13 SIMPLE LINEAR REGRESSION. Opening Example. Simple Regression. Linear Regression
Opening Example CHAPTER 13 SIMPLE LINEAR REGREION SIMPLE LINEAR REGREION! Simple Regression! Linear Regression Simple Regression Definition A regression model is a mathematical equation that descries the
More informationREGRESSION LINES IN STATA
REGRESSION LINES IN STATA THOMAS ELLIOTT 1. Introduction to Regression Regression analysis is about eploring linear relationships between a dependent variable and one or more independent variables. Regression
More informationStatistical Models in R
Statistical Models in R Some Examples Steven Buechler Department of Mathematics 276B Hurley Hall; 16233 Fall, 2007 Outline Statistical Models Linear Models in R Regression Regression analysis is the appropriate
More informationA Introduction to Matrix Algebra and Principal Components Analysis
A Introduction to Matrix Algebra and Principal Components Analysis Multivariate Methods in Education ERSH 8350 Lecture #2 August 24, 2011 ERSH 8350: Lecture 2 Today s Class An introduction to matrix algebra
More informationProcedure In each case, draw and extend the given series to the fifth generation, then complete the following tasks:
Math IV Nonlinear Algebra 1.2 Growth & Decay Investigation 1.2 B: Nonlinear Growth Introduction The previous investigation introduced you to a pattern of nonlinear growth, as found in the areas of a series
More informationTechnology StepbyStep Using StatCrunch
Technology StepbyStep Using StatCrunch Section 1.3 Simple Random Sampling 1. Select Data, highlight Simulate Data, then highlight Discrete Uniform. 2. Fill in the following window with the appropriate
More informationLecture 16: Logistic regression diagnostics, splines and interactions. Sandy Eckel 19 May 2007
Lecture 16: Logistic regression diagnostics, splines and interactions Sandy Eckel seckel@jhsph.edu 19 May 2007 1 Logistic Regression Diagnostics Graphs to check assumptions Recall: Graphing was used to
More informationInferential Statistics
Inferential Statistics Sampling and the normal distribution Zscores Confidence levels and intervals Hypothesis testing Commonly used statistical methods Inferential Statistics Descriptive statistics are
More informationLecture 18 Linear Regression
Lecture 18 Statistics Unit Andrew Nunekpeku / Charles Jackson Fall 2011 Outline 1 1 Situation  used to model quantitative dependent variable using linear function of quantitative predictor(s). Situation
More informationFactor Analysis. Chapter 420. Introduction
Chapter 420 Introduction (FA) is an exploratory technique applied to a set of observed variables that seeks to find underlying factors (subsets of variables) from which the observed variables were generated.
More informationPractice 3 SPSS. Partially based on Notes from the University of Reading:
Practice 3 SPSS Partially based on Notes from the University of Reading: http://www.reading.ac.uk Simple Linear Regression A simple linear regression model is fitted when you want to investigate whether
More information17. SIMPLE LINEAR REGRESSION II
17. SIMPLE LINEAR REGRESSION II The Model In linear regression analysis, we assume that the relationship between X and Y is linear. This does not mean, however, that Y can be perfectly predicted from X.
More informationStatistical Models in R
Statistical Models in R Some Examples Steven Buechler Department of Mathematics 276B Hurley Hall; 16233 Fall, 2007 Outline Statistical Models Structure of models in R Model Assessment (Part IA) Anova
More informationMultiple Regression YX1 YX2 X1X2 YX1.X2
Multiple Regression Simple or total correlation: relationship between one dependent and one independent variable, Y versus X Coefficient of simple determination: r (or r, r ) YX YX XX Partial correlation:
More informationANALYSIS OF TREND CHAPTER 5
ANALYSIS OF TREND CHAPTER 5 ERSH 8310 Lecture 7 September 13, 2007 Today s Class Analysis of trends Using contrasts to do something a bit more practical. Linear trends. Quadratic trends. Trends in SPSS.
More informationQuantitative Understanding in Biology Module II: Model Parameter Estimation Lecture I: Linear Correlation and Regression
Quantitative Understanding in Biology Module II: Model Parameter Estimation Lecture I: Linear Correlation and Regression Correlation Linear correlation and linear regression are often confused, mostly
More informationSTATISTICA Formula Guide: Logistic Regression. Table of Contents
: Table of Contents... 1 Overview of Model... 1 Dispersion... 2 Parameterization... 3 SigmaRestricted Model... 3 Overparameterized Model... 4 Reference Coding... 4 Model Summary (Summary Tab)... 5 Summary
More informationIntroduction to Regression. Dr. Tom Pierce Radford University
Introduction to Regression Dr. Tom Pierce Radford University In the chapter on correlational techniques we focused on the Pearson R as a tool for learning about the relationship between two variables.
More informationEPS 625 ANALYSIS OF COVARIANCE (ANCOVA) EXAMPLE USING THE GENERAL LINEAR MODEL PROGRAM
EPS 6 ANALYSIS OF COVARIANCE (ANCOVA) EXAMPLE USING THE GENERAL LINEAR MODEL PROGRAM ANCOVA One Continuous Dependent Variable (DVD Rating) Interest Rating in DVD One Categorical/Discrete Independent Variable
More informationNotes on Applied Linear Regression
Notes on Applied Linear Regression Jamie DeCoster Department of Social Psychology Free University Amsterdam Van der Boechorststraat 1 1081 BT Amsterdam The Netherlands phone: +31 (0)20 4448935 email:
More informationIntroduction to General and Generalized Linear Models
Introduction to General and Generalized Linear Models General Linear Models  part I Henrik Madsen Poul Thyregod Informatics and Mathematical Modelling Technical University of Denmark DK2800 Kgs. Lyngby
More informationStatistiek II. John Nerbonne. March 24, 2010. Information Science, Groningen Slides improved a lot by Harmut Fitz, Groningen!
Information Science, Groningen j.nerbonne@rug.nl Slides improved a lot by Harmut Fitz, Groningen! March 24, 2010 Correlation and regression We often wish to compare two different variables Examples: compare
More informationWhat s New in Econometrics? Lecture 8 Cluster and Stratified Sampling
What s New in Econometrics? Lecture 8 Cluster and Stratified Sampling Jeff Wooldridge NBER Summer Institute, 2007 1. The Linear Model with Cluster Effects 2. Estimation with a Small Number of Groups and
More informationAssessing Forecasting Error: The Prediction Interval. David Gerbing School of Business Administration Portland State University
Assessing Forecasting Error: The Prediction Interval David Gerbing School of Business Administration Portland State University November 27, 2015 Contents 1 Prediction Intervals 1 1.1 Modeling Error..................................
More informationWhat is correlational research?
Key Ideas Purpose and use of correlational designs How correlational research developed Types of correlational designs Key characteristics of correlational designs Procedures used in correlational studies
More informationInference for Regression
Simple Linear Regression Inference for Regression The simple linear regression model Estimating regression parameters; Confidence intervals and significance tests for regression parameters Inference about
More informationLeastSquares Intersection of Lines
LeastSquares Intersection of Lines Johannes Traa  UIUC 2013 This writeup derives the leastsquares solution for the intersection of lines. In the general case, a set of lines will not intersect at a
More informationpsyc3010 lecture 8 standard and hierarchical multiple regression last week: correlation and regression Next week: moderated regression
psyc3010 lecture 8 standard and hierarchical multiple regression last week: correlation and regression Next week: moderated regression 1 last week this week last week we revised correlation & regression
More informationSpecifications for this HLM2 run
One way ANOVA model 1. How much do U.S. high schools vary in their mean mathematics achievement? 2. What is the reliability of each school s sample mean as an estimate of its true population mean? 3. Do
More informationAn Intermediate Course in SPSS. Mr. Liberato Camilleri
An Intermediate Course in SPSS by Mr. Liberato Camilleri 3. Simple linear regression 3. Regression Analysis The model that is applicable in the simplest regression structure is the simple linear regression
More informationJoint models for classification and comparison of mortality in different countries.
Joint models for classification and comparison of mortality in different countries. Viani D. Biatat 1 and Iain D. Currie 1 1 Department of Actuarial Mathematics and Statistics, and the Maxwell Institute
More informatione = random error, assumed to be normally distributed with mean 0 and standard deviation σ
1 Linear Regression 1.1 Simple Linear Regression Model The linear regression model is applied if we want to model a numeric response variable and its dependency on at least one numeric factor variable.
More informationA Primer on Mathematical Statistics and Univariate Distributions; The Normal Distribution; The GLM with the Normal Distribution
A Primer on Mathematical Statistics and Univariate Distributions; The Normal Distribution; The GLM with the Normal Distribution PSYC 943 (930): Fundamentals of Multivariate Modeling Lecture 4: September
More informationSimple linear regression
Simple linear regression Introduction Simple linear regression is a statistical method for obtaining a formula to predict values of one variable from another where there is a causal relationship between
More informationTwoVariable Regression: Interval Estimation and Hypothesis Testing
TwoVariable Regression: Interval Estimation and Hypothesis Testing Jamie Monogan University of Georgia Intermediate Political Methodology Jamie Monogan (UGA) Confidence Intervals & Hypothesis Testing
More informationFactor Analysis. Principal components factor analysis. Use of extracted factors in multivariate dependency models
Factor Analysis Principal components factor analysis Use of extracted factors in multivariate dependency models 2 KEY CONCEPTS ***** Factor Analysis Interdependency technique Assumptions of factor analysis
More informationPrediction and Confidence Intervals in Regression
Fall Semester, 2001 Statistics 621 Lecture 3 Robert Stine 1 Prediction and Confidence Intervals in Regression Preliminaries Teaching assistants See them in Room 3009 SHDH. Hours are detailed in the syllabus.
More informationCorrelation and Simple Linear Regression
Correlation and Simple Linear Regression We are often interested in studying the relationship among variables to determine whether they are associated with one another. When we think that changes in a
More information