Comparing Conditional Effects in Moderated Multiple Regression: Implementation using PROCESS for SPSS and SAS
|
|
- Marvin Powers
- 7 years ago
- Views:
Transcription
1 Comparing Conditional Effects in Moderated Multiple Regression: Implementation using PROCESS for SPSS and SAS Andrew F. Hayes The Ohio State University Department of Psychology This document describes a method for testing the difference between any two conditional effects of X on Y in a moderated multiple regression model. It is based in part on the approach outlined by Dawson and Richter (2006) in the Journal of Applied Psychology. It works for continuous or dichotomous moderators in any combination. Mean centering or standardization is not required, although substituting mean centered or standardized values in the formulas below does not harm or alter the derivation and discussion. I conclude with instructions for implementation in PROCESS for SPSS and SAS as of version Version 2.12 is scheduled for release toward the end of May Moderation of X s Effect by a Single Moderator M Consider the model In this model, the conditional effect of X on Y is Y = i Y + b 1 X + b 2 M + b 3 XM + e Y X Y = b 1 + b 3 M We wish to compare the conditional effect of X on Y when M = m 2 to the conditional effect of X on Y when M = m 1. The difference between these conditional effects is X Y = X Y m2 X Y m1 = b 3 (m 2 m 1 ) The sampling variance of X Y is V( X Y ) = (m 2 m 1 ) 2 V(b 3 ) The ratio of X Y to the square root of V( X Y ) is distributed as t(df residual ) under the null hypothesis that the two conditional effects are equal, where df residual is the residual degrees of freedom for the model. But this ratio contains (m 2 m 1 ) in both the numerator and the denominator. These cancel each other and so the ratio simplifies to b 3 / V(b 3 ), which is the t-ratio for b 3 from the regression analysis. Thus, an inference that M moderates the effect of X on Y with a test of significance for the regression coefficient for XM means that any two conditional effects of X defined by different values of M are significantly different from each other, with the same p-value as the p- value for b 3. Conversely, a failure of M to moderate X s effect from the test of the regression 1
2 coefficient for XM implies that no two conditional effects of X defined by different values of M differ from each other. As no test is needed to compare conditional effects in this model, this model is not discussed further in this document. Moderation of X s Effect Additively by Two Moderators M and W Next consider the model In this model, the conditional effect of X on Y is Y = i Y + b 1 X + b 2 M + b 3 W + b 4 XM + b 5 XW + e Y X Y = b 1 + b 4 M + b 5 W We wish to compare the conditional effect of X on Y when M and W = m 2 and w 2 to the conditional effect of X on Y when M and W = m 1 and w 1. The difference between these conditional effects is The sampling variance of X Y is X Y = X Y (m2,w2) X Y (m1,w1) = b 4 (m 2 m 1 ) + b 5 (w 2 w 1 ) V( X Y ) = (m 2 m 1 ) 2 V(b 4 ) + (w 2 w 1 ) 2 V(b 5 ) + 2(m 2 m 1 )(w 2 w 1 )COV(b 4 b 5 ) The ratio of X Y to the square root of V( X Y ) is distributed as t(df residual ) under the null hypothesis that the two conditional effects are equal, where df residual is the residual degrees of freedom for the model. Two special cases are worth highlighting. If W is held fixed any value, such that w 1 = w 2 = w, then X Y = X Y (m2,w) X Y (m1,w) = b 4 (m 2 m 1 ) and the sampling variance of X Y is V( X Y ) = (m 2 m 1 ) 2 V(b 4 ) By the argument in the prior section, the ratio of the difference in conditional effects of X when W is held fixed to the standard error of this difference is equivalent to the t statistic for b 4. Thus, the inference about b 4 results in an equivalent inference for the difference between any two conditional effects of X defined by different values of M, regardless of the common value of W. By the same argument, when M is held fixed at any value such that m 1 = m 2 = m 2
3 and the sampling variance of X Y is just X Y = X Y (m,w2) X Y (m,w1) = b 5 (w 2 w 1 ) V( X Y ) = (w 2 w 1 ) 2 V(b 5 ) The ratio of the difference in conditional effects of X when M is held fixed to the standard error of this difference is equivalent to the t statistic for b 5. Thus, the inference about b 5 results in an equivalent inference for the difference between any two conditional effects of X defined by different values of W, regardless of the common value of M. Moderation of X s Effect Multiplicatively by Two Moderators M and W ( Moderated Moderation ) Finally, consider the model Y = i Y + b 1 X + b 2 M + b 3 W + b 4 XM + b 5 XW + b 6 MW + b 7 XMW + e Y In this model, the conditional effect of X on Y is X Y = b 1 + b 4 M + b 5 W + b 7 MW We wish to compare the conditional effect of X on Y when M and W = m 2 and w 2 to the conditional effect of X on Y when M and W = m 1 and w 1. The difference between these conditional effects is X Y = X Y (m2,w2) X Y (m1,w1) = b 4 (m 2 m 1 ) + b 5 (w 2 w 1 ) + b 7 (m 2 w 2 m 1 w 1 ) The sampling variance of X Y is (see the derivation at the end) V( X Y ) = (m 2 m 1 ) 2 V(b 4 ) + (w 2 w 1 ) 2 V(b 5 ) + (m 2 w 2 m 1 w 1 ) 2 V(b 7 ) + 2(m 2 m 1 )(w 2 w 1 )COV(b 4 b 5 ) + 2(m 2 m 1 )(m 2 w 2 m 1 w 1 )COV(b 4 b 7 ) + 2(w 2 w 1 )(m 2 w 2 m 1 w 1 )COV(b 5 b 7 ) The ratio of X Y to the square root of V( X Y ) is distributed as t(df residual ) under the null hypothesis that the two conditional effects are equal, where df residual is the residual degrees of freedom for the model. Implementation in PROCESS PROCESS v2.12 or higher (scheduled for release in May of 2014) can be used to conduct a test of difference between any two conditional effects of X on Y in additive multiple moderation 3
4 (PROCESS model 2) or moderated moderation (PROCESS model 3). To do so, first center M and W around m 1 and w 1, respectively. Then execute PROCESS model 2 or 3, specifying contrast=1 and using the mmodval and wmodval commands with mmodval = m and wmodval = w where m = m 2 m 1 and w = w 2 w 1. If you want to fix m 2 or w 2 to m 1 or w 1, then use mmodval = 0 or wmodval = 0, respectively. Because this procedure relies on the mmodval and wmodval options, it is not available in the custom dialog version of PROCESS for SPSS. The generic form of the code that accomplishes the analysis in SPSS is compute mvarc=mvar-m1. compute wvarc=wvar-w1. process vars=yvar mvarc wvarc xvar cvarlist/y=yvar/m=mvarc/w=wvarc/model=3/mmodval=mdiff/ /wmodval=wdiff/contrast=1. where m1 and w1 are m 1 and w 1, mdiff is a numerical argument set to m 2 m 1 and wdiff if a numerical argument set to w 2 w 1, mvar, wvar, xvar, and yvar are the variable names in the data corresponding to M, W, X, and Y, and cvarlist is an optional list of covariates. Model=3 can be replaced with model=2 if desired. The equivalent code in SAS in generic form is data datafile;set datafile;mvarc=mvar-m1;wvarc=wvar-w1;run; %process (data=datafile,vars=yvar mvarc wvarc xvar cvarlist,y=yvar,x=xvar,m=mvarc,w=wvarc, model=3,mmodval=mdiff,wmodval=wdiff,contrast=1);.3308/contrast=1. Following this procedure will override the defaults in PROCESS such that it will no longer print the conditional effects of X on Y for various combinations of M and W. Instead, PROCESS will generate the difference between conditional effects ( X Y = X Y (m2,w2) X Y (m1,w1) ), the standard error of the difference ( V( X Y ), their ratio as a t statistic, a p-value for testing the null hypothesis of no difference, and a confidence interval for the difference. I illustrate using the example from section 9.4 (pp ) of Introduction to Mediation, Moderation, and Conditional Process Analysis (Hayes, 2013). In this example, Y is support for government actions to mitigate the effects of global climate change (govact), X is negative emotions about climate change (negemot), M is sex (0 = female, 1 = male), and W is age. The model also includes two covariates: political ideology (ideology) and positive emotions about climate change (posemot). Of interest is the relationship between support for government action and negative emotions about climate change with age and sex as moderators. The model estimated includes a three-way interaction (PROCESS model 3) between X, M, and Y that is statistically significant. See Figure 9.4 in Hayes (2013) for PROCESS output. Because age is continuous and sex is dichotomous, PROCESS automatically produces conditional effects of negative emotions for each group among people relatively younger (1SD below the mean = years), moderate in age (the mean = years) and relatively older (1SD above the mean = years). The relevant output can be found below. 4
5 Conditional effect of X on Y at values of the moderator(s): age sex Effect se t p LLCI ULCI In this example, we test whether the relationship between negative emotions and support for government action differs between relatively younger women (m 1 = 0, w 1 = ; the first row above) and men who are moderate in age (m 2 = 1, w 2 = ; the fourth row above). As can be seen from the table of conditional effects above, X Y (m2 = 1, w2 = ) = and X Y (m1 = 0, w1 = ) = Their difference is X Y (m2,w2) X Y (m1,w1) = = The SPSS code below conducts an inferential test that the difference between these conditional effects of X is equal to zero against the alternative that it is different from zero. compute sexc=sex-0. compute agec=age process vars=govact negemot posemot agec sexc ideology/y=govact/x=negemot/m=sexc/w=agec/model=3 /mmodval=1/wmodval= /contrast=1. The equivalent command in PROCESS for SAS is data glbwarm;set glbwarm;sexc=sex-0;agec=age ;run; %process (data=glbwarm,vars=govact negemot posemot agec sexc ideology,y=govact,x=negemot, m=sexc,w=agec,model=3,mmodval=1,wmodval= ,contrast=1); This code first centers sex and age around m 1 = 0 and w 1 = , respectively. Obviously, centering sex around zero doesn t do anything to sex in this case. I include this line of code to illustrate how the centering is conducted in general. Using these centered moderators in the code and specifying mmodval = m 2 m 1 = 1 0 = 1 and wmodval = w 2 w 1 = = , along with the contrast=1 option, produces an inferential test of the difference between X Y (m2 = 1, w2 = ) and X Y (m1 = 0, w1 = ). Most of the output PROCESS generates is not pertinent to the test of interest. What is pertinent is the section that reads Contrast of conditional effects of X on Y Contrast se t p LLCI ULCI As can be seen, PROCESS says this difference of 0.215, with a standard error of 0.060, is statistically significant, t(805) = 3.616, p <.001, with a 95% confidence interval of to It is possible using PROCESS to generate the conditional effect of X on Y for any two combinations of moderators M and W you choose and then conduct a test of the difference between these conditional effects. You don t have to use the values that PROCESS picks for you by default. For instance, the SPSS code below estimates the conditional effect of negative 5
6 emotions on support for government action among 30 year old males (m 1 = 1, w 1 = 30) as well as among 50-year old females (m 2 = 0, w 2 = 50). process vars=govact negemot posemot age sex ideology/y=govact/x=negemot/m=sex/w=age/model=3 /mmodval=1/wmodval=30. process vars=govact negemot posemot age sex ideology/y=govact/x=negemot/m=sex/w=age/model=3 /mmodval=0/wmodval=50. In SAS, use In %process (data=glbwarm,vars=govact negemot posemot age sex ideology,y=govact,x=negemot, m=sex,w=age,model=3,mmodval=1,wmodval=30); %process (data=glbwarm,vars=govact negemot posemot age sex ideology,y=govact,x=negemot, m=sex,w=age,model=3,mmodval=0,wmodval=50); The relevant sections of output are below. Conditional effect of X on Y at values of the moderator(s): age sex Effect se t p LLCI ULCI Conditional effect of X on Y at values of the moderator(s): age sex Effect se t p LLCI ULCI The effect of negative emotions on support for government action is among 30 year-old males (m 1 = 1, w 1 = 30) and among 50 year-old females (m 1 = 0, w 2 = 50). Both of these are statistically different from zero. The difference between them is X Y (m2,w2) X Y (m1,w1) = = A statistical test of the between these two conditional effects is conducted in PROCESS with the code below. In SPSS, use compute sexc=sex-1. compute agec=age-30. process vars=govact negemot posemot agec sexc ideology/y=govact/x=negemot/m=sexc/w=agec/model=3 /mmodval=-1/wmodval=20/contrast=1. The equivalent code in SAS is data glbwarm;set glbwarm;sexc=sex-1;agec=age-30;run; %process (data=glbwarm,vars=govact negemot posemot agec sexc ideology,y=govact,x=negemot, m=sexc,w=agec,model=3,mmodval=-1,wmodval=20,contrast=1); This code first centers sex and age around m 1 = 1 and w 1 = 30, respectively. Using these centered moderators in the code and specifying mmodval = m 2 m 1 = 0 1 = -1 and wmodval = w 2 w 1 = = 20 along with the contrast=1 option produces a test of the difference between X Y (m2,w2) and X Y (m1,w1). The resulting output is Contrast of conditional effects of X on Y Contrast se t p LLCI ULCI This difference of has an estimated standard error of and is not statistically different from zero, t(805) = , p = 0.471, with a 95% confidence interval from to
7 References Dawson, J. F., & Richter, A. W. (2006). Probing three-way interactions in moderated multiple regression: Development and application of a slope difference test. Journal of Applied Psychology, 91, Hayes, A. F. (2013). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach. New York: The Guilford Press. 7
8 Derivation of the Variance of the Difference between Conditional Effects of X for the Moderated Moderation Model We seek the sampling variance of b 4 (m 2 m 1 ) + b 5 (w 2 w 1 ) + b 7 (m 2 w 2 m 1 w 1 ). We know from covariance algebra that V ( ax by cz) = a b c V ( X ) COVXY COVXZ COV V ( Y ) COV XY YZ COVXZ COV YZ V ( Z) a b c = a b c av ( X ) bcovxy ccovxz acovxy bv ( Y ) ccovyz acov bcov cv ( Z) XZ YZ = 2 a V ( X ) abcov abcov accov XY XZ XY 2 b V ( Y ) bccov bccov YZ accov YZ XZ 2 c V ( Z) = a 2 V(X) + b 2 V(Y) + c 2 V(Z) + 2abCOV XY + 2acCOV XZ + 2bcCOV YZ Substitution of values below into the above yields the desired variance. X = b 4 Y = b 5 Z = b 7 a = (m 2 m 1 ) b = (w 2 w 1 ) c = (m 2 w 2 m 1 w 1 ) 8
9 Note: If one were to center M and W around m 1 and w 1 prior to model estimation and express the conditioning on the centered metric, such that one of the conditional effects is therefore X Y (0,0) the expressions simplify to X Y = b 4 m 2 + b 5 w 2 V( X Y ) = m 2 2 V(b 4 ) + w 2 2 V(b 5 ) + 2m 2 w 2 COV(b 4 b 5 ) ===== Note: If one were to center M and W around m 1 and w 1 prior to model estimation and express the conditioning on the centered metric, such that one of the conditional effects is therefore X Y (0,0) the expressions simplify to X Y = b 4 m 2 + b 5 w 2 + b 7 m 2 w 2 V( X Y ) = m 2 2 V(b 4 ) + w 2 2 V(b 5 ) + (m 2 w 2 ) 2 V(b 7 ) + 2m 2 w 2 COV(b 4 b 5 ) + 2m 2 2 w 2 COV(b 4 b 7 ) + 2m 2 w 2 2 COV(b 5 b 7 ) PROCESS v or higher can be used to conduct the test. First center M and W around m 1 and w 1, respectively. Then execute PROCESS model 3, specifying contrast=1 and using the mmodval and wmodval commands with mmodval = m and wmodval = w where m = m 2 m 1 and w = w 2 w 1. What PROCESS displays as the conditional effect of X on Y along with its test of significance will actually be the difference between the two conditional effects desired and a test that their difference equals zero along with a confidence interval for the difference. 9
Supplementary PROCESS Documentation
Supplementary PROCESS Documentation This document is an addendum to Appendix A of Introduction to Mediation, Moderation, and Conditional Process Analysis that describes options and output added to PROCESS
More informationUnit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression
Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression Objectives: To perform a hypothesis test concerning the slope of a least squares line To recognize that testing for a
More informationAssociation Between Variables
Contents 11 Association Between Variables 767 11.1 Introduction............................ 767 11.1.1 Measure of Association................. 768 11.1.2 Chapter Summary.................... 769 11.2 Chi
More information[This document contains corrections to a few typos that were found on the version available through the journal s web page]
Online supplement to Hayes, A. F., & Preacher, K. J. (2014). Statistical mediation analysis with a multicategorical independent variable. British Journal of Mathematical and Statistical Psychology, 67,
More informationModeration. Moderation
Stats - Moderation Moderation A moderator is a variable that specifies conditions under which a given predictor is related to an outcome. The moderator explains when a DV and IV are related. Moderation
More informationRegression Analysis: A Complete Example
Regression Analysis: A Complete Example This section works out an example that includes all the topics we have discussed so far in this chapter. A complete example of regression analysis. PhotoDisc, Inc./Getty
More informationChapter 5 Analysis of variance SPSS Analysis of variance
Chapter 5 Analysis of variance SPSS Analysis of variance Data file used: gss.sav How to get there: Analyze Compare Means One-way ANOVA To test the null hypothesis that several population means are equal,
More informationEverything You Wanted to Know about Moderation (but were afraid to ask) Jeremy F. Dawson University of Sheffield
Everything You Wanted to Know about Moderation (but were afraid to ask) Jeremy F. Dawson University of Sheffield Andreas W. Richter University of Cambridge Resources for this PDW Slides SPSS data set SPSS
More informationThis chapter will demonstrate how to perform multiple linear regression with IBM SPSS
CHAPTER 7B Multiple Regression: Statistical Methods Using IBM SPSS This chapter will demonstrate how to perform multiple linear regression with IBM SPSS first using the standard method and then using the
More informationHYPOTHESIS TESTING: CONFIDENCE INTERVALS, T-TESTS, ANOVAS, AND REGRESSION
HYPOTHESIS TESTING: CONFIDENCE INTERVALS, T-TESTS, ANOVAS, AND REGRESSION HOD 2990 10 November 2010 Lecture Background This is a lightning speed summary of introductory statistical methods for senior undergraduate
More informationIndependent t- Test (Comparing Two Means)
Independent t- Test (Comparing Two Means) The objectives of this lesson are to learn: the definition/purpose of independent t-test when to use the independent t-test the use of SPSS to complete an independent
More informationDescriptive Statistics
Descriptive Statistics Primer Descriptive statistics Central tendency Variation Relative position Relationships Calculating descriptive statistics Descriptive Statistics Purpose to describe or summarize
More information1/27/2013. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2
PSY 512: Advanced Statistics for Psychological and Behavioral Research 2 Introduce moderated multiple regression Continuous predictor continuous predictor Continuous predictor categorical predictor Understand
More informationMULTIPLE REGRESSION WITH CATEGORICAL DATA
DEPARTMENT OF POLITICAL SCIENCE AND INTERNATIONAL RELATIONS Posc/Uapp 86 MULTIPLE REGRESSION WITH CATEGORICAL DATA I. AGENDA: A. Multiple regression with categorical variables. Coding schemes. Interpreting
More informationSimple linear regression
Simple linear regression Introduction Simple linear regression is a statistical method for obtaining a formula to predict values of one variable from another where there is a causal relationship between
More information1. What is the critical value for this 95% confidence interval? CV = z.025 = invnorm(0.025) = 1.96
1 Final Review 2 Review 2.1 CI 1-propZint Scenario 1 A TV manufacturer claims in its warranty brochure that in the past not more than 10 percent of its TV sets needed any repair during the first two years
More informationRATIOS, PROPORTIONS, PERCENTAGES, AND RATES
RATIOS, PROPORTIOS, PERCETAGES, AD RATES 1. Ratios: ratios are one number expressed in relation to another by dividing the one number by the other. For example, the sex ratio of Delaware in 1990 was: 343,200
More information11. Analysis of Case-control Studies Logistic Regression
Research methods II 113 11. Analysis of Case-control Studies Logistic Regression This chapter builds upon and further develops the concepts and strategies described in Ch.6 of Mother and Child Health:
More informationReporting Statistics in Psychology
This document contains general guidelines for the reporting of statistics in psychology research. The details of statistical reporting vary slightly among different areas of science and also among different
More informationMultiple Linear Regression
Multiple Linear Regression A regression with two or more explanatory variables is called a multiple regression. Rather than modeling the mean response as a straight line, as in simple regression, it is
More informationSPSS Guide: Regression Analysis
SPSS Guide: Regression Analysis I put this together to give you a step-by-step guide for replicating what we did in the computer lab. It should help you run the tests we covered. The best way to get familiar
More informationSCHOOL OF HEALTH AND HUMAN SCIENCES DON T FORGET TO RECODE YOUR MISSING VALUES
SCHOOL OF HEALTH AND HUMAN SCIENCES Using SPSS Topics addressed today: 1. Differences between groups 2. Graphing Use the s4data.sav file for the first part of this session. DON T FORGET TO RECODE YOUR
More informationOne-Way ANOVA using SPSS 11.0. SPSS ANOVA procedures found in the Compare Means analyses. Specifically, we demonstrate
1 One-Way ANOVA using SPSS 11.0 This section covers steps for testing the difference between three or more group means using the SPSS ANOVA procedures found in the Compare Means analyses. Specifically,
More informationAnswer: C. The strength of a correlation does not change if units change by a linear transformation such as: Fahrenheit = 32 + (5/9) * Centigrade
Statistics Quiz Correlation and Regression -- ANSWERS 1. Temperature and air pollution are known to be correlated. We collect data from two laboratories, in Boston and Montreal. Boston makes their measurements
More informationII. DISTRIBUTIONS distribution normal distribution. standard scores
Appendix D Basic Measurement And Statistics The following information was developed by Steven Rothke, PhD, Department of Psychology, Rehabilitation Institute of Chicago (RIC) and expanded by Mary F. Schmidt,
More informationChapter 7: Simple linear regression Learning Objectives
Chapter 7: Simple linear regression Learning Objectives Reading: Section 7.1 of OpenIntro Statistics Video: Correlation vs. causation, YouTube (2:19) Video: Intro to Linear Regression, YouTube (5:18) -
More informationStatistics Review PSY379
Statistics Review PSY379 Basic concepts Measurement scales Populations vs. samples Continuous vs. discrete variable Independent vs. dependent variable Descriptive vs. inferential stats Common analyses
More informationModule 3: Correlation and Covariance
Using Statistical Data to Make Decisions Module 3: Correlation and Covariance Tom Ilvento Dr. Mugdim Pašiƒ University of Delaware Sarajevo Graduate School of Business O ften our interest in data analysis
More informationUNDERSTANDING THE TWO-WAY ANOVA
UNDERSTANDING THE e have seen how the one-way ANOVA can be used to compare two or more sample means in studies involving a single independent variable. This can be extended to two independent variables
More informationLesson 1: Comparison of Population Means Part c: Comparison of Two- Means
Lesson : Comparison of Population Means Part c: Comparison of Two- Means Welcome to lesson c. This third lesson of lesson will discuss hypothesis testing for two independent means. Steps in Hypothesis
More informationSection 13, Part 1 ANOVA. Analysis Of Variance
Section 13, Part 1 ANOVA Analysis Of Variance Course Overview So far in this course we ve covered: Descriptive statistics Summary statistics Tables and Graphs Probability Probability Rules Probability
More informationPoint Biserial Correlation Tests
Chapter 807 Point Biserial Correlation Tests Introduction The point biserial correlation coefficient (ρ in this chapter) is the product-moment correlation calculated between a continuous random variable
More informationUsing R for Linear Regression
Using R for Linear Regression In the following handout words and symbols in bold are R functions and words and symbols in italics are entries supplied by the user; underlined words and symbols are optional
More informationGeneral Method: Difference of Means. 3. Calculate df: either Welch-Satterthwaite formula or simpler df = min(n 1, n 2 ) 1.
General Method: Difference of Means 1. Calculate x 1, x 2, SE 1, SE 2. 2. Combined SE = SE1 2 + SE2 2. ASSUMES INDEPENDENT SAMPLES. 3. Calculate df: either Welch-Satterthwaite formula or simpler df = min(n
More informationSolución del Examen Tipo: 1
Solución del Examen Tipo: 1 Universidad Carlos III de Madrid ECONOMETRICS Academic year 2009/10 FINAL EXAM May 17, 2010 DURATION: 2 HOURS 1. Assume that model (III) verifies the assumptions of the classical
More informationClass 19: Two Way Tables, Conditional Distributions, Chi-Square (Text: Sections 2.5; 9.1)
Spring 204 Class 9: Two Way Tables, Conditional Distributions, Chi-Square (Text: Sections 2.5; 9.) Big Picture: More than Two Samples In Chapter 7: We looked at quantitative variables and compared the
More information" Y. Notation and Equations for Regression Lecture 11/4. Notation:
Notation: Notation and Equations for Regression Lecture 11/4 m: The number of predictor variables in a regression Xi: One of multiple predictor variables. The subscript i represents any number from 1 through
More informationThis chapter discusses some of the basic concepts in inferential statistics.
Research Skills for Psychology Majors: Everything You Need to Know to Get Started Inferential Statistics: Basic Concepts This chapter discusses some of the basic concepts in inferential statistics. Details
More informationSIMPLE LINEAR CORRELATION. r can range from -1 to 1, and is independent of units of measurement. Correlation can be done on two dependent variables.
SIMPLE LINEAR CORRELATION Simple linear correlation is a measure of the degree to which two variables vary together, or a measure of the intensity of the association between two variables. Correlation
More informationRegression III: Advanced Methods
Lecture 16: Generalized Additive Models Regression III: Advanced Methods Bill Jacoby Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Goals of the Lecture Introduce Additive Models
More information1.5 Oneway Analysis of Variance
Statistics: Rosie Cornish. 200. 1.5 Oneway Analysis of Variance 1 Introduction Oneway analysis of variance (ANOVA) is used to compare several means. This method is often used in scientific or medical experiments
More informationHLM software has been one of the leading statistical packages for hierarchical
Introductory Guide to HLM With HLM 7 Software 3 G. David Garson HLM software has been one of the leading statistical packages for hierarchical linear modeling due to the pioneering work of Stephen Raudenbush
More informationSubcommands in brackets are optional. User input in italics. Brackets should not be included in PROCESS command.
PROCESS PROCESS VARS = varlist/y = yvar/x = xvar/m = mvlist/model = num [/W = wvar] [/Z = zvar] [/V = vvar] [/Q = qvar] [/WMODVAL = wval] [/ZMODVAL = zval] [/VMODVAL = vval] [/QMODVAL = qval] [/MMODVAL
More informationRegression step-by-step using Microsoft Excel
Step 1: Regression step-by-step using Microsoft Excel Notes prepared by Pamela Peterson Drake, James Madison University Type the data into the spreadsheet The example used throughout this How to is a regression
More informationChapter 2 Probability Topics SPSS T tests
Chapter 2 Probability Topics SPSS T tests Data file used: gss.sav In the lecture about chapter 2, only the One-Sample T test has been explained. In this handout, we also give the SPSS methods to perform
More informationThe Chi-Square Test. STAT E-50 Introduction to Statistics
STAT -50 Introduction to Statistics The Chi-Square Test The Chi-square test is a nonparametric test that is used to compare experimental results with theoretical models. That is, we will be comparing observed
More informationOverview Classes. 12-3 Logistic regression (5) 19-3 Building and applying logistic regression (6) 26-3 Generalizations of logistic regression (7)
Overview Classes 12-3 Logistic regression (5) 19-3 Building and applying logistic regression (6) 26-3 Generalizations of logistic regression (7) 2-4 Loglinear models (8) 5-4 15-17 hrs; 5B02 Building and
More informationThe correlation coefficient
The correlation coefficient Clinical Biostatistics The correlation coefficient Martin Bland Correlation coefficients are used to measure the of the relationship or association between two quantitative
More informationTHE UNIVERSITY OF CHICAGO, Booth School of Business Business 41202, Spring Quarter 2014, Mr. Ruey S. Tsay. Solutions to Homework Assignment #2
THE UNIVERSITY OF CHICAGO, Booth School of Business Business 41202, Spring Quarter 2014, Mr. Ruey S. Tsay Solutions to Homework Assignment #2 Assignment: 1. Consumer Sentiment of the University of Michigan.
More informationChapter 9. Two-Sample Tests. Effect Sizes and Power Paired t Test Calculation
Chapter 9 Two-Sample Tests Paired t Test (Correlated Groups t Test) Effect Sizes and Power Paired t Test Calculation Summary Independent t Test Chapter 9 Homework Power and Two-Sample Tests: Paired Versus
More informationCompute the derivative by definition: The four step procedure
Compute te derivative by definition: Te four step procedure Given a function f(x), te definition of f (x), te derivative of f(x), is lim 0 f(x + ) f(x), provided te limit exists Te derivative function
More informationNCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )
Chapter 340 Principal Components Regression Introduction is a technique for analyzing multiple regression data that suffer from multicollinearity. When multicollinearity occurs, least squares estimates
More informationAn introduction to IBM SPSS Statistics
An introduction to IBM SPSS Statistics Contents 1 Introduction... 1 2 Entering your data... 2 3 Preparing your data for analysis... 10 4 Exploring your data: univariate analysis... 14 5 Generating descriptive
More informationSimple Linear Regression Inference
Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation
More informationMultinomial and Ordinal Logistic Regression
Multinomial and Ordinal Logistic Regression ME104: Linear Regression Analysis Kenneth Benoit August 22, 2012 Regression with categorical dependent variables When the dependent variable is categorical,
More informationThis is a square root. The number under the radical is 9. (An asterisk * means multiply.)
Page of Review of Radical Expressions and Equations Skills involving radicals can be divided into the following groups: Evaluate square roots or higher order roots. Simplify radical expressions. Rationalize
More informationAn analysis method for a quantitative outcome and two categorical explanatory variables.
Chapter 11 Two-Way ANOVA An analysis method for a quantitative outcome and two categorical explanatory variables. If an experiment has a quantitative outcome and two categorical explanatory variables that
More informationBivariate Statistics Session 2: Measuring Associations Chi-Square Test
Bivariate Statistics Session 2: Measuring Associations Chi-Square Test Features Of The Chi-Square Statistic The chi-square test is non-parametric. That is, it makes no assumptions about the distribution
More informationChapter 7. One-way ANOVA
Chapter 7 One-way ANOVA One-way ANOVA examines equality of population means for a quantitative outcome and a single categorical explanatory variable with any number of levels. The t-test of Chapter 6 looks
More informationCase Study in Data Analysis Does a drug prevent cardiomegaly in heart failure?
Case Study in Data Analysis Does a drug prevent cardiomegaly in heart failure? Harvey Motulsky hmotulsky@graphpad.com This is the first case in what I expect will be a series of case studies. While I mention
More informationMain Effects and Interactions
Main Effects & Interactions page 1 Main Effects and Interactions So far, we ve talked about studies in which there is just one independent variable, such as violence of television program. You might randomly
More informationCorrelation and Simple Linear Regression
Correlation and Simple Linear Regression We are often interested in studying the relationship among variables to determine whether they are associated with one another. When we think that changes in a
More informationFactors affecting online sales
Factors affecting online sales Table of contents Summary... 1 Research questions... 1 The dataset... 2 Descriptive statistics: The exploratory stage... 3 Confidence intervals... 4 Hypothesis tests... 4
More information2. Simple Linear Regression
Research methods - II 3 2. Simple Linear Regression Simple linear regression is a technique in parametric statistics that is commonly used for analyzing mean response of a variable Y which changes according
More informationChapter Seven. Multiple regression An introduction to multiple regression Performing a multiple regression on SPSS
Chapter Seven Multiple regression An introduction to multiple regression Performing a multiple regression on SPSS Section : An introduction to multiple regression WHAT IS MULTIPLE REGRESSION? Multiple
More informationLecture Notes Module 1
Lecture Notes Module 1 Study Populations A study population is a clearly defined collection of people, animals, plants, or objects. In psychological research, a study population usually consists of a specific
More information3.4 Statistical inference for 2 populations based on two samples
3.4 Statistical inference for 2 populations based on two samples Tests for a difference between two population means The first sample will be denoted as X 1, X 2,..., X m. The second sample will be denoted
More informationModule 5: Multiple Regression Analysis
Using Statistical Data Using to Make Statistical Decisions: Data Multiple to Make Regression Decisions Analysis Page 1 Module 5: Multiple Regression Analysis Tom Ilvento, University of Delaware, College
More informationLAB 4 INSTRUCTIONS CONFIDENCE INTERVALS AND HYPOTHESIS TESTING
LAB 4 INSTRUCTIONS CONFIDENCE INTERVALS AND HYPOTHESIS TESTING In this lab you will explore the concept of a confidence interval and hypothesis testing through a simulation problem in engineering setting.
More informationCHAPTER 13 SIMPLE LINEAR REGRESSION. Opening Example. Simple Regression. Linear Regression
Opening Example CHAPTER 13 SIMPLE LINEAR REGREION SIMPLE LINEAR REGREION! Simple Regression! Linear Regression Simple Regression Definition A regression model is a mathematical equation that descries the
More informationBasic Statistics and Data Analysis for Health Researchers from Foreign Countries
Basic Statistics and Data Analysis for Health Researchers from Foreign Countries Volkert Siersma siersma@sund.ku.dk The Research Unit for General Practice in Copenhagen Dias 1 Content Quantifying association
More informationOdds ratio, Odds ratio test for independence, chi-squared statistic.
Odds ratio, Odds ratio test for independence, chi-squared statistic. Announcements: Assignment 5 is live on webpage. Due Wed Aug 1 at 4:30pm. (9 days, 1 hour, 58.5 minutes ) Final exam is Aug 9. Review
More informationA Review of Methods. for Dealing with Missing Data. Angela L. Cool. Texas A&M University 77843-4225
Missing Data 1 Running head: DEALING WITH MISSING DATA A Review of Methods for Dealing with Missing Data Angela L. Cool Texas A&M University 77843-4225 Paper presented at the annual meeting of the Southwest
More informationcontaining Kendall correlations; and the OUTH = option will create a data set containing Hoeffding statistics.
Getting Correlations Using PROC CORR Correlation analysis provides a method to measure the strength of a linear relationship between two numeric variables. PROC CORR can be used to compute Pearson product-moment
More informationPartial Fractions. Combining fractions over a common denominator is a familiar operation from algebra:
Partial Fractions Combining fractions over a common denominator is a familiar operation from algebra: From the standpoint of integration, the left side of Equation 1 would be much easier to work with than
More informationUnit 12 Logistic Regression Supplementary Chapter 14 in IPS On CD (Chap 16, 5th ed.)
Unit 12 Logistic Regression Supplementary Chapter 14 in IPS On CD (Chap 16, 5th ed.) Logistic regression generalizes methods for 2-way tables Adds capability studying several predictors, but Limited to
More informationWhen to Use a Particular Statistical Test
When to Use a Particular Statistical Test Central Tendency Univariate Descriptive Mode the most commonly occurring value 6 people with ages 21, 22, 21, 23, 19, 21 - mode = 21 Median the center value the
More informationOrdinal Regression. Chapter
Ordinal Regression Chapter 4 Many variables of interest are ordinal. That is, you can rank the values, but the real distance between categories is unknown. Diseases are graded on scales from least severe
More informationCategorical Data Analysis
Richard L. Scheaffer University of Florida The reference material and many examples for this section are based on Chapter 8, Analyzing Association Between Categorical Variables, from Statistical Methods
More informationChapter 13 Introduction to Linear Regression and Correlation Analysis
Chapter 3 Student Lecture Notes 3- Chapter 3 Introduction to Linear Regression and Correlation Analsis Fall 2006 Fundamentals of Business Statistics Chapter Goals To understand the methods for displaing
More informationLinear Models in STATA and ANOVA
Session 4 Linear Models in STATA and ANOVA Page Strengths of Linear Relationships 4-2 A Note on Non-Linear Relationships 4-4 Multiple Linear Regression 4-5 Removal of Variables 4-8 Independent Samples
More informationDDBA 8438: The t Test for Independent Samples Video Podcast Transcript
DDBA 8438: The t Test for Independent Samples Video Podcast Transcript JENNIFER ANN MORROW: Welcome to The t Test for Independent Samples. My name is Dr. Jennifer Ann Morrow. In today's demonstration,
More informationOutline. Definitions Descriptive vs. Inferential Statistics The t-test - One-sample t-test
The t-test Outline Definitions Descriptive vs. Inferential Statistics The t-test - One-sample t-test - Dependent (related) groups t-test - Independent (unrelated) groups t-test Comparing means Correlation
More informationAdditional sources Compilation of sources: http://lrs.ed.uiuc.edu/tseportal/datacollectionmethodologies/jin-tselink/tselink.htm
Mgt 540 Research Methods Data Analysis 1 Additional sources Compilation of sources: http://lrs.ed.uiuc.edu/tseportal/datacollectionmethodologies/jin-tselink/tselink.htm http://web.utk.edu/~dap/random/order/start.htm
More informationBinary Logistic Regression
Binary Logistic Regression Main Effects Model Logistic regression will accept quantitative, binary or categorical predictors and will code the latter two in various ways. Here s a simple model including
More informationBelow is a very brief tutorial on the basic capabilities of Excel. Refer to the Excel help files for more information.
Excel Tutorial Below is a very brief tutorial on the basic capabilities of Excel. Refer to the Excel help files for more information. Working with Data Entering and Formatting Data Before entering data
More informationSimple Regression Theory II 2010 Samuel L. Baker
SIMPLE REGRESSION THEORY II 1 Simple Regression Theory II 2010 Samuel L. Baker Assessing how good the regression equation is likely to be Assignment 1A gets into drawing inferences about how close the
More information12: Analysis of Variance. Introduction
1: Analysis of Variance Introduction EDA Hypothesis Test Introduction In Chapter 8 and again in Chapter 11 we compared means from two independent groups. In this chapter we extend the procedure to consider
More informationOne-Way Analysis of Variance (ANOVA) Example Problem
One-Way Analysis of Variance (ANOVA) Example Problem Introduction Analysis of Variance (ANOVA) is a hypothesis-testing technique used to test the equality of two or more population (or treatment) means
More informationHypothesis testing - Steps
Hypothesis testing - Steps Steps to do a two-tailed test of the hypothesis that β 1 0: 1. Set up the hypotheses: H 0 : β 1 = 0 H a : β 1 0. 2. Compute the test statistic: t = b 1 0 Std. error of b 1 =
More informationPsychology 205: Research Methods in Psychology
Psychology 205: Research Methods in Psychology Using R to analyze the data for study 2 Department of Psychology Northwestern University Evanston, Illinois USA November, 2012 1 / 38 Outline 1 Getting ready
More informationGood luck! BUSINESS STATISTICS FINAL EXAM INSTRUCTIONS. Name:
Glo bal Leadership M BA BUSINESS STATISTICS FINAL EXAM Name: INSTRUCTIONS 1. Do not open this exam until instructed to do so. 2. Be sure to fill in your name before starting the exam. 3. You have two hours
More informationX X X a) perfect linear correlation b) no correlation c) positive correlation (r = 1) (r = 0) (0 < r < 1)
CORRELATION AND REGRESSION / 47 CHAPTER EIGHT CORRELATION AND REGRESSION Correlation and regression are statistical methods that are commonly used in the medical literature to compare two or more variables.
More informationKSTAT MINI-MANUAL. Decision Sciences 434 Kellogg Graduate School of Management
KSTAT MINI-MANUAL Decision Sciences 434 Kellogg Graduate School of Management Kstat is a set of macros added to Excel and it will enable you to do the statistics required for this course very easily. To
More informationCorrelations. MSc Module 6: Introduction to Quantitative Research Methods Kenneth Benoit. March 18, 2010
Correlations MSc Module 6: Introduction to Quantitative Research Methods Kenneth Benoit March 18, 2010 Relationships between variables In previous weeks, we have been concerned with describing variables
More informationThe Dummy s Guide to Data Analysis Using SPSS
The Dummy s Guide to Data Analysis Using SPSS Mathematics 57 Scripps College Amy Gamble April, 2001 Amy Gamble 4/30/01 All Rights Rerserved TABLE OF CONTENTS PAGE Helpful Hints for All Tests...1 Tests
More informationWe extended the additive model in two variables to the interaction model by adding a third term to the equation.
Quadratic Models We extended the additive model in two variables to the interaction model by adding a third term to the equation. Similarly, we can extend the linear model in one variable to the quadratic
More informationPart 3. Comparing Groups. Chapter 7 Comparing Paired Groups 189. Chapter 8 Comparing Two Independent Groups 217
Part 3 Comparing Groups Chapter 7 Comparing Paired Groups 189 Chapter 8 Comparing Two Independent Groups 217 Chapter 9 Comparing More Than Two Groups 257 188 Elementary Statistics Using SAS Chapter 7 Comparing
More information