Mediation with Dichotomous Outcomes

Similar documents
Moderator and Mediator Analysis

[This document contains corrections to a few typos that were found on the version available through the journal s web page]

1/27/2013. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2

HYPOTHESIS TESTING: CONFIDENCE INTERVALS, T-TESTS, ANOVAS, AND REGRESSION

Correlational Research. Correlational Research. Stephen E. Brock, Ph.D., NCSP EDS 250. Descriptive Research 1. Correlational Research: Scatter Plots

Supplementary PROCESS Documentation

Moderation. Moderation

When to Use a Particular Statistical Test

Partial Fractions. Combining fractions over a common denominator is a familiar operation from algebra:

Overview of Factor Analysis

Using Correlation and Regression: Mediation, Moderation, and More

Overview Classes Logistic regression (5) 19-3 Building and applying logistic regression (6) 26-3 Generalizations of logistic regression (7)

When Moderation Is Mediated and Mediation Is Moderated

" Y. Notation and Equations for Regression Lecture 11/4. Notation:

A Basic Guide to Analyzing Individual Scores Data with SPSS

INTRODUCTION TO MULTIPLE CORRELATION

Chapter Seven. Multiple regression An introduction to multiple regression Performing a multiple regression on SPSS

Statistics in Retail Finance. Chapter 2: Statistical models of default

Dimensionality Reduction: Principal Components Analysis

Multiple Imputation for Missing Data: A Cautionary Tale

Linear Equations in One Variable

Simple linear regression

Chapter 7 Factor Analysis SPSS

One-Way ANOVA using SPSS SPSS ANOVA procedures found in the Compare Means analyses. Specifically, we demonstrate

Calculating the Probability of Returning a Loan with Binary Probability Models

Partial Fractions. p(x) q(x)

Mediation in Experimental and Nonexperimental Studies: New Procedures and Recommendations

CHAPTER THREE COMMON DESCRIPTIVE STATISTICS COMMON DESCRIPTIVE STATISTICS / 13

Estimate a WAIS Full Scale IQ with a score on the International Contest 2009

Binary Logistic Regression

LOGIT AND PROBIT ANALYSIS

a 1 x + a 0 =0. (3) ax 2 + bx + c =0. (4)

The Method of Partial Fractions Math 121 Calculus II Spring 2015

Ordinal Regression. Chapter

Employee Health and Well being in the NHS: A Trust Level Analysis

Using Excel for Statistical Analysis

INTERPRETING THE ONE-WAY ANALYSIS OF VARIANCE (ANOVA)

USING LOGISTIC REGRESSION TO PREDICT CUSTOMER RETENTION. Andrew H. Karp Sierra Information Services, Inc. San Francisco, California USA

X X X a) perfect linear correlation b) no correlation c) positive correlation (r = 1) (r = 0) (0 < r < 1)

MEASURES OF VARIATION

Effective Factors Influencing on the Implementation of Knowledge Management in the Agricultural Bank of Qom Province

CHAPTER 8 FACTOR EXTRACTION BY MATRIX FACTORING TECHNIQUES. From Exploratory Factor Analysis Ledyard R Tucker and Robert C.

PROCESS: A Versatile Computational Tool for Observed Variable Mediation, Moderation, and Conditional Process Modeling 1

3.1. Solving linear equations. Introduction. Prerequisites. Learning Outcomes. Learning Style

Specification of Rasch-based Measures in Structural Equation Modelling (SEM) Thomas Salzberger

Calculating, Interpreting, and Reporting Estimates of Effect Size (Magnitude of an Effect or the Strength of a Relationship)

Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models

Number Who Chose This Maximum Amount

Windows-Based Meta-Analysis Software. Package. Version 2.0

Comparison of sales forecasting models for an innovative agro-industrial product: Bass model versus logistic function

A Review of Methods. for Dealing with Missing Data. Angela L. Cool. Texas A&M University

Scatter Plots with Error Bars

Factoring Polynomials and Solving Quadratic Equations

Multiple Regression in SPSS This example shows you how to perform multiple regression. The basic command is regression : linear.

Solve addition and subtraction word problems, and add and subtract within 10, e.g., by using objects or drawings to represent the problem.

1 Lecture: Integration of rational functions by decomposition

5. Linear Regression

A logistic approximation to the cumulative normal distribution

3. What is the difference between variance and standard deviation? 5. If I add 2 to all my observations, how variance and mean will vary?

Cool Tools for PROC LOGISTIC

Regression Analysis: A Complete Example

Chapter 13 Introduction to Linear Regression and Correlation Analysis

Missing Data: Part 1 What to Do? Carol B. Thompson Johns Hopkins Biostatistics Center SON Brown Bag 3/20/13

Chapter 7: Simple linear regression Learning Objectives

Advantages of Monte Carlo Confidence Intervals for Indirect Effects

Simple Linear Regression, Scatterplots, and Bivariate Correlation

CHAPTER 9 EXAMPLES: MULTILEVEL MODELING WITH COMPLEX SURVEY DATA

STATISTICA Formula Guide: Logistic Regression. Table of Contents

An Introduction to Path Analysis. nach 3

Reliability Overview

Multiple Linear Regression in Data Mining

FACTORING QUADRATIC EQUATIONS

Estimation of σ 2, the variance of ɛ

Module 5: Introduction to Multilevel Modelling SPSS Practicals Chris Charlton 1 Centre for Multilevel Modelling

14.2 Do Two Distributions Have the Same Means or Variances?

Coefficient of Determination

Chapter 4 and 5 solutions

Univariate Regression

Statistical Confidence Calculations

The Basic Two-Level Regression Model

COMP6053 lecture: Relationship between two variables: correlation, covariance and r-squared.

International Statistical Institute, 56th Session, 2007: Phil Everson

Illustration (and the use of HLM)

The Effect of Human Resources Planning and Training and Development on Organizational Performance in the Government Sector in Jordan

SPSS Guide: Regression Analysis

13. Poisson Regression Analysis

This chapter will demonstrate how to perform multiple linear regression with IBM SPSS

Algebra 1 If you are okay with that placement then you have no further action to take Algebra 1 Portion of the Math Placement Test

Forecasting Tourism Demand: Methods and Strategies. By D. C. Frechtling Oxford, UK: Butterworth Heinemann 2001

This is a square root. The number under the radical is 9. (An asterisk * means multiply.)

Regression III: Advanced Methods

Below is a very brief tutorial on the basic capabilities of Excel. Refer to the Excel help files for more information.

Staying on Course: The Effects of Savings and Assets on the College Progress of Young Adults By William Elliott and Sondra Beverly

3.3. Solving Polynomial Equations. Introduction. Prerequisites. Learning Outcomes

Transcription:

April 19, 2013 This is a draft document to assist researchers. Please do not cite or quote without the author s permission. I thank both Nathaniel R. Herr and Craig Nathanson who suggested corrections to a prior document. Readers might benefit by consulting the page by Nathaniel Herr: http://www.nrhpsych.com/mediation/logmed.html Mediation with Dichotomous Outcomes David A. Kenny This note is on the testing of mediation using logistic regression and is largely based on a paper by MacKinnon and Dwyer (1993). The interested reader should consult their paper for more details. Consider the equations: Y = cx + E 1 M = ax + E 2 Y = bm + c'x + E 3 where X is the causal variable, Y the outcome, and M the mediator and each is a dichotomy. The coefficients in the equations are estimated by logistic regression. The usual decomposition of effects c' + ab = c. The problem is that when a variable is used as a predictor in logistic regression, it has a different scale from when it is an outcome variable. The mediation equations need to be rewritten to show the need as follows: Y' = cx + E 1 M' = ax + E 2

2 Y'' = bm + c'x + E 3 M', Y', and Y'' refers to the new M and Y variables differs from the scale of original M and Y. They differ because the error variances are fixed to one. To make the coefficients comparable across equations, we multiply each coefficient by the standard deviation of the predictor variable and divide by the standard deviation of the outcome variable. The standard deviation of M and X can be computed in the ordinary way. However, the variances of outcome variables in the three above equations are as follows Y': c 2 V(X) + 2 /3 M': a 2 V(X) + 2 /3 Y'': c' 2 V(X) + b 2 V(M) + 2bc'C(X,M) + 2 /3 where = 3.1416... These variances would be square rooted and used in the standardization. Note also that if either M or Y is measured at the interval level of measurement, ordinary multiple regression coefficients can be used when each is the outcome variable in the analysis and there would be no need to rescale the coefficients from that equation. An alternative and equivalent estimation strategy (and perhaps easier) is to use as the variances: Y': V(Y'*) + 2 /3 M': V(M'*) + 2 /3 Y'': V(Y''*) + 2 /3 where Y'*, M'*, and Y''* means the variance in the predicted scores. After each coefficient has been standardized (multiplied by the standard deviation of the predictor variable and divided by the standard deviation of the outcome variable), the

3 decomposition can be computed. Unlike multiple regression, ab + c' only approximately equals c. If a probit regression analysis were used, the same procedure would be used, but instead we would substitute 1 for 2 /3. The use of the Sobel (1982) test the indirect effect because that test presumes that a and b are independent (i.e., uncorrelated) which they are not when the results are from logistic regression. The reader might consult Imai, Keele, and Tingley (2010) for details. Illustration We use the Morse et al. (1994) data that were reanalyzed by Kenny, Kashy, and Bolger (1988). The variables are X: condition (0 = control; 1 = treated) Y: days of stable housing in a month (0 < 15 days; 1 > 14 days) M: housing services received (0 < 1 contact; 1 greater than or equal to 1 contact) The SPSS output from 1 is: treatment.852.400 4.531 1.033 2.344 Constant -.223.254.775 1.379.800 a Variable(s) entered on step 1: treatment. The SPSS output from 2 is: treatment 1.165.413 7.936 1.005 3.205 Constant -1.078.289 13.860 1.000.340 a Variable(s) entered on step 1: treatment.

4 The SPSS output from 3 and 4 is: treatment.561.427 1.727 1.189 1.752 hc_dic 1.346.451 8.907 1.003 3.844 Constant -.566.286 3.914 1.048.568 a Variable(s) entered on step 1: treatment, hc_dic. Term Pred. Outcome Estimate SE Z s 2 Pred. s 2 Out. Stan.Coef. a X M 1.165 0.413 7.936 3.624 0.246 0.304 b M Y 1.346 0.451 8.907 3.892 0.234 0.330 c' X Y 0.561 0.427 1.727 3.892 0.246 0.141 c X Y 0.852 0.400 4.530 3.469 0.246 0.227 Note that ab + c' equals 0.241 which is approximately equal to c or 0.227. References Imai, K., Keele, L., & Tingley, D. (2010). A general approach to causal mediation analysis. Psychological Methods, 15, 309-334. Kenny, D. A., Kashy, D. A., & Bolger, N. (1998). Data analysis in social psychology. In D. Gilbert, S. Fiske, & G. Lindzey (Eds.), The handbook of social psychology (Vol. 1, 4th ed., pp. 233-265). Boston, MA: McGraw-Hill. MacKinnon, D. P., & Dwyer, J. H. (1993). Estimating mediated effects in prevention studies. Evaluation Review, 17, 144-158.

5 Morse, G. A., Calsyn, R. J., Allen, G., & Kenny, D. A. (1994). Helping homeless mentally ill people: What variables mediate and moderate program effects? American Journal of Community Psychology, 22, 661-683. Sobel, M. E. (1982). Asymptotic confidence intervals for indirect effects in structural equation models. In S. Leinhardt (Ed.), Sociological Methodology 1982 (pp. 290-312). Washington DC: American Sociological Association.