1 Overview. Fisher s Least Significant Difference (LSD) Test. Lynne J. Williams Hervé Abdi



Similar documents
1 Overview and background

The Kendall Rank Correlation Coefficient

The Bonferonni and Šidák Corrections for Multiple Comparisons

The Binomial Distribution The Binomial and Sign Tests

Section 13, Part 1 ANOVA. Analysis Of Variance

ANOVA ANOVA. Two-Way ANOVA. One-Way ANOVA. When to use ANOVA ANOVA. Analysis of Variance. Chapter 16. A procedure for comparing more than two groups

The Method of Least Squares

Metric Multidimensional Scaling (MDS): Analyzing Distance Matrices

Factor Rotations in Factor Analyses.

Multiple Correspondence Analysis

INTERPRETING THE ONE-WAY ANALYSIS OF VARIANCE (ANOVA)

UNDERSTANDING THE TWO-WAY ANOVA

Analysis of Data. Organizing Data Files in SPSS. Descriptive Statistics

Multiple-Comparison Procedures

Partial Least Squares (PLS) Regression.

1.5 Oneway Analysis of Variance

Partial Least Square Regression PLS-Regression

Statistiek II. John Nerbonne. October 1, Dept of Information Science

One-Way Analysis of Variance: A Guide to Testing Differences Between Multiple Groups

13: Additional ANOVA Topics. Post hoc Comparisons

CHAPTER 13. Experimental Design and Analysis of Variance

Descriptive Statistics

One-Way Analysis of Variance (ANOVA) Example Problem

How To Find Out How Fast A Car Is Going

" Y. Notation and Equations for Regression Lecture 11/4. Notation:

Analysis of Variance ANOVA

MEAN SEPARATION TESTS (LSD AND Tukey s Procedure) is rejected, we need a method to determine which means are significantly different from the others.

General Regression Formulae ) (N-2) (1 - r 2 YX

Biostatistics: DESCRIPTIVE STATISTICS: 2, VARIABILITY

TABLE OF CONTENTS. About Chi Squares What is a CHI SQUARE? Chi Squares Hypothesis Testing with Chi Squares... 2

Part 2: Analysis of Relationship Between Two Variables

The ANOVA for 2x2 Independent Groups Factorial Design

Study Guide for the Final Exam

One-Way ANOVA using SPSS SPSS ANOVA procedures found in the Compare Means analyses. Specifically, we demonstrate

Syntax Menu Description Options Remarks and examples Stored results Methods and formulas References Also see. level(#) , options2

Lecture Notes #3: Contrasts and Post Hoc Tests 3-1

UNDERSTANDING THE DEPENDENT-SAMPLES t TEST

Chapter 19 Split-Plot Designs

UNDERSTANDING THE INDEPENDENT-SAMPLES t TEST

Chapter 14: Repeated Measures Analysis of Variance (ANOVA)

Statistics Review PSY379

Chapter 5 Analysis of variance SPSS Analysis of variance

12: Analysis of Variance. Introduction

Testing for Lack of Fit

1. What is the critical value for this 95% confidence interval? CV = z.025 = invnorm(0.025) = 1.96

Hypothesis testing - Steps

Absolute Value Equations and Inequalities

Introduction to Analysis of Variance (ANOVA) Limitations of the t-test

Regression step-by-step using Microsoft Excel

UNDERSTANDING ANALYSIS OF COVARIANCE (ANCOVA)

Elementary Statistics Sample Exam #3

EPS 625 INTERMEDIATE STATISTICS FRIEDMAN TEST

Calculating P-Values. Parkland College. Isela Guerra Parkland College. Recommended Citation

Multivariate Analysis of Variance (MANOVA)

Univariate Regression

SPSS and AMOS. Miss Brenda Lee 2:00p.m. 6:00p.m. 24 th July, 2015 The Open University of Hong Kong

Multivariate Analysis of Variance (MANOVA): I. Theory

research/scientific includes the following: statistical hypotheses: you have a null and alternative you accept one and reject the other

CHI-SQUARE: TESTING FOR GOODNESS OF FIT

COMMUNITY COLLEGE GRADUATION RATES: DETERMINING THE INFLUENCE OF CAMPUS HOUSING IN A NON-URBAN SETTING. Stephen M. Benson

Chapter 4 and 5 solutions

Week TSX Index

Simple Tricks for Using SPSS for Windows

1 Basic ANOVA concepts

The Friedman Test with MS Excel. In 3 Simple Steps. Kilem L. Gwet, Ph.D.

Lesson 1: Comparison of Population Means Part c: Comparison of Two- Means

Simple Second Order Chi-Square Correction

SIMPLE LINEAR CORRELATION. r can range from -1 to 1, and is independent of units of measurement. Correlation can be done on two dependent variables.

Introduction to Design and Analysis of Experiments with the SAS System (Stat 7010 Lecture Notes)

Advantages of Magnetic Mice over Trackballs as input devices on moving platforms

DISCRIMINANT FUNCTION ANALYSIS (DA)

Friedman's Two-way Analysis of Variance by Ranks -- Analysis of k-within-group Data with a Quantitative Response Variable

Two-sample hypothesis testing, II /16/2004

Simple Linear Regression Inference

Premaster Statistics Tutorial 4 Full solutions

A STUDY OF WHETHER HAVING A PROFESSIONAL STAFF WITH ADVANCED DEGREES INCREASES STUDENT ACHIEVEMENT MEGAN M. MOSSER. Submitted to

8. Comparing Means Using One Way ANOVA

A ew onparametric Levene Test for Equal Variances

Multiple samples: Pairwise comparisons and categorical outcomes

Solving Rational Equations and Inequalities

One-Way Analysis of Variance

The Analysis of Variance ANOVA

Contrast Coding in Multiple Regression Analysis: Strengths, Weaknesses, and Utility of Popular Coding Structures

THE KRUSKAL WALLLIS TEST

Post-hoc comparisons & two-way analysis of variance. Two-way ANOVA, II. Post-hoc testing for main effects. Post-hoc testing 9.

Outline. Topic 4 - Analysis of Variance Approach to Regression. Partitioning Sums of Squares. Total Sum of Squares. Partitioning sums of squares

Multivariate Analysis of Variance. The general purpose of multivariate analysis of variance (MANOVA) is to determine

Interaction between quantitative predictors

DEPARTMENT OF PSYCHOLOGY UNIVERSITY OF LANCASTER MSC IN PSYCHOLOGICAL RESEARCH METHODS ANALYSING AND INTERPRETING DATA 2 PART 1 WEEK 9

Confidence Intervals for the Difference Between Two Means

Implementing Multiple Comparisons on Pearson Chi-square Test for an R C Contingency Table in SAS

Permutation Tests for Comparing Two Populations

Topic 9. Factorial Experiments [ST&D Chapter 15]

Chapter 7 Notes - Inference for Single Samples. You know already for a large sample, you can invoke the CLT so:

How to Deal with The Language-as-Fixed-Effect Fallacy : Common Misconceptions and Alternative Solutions

Investigation of Reconstructive Memory

Independent t- Test (Comparing Two Means)

Method To Solve Linear, Polynomial, or Absolute Value Inequalities:

Multiple Linear Regression

Research Methods & Experimental Design

Transcription:

In Neil Salkind (Ed.), Encyclopedia of Research Design. Thousand Oaks, CA: Sage. 2010 Fisher s Least Significant Difference (LSD) Test Lynne J. Williams Hervé Abdi 1 Overview When an analysis of variance (anova) gives a significant result, this indicates that at least one group differs from the other groups. Yet, the omnibus test does not indicate which group differs. In order to analyze the pattern of difference between means, the anova is often followed by specific comparisons, and the most commonly used involves comparing two means (the so called pairwise comparisons ). The first pairwise comparison technique was developed by Fisher in 1935 and is called the least significant difference (lsd) test. This technique can be used only if the anova F omnibus is significant. The main idea of the lsd is to compute the smallest significant difference (i.e., the lsd) between two means as if these means had been the only means to be compared (i.e., with a t test) and to declare significant any difference larger than the lsd. Lynne J. Williams The University of Toronto Scarborough Hervé Abdi The University of Texas at Dallas Address correspondence to: Hervé Abdi Program in Cognition and Neurosciences, MS: Gr.4.1, The University of Texas at Dallas, Richardson, TX 75083 0688, USA E-mail: herve@utdallas.edu http://www.utd.edu/ herve

2 Fisher s Least Significant Difference (LSD) Test 2 Notations The data to be analyzed comprise A groups, a given group is denoted a. The number of observations of the a-th group is denoted S a. If all groups have the same size it is denoted S. The total number of observations is denoted N. The mean of Group a is denoted M a+. From the anova, the mean square of error (i.e., within group) is denoted MS S(A) and the mean square of effect (i.e., between group) is denoted MS A. 3 Least significant difference The rationale behind the lsd technique value comes from the observation that, when the null hypothesis is true, the value of the t statistics evaluating the difference between Groups a and a is equal to t = M a+ M a + MS S(A) ( 1 S a + 1 S a ), (1) and follows a student s t distribution with N A degrees of freedom. The ratio t would therefore be declared significant at a given α level if the value of t is larger than the critical value for the α level obtained from the t distribution and denoted t ν,α (where ν = N A is the number of degrees of freedom of the error, this value can be obtained from a standard t table). Rewriting this ratio shows that, a difference between the means of Group a and a will be significant if ( 1 M a+ M a + > lsd = t ν,α MS S(A) + 1 ) (2) S a S a When there is an equal number of observation per group, Equation 2 can be simplified as: 2 lsd = t ν,α MS S(A) (3) S In order to evaluate the difference between the means of Groups a and a, we take the absolute value of the difference between the means and compare it to the value of lsd. If M i+ M j+ lsd, (4) then the comparison is declared significant at the chosen α-level (usually.05 or A(A 1).01). Then this procedure is repeated for all comparisons. 2

ABDI & WILLIAMS 3 Note that lsd has more power compared to other post-hoc comparison methods (e.g., the honestly significant difference test, or Tukey test) because the α level for each comparison is not corrected for multiple comparisons. And, because lsd does not correct for multiple comparisons, it severely inflates Type I error (i.e., finding a difference when it does not actually exist). As a consequence, a revised version of the lsd test has been proposed by Hayter (and is knows as the Fisher-Hayter procedure) where the modified lsd (mlsd) is used instead of the lsd. The mlsd is computed using the Studentized range distribution q as mlsd = q α,a 1 MSS(A) S. (5) where q α,a 1 is the α level critical value of the Studentized range distribution for a range of A 1 and for ν = N A degrees of freedom. The mlsd procedure is more conservative than the lsd, but more powerful than the Tukey approach because the critical value for the Tukey approach is obtained from a Studentized range distribution equal to A. This difference in range makes Tukey s critical value always larger than the one used for the mlsd and therefore it makes Tukey s approach more conservative. 4 Example In a series of experiments on eyewitness testimony, Elizabeth Loftus wanted to show that the wording of a question influenced witnesses reports. She showed participants a film of a car accident, then asked them a series of questions. Among the questions was one of five versions of a critical question asking about the speed the vehicles were traveling: 1. How fast were the cars going when they hit each other? 2. How fast were the cars going when they smashed into each other? 3. How fast were the cars going when they collided with each other? 4. How fast were the cars going when they bumped each other? 5. How fast were the cars going when they contacted each other? The data from a fictitious replication of Loftus experiment are shown in Table 1. We have A = 4 groups and S = 10 participants per group. The anova found an effect of the verb used on participants responses. The anova table is shown in Table 2.

4 Fisher s Least Significant Difference (LSD) Test Table 1 Results for a fictitious replication of Loftus & Palmer (1974) in miles per hour Contact Hit Bump Collide Smash 21 23 35 44 39 20 30 35 40 44 26 34 52 33 51 46 51 29 45 47 35 20 54 45 50 13 38 32 30 45 41 34 30 46 39 30 44 42 34 51 42 41 50 49 39 26 35 21 44 55 M.+ 30 35 38 41 46 Table 2 anova results for the replication of Loftus and Palmer (1974). Source df SS MS F P r(f ) Between: A 4 1,460.00 365.00 4.56.0036 Error: S(A) 45 3,600.00 80.00 Total 49 5,060.00 4.1 LSD For an α level of.05, the lsd for these data is computed as: 2 lsd = t ν,.05 MS S(A) n = t ν,.05 80.00 2 10 160 = 2.01 10 = 2.01 4 = 8.04. (6) A similar computation will show that, for these data, the lsd for an α level of.01, is equal to lsd = 2.69 4 = 10.76. For example, the difference between M contact+ and M hit+ is declared non significant because M contact+ M hit+ = 30 35 = 5 < 8.04. (7) The differences and significance of all pairwise comparisons are shown in Table 3.

ABDI & WILLIAMS 5 Table 3 lsd. Difference between means and significance of pairwise comparisions from the (fictitious) replication of Loftus and Palmer (1974). Differences larger than 8.04 are significant at the α =.05 level and are indicated with, differences larger than 10.76 are significant at the α =.01 level and are indicated with. Experimental Group M 1.+ M 2.+ M 3.+ M 4.+ M 5.+ Contact Hit 1 Bump Collide Smash 30 35 38 41 46 M 1.+ = 30 Contact 0.00 5.00 ns 8.00 ns 11.00 16.00 M 2.+ = 35 Hit 0.00 3.00 ns 6.00 ns 11.00 M 3.+ = 38 Bump 0.00 3.00 ns 8.00 ns M 4.+ = 41 Collide 0.00 5.00 ns M 5.+ = 46 Smash 0.00 Table 4 mlsd. Difference between means and significance of pairwise comparisions from the (fictitious) replication of Loftus and Palmer (1974). Differences larger than 10.66 are significant at the α =.05 level and are indicated with, differences larger than 13.21 are significant at the α =.01 level and are indicated with. Experimental Group M 1.+ M 2.+ M 3.+ M 4.+ M 5.+ Contact Hit 1 Bump Collide Smash 30 35 38 41 46 M 1.+ = 30 Contact 0.00 5.00 ns 8.00 ns 11.00 16.00 M 2.+ = 35 Hit 0.00 3.00 ns 6.00 ns 11.00 M 3.+ = 38 Bump 0.00 3.00 ns 8.00 ns M 4.+ = 41 Collide 0.00 5.00 ns M 5.+ = 46 Smash 0.00 4.2 MLSD For an α level of.05, the value of q.05,a 1 is equal to 3.77 and the mlsd for these data is computed as: mlsd = q α,a 1 MSS(A) S = 3.77 8 = 10.66. (8) The value of q.01,a 1 = 4.67, and a similar computation will show that, for these data, the mlsd for an α level of.01, is equal to mlsd = 4.67 8 = 13.21.. For example, the difference between M contact+ and M hit+ is declared non significant because M contact+ M hit+ = 30 35 = 5 < 10.66. (9) The differences and significance of all pairwise comparisons are shown in Table 4.

6 Fisher s Least Significant Difference (LSD) Test Related entries Analysis of variance, Bonferroni procedure, Honestly significant difference (HSD) test, Multiple comparison test, Newman-Keuls test, Pairwise comparisons, Posthoc comparisons, Scheffe s test, Tukey test. Further readings Abdi, H., Edelman, B., Valentin, D., & Dowling, W.J. (2009). Experimental Design and Analysis for Psychology. Oxford: Oxford University Press. Hayter, A.J. (1986). The maximum familywise error rate of Fisher s least significant difference test. Journal of the American Statistical Association, 81, 1001 1004. Seaman, M.A., Levin, J.R., & Serlin, R.C. (1991). New developments in pairwise multiple comparisons some powerful and practicable procedures. Psychological Bulletin, 110, 577 586.