# Multivariate analysis of variance

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 21 Multivariate analysis of variance In previous chapters, we explored the use of analysis of variance to compare groups on a single dependent variable. In many research situations, however, we are interested in comparing groups on a range of different characteristics. This is quite common in clinical research, where the focus is on the evaluation of the impact of an intervention on a variety of outcome measures (e.g. anxiety, depression). Multivariate analysis of variance (MANOVA) is an extension of analysis of variance for use when you have more than one dependent variable. These dependent variables should be related in some way, or there should be some conceptual reason for considering them together. MANOVA compares the groups and tells you whether the mean differences between the groups on the combination of dependent variables are likely to have occurred by chance. To do this, MANOVA creates a new summary dependent variable, which is a linear combination of each of your original dependent variables. It then performs an analysis of variance using this new combined dependent variable. MANOVA will tell you if there is a significant difference between your groups on this composite dependent variable; it also provides the univariate results for each of your dependent variables separately. Some of you might be thinking: why not just conduct a series of ANOVAs separately for each dependent variable? This is in fact what many researchers do. Unfortunately, by conducting a whole series of analyses you run the risk of an inflated Type 1 error. (See the introduction to Part Five for a discussion of Type 1 and Type 2 errors.) Put simply, this means that the more analyses you run the more likely you are to find a significant result, even if in reality there are no differences between your groups. The advantage of using MANOVA is that it controls or adjusts for this increased risk of a Type 1 error; however, this comes at a cost. MANOVA is a much more complex set of procedures, and it has a number of additional assumptions that must be met. If you decide that MANOVA is a bit out of your depth just yet, all is not lost. If you have a number of dependent variables, you can still perform a series of ANOVAs 283

3 Multivariate analysis of variance 285 Assumptions: MANOVA has a number of assumptions. These are discussed in more detail in the next section. You should also review the material on assumptions in the introduction to Part Five of this book. 1. Sample size. 2. Normality. 3. Outliers. 4. Linearity. 5. Homogeneity of regression. 6. Multicollinearity and singularity. 7. Homogeneity of variance-covariance matrices. Non-parametric alternative: None. Assumption testing Before proceeding with the main MANOVA analysis, we will test whether our data conform to the assumptions listed in the summary. Some of these tests are not strictly necessary given our large sample size, but I will demonstrate them so that you can see the steps involved. 1. Sample size You need to have more cases in each cell than you have dependent variables. Ideally, you will have more than this, but this is the absolute minimum. Having a larger sample can also help you get away with violations of some of the other assumptions (e.g. normality). The minimum required number of cases in each cell in this example is three (the number of dependent variables). We have a total of six cells (two levels of our independent variable: male/female, and three dependent variables for each). The number of cases in each cell is provided as part of the MANOVA output. In our case, we have many more than the required number of cases per cell (see the Descriptive statistics box in the Output). 2. Normality Although the significance tests of MANOVA are based on the multivariate normal distribution, in practice it is reasonably robust to modest violations of normality (except where the violations are due to outliers). According to Tabachnick and Fidell (2007, p. 251), a sample size of at least 20 in each cell should ensure robustness. You need to check both univariate normality (see Chapter 6) and multivariate normality (using something called Mahalanobis distances). The procedures used to check for normality will also help you identify any outliers (see Assumption 3). 3. Outliers MANOVA is quite sensitive to outliers (i.e. data points or scores that are different from the remainder of the scores). You need to check for univariate outliers (for each

4 286 Statistical techniques to compare groups of the dependent variables separately) and multivariate outliers. Multivariate outliers are participants with a strange combination of scores on the various dependent variables (e.g. very high on one variable, but very low on another). Check for univariate outliers by using Explore (see Chapter 6). The procedure to check for multivariate outliers and multivariate normality is demonstrated below. Checking multivariate normality To test for multivariate normality, we will be asking SPSS to calculate Mahalanobis distances using the Regression menu. Mahalanobis distance is the distance of a particular case from the centroid of the remaining cases, where the centroid is the point created by the means of all the variables (Tabachnick & Fidell 2007). This analysis will pick up on any cases that have a strange pattern of scores across the three dependent variables. The procedure detailed below will create a new variable in your data file (labelled mah_1). Each person or subject receives a value on this variable that indicates the degree to which their pattern of scores differs from the remainder of the sample. To decide whether a case is an outlier, you need to compare the Mahalanobis distance value against a critical value (this is obtained using a chi-square critical value table). If an individual s mah_1 score exceeds this value, it is considered an outlier. MANOVA can tolerate a few outliers, particularly if their scores are not too extreme and you have a reasonable size data file. With too many outliers, or very extreme scores, you may need to consider deleting the cases or, alternatively, transforming the variables involved (see Tabachnick & Fidell 2007, p. 72). Procedure for obtaining Mahalanobis distances 1. From the menu at the top of the screen, click on Analyze, then select Regression, then Linear. 2. Click on the variable in your data file that uniquely identifies each of your cases (in this case, it is ID). Move this variable into the Dependent box. 3. In the Independent box, put the continuous dependent variables that you will be using in your MANOVA analysis (e.g. Total Negative Affect: tnegaff, Total Positive Affect: tposaff, Total perceived stress: tpstress). 4. Click on the Save button. In the section marked Distances, click on Mahalanobis. 5. Click on Continue and then OK (or on Paste to save to Syntax Editor). The syntax from this procedure is:

5 Multivariate analysis of variance 287 REGRESSION /MISSING LISTWISE /STATISTICS COEFF OUTS R ANOVA /CRITERIA=PIN(.05) POUT(.10) /NOORIGIN /DEPENDENT id /METHOD=ENTER tposaff tnegaff tpstress /SAVE MAHAL. Some of the output generated from this procedure is shown below. Towards the bottom of this table, you will see a row labelled Mahal. distance. Look across under the column marked Maximum. Take note of this value (in this example, it is 18.1). You will be comparing this number to a critical value. This critical value is determined by using a chi-square table, with the number of dependent variables that you have as your degrees of freedom (df) value. The alpha value that you use is.001. To simplify this whole process I have summarised the key values for you in Table 21.1, for studies up to ten dependent variables. Find the column with the number of dependent variables that you have in your study (in this example, it is 3). Read across to find the critical value (in this case, it is 16.27). Compare the maximum value you obtained from your output (e.g. 18.1) with this critical value. If your value is larger than the critical value, you have multivariate outliers in your data file. In my example at least one of my cases exceeded the critical value of 16.27, suggesting the presence of multivariate outliers. I will need

7 Multivariate analysis of variance 289 Total Positive Affect: tposaff, Total perceived stress: tpstress) and move them into the Matrix Variables box. 4. Select the sex variable and move it into the Rows box. 5. Click on the Options button and select Exclude cases variable by variable. 6. Click on Continue and then OK (or on Paste to save to Syntax Editor). The syntax from this procedure is: GRAPH /SCATTERPLOT(MATRIX)=tposaff tnegaff tpstress /PANEL ROWVAR=sex ROWOP=CROSS /MISSING=VARIABLEWISE. The output generated from this procedure is shown below. These plots do not show any obvious evidence of non-linearity; therefore, our assumption of linearity is satisfied.

8 290 Statistical techniques to compare groups 5. Homogeneity of regression This assumption is important only if you are intending to perform a stepdown analysis. This approach is used when you have some theoretical or conceptual reason for ordering your dependent variables. It is quite a complex procedure and is beyond the scope of this book. If you are interested in finding out more, see Tabachnick and Fidell (2007, p. 252). 6. Multicollinearity and singularity MANOVA works best when the dependent variables are moderately correlated. With low correlations, you should consider running separate univariate analysis of variance for your various dependent variables. When the dependent variables are highly correlated, this is referred to as multicollinearity. This can occur when one of your variables is a combination of other variables (e.g. the total scores of a scale that is made up of subscales that are also included as dependent variables). This is referred to as singularity, and can be avoided by knowing what your variables are and how the scores are obtained. While there are quite sophisticated ways of checking for multicollinearity, the simplest way is to run Correlation and to check the strength of the correlations among your dependent variables (see Chapter 11). Correlations up around.8 or.9 are reason for concern. If you find any of these, you may need to consider removing one of the strongly correlated pairs of dependent variables. 7. Homogeneity of variance-covariance matrices Fortunately, the test of this assumption is generated as part of your MANOVA output. The test used to assess this is Box s M Test of Equality of Covariance Matrices. This is discussed in more detail in the interpretation of the output presented below. PERFORMING MANOVA The procedure for performing a one-way multivariate analysis of variance to explore sex differences in our set of dependent variables (Total Negative Affect, Total Positive Affect, Total perceived stress) is described below. This technique can be extended to perform a two-way or higher-order factorial MANOVA by adding additional independent variables. It can also be extended to include covariates (see analysis of covariance discussed in Chapter 22).

9 Multivariate analysis of variance 291 Procedure for MANOVA 1. From the menu at the top of the screen, click on Analyze, then select General Linear Model, then Multivariate. 2. In the Dependent Variables box, enter each of your dependent variables (e.g. Total Negative Affect, Total Positive Affect, Total perceived stress). 3. In the Fixed Factors box, enter your independent variable (e.g. sex). 4. Click on the Model button. Make sure that the Full factorial button is selected in the Specify Model box. 5. Down the bottom in the Sum of squares box, Type III should be displayed. This is the default method of calculating sums of squares. Click on Continue. 6. Click on the Options button. In the section labelled Factor and Factor interactions, click on your independent variable (e.g. sex). Move it into the box marked Display Means for:. In the Display section of this screen, put a tick in the boxes for Descriptive Statistics, Estimates of effect size and Homogeneity tests. 7. Click on Continue and then OK (or on Paste to save to Syntax Editor). The syntax generated from this procedure is: GLM tposaff tnegaff tpstress BY sex /METHOD = SSTYPE(3) /INTERCEPT = INCLUDE /EMMEANS = TABLES(sex) /PRINT = DESCRIPTIVE ETASQ HOMOGENEITY /CRITERIA = ALPHA(.05) /DESIGN = sex.

10 292 Statistical techniques to compare groups Selected output generated from this procedure is shown below. Descriptive Statistics Total Positive Affect Total Negative Affect Total perceived stress SEX MALES FEMALES Total MALES FEMALES To t al MALES FEMALES To t al Mean Std. Deviat ion N Box's Test of Equality of Covariance Matrices a Box's M F df1 df2 Sig Tests the null hypothesis that the observed covariance matrices of the dependent variables are equal across groups. a. Design: Intercept+SEX Levene's Test of Equality of Error Variances a Total Positive Affect Total Negative Affect Total perceived stress F d f1 d f2 Sig Tests the null hypothesis that the error variance of the dependent variable is equal across groups. a. Design: Intercept+SEX

11 Multivariate analysis of variance 293 SEX Dependent Variable Total Positive Affect Total Negative Affect Total perceived stress SEX MALES FEMALES MALES FEMALES MALES FEMALES 95% Confidenc e Interval Mean Std. Error Lower Bound Upper Bound INTERPRETATION OF OUTPUT FROM MANOVA The key aspects of the output generated by MANOVA are presented below. Descriptive statistics Check that the information is correct. In particular, check that the N values correspond to what you know about your sample. These N values are your cell sizes (see Assumption 1). Make sure that you have more cases in each cell than the number of dependent variables. If you have over 30, then any violations of normality or equality of variance that may exist are not going to matter too much.

12 294 Statistical techniques to compare groups Box s Test The output box labelled Box s Test of Equality of Covariance Matrices will tell you whether your data violates the assumption of homogeneity of variance-covariance matrices. If the Sig. value is larger than.001, you have not violated the assumption. Tabachnick and Fidell (2007, p. 281) warn that Box s M can tend to be too strict when you have a large sample size. Fortunately, in our example the Box s M Sig. value is.33; therefore, we have not violated this assumption. Levene s Test The next box to look at is Levene s Test of Equality of Error Variances. In the Sig. column, look for any values that are less than.05. These would indicate that you have violated the assumption of equality of variance for that variable. In the current example, none of the variables recorded significant values; therefore, we can assume equal variances. If you do violate this assumption of equality of variances, you will need to set a more conservative alpha level for determining significance for that variable in the univariate F-test. Tabachnick and Fidell (2007) suggest an alpha of.025 or.01 rather than the conventional.05 level. Multivariate tests This set of multivariate tests of significance will indicate whether there are statistically significant differences among the groups on a linear combination of the dependent variables. There are a number of statistics to choose from (Wilks Lambda, Hotelling s Trace, Pillai s Trace). One of the most commonly reported statistics is Wilks Lambda. Tabachnick and Fidell (2007) recommend Wilks Lambda for general use; however, if your data have problems (small sample size, unequal N values, violation of assumptions), then Pillai s Trace is more robust (see comparison of statistics in Tabachnick & Fidell 2007, p. 252). In situations where you have only two groups, the F-tests for Wilks Lambda, Hotelling s Trace and Pillai s Trace are identical. Wilks Lambda You will find the value you are looking for in the second section of the Multivariate Tests table, in the row labelled with the name of your independent or grouping variable (in this case, SEX). Don t make the mistake of using the first set of figures, which refers to the intercept. Find the value of Wilks Lambda and its associated significance level (Sig.). If the significance level is less than.05, then you can conclude that there is a difference among your groups. Here we obtained a Wilks Lambda value of.976, with a significance value of.014. This is less than.05; therefore, there is a statistically significant difference between males and females in terms of their overall wellbeing.

13 Multivariate analysis of variance 295 Between-subjects effects If you obtain a significant result on this multivariate test of significance, this gives you permission to investigate further in relation to each of your dependent variables. Do males and females differ on all of the dependent measures, or just some? This information is provided in the Tests of Between-Subjects Effects output box. Because you are looking at a number of separate analyses here, it is suggested that you set a higher alpha level to reduce the chance of a Type 1 error (i.e. finding a significant result when there isn t really one). The most common way of doing this is to apply what is known as a Bonferroni adjustment. In its simplest form, this involves dividing your original alpha level of.05 by the number of analyses that you intend to do (see Tabachnick & Fidell 2007, p. 270, for more sophisticated versions of this formula). In this case, we have three dependent variables to investigate; therefore, we would divide.05 by 3, giving a new alpha level of.017. We will consider our results significant only if the probability value (Sig.) is less than.017. In the Tests of Between-Subjects Effects box, move down to the third set of values in a row labelled with your independent variable (in this case, SEX). You will see each of your dependent variables listed, with their associated univariate F, df and Sig. values. You interpret these in the same way as you would a normal one-way analysis of variance. In the Sig. column, look for any values that are less than.017 (our new adjusted alpha level). In our example, only one of the dependent variables (Total perceived stress) recorded a significance value less than our cut-off (with a Sig. value of.004). In this study, the only significant difference between males and females was on their perceived stress scores. Effect size The importance of the impact of sex on perceived stress can be evaluated using the effect size statistic provided in the final column. Partial Eta Squared represents the proportion of the variance in the dependent variable (perceived stress scores) that can be explained by the independent variable (sex). The value in this case is.019 which, according to generally accepted criteria (Cohen 1988, pp ), is considered quite a small effect (see the introduction to Part Five for a discussion of effect size). This represents only 1.9 per cent of the variance in perceived stress scores explained by sex. Comparing group means Although we know that males and females differed in terms of perceived stress, we do not know who had the higher scores. To find this out, we refer to the output table provided in the section labelled Estimated Marginal Means. For Total perceived stress, the mean score for males was and for females Although statistically significant, the actual difference in the two mean scores was very small, fewer than 2 scale points.

14 296 Statistical techniques to compare groups Follow-up analyses In the example shown above, there were only two levels to the independent variable (males, females). When you have independent variables with three or more levels, it is necessary to conduct follow-up univariate analyses to identify where the significant differences lie (compare Group 1 with Group 2, and compare Group 2 with Group 3 etc.). One way to do this would be to use one-way ANOVA on the dependent variables that were significant in the MANOVA (e.g. perceived stress). Within the one-way ANOVA procedure (see Chapter 18), you could request post-hoc tests for your variable with three or more levels. Remember to make the necessary adjustments to your alpha level using the Bonferroni adjustment if you run multiple analyses. PRESENTING THE RESULTS FROM MANOVA The results of this multivariate analysis of variance could be presented as follows: A one-way between-groups multivariate analysis of variance was performed to investigate sex differences in psychological wellbeing. Three dependent variables were used: positive affect, negative affect and perceived stress. The independent variable was gender. Preliminary assumption testing was conducted to check for normality, linearity, univariate and multivariate outliers, homogeneity of variancecovariance matrices, and multicollinearity, with no serious violations noted. There was a statistically significant difference between males and females on the combined dependent variables, F (3, 428) = 3.57, p =.014; Wilks Lambda =.98; partial eta squared =.02. When the results for the dependent variables were considered separately, the only difference to reach statistical significance, using a Bonferroni adjusted alpha level of.017, was perceived stress, F (1, 430) = 8.34, p =.004, partial eta squared =.02. An inspection of the mean scores indicated that females reported slightly higher levels of perceived stress (M = 27.42, SD = 6.08) than males (M = 25.79, SD = 5.41). ADDITIONAL EXERCISE Health Data file: sleep4ed.sav. See Appendix for details of the data file. 1. Conduct a one-way MANOVA to see if there are gender differences in each of the individual items that make up the Sleepiness and Associated Sensations Scale. The variables you will need as dependent variables are fatigue, lethargy, tired, sleepy, energy.

### Chapter 5 Analysis of variance SPSS Analysis of variance

Chapter 5 Analysis of variance SPSS Analysis of variance Data file used: gss.sav How to get there: Analyze Compare Means One-way ANOVA To test the null hypothesis that several population means are equal,

### Factor B: Curriculum New Math Control Curriculum (B (B 1 ) Overall Mean (marginal) Females (A 1 ) Factor A: Gender Males (A 2) X 21

1 Factorial ANOVA The ANOVA designs we have dealt with up to this point, known as simple ANOVA or oneway ANOVA, had only one independent grouping variable or factor. However, oftentimes a researcher has

### EPS 625 ANALYSIS OF COVARIANCE (ANCOVA) EXAMPLE USING THE GENERAL LINEAR MODEL PROGRAM

EPS 6 ANALYSIS OF COVARIANCE (ANCOVA) EXAMPLE USING THE GENERAL LINEAR MODEL PROGRAM ANCOVA One Continuous Dependent Variable (DVD Rating) Interest Rating in DVD One Categorical/Discrete Independent Variable

### Multivariate Analysis of Variance. The general purpose of multivariate analysis of variance (MANOVA) is to determine

2 - Manova 4.3.05 25 Multivariate Analysis of Variance What Multivariate Analysis of Variance is The general purpose of multivariate analysis of variance (MANOVA) is to determine whether multiple levels

### SPSS: Descriptive and Inferential Statistics. For Windows

For Windows August 2012 Table of Contents Section 1: Summarizing Data...3 1.1 Descriptive Statistics...3 Section 2: Inferential Statistics... 10 2.1 Chi-Square Test... 10 2.2 T tests... 11 2.3 Correlation...

### Multivariate Analysis of Variance (MANOVA)

Chapter 415 Multivariate Analysis of Variance (MANOVA) Introduction Multivariate analysis of variance (MANOVA) is an extension of common analysis of variance (ANOVA). In ANOVA, differences among various

### SPSS Tests for Versions 9 to 13

SPSS Tests for Versions 9 to 13 Chapter 2 Descriptive Statistic (including median) Choose Analyze Descriptive statistics Frequencies... Click on variable(s) then press to move to into Variable(s): list

### Simple Linear Regression, Scatterplots, and Bivariate Correlation

1 Simple Linear Regression, Scatterplots, and Bivariate Correlation This section covers procedures for testing the association between two continuous variables using the SPSS Regression and Correlate analyses.

### Profile analysis is the multivariate equivalent of repeated measures or mixed ANOVA. Profile analysis is most commonly used in two cases:

Profile Analysis Introduction Profile analysis is the multivariate equivalent of repeated measures or mixed ANOVA. Profile analysis is most commonly used in two cases: ) Comparing the same dependent variables

### January 26, 2009 The Faculty Center for Teaching and Learning

THE BASICS OF DATA MANAGEMENT AND ANALYSIS A USER GUIDE January 26, 2009 The Faculty Center for Teaching and Learning THE BASICS OF DATA MANAGEMENT AND ANALYSIS Table of Contents Table of Contents... i

### SPSS Explore procedure

SPSS Explore procedure One useful function in SPSS is the Explore procedure, which will produce histograms, boxplots, stem-and-leaf plots and extensive descriptive statistics. To run the Explore procedure,

### Canonical Correlation

Chapter 400 Introduction Canonical correlation analysis is the study of the linear relations between two sets of variables. It is the multivariate extension of correlation analysis. Although we will present

### Multivariate Analysis of Variance (MANOVA)

Multivariate Analysis of Variance (MANOVA) Aaron French, Marcelo Macedo, John Poulsen, Tyler Waterson and Angela Yu Keywords: MANCOVA, special cases, assumptions, further reading, computations Introduction

### Main Effects and Interactions

Main Effects & Interactions page 1 Main Effects and Interactions So far, we ve talked about studies in which there is just one independent variable, such as violence of television program. You might randomly

### Simple Tricks for Using SPSS for Windows

Simple Tricks for Using SPSS for Windows Chapter 14. Follow-up Tests for the Two-Way Factorial ANOVA The Interaction is Not Significant If you have performed a two-way ANOVA using the General Linear Model,

### Multivariate analyses

14 Multivariate analyses Learning objectives By the end of this chapter you should be able to: Recognise when it is appropriate to use multivariate analyses (MANOVA) and which test to use (traditional

### INTERPRETING THE ONE-WAY ANALYSIS OF VARIANCE (ANOVA)

INTERPRETING THE ONE-WAY ANALYSIS OF VARIANCE (ANOVA) As with other parametric statistics, we begin the one-way ANOVA with a test of the underlying assumptions. Our first assumption is the assumption of

### This chapter will demonstrate how to perform multiple linear regression with IBM SPSS

CHAPTER 7B Multiple Regression: Statistical Methods Using IBM SPSS This chapter will demonstrate how to perform multiple linear regression with IBM SPSS first using the standard method and then using the

### One-Way ANOVA using SPSS 11.0. SPSS ANOVA procedures found in the Compare Means analyses. Specifically, we demonstrate

1 One-Way ANOVA using SPSS 11.0 This section covers steps for testing the difference between three or more group means using the SPSS ANOVA procedures found in the Compare Means analyses. Specifically,

### DISCRIMINANT FUNCTION ANALYSIS (DA)

DISCRIMINANT FUNCTION ANALYSIS (DA) John Poulsen and Aaron French Key words: assumptions, further reading, computations, standardized coefficents, structure matrix, tests of signficance Introduction Discriminant

### Data analysis process

Data analysis process Data collection and preparation Collect data Prepare codebook Set up structure of data Enter data Screen data for errors Exploration of data Descriptive Statistics Graphs Analysis

### 10. Comparing Means Using Repeated Measures ANOVA

10. Comparing Means Using Repeated Measures ANOVA Objectives Calculate repeated measures ANOVAs Calculate effect size Conduct multiple comparisons Graphically illustrate mean differences Repeated measures

### SPSS Guide How-to, Tips, Tricks & Statistical Techniques

SPSS Guide How-to, Tips, Tricks & Statistical Techniques Support for the course Research Methodology for IB Also useful for your BSc or MSc thesis March 2014 Dr. Marijke Leliveld Jacob Wiebenga, MSc CONTENT

### Moderation. Moderation

Stats - Moderation Moderation A moderator is a variable that specifies conditions under which a given predictor is related to an outcome. The moderator explains when a DV and IV are related. Moderation

### Module 9: Nonparametric Tests. The Applied Research Center

Module 9: Nonparametric Tests The Applied Research Center Module 9 Overview } Nonparametric Tests } Parametric vs. Nonparametric Tests } Restrictions of Nonparametric Tests } One-Sample Chi-Square Test

### Contrasts ask specific questions as opposed to the general ANOVA null vs. alternative

Chapter 13 Contrasts and Custom Hypotheses Contrasts ask specific questions as opposed to the general ANOVA null vs. alternative hypotheses. In a one-way ANOVA with a k level factor, the null hypothesis

### UNDERSTANDING THE INDEPENDENT-SAMPLES t TEST

UNDERSTANDING The independent-samples t test evaluates the difference between the means of two independent or unrelated groups. That is, we evaluate whether the means for two independent groups are significantly

### A Guide for a Selection of SPSS Functions

A Guide for a Selection of SPSS Functions IBM SPSS Statistics 19 Compiled by Beth Gaedy, Math Specialist, Viterbo University - 2012 Using documents prepared by Drs. Sheldon Lee, Marcus Saegrove, Jennifer

### Linear Models in STATA and ANOVA

Session 4 Linear Models in STATA and ANOVA Page Strengths of Linear Relationships 4-2 A Note on Non-Linear Relationships 4-4 Multiple Linear Regression 4-5 Removal of Variables 4-8 Independent Samples

### 6 Comparison of differences between 2 groups: Student s T-test, Mann-Whitney U-Test, Paired Samples T-test and Wilcoxon Test

6 Comparison of differences between 2 groups: Student s T-test, Mann-Whitney U-Test, Paired Samples T-test and Wilcoxon Test Having finally arrived at the bottom of our decision tree, we are now going

### UNDERSTANDING THE TWO-WAY ANOVA

UNDERSTANDING THE e have seen how the one-way ANOVA can be used to compare two or more sample means in studies involving a single independent variable. This can be extended to two independent variables

### An introduction to IBM SPSS Statistics

An introduction to IBM SPSS Statistics Contents 1 Introduction... 1 2 Entering your data... 2 3 Preparing your data for analysis... 10 4 Exploring your data: univariate analysis... 14 5 Generating descriptive

### SPSS ADVANCED ANALYSIS WENDIANN SETHI SPRING 2011

SPSS ADVANCED ANALYSIS WENDIANN SETHI SPRING 2011 Statistical techniques to be covered Explore relationships among variables Correlation Regression/Multiple regression Logistic regression Factor analysis

### Introduction to Analysis of Variance (ANOVA) Limitations of the t-test

Introduction to Analysis of Variance (ANOVA) The Structural Model, The Summary Table, and the One- Way ANOVA Limitations of the t-test Although the t-test is commonly used, it has limitations Can only

### Directions for using SPSS

Directions for using SPSS Table of Contents Connecting and Working with Files 1. Accessing SPSS... 2 2. Transferring Files to N:\drive or your computer... 3 3. Importing Data from Another File Format...

### The Dummy s Guide to Data Analysis Using SPSS

The Dummy s Guide to Data Analysis Using SPSS Mathematics 57 Scripps College Amy Gamble April, 2001 Amy Gamble 4/30/01 All Rights Rerserved TABLE OF CONTENTS PAGE Helpful Hints for All Tests...1 Tests

### SPSS for Exploratory Data Analysis Data used in this guide: studentp.sav (http://people.ysu.edu/~gchang/stat/studentp.sav)

Data used in this guide: studentp.sav (http://people.ysu.edu/~gchang/stat/studentp.sav) Organize and Display One Quantitative Variable (Descriptive Statistics, Boxplot & Histogram) 1. Move the mouse pointer

### Examining Differences (Comparing Groups) using SPSS Inferential statistics (Part I) Dwayne Devonish

Examining Differences (Comparing Groups) using SPSS Inferential statistics (Part I) Dwayne Devonish Statistics Statistics are quantitative methods of describing, analysing, and drawing inferences (conclusions)

### SCHOOL OF HEALTH AND HUMAN SCIENCES DON T FORGET TO RECODE YOUR MISSING VALUES

SCHOOL OF HEALTH AND HUMAN SCIENCES Using SPSS Topics addressed today: 1. Differences between groups 2. Graphing Use the s4data.sav file for the first part of this session. DON T FORGET TO RECODE YOUR

### Once saved, if the file was zipped you will need to unzip it.

1 Commands in SPSS 1.1 Dowloading data from the web The data I post on my webpage will be either in a zipped directory containing a few files or just in one file containing data. Please learn how to unzip

### SPSS Resources. 1. See website (readings) for SPSS tutorial & Stats handout

Analyzing Data SPSS Resources 1. See website (readings) for SPSS tutorial & Stats handout Don t have your own copy of SPSS? 1. Use the libraries to analyze your data 2. Download a trial version of SPSS

### Data Analysis in SPSS. February 21, 2004. If you wish to cite the contents of this document, the APA reference for them would be

Data Analysis in SPSS Jamie DeCoster Department of Psychology University of Alabama 348 Gordon Palmer Hall Box 870348 Tuscaloosa, AL 35487-0348 Heather Claypool Department of Psychology Miami University

### T-test & factor analysis

Parametric tests T-test & factor analysis Better than non parametric tests Stringent assumptions More strings attached Assumes population distribution of sample is normal Major problem Alternatives Continue

### Doing Multiple Regression with SPSS. In this case, we are interested in the Analyze options so we choose that menu. If gives us a number of choices:

Doing Multiple Regression with SPSS Multiple Regression for Data Already in Data Editor Next we want to specify a multiple regression analysis for these data. The menu bar for SPSS offers several options:

### DDBA 8438: The t Test for Independent Samples Video Podcast Transcript

DDBA 8438: The t Test for Independent Samples Video Podcast Transcript JENNIFER ANN MORROW: Welcome to The t Test for Independent Samples. My name is Dr. Jennifer Ann Morrow. In today's demonstration,

### HYPOTHESIS TESTING: CONFIDENCE INTERVALS, T-TESTS, ANOVAS, AND REGRESSION

HYPOTHESIS TESTING: CONFIDENCE INTERVALS, T-TESTS, ANOVAS, AND REGRESSION HOD 2990 10 November 2010 Lecture Background This is a lightning speed summary of introductory statistical methods for senior undergraduate

### 3: Graphing Data. Objectives

3: Graphing Data Objectives Create histograms, box plots, stem-and-leaf plots, pie charts, bar graphs, scatterplots, and line graphs Edit graphs using the Chart Editor Use chart templates SPSS has the

### ABSORBENCY OF PAPER TOWELS

ABSORBENCY OF PAPER TOWELS 15. Brief Version of the Case Study 15.1 Problem Formulation 15.2 Selection of Factors 15.3 Obtaining Random Samples of Paper Towels 15.4 How will the Absorbency be measured?

### Simple Linear Regression in SPSS STAT 314

Simple Linear Regression in SPSS STAT 314 1. Ten Corvettes between 1 and 6 years old were randomly selected from last year s sales records in Virginia Beach, Virginia. The following data were obtained,

### Mixed 2 x 3 ANOVA. Notes

Mixed 2 x 3 ANOVA This section explains how to perform an ANOVA when one of the variables takes the form of repeated measures and the other variable is between-subjects that is, independent groups of participants

### Using SPSS version 14 Joel Elliott, Jennifer Burnaford, Stacey Weiss

Using SPSS version 14 Joel Elliott, Jennifer Burnaford, Stacey Weiss SPSS is a program that is very easy to learn and is also very powerful. This manual is designed to introduce you to the program however,

### COMPARISONS OF CUSTOMER LOYALTY: PUBLIC & PRIVATE INSURANCE COMPANIES.

277 CHAPTER VI COMPARISONS OF CUSTOMER LOYALTY: PUBLIC & PRIVATE INSURANCE COMPANIES. This chapter contains a full discussion of customer loyalty comparisons between private and public insurance companies

### 8. Comparing Means Using One Way ANOVA

8. Comparing Means Using One Way ANOVA Objectives Calculate a one-way analysis of variance Run various multiple comparisons Calculate measures of effect size A One Way ANOVA is an analysis of variance

### Association Between Variables

Contents 11 Association Between Variables 767 11.1 Introduction............................ 767 11.1.1 Measure of Association................. 768 11.1.2 Chapter Summary.................... 769 11.2 Chi

### An analysis method for a quantitative outcome and two categorical explanatory variables.

Chapter 11 Two-Way ANOVA An analysis method for a quantitative outcome and two categorical explanatory variables. If an experiment has a quantitative outcome and two categorical explanatory variables that

### Using SPSS Prepared by Pam Schraedley January 2002

Using SPSS Prepared by Pam Schraedley January 2002 This document was prepared using SPSS versions 8 and 10 for the PC. Mac versions or other PC versions may look slightly different, but most instructions

### Technology Step-by-Step Using StatCrunch

Technology Step-by-Step Using StatCrunch Section 1.3 Simple Random Sampling 1. Select Data, highlight Simulate Data, then highlight Discrete Uniform. 2. Fill in the following window with the appropriate

### Course Objective This course is designed to give you a basic understanding of how to run regressions in SPSS.

SPSS Regressions Social Science Research Lab American University, Washington, D.C. Web. www.american.edu/provost/ctrl/pclabs.cfm Tel. x3862 Email. SSRL@American.edu Course Objective This course is designed

### Bill Burton Albert Einstein College of Medicine william.burton@einstein.yu.edu April 28, 2014 EERS: Managing the Tension Between Rigor and Resources 1

Bill Burton Albert Einstein College of Medicine william.burton@einstein.yu.edu April 28, 2014 EERS: Managing the Tension Between Rigor and Resources 1 Calculate counts, means, and standard deviations Produce

### Research Methods & Experimental Design

Research Methods & Experimental Design 16.422 Human Supervisory Control April 2004 Research Methods Qualitative vs. quantitative Understanding the relationship between objectives (research question) and

### PASS Sample Size Software

Chapter 250 Introduction The Chi-square test is often used to test whether sets of frequencies or proportions follow certain patterns. The two most common instances are tests of goodness of fit using multinomial

### Descriptive Statistics

Descriptive Statistics Primer Descriptive statistics Central tendency Variation Relative position Relationships Calculating descriptive statistics Descriptive Statistics Purpose to describe or summarize

### Projects Involving Statistics (& SPSS)

Projects Involving Statistics (& SPSS) Academic Skills Advice Starting a project which involves using statistics can feel confusing as there seems to be many different things you can do (charts, graphs,

### Chapter 23. Inferences for Regression

Chapter 23. Inferences for Regression Topics covered in this chapter: Simple Linear Regression Simple Linear Regression Example 23.1: Crying and IQ The Problem: Infants who cry easily may be more easily

### Reporting Statistics in Psychology

This document contains general guidelines for the reporting of statistics in psychology research. The details of statistical reporting vary slightly among different areas of science and also among different

### The Chi-Square Test. STAT E-50 Introduction to Statistics

STAT -50 Introduction to Statistics The Chi-Square Test The Chi-square test is a nonparametric test that is used to compare experimental results with theoretical models. That is, we will be comparing observed

### PRE-SERVICE SCIENCE AND PRIMARY SCHOOL TEACHERS PERCEPTIONS OF SCIENCE LABORATORY ENVIRONMENT

PRE-SERVICE SCIENCE AND PRIMARY SCHOOL TEACHERS PERCEPTIONS OF SCIENCE LABORATORY ENVIRONMENT Gamze Çetinkaya 1 and Jale Çakıroğlu 1 1 Middle East Technical University Abstract: The associations between

### NCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )

Chapter 340 Principal Components Regression Introduction is a technique for analyzing multiple regression data that suffer from multicollinearity. When multicollinearity occurs, least squares estimates

### Chapter Seven. Multiple regression An introduction to multiple regression Performing a multiple regression on SPSS

Chapter Seven Multiple regression An introduction to multiple regression Performing a multiple regression on SPSS Section : An introduction to multiple regression WHAT IS MULTIPLE REGRESSION? Multiple

### Chapter 7 Section 7.1: Inference for the Mean of a Population

Chapter 7 Section 7.1: Inference for the Mean of a Population Now let s look at a similar situation Take an SRS of size n Normal Population : N(, ). Both and are unknown parameters. Unlike what we used

### Sydney Roberts Predicting Age Group Swimmers 50 Freestyle Time 1. 1. Introduction p. 2. 2. Statistical Methods Used p. 5. 3. 10 and under Males p.

Sydney Roberts Predicting Age Group Swimmers 50 Freestyle Time 1 Table of Contents 1. Introduction p. 2 2. Statistical Methods Used p. 5 3. 10 and under Males p. 8 4. 11 and up Males p. 10 5. 10 and under

### There are six different windows that can be opened when using SPSS. The following will give a description of each of them.

SPSS Basics Tutorial 1: SPSS Windows There are six different windows that can be opened when using SPSS. The following will give a description of each of them. The Data Editor The Data Editor is a spreadsheet

### UNDERSTANDING THE ONE-WAY ANOVA

UNDERSTANDING The One-way Analysis of Variance (ANOVA) is a procedure for testing the hypothesis that K population means are equal, where K >. The One-way ANOVA compares the means of the samples or groups

### A full analysis example Multiple correlations Partial correlations

A full analysis example Multiple correlations Partial correlations New Dataset: Confidence This is a dataset taken of the confidence scales of 41 employees some years ago using 4 facets of confidence (Physical,

### Statistical Analysis Using SPSS for Windows Getting Started (Ver. 2014/11/6) The numbers of figures in the SPSS_screenshot.pptx are shown in red.

Statistical Analysis Using SPSS for Windows Getting Started (Ver. 2014/11/6) The numbers of figures in the SPSS_screenshot.pptx are shown in red. 1. How to display English messages from IBM SPSS Statistics

### Statistics and research

Statistics and research Usaneya Perngparn Chitlada Areesantichai Drug Dependence Research Center (WHOCC for Research and Training in Drug Dependence) College of Public Health Sciences Chulolongkorn University,

### Variables and Data A variable contains data about anything we measure. For example; age or gender of the participants or their score on a test.

The Analysis of Research Data The design of any project will determine what sort of statistical tests you should perform on your data and how successful the data analysis will be. For example if you decide

### Chapter 16 Appendix. Nonparametric Tests with Excel, JMP, Minitab, SPSS, CrunchIt!, R, and TI-83-/84 Calculators

The Wilcoxon Rank Sum Test Chapter 16 Appendix Nonparametric Tests with Excel, JMP, Minitab, SPSS, CrunchIt!, R, and TI-83-/84 Calculators These nonparametric tests make no assumption about Normality.

### Two Related Samples t Test

Two Related Samples t Test In this example 1 students saw five pictures of attractive people and five pictures of unattractive people. For each picture, the students rated the friendliness of the person

### Simple linear regression

Simple linear regression Introduction Simple linear regression is a statistical method for obtaining a formula to predict values of one variable from another where there is a causal relationship between

### SPSS Workbook 2 - Descriptive Statistics

TEESSIDE UNIVERSITY SCHOOL OF HEALTH & SOCIAL CARE SPSS Workbook 2 - Descriptive Statistics Includes: Recoding variables Cronbachs Alpha Module Leader:Sylvia Storey Phone:016420384969 s.storey@tees.ac.uk

### ANOVA ANOVA. Two-Way ANOVA. One-Way ANOVA. When to use ANOVA ANOVA. Analysis of Variance. Chapter 16. A procedure for comparing more than two groups

ANOVA ANOVA Analysis of Variance Chapter 6 A procedure for comparing more than two groups independent variable: smoking status non-smoking one pack a day > two packs a day dependent variable: number of

### We are often interested in the relationship between two variables. Do people with more years of full-time education earn higher salaries?

Statistics: Correlation Richard Buxton. 2008. 1 Introduction We are often interested in the relationship between two variables. Do people with more years of full-time education earn higher salaries? Do

### KSTAT MINI-MANUAL. Decision Sciences 434 Kellogg Graduate School of Management

KSTAT MINI-MANUAL Decision Sciences 434 Kellogg Graduate School of Management Kstat is a set of macros added to Excel and it will enable you to do the statistics required for this course very easily. To

### Multivariate Analysis of Variance (MANOVA): I. Theory

Gregory Carey, 1998 MANOVA: I - 1 Multivariate Analysis of Variance (MANOVA): I. Theory Introduction The purpose of a t test is to assess the likelihood that the means for two groups are sampled from the

### 5. Correlation. Open HeightWeight.sav. Take a moment to review the data file.

5. Correlation Objectives Calculate correlations Calculate correlations for subgroups using split file Create scatterplots with lines of best fit for subgroups and multiple correlations Correlation The

### CHAPTER 11 CHI-SQUARE: NON-PARAMETRIC COMPARISONS OF FREQUENCY

CHAPTER 11 CHI-SQUARE: NON-PARAMETRIC COMPARISONS OF FREQUENCY The hypothesis testing statistics detailed thus far in this text have all been designed to allow comparison of the means of two or more samples

### Using SPSS to perform Chi-Square tests:

Using SPSS to perform Chi-Square tests: Graham Hole, January 2006: page 1: Using SPSS to perform Chi-Square tests: This handout explains how to perform the two types of Chi-Square test that were discussed

### Using Minitab for Regression Analysis: An extended example

Using Minitab for Regression Analysis: An extended example The following example uses data from another text on fertilizer application and crop yield, and is intended to show how Minitab can be used to

### PSYCHOLOGY 320L Problem Set #3: One-Way ANOVA and Analytical Comparisons

PSYCHOLOGY 30L Problem Set #3: One-Way ANOVA and Analytical Comparisons Name: Score:. You and Dr. Exercise have decided to conduct a study on exercise and its effects on mood ratings. Many studies (Babyak

### Analysing Questionnaires using Minitab (for SPSS queries contact -) Graham.Currell@uwe.ac.uk

Analysing Questionnaires using Minitab (for SPSS queries contact -) Graham.Currell@uwe.ac.uk Structure As a starting point it is useful to consider a basic questionnaire as containing three main sections:

### 10. Analysis of Longitudinal Studies Repeat-measures analysis

Research Methods II 99 10. Analysis of Longitudinal Studies Repeat-measures analysis This chapter builds on the concepts and methods described in Chapters 7 and 8 of Mother and Child Health: Research methods.

### EPS 625 INTERMEDIATE STATISTICS FRIEDMAN TEST

EPS 625 INTERMEDIATE STATISTICS The Friedman test is an extension of the Wilcoxon test. The Wilcoxon test can be applied to repeated-measures data if participants are assessed on two occasions or conditions

### TABLE OF CONTENTS. About Chi Squares... 1. What is a CHI SQUARE?... 1. Chi Squares... 1. Hypothesis Testing with Chi Squares... 2

About Chi Squares TABLE OF CONTENTS About Chi Squares... 1 What is a CHI SQUARE?... 1 Chi Squares... 1 Goodness of fit test (One-way χ 2 )... 1 Test of Independence (Two-way χ 2 )... 2 Hypothesis Testing