Comparing a Multiple Regression Model Across Groups
|
|
|
- Rhoda Williamson
- 9 years ago
- Views:
Transcription
1 Comparing a Multiple Regression Across Groups We might want to know whether a particular set of predictors leads to a multiple regression model that works equally effectively for two (or more) different groups (populations, treatments, cultures, social-temporal changes, etc.). Here's an example While developing a multiple regression model to be used to select graduate students based on scores, one of the faculty pointed out that it might not be a good idea to use the same model to select Experimental and Clinical graduate students. The way to answer this question is a bit cumbersome, but can be very important to consider. Here's what we'll do Split the file into a Clinical and an Experimental subfile Run the multiple regression predicting grad gpa from the three scores for each subfile Then compare how well the predictor set predicts the criterion for the two groups using Fisher's Z-test Then compare the structure (weights) of the model for the two groups using Hotelling's t-test and the Meng, etc. Z- test First we split the sample Data Split File Be sure "Organize output by groups" is marked and move the variable representing the groups into the "Groups Based on:" window Any analysis you request will be done separately for all the groups defined by this variable. Next, get the multiple regression for each group Analyze Regression Linear move graduate gpa into the "Dependent " window move grev, greq and grea into the "Independent(s)" window remember -- with the "split files" we did earlier, we'll get a separate model for each group SPSS Syntax SORT CASES BY program. SPLIT FILE SEPARATE BY program. RESSION /STATISTICS COEFF OUTS R ANOVA CHANGE /DEPENDENT ggpa /METHOD=ENTER grev greq grea.
2 Here's the abbreviated output PROGRAM = Clinical (n=64) Summary b Adjusted Std. Error of R R Square R Square the Estimate.835 a a. Predictors: (Constant), Verbal subscore of, Quantitative subscore of, Analytic subscore of b. PROGRAM = Clinical PROGRAM = Experimental (n=76) Summary b Adjusted Std. Error of R R Square R Square the Estimate.735 a a. Predictors: (Constant), Verbal subscore of, Quantitative subscore of, Analytic subscore of b. PROGRAM = Experimental (Constant) Analytic subscore of Quantitative subscore of Verbal subscore of Coefficients a,b Unstanda rdized Coefficien ts Stand ardize d Coeffi cients Si B Beta t g E E E a. Dependent Variable: st year graduate gpa -- criterion variable b. PROGRAM = Clinical (Constant) Analytic subscore of Quantitative subscore of Verbal subscore of Coefficients a,b Unstandar dized Coefficient s Stand ardize d Coeffi cients B Beta t Sig E E E a. Dependent Variable: st year graduate gpa -- criterion variable b. PROGRAM = Experimental Comparing the Fit -- R² values -- of the two models To compare the "fit" of this predictor set in each group we will use the FZT program to perform Fisher's Ztest to compare the R² values of.698 and.54 Remember the Fisher s Z part of the FZT program uses R (r) values! Applying the FZT program with ry =.835 & N=64 and ry =.735 & N=76 gives Z =.527 and so p >.05 So, based upon these sample data we would conclude that the predictor set does equally well for both groups. But remember that this is not a powerful test and that these groups have rather small sample sizes for this test. We might want to re-evaluate this question based on a larger sample size..
3 Comparing the "structure" of the two models. We want to work with the larger of the two groups, so that the test will have best sensitivity. So, first we have to tell SPSS that we want to analyze data only from Experimental students (program = 2). Data Select Cases Be sure "If condition is satisfied" is marked and click the "If " button Specify the variable and value that identifies the cases that are to be included in the analysis Next we have to construct a predicted criterion value from each group's model. Transform Compute
4 Finally we get the correlation of each model with the criterion and with each other (remember that the correlation between two models is represented by the correlation between their y' values). Because of the selection we did above these analyses will be based only on data from the Experimental students. Analyze Correlate Bivariate SPSS Syntax SELECT IF (program = 2). COMPUTE clinpred = ( *grea) + ( *greq) + (.007*grev) COMPUTE exppred = ( *grea) + ( *greq) + ( *grev) CORR VARIABLES = gpa clinpred exppred. select experimental cases y based on clinical model y based on exp model get correlations to compare the clinical and experimental group models
5 Here's the output.. Correlations st year graduate gpa -- criterion variable EXPPRED CLINPRED Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N Pearson Correlation Sig. (2-tailed) N st year graduate gpa -- criterion variable EXPPRED CLINPRED Direct R Crossed R -- the same as the R from the original multiple regression analysis of the experimental data above. -- when you apply the weights from the Clinical sample multiple regression model onto the Experimental sample Correlation -- remember that the correlation between two models is represented by the correlation between their y' values Remember that the Hotellings t / Steiger s Z part of the FZT program uses R (r) values! Applying the FZT program with ry =.735, ry2 =.532 and r2 =.724 and N = 76 gives t = 3.44 & Z = 3.22 Based on this we would conclude there are structural differences between the best multiple regression model for predicting st year GPA for Clinical and Experimental graduate students. Inspection of the standardized weights of the two regression models suggests that all three predictors are important for predicting Experimental students grades, with something of an emphasis for the analytic subscale. For the Clinical students, the quant subscale seems the most important, with a lesser contribution by the analytic (and don't get too brave about ignoring the verbal -- remember the sample size is small).
6 Examining Individual Predictors for Between Group Differences in Contribution Asking if a single predictor has a different regression weight for two different groups is equivalent to asking if there is an interaction between that predictor and group membership. (Please note that asking about a regression slope difference and about a correlation difference are two different things you know how to use Fisher s Test to compare correlations across groups). This approach uses a single model, applied to the full sample Criterion = bpredictor + b2group + b3predictor*group + a If b3 is significant, then there is a difference between then predictor regression weights of the two groups. However, this approach gets cumbersome when applied to models with multiple predictors. With 3 predictors we would look at the model. Each interaction term is designed to tell us if a particular predictor has a regression slope difference across the groups. y = bg + b2p + b3g*p + b4p2 + b5g*p2 + b6p3 + b7g*p3 +a Because the collinearity among the interaction terms and between a predictor s term and other predictor s interaction terms all influence the interaction b weights, there has been dissatisfaction with how well this approach works for multiple predictors. Also, because his approach does not involve constructing different models for each group, it does not allow the comparison of the fit of the two models or an examination of the substitutability of the two models Another approach is to apply a significance test to each predictor s b weights from the two models to directly test for a significant difference. (Again, this is different from comparing the same correlation from 2 groups). However, there are competing formulas for SE b-difference. Here is the most common (e.g., Cohen, 983). b G - b G2 Z = SE b-difference SE b-difference 2 (df * SE ) + (df * SE 2) bg bg bg2 bg2 = df bg + df bg2 Note: When SEbs aren t available they can be calculated as SEb = b / t However, work by two research groups has demonstrated that, for large sample studies (both N > 30) this Standard Error estimator is negatively biased (produces error estimates that are too small), so that the resulting Z-values are too large, promoting Type I & Type 3 errors. (Brame, Paternost, Mazerolle & Piquero, 998; Clogg, Petrova & Haritou, 995). Leading to the formulas bg - bg2 SE b-difference = ( SEbG 2 + SEbG2 2 ) and Z = ( SEbG 2 + SEbG2 2 ) Remember: Just because the weight from model is significant and the weight from another model is non significant does not mean that the two weights are significantly different!!! You must apply this to determine if they are significantly different! Here are the results from these models Predictor Clinical Group Experimental Group SEb-diff Z (Brame/Clogg) p b SEb** b SEb** Analytic <.00 Quantitative <.00 Verbal The results show that both Analytic and Quantitative have significantly different regression weights in the clinical and experimental samples, while Verbal has equivalent regression weights in the two groups.
7 Example write-up of these analyses (which used some univariate and correlation info not shown above): A series of regression analyses were run to examine the relationships between graduate school grade point average (GGPA) and the Verbal (V), Quantitative (Q) and Analytic (A) subscales and compare the models derived from the Clinical and Experimental programs. Table shows the univariate statistics, correlations of each variable with graduate GGPA, and the multiple regression weights for the two programs. For the Clinical Program students this model had an R² =.698, F(3,60) = 4.35, p <.00, with Q and A having significant regression weights and Q seeming to have the major contribution (based on inspection of the β weights). For the Clinical Program students this model had an R² =.54, F(3,72) = 47.53, p <.00, with all three predictors having significant regression weights and A seeming to have the major contribution (based on inspection of the β weights). Comparison of the fit of the model from the Clinical and Experimental programs revealed that there was no significant difference between the respective R² values, Z =.527, p >.05. A comparison of the structure of the models from the two groups was also conducted by applying the model derived from the Clinical Program to the data from the Experimental Program and comparing the resulting crossed R² with the direct R² originally obtained from this group. The direct R²=.54 and crossed R²=.283 were significantly different, Z = 3.22, p <.0, which indicates that the apparent differential structure of the regression weights from the two groups described above warrants further interpretation and investigation. Further analyses revealed that both Analytic and Quantitative have significantly different regression weights in the clinical and experimental samples (Z=4.0, p <.00 & Z=3.50, p <.00, respectively), while Verbal has equivalent regression weights in the two groups (Z=.309, p=.9) Table Summary statistics, correlations and multiple regression weights from the Clinical and Experimental program participants. Clinical Program Experimental Program r with r with Variable mean std GGPA b β mean std GGPA b β GGPA V ** -.36 Q **.0056** **.0023**.34 A **.0027* **.0086**.754 constant * p <.05 ** p <.0
Statistical Control using Partial and Semi-partial (Part) Correlations
Statistical Control using Partial and Semi-partial (Part) A study of performance in graduate school produced the following correlations. 1st year graduate gpa -- criterion variable undergraduate gpa full
SPSS Guide: Regression Analysis
SPSS Guide: Regression Analysis I put this together to give you a step-by-step guide for replicating what we did in the computer lab. It should help you run the tests we covered. The best way to get familiar
Multiple Regression in SPSS This example shows you how to perform multiple regression. The basic command is regression : linear.
Multiple Regression in SPSS This example shows you how to perform multiple regression. The basic command is regression : linear. In the main dialog box, input the dependent variable and several predictors.
Simple Linear Regression, Scatterplots, and Bivariate Correlation
1 Simple Linear Regression, Scatterplots, and Bivariate Correlation This section covers procedures for testing the association between two continuous variables using the SPSS Regression and Correlate analyses.
Multiple Regression. Page 24
Multiple Regression Multiple regression is an extension of simple (bi-variate) regression. The goal of multiple regression is to enable a researcher to assess the relationship between a dependent (predicted)
Binary Logistic Regression
Binary Logistic Regression Main Effects Model Logistic regression will accept quantitative, binary or categorical predictors and will code the latter two in various ways. Here s a simple model including
Chapter Seven. Multiple regression An introduction to multiple regression Performing a multiple regression on SPSS
Chapter Seven Multiple regression An introduction to multiple regression Performing a multiple regression on SPSS Section : An introduction to multiple regression WHAT IS MULTIPLE REGRESSION? Multiple
Two Related Samples t Test
Two Related Samples t Test In this example 1 students saw five pictures of attractive people and five pictures of unattractive people. For each picture, the students rated the friendliness of the person
HYPOTHESIS TESTING: CONFIDENCE INTERVALS, T-TESTS, ANOVAS, AND REGRESSION
HYPOTHESIS TESTING: CONFIDENCE INTERVALS, T-TESTS, ANOVAS, AND REGRESSION HOD 2990 10 November 2010 Lecture Background This is a lightning speed summary of introductory statistical methods for senior undergraduate
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Chapter 3 Student Lecture Notes 3- Chapter 3 Introduction to Linear Regression and Correlation Analsis Fall 2006 Fundamentals of Business Statistics Chapter Goals To understand the methods for displaing
Moderator and Mediator Analysis
Moderator and Mediator Analysis Seminar General Statistics Marijtje van Duijn October 8, Overview What is moderation and mediation? What is their relation to statistical concepts? Example(s) October 8,
Mixed 2 x 3 ANOVA. Notes
Mixed 2 x 3 ANOVA This section explains how to perform an ANOVA when one of the variables takes the form of repeated measures and the other variable is between-subjects that is, independent groups of participants
Pearson s Correlation
Pearson s Correlation Correlation the degree to which two variables are associated (co-vary). Covariance may be either positive or negative. Its magnitude depends on the units of measurement. Assumes the
Additional sources Compilation of sources: http://lrs.ed.uiuc.edu/tseportal/datacollectionmethodologies/jin-tselink/tselink.htm
Mgt 540 Research Methods Data Analysis 1 Additional sources Compilation of sources: http://lrs.ed.uiuc.edu/tseportal/datacollectionmethodologies/jin-tselink/tselink.htm http://web.utk.edu/~dap/random/order/start.htm
This chapter will demonstrate how to perform multiple linear regression with IBM SPSS
CHAPTER 7B Multiple Regression: Statistical Methods Using IBM SPSS This chapter will demonstrate how to perform multiple linear regression with IBM SPSS first using the standard method and then using the
Chapter 7. Comparing Means in SPSS (t-tests) Compare Means analyses. Specifically, we demonstrate procedures for running Dependent-Sample (or
1 Chapter 7 Comparing Means in SPSS (t-tests) This section covers procedures for testing the differences between two means using the SPSS Compare Means analyses. Specifically, we demonstrate procedures
Example of Including Nonlinear Components in Regression
Example of Including onlinear Components in Regression These are real data obtained at a local martial arts tournament. First-time adult competitors were approached during registration and asked to complete
Analysing Questionnaires using Minitab (for SPSS queries contact -) [email protected]
Analysing Questionnaires using Minitab (for SPSS queries contact -) [email protected] Structure As a starting point it is useful to consider a basic questionnaire as containing three main sections:
SPSS Resources. 1. See website (readings) for SPSS tutorial & Stats handout
Analyzing Data SPSS Resources 1. See website (readings) for SPSS tutorial & Stats handout Don t have your own copy of SPSS? 1. Use the libraries to analyze your data 2. Download a trial version of SPSS
Moderation. Moderation
Stats - Moderation Moderation A moderator is a variable that specifies conditions under which a given predictor is related to an outcome. The moderator explains when a DV and IV are related. Moderation
SPSS Tests for Versions 9 to 13
SPSS Tests for Versions 9 to 13 Chapter 2 Descriptive Statistic (including median) Choose Analyze Descriptive statistics Frequencies... Click on variable(s) then press to move to into Variable(s): list
Univariate Regression
Univariate Regression Correlation and Regression The regression line summarizes the linear relationship between 2 variables Correlation coefficient, r, measures strength of relationship: the closer r is
1.1. Simple Regression in Excel (Excel 2010).
.. Simple Regression in Excel (Excel 200). To get the Data Analysis tool, first click on File > Options > Add-Ins > Go > Select Data Analysis Toolpack & Toolpack VBA. Data Analysis is now available under
Chapter 2 Probability Topics SPSS T tests
Chapter 2 Probability Topics SPSS T tests Data file used: gss.sav In the lecture about chapter 2, only the One-Sample T test has been explained. In this handout, we also give the SPSS methods to perform
Outline. Topic 4 - Analysis of Variance Approach to Regression. Partitioning Sums of Squares. Total Sum of Squares. Partitioning sums of squares
Topic 4 - Analysis of Variance Approach to Regression Outline Partitioning sums of squares Degrees of freedom Expected mean squares General linear test - Fall 2013 R 2 and the coefficient of correlation
Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression
Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression Objectives: To perform a hypothesis test concerning the slope of a least squares line To recognize that testing for a
The Dummy s Guide to Data Analysis Using SPSS
The Dummy s Guide to Data Analysis Using SPSS Mathematics 57 Scripps College Amy Gamble April, 2001 Amy Gamble 4/30/01 All Rights Rerserved TABLE OF CONTENTS PAGE Helpful Hints for All Tests...1 Tests
January 26, 2009 The Faculty Center for Teaching and Learning
THE BASICS OF DATA MANAGEMENT AND ANALYSIS A USER GUIDE January 26, 2009 The Faculty Center for Teaching and Learning THE BASICS OF DATA MANAGEMENT AND ANALYSIS Table of Contents Table of Contents... i
Chapter 7: Simple linear regression Learning Objectives
Chapter 7: Simple linear regression Learning Objectives Reading: Section 7.1 of OpenIntro Statistics Video: Correlation vs. causation, YouTube (2:19) Video: Intro to Linear Regression, YouTube (5:18) -
UNDERSTANDING THE DEPENDENT-SAMPLES t TEST
UNDERSTANDING THE DEPENDENT-SAMPLES t TEST A dependent-samples t test (a.k.a. matched or paired-samples, matched-pairs, samples, or subjects, simple repeated-measures or within-groups, or correlated groups)
Power Analysis for Correlation & Multiple Regression
Power Analysis for Correlation & Multiple Regression Sample Size & multiple regression Subject-to-variable ratios Stability of correlation values Useful types of power analyses Simple correlations Full
Correlation and Regression Analysis: SPSS
Correlation and Regression Analysis: SPSS Bivariate Analysis: Cyberloafing Predicted from Personality and Age These days many employees, during work hours, spend time on the Internet doing personal things,
DEPARTMENT OF PSYCHOLOGY UNIVERSITY OF LANCASTER MSC IN PSYCHOLOGICAL RESEARCH METHODS ANALYSING AND INTERPRETING DATA 2 PART 1 WEEK 9
DEPARTMENT OF PSYCHOLOGY UNIVERSITY OF LANCASTER MSC IN PSYCHOLOGICAL RESEARCH METHODS ANALYSING AND INTERPRETING DATA 2 PART 1 WEEK 9 Analysis of covariance and multiple regression So far in this course,
Factor Analysis. Principal components factor analysis. Use of extracted factors in multivariate dependency models
Factor Analysis Principal components factor analysis Use of extracted factors in multivariate dependency models 2 KEY CONCEPTS ***** Factor Analysis Interdependency technique Assumptions of factor analysis
A STUDY ON ONBOARDING PROCESS IN SIFY TECHNOLOGIES, CHENNAI
A STUDY ON ONBOARDING PROCESS IN SIFY TECHNOLOGIES, CHENNAI ABSTRACT S. BALAJI*; G. RAMYA** *Assistant Professor, School of Management Studies, Surya Group of Institutions, Vikravandi 605652, Villupuram
HYPOTHESIS TESTING WITH SPSS:
HYPOTHESIS TESTING WITH SPSS: A NON-STATISTICIAN S GUIDE & TUTORIAL by Dr. Jim Mirabella SPSS 14.0 screenshots reprinted with permission from SPSS Inc. Published June 2006 Copyright Dr. Jim Mirabella CHAPTER
Analyzing Intervention Effects: Multilevel & Other Approaches. Simplest Intervention Design. Better Design: Have Pretest
Analyzing Intervention Effects: Multilevel & Other Approaches Joop Hox Methodology & Statistics, Utrecht Simplest Intervention Design R X Y E Random assignment Experimental + Control group Analysis: t
Doing Multiple Regression with SPSS. In this case, we are interested in the Analyze options so we choose that menu. If gives us a number of choices:
Doing Multiple Regression with SPSS Multiple Regression for Data Already in Data Editor Next we want to specify a multiple regression analysis for these data. The menu bar for SPSS offers several options:
Using R for Linear Regression
Using R for Linear Regression In the following handout words and symbols in bold are R functions and words and symbols in italics are entries supplied by the user; underlined words and symbols are optional
Pearson's Correlation Tests
Chapter 800 Pearson's Correlation Tests Introduction The correlation coefficient, ρ (rho), is a popular statistic for describing the strength of the relationship between two variables. The correlation
INTERPRETING THE ONE-WAY ANALYSIS OF VARIANCE (ANOVA)
INTERPRETING THE ONE-WAY ANALYSIS OF VARIANCE (ANOVA) As with other parametric statistics, we begin the one-way ANOVA with a test of the underlying assumptions. Our first assumption is the assumption of
1/27/2013. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2
PSY 512: Advanced Statistics for Psychological and Behavioral Research 2 Introduce moderated multiple regression Continuous predictor continuous predictor Continuous predictor categorical predictor Understand
Module 5: Multiple Regression Analysis
Using Statistical Data Using to Make Statistical Decisions: Data Multiple to Make Regression Decisions Analysis Page 1 Module 5: Multiple Regression Analysis Tom Ilvento, University of Delaware, College
KSTAT MINI-MANUAL. Decision Sciences 434 Kellogg Graduate School of Management
KSTAT MINI-MANUAL Decision Sciences 434 Kellogg Graduate School of Management Kstat is a set of macros added to Excel and it will enable you to do the statistics required for this course very easily. To
Multiple Regression: What Is It?
Multiple Regression Multiple Regression: What Is It? Multiple regression is a collection of techniques in which there are multiple predictors of varying kinds and a single outcome We are interested in
THE RELATIONSHIP OF GRADUATE RECORD EXAMINATION APTITUDE TEST
THE RELATIONSHIP OF GRADUATE RECORD EXAMINATION APTITUDE TEST SCORES AND GRADUATE SCHOOL PERFORMANCE OF INTERNATIONAL STUDENTS AT THE UNITED STATES UNIVERSITIES Ramazan Basturk The Ohio State University
Data Analysis for Marketing Research - Using SPSS
North South University, School of Business MKT 63 Marketing Research Instructor: Mahmood Hussain, PhD Data Analysis for Marketing Research - Using SPSS Introduction In this part of the class, we will learn
THE KRUSKAL WALLLIS TEST
THE KRUSKAL WALLLIS TEST TEODORA H. MEHOTCHEVA Wednesday, 23 rd April 08 THE KRUSKAL-WALLIS TEST: The non-parametric alternative to ANOVA: testing for difference between several independent groups 2 NON
Simple linear regression
Simple linear regression Introduction Simple linear regression is a statistical method for obtaining a formula to predict values of one variable from another where there is a causal relationship between
Part 2: Analysis of Relationship Between Two Variables
Part 2: Analysis of Relationship Between Two Variables Linear Regression Linear correlation Significance Tests Multiple regression Linear Regression Y = a X + b Dependent Variable Independent Variable
Independent t- Test (Comparing Two Means)
Independent t- Test (Comparing Two Means) The objectives of this lesson are to learn: the definition/purpose of independent t-test when to use the independent t-test the use of SPSS to complete an independent
Section 14 Simple Linear Regression: Introduction to Least Squares Regression
Slide 1 Section 14 Simple Linear Regression: Introduction to Least Squares Regression There are several different measures of statistical association used for understanding the quantitative relationship
Co-Curricular Activities and Academic Performance -A Study of the Student Leadership Initiative Programs. Office of Institutional Research
Co-Curricular Activities and Academic Performance -A Study of the Student Leadership Initiative Programs Office of Institutional Research July 2014 Introduction The Leadership Initiative (LI) is a certificate
Chapter 15. Mixed Models. 15.1 Overview. A flexible approach to correlated data.
Chapter 15 Mixed Models A flexible approach to correlated data. 15.1 Overview Correlated data arise frequently in statistical analyses. This may be due to grouping of subjects, e.g., students within classrooms,
Testing and Interpreting Interactions in Regression In a Nutshell
Testing and Interpreting Interactions in Regression In a Nutshell The principles given here always apply when interpreting the coefficients in a multiple regression analysis containing interactions. However,
Chapter 3 Quantitative Demand Analysis
Managerial Economics & Business Strategy Chapter 3 uantitative Demand Analysis McGraw-Hill/Irwin Copyright 2010 by the McGraw-Hill Companies, Inc. All rights reserved. Overview I. The Elasticity Concept
Psychology 205: Research Methods in Psychology
Psychology 205: Research Methods in Psychology Using R to analyze the data for study 2 Department of Psychology Northwestern University Evanston, Illinois USA November, 2012 1 / 38 Outline 1 Getting ready
SPSS Guide How-to, Tips, Tricks & Statistical Techniques
SPSS Guide How-to, Tips, Tricks & Statistical Techniques Support for the course Research Methodology for IB Also useful for your BSc or MSc thesis March 2014 Dr. Marijke Leliveld Jacob Wiebenga, MSc CONTENT
research/scientific includes the following: statistical hypotheses: you have a null and alternative you accept one and reject the other
1 Hypothesis Testing Richard S. Balkin, Ph.D., LPC-S, NCC 2 Overview When we have questions about the effect of a treatment or intervention or wish to compare groups, we use hypothesis testing Parametric
DATA ANALYSIS. QEM Network HBCU-UP Fundamentals of Education Research Workshop Gerunda B. Hughes, Ph.D. Howard University
DATA ANALYSIS QEM Network HBCU-UP Fundamentals of Education Research Workshop Gerunda B. Hughes, Ph.D. Howard University Quantitative Research What is Statistics? Statistics (as a subject) is the science
Bill Burton Albert Einstein College of Medicine [email protected] April 28, 2014 EERS: Managing the Tension Between Rigor and Resources 1
Bill Burton Albert Einstein College of Medicine [email protected] April 28, 2014 EERS: Managing the Tension Between Rigor and Resources 1 Calculate counts, means, and standard deviations Produce
1. What is the critical value for this 95% confidence interval? CV = z.025 = invnorm(0.025) = 1.96
1 Final Review 2 Review 2.1 CI 1-propZint Scenario 1 A TV manufacturer claims in its warranty brochure that in the past not more than 10 percent of its TV sets needed any repair during the first two years
Chapter 5 Analysis of variance SPSS Analysis of variance
Chapter 5 Analysis of variance SPSS Analysis of variance Data file used: gss.sav How to get there: Analyze Compare Means One-way ANOVA To test the null hypothesis that several population means are equal,
Using Excel for inferential statistics
FACT SHEET Using Excel for inferential statistics Introduction When you collect data, you expect a certain amount of variation, just caused by chance. A wide variety of statistical tests can be applied
THE RELATIONSHIP BETWEEN WORKING CAPITAL MANAGEMENT AND DIVIDEND PAYOUT RATIO OF FIRMS LISTED IN NAIROBI SECURITIES EXCHANGE
International Journal of Economics, Commerce and Management United Kingdom Vol. III, Issue 11, November 2015 http://ijecm.co.uk/ ISSN 2348 0386 THE RELATIONSHIP BETWEEN WORKING CAPITAL MANAGEMENT AND DIVIDEND
Multicollinearity Richard Williams, University of Notre Dame, http://www3.nd.edu/~rwilliam/ Last revised January 13, 2015
Multicollinearity Richard Williams, University of Notre Dame, http://www3.nd.edu/~rwilliam/ Last revised January 13, 2015 Stata Example (See appendices for full example).. use http://www.nd.edu/~rwilliam/stats2/statafiles/multicoll.dta,
Using Excel for Statistical Analysis
Using Excel for Statistical Analysis You don t have to have a fancy pants statistics package to do many statistical functions. Excel can perform several statistical tests and analyses. First, make sure
Introduction. Research Problem. Larojan Chandrasegaran (1), Janaki Samuel Thevaruban (2)
Larojan Chandrasegaran (1), Janaki Samuel Thevaruban (2) Determining Factors on Applicability of the Computerized Accounting System in Financial Institutions in Sri Lanka (1) Department of Finance and
INTRODUCTION TO MULTIPLE CORRELATION
CHAPTER 13 INTRODUCTION TO MULTIPLE CORRELATION Chapter 12 introduced you to the concept of partialling and how partialling could assist you in better interpreting the relationship between two primary
IAPRI Quantitative Analysis Capacity Building Series. Multiple regression analysis & interpreting results
IAPRI Quantitative Analysis Capacity Building Series Multiple regression analysis & interpreting results How important is R-squared? R-squared Published in Agricultural Economics 0.45 Best article of the
A full analysis example Multiple correlations Partial correlations
A full analysis example Multiple correlations Partial correlations New Dataset: Confidence This is a dataset taken of the confidence scales of 41 employees some years ago using 4 facets of confidence (Physical,
" Y. Notation and Equations for Regression Lecture 11/4. Notation:
Notation: Notation and Equations for Regression Lecture 11/4 m: The number of predictor variables in a regression Xi: One of multiple predictor variables. The subscript i represents any number from 1 through
Examining Differences (Comparing Groups) using SPSS Inferential statistics (Part I) Dwayne Devonish
Examining Differences (Comparing Groups) using SPSS Inferential statistics (Part I) Dwayne Devonish Statistics Statistics are quantitative methods of describing, analysing, and drawing inferences (conclusions)
The Not-Even-Remotely Close to Being a Complete Guide to SPSS / PASW Syntax. (For SPSS / PASW v.18+)
The Not-Even-Remotely Close to Being a Complete Guide to SPSS / PASW Syntax (For SPSS / PASW v.18+) Dr. Bryan R. Burnham Department of Psychology University of Scranton 1 of 49 Table of Contents 1. What
An Introduction to Path Analysis. nach 3
An Introduction to Path Analysis Developed by Sewall Wright, path analysis is a method employed to determine whether or not a multivariate set of nonexperimental data fits well with a particular (a priori)
Illustration (and the use of HLM)
Illustration (and the use of HLM) Chapter 4 1 Measurement Incorporated HLM Workshop The Illustration Data Now we cover the example. In doing so we does the use of the software HLM. In addition, we will
Lets suppose we rolled a six-sided die 150 times and recorded the number of times each outcome (1-6) occured. The data is
In this lab we will look at how R can eliminate most of the annoying calculations involved in (a) using Chi-Squared tests to check for homogeneity in two-way tables of catagorical data and (b) computing
Chapter 23. Inferences for Regression
Chapter 23. Inferences for Regression Topics covered in this chapter: Simple Linear Regression Simple Linear Regression Example 23.1: Crying and IQ The Problem: Infants who cry easily may be more easily
DDBA 8438: The t Test for Independent Samples Video Podcast Transcript
DDBA 8438: The t Test for Independent Samples Video Podcast Transcript JENNIFER ANN MORROW: Welcome to The t Test for Independent Samples. My name is Dr. Jennifer Ann Morrow. In today's demonstration,
Analysis of Data. Organizing Data Files in SPSS. Descriptive Statistics
Analysis of Data Claudia J. Stanny PSY 67 Research Design Organizing Data Files in SPSS All data for one subject entered on the same line Identification data Between-subjects manipulations: variable to
One-Way ANOVA using SPSS 11.0. SPSS ANOVA procedures found in the Compare Means analyses. Specifically, we demonstrate
1 One-Way ANOVA using SPSS 11.0 This section covers steps for testing the difference between three or more group means using the SPSS ANOVA procedures found in the Compare Means analyses. Specifically,
APPLICATION OF LINEAR REGRESSION MODEL FOR POISSON DISTRIBUTION IN FORECASTING
APPLICATION OF LINEAR REGRESSION MODEL FOR POISSON DISTRIBUTION IN FORECASTING Sulaimon Mutiu O. Department of Statistics & Mathematics Moshood Abiola Polytechnic, Abeokuta, Ogun State, Nigeria. Abstract
Directions for using SPSS
Directions for using SPSS Table of Contents Connecting and Working with Files 1. Accessing SPSS... 2 2. Transferring Files to N:\drive or your computer... 3 3. Importing Data from Another File Format...
II. DISTRIBUTIONS distribution normal distribution. standard scores
Appendix D Basic Measurement And Statistics The following information was developed by Steven Rothke, PhD, Department of Psychology, Rehabilitation Institute of Chicago (RIC) and expanded by Mary F. Schmidt,
USING THE CORRECT STATISTICAL TEST FOR THE EQUALITY OF REGRESSION COEFFICIENTS
USING THE CORRECT STATISTICAL TEST FOR THE EQUALITY OF REGRESSION COEFFICIENTS RAYMOND PATERNOSTER ROBERT BRAME University of Maryland National Consortium on Violence Research PAUL MAZEROLLE University
SIMPLE LINEAR CORRELATION. r can range from -1 to 1, and is independent of units of measurement. Correlation can be done on two dependent variables.
SIMPLE LINEAR CORRELATION Simple linear correlation is a measure of the degree to which two variables vary together, or a measure of the intensity of the association between two variables. Correlation
CORRELATION ANALYSIS
CORRELATION ANALYSIS Learning Objectives Understand how correlation can be used to demonstrate a relationship between two factors. Know how to perform a correlation analysis and calculate the coefficient
EPS 625 INTERMEDIATE STATISTICS FRIEDMAN TEST
EPS 625 INTERMEDIATE STATISTICS The Friedman test is an extension of the Wilcoxon test. The Wilcoxon test can be applied to repeated-measures data if participants are assessed on two occasions or conditions
Simple Linear Regression Inference
Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation
How to Get More Value from Your Survey Data
Technical report How to Get More Value from Your Survey Data Discover four advanced analysis techniques that make survey research more effective Table of contents Introduction..............................................................2
GMAC. Predicting Success in Graduate Management Doctoral Programs
GMAC Predicting Success in Graduate Management Doctoral Programs Kara O. Siegert GMAC Research Reports RR-07-10 July 12, 2007 Abstract An integral part of the test evaluation and improvement process involves
Logs Transformation in a Regression Equation
Fall, 2001 1 Logs as the Predictor Logs Transformation in a Regression Equation The interpretation of the slope and intercept in a regression change when the predictor (X) is put on a log scale. In this
Didacticiel - Études de cas
1 Topic Regression analysis with LazStats (OpenStat). LazStat 1 is a statistical software which is developed by Bill Miller, the father of OpenStat, a wellknow tool by statisticians since many years. These
Introduction to Regression and Data Analysis
Statlab Workshop Introduction to Regression and Data Analysis with Dan Campbell and Sherlock Campbell October 28, 2008 I. The basics A. Types of variables Your variables may take several forms, and it
Main Effects and Interactions
Main Effects & Interactions page 1 Main Effects and Interactions So far, we ve talked about studies in which there is just one independent variable, such as violence of television program. You might randomly
