Statistics in Psychosocial Research Lecture 8 Factor Analysis I. Lecturer: Elizabeth Garrett-Mayer
|
|
|
- Barrie Hardy
- 10 years ago
- Views:
Transcription
1 This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this site. Copyright 2006, The Johns Hopkins University and Elizabeth Garrett-Mayer. All rights reserved. Use of these materials permitted only in accordance with license rights granted. Materials provided AS IS ; no representations or warranties provided. User assumes all responsibility for use, and all liability related thereto, and must independently review all materials for accuracy and efficacy. May contain materials owned by others. User is responsible for obtaining permissions for use from third parties as needed.
2 Statistics in Psychosocial Research Lecture 8 Factor Analysis I Lecturer: Elizabeth Garrett-Mayer
3 Motivating Example: Frailty We have a concept of what frailty is, but we can t measure it directly. We think it combines strength, weight, speed, agility, balance, and perhaps other factors We would like to be able to describe the components of frailty with a summary of strength, weight, etc.
4 Factor Analysis Data reduction tool Removes redundancy or duplication from a set of correlated variables Represents correlated variables with a smaller set of derived variables. Factors are formed that are relatively independent of one another. Two types of variables : latent variables: factors observed variables
5 Frailty Variables Speed of fast walk (+) Upper extremity strength (+) Speed of usual walk (+) Pinch strength (+) Time to do chair stands (-) Grip strength (+) Arm circumference (+) Knee extension (+) Body mass index (+) Hip extension (+) Tricep skinfold thickness (+) Time to do Pegboard test (-) Shoulder rotation (+)
6 Other examples Diet Air pollution Personality Customer satisfaction Depression
7 Applications of Factor Analysis 1. Identification of Underlying Factors: clusters variables into homogeneous sets creates new variables (i.e. factors) allows us to gain insight to categories 2. Screening of Variables: identifies groupings to allow us to select one variable to represent many useful in regression (recall collinearity)
8 3. Summary: Applications of Factor Analysis Allows us to describe many variables using a few factors 4. Sampling of variables: helps select small group of variables of representative variables from larger set 5. Clustering of objects: Helps us to put objects (people) into categories depending on their factor scores
9 Perhaps the most widely used (and misused) multivariate [technique] is factor analysis. Few statisticians are neutral about this technique. Proponents feel that factor analysis is the greatest invention since the double bed, while its detractors feel it is a useless procedure that can be used to support nearly any desired interpretation of the data. The truth, as is usually the case, lies somewhere in between. Used properly, factor analysis can yield much useful information; when applied blindly, without regard for its limitations, it is about as useful and informative as Tarot cards. In particular, factor analysis can be used to explore the data for patterns, confirm our hypotheses, or reduce the Many variables to a more manageable number. -- Norman Streiner, PDQ Statistics
10 Orthogonal One Factor Model Classical Test Theory Idea: Ideal: X 1 = F + e 1 var(e j ) = var(e k ), j k X 2 = F + e 2 X m = F + e m Reality: X 1 = λ 1 F + e 1 var(e j ) var(e k ), j k X 2 = λ 2 F + e 2 X m = λ m F + e m (unequal sensitivity to change in factor) (Related to Item Response Theory (IRT))
11 Key Concepts F is latent (i.e.unobserved, underlying) variable X s are observed (i.e. manifest) variables e j is measurement error for X j. λ j is the loading for X j.
12 Assumptions of Factor Analysis Model Measurement error has constant variance and is, on average, 0. Var(e j ) = σ j 2 E(e j ) = 0 No association between the factor and measurement error Cov(F,e j ) = 0 No association between errors: Cov(e j,e k ) = 0 Local (i.e. conditional independence): Given the factor, observed variables are independent of one another. Cov( X j,x k F ) = 0
13 Brief Aside in Path Analysis Local (i.e. conditional independence): Given the factor, observed variables are independent of one another. Cov( X j,x k F ) = 0 F λ 1 λ2 λ 3 X 1 e 1 X 2 X 3 e 2 e 3 X s are only related to each other through their common relationship with F.
14 Optional Assumptions We will make these to simplify our discussions F is standardized (think standard normal ) Var(F) = 1 E(F) = 0 X s are standardized: In practice, this means that we will deal with correlations versus covariance This automatically happens when we use correlation in factor analysis, so it is not an extra step.
15 Some math associated with the ONE FACTOR model λ j 2 is also called the communality of X j in the one factor case (notation: h j2 ) For standardized X j, Corr(F, X j ) = λ j The percentage variability in (standardized) X j explained by F is λ j2. (like an R 2 ) If X j is N(0,1), then λ j is equivalent to: the slope in a regression of X j on F the correlation between F and X j Interpretation of λ j : standardized regression coefficient (regression) path coefficient (path analysis) factor loading (factor analysis)
16 Some more math associated with the ONE factor model Corr(X j, X k )= λ j λ k Note that the correlation between X j and X k is completely determined by the common factor. Recall Cov(e j,e k )=0 Factor loadings (λ j ) are equivalent to correlation between factors and variables when only a SINGLE common factor is involved.
17 Steps in Exploratory Factor Analysis (1) Collect and explore data: choose relevant variables. (2) Extract initial factors (via principal components) (3) Choose number of factors to retain (4) Choose estimation method, estimate model (5) Rotate and interpret (6) (a) Decide if changes need to be made (e.g. drop item(s), include item(s)) (b) repeat (4)-(5) (7) Construct scales and use in further analysis
18 Data Exploration Histograms normality discreteness outliers Covariance and correlations between variables very high or low correlations? Same scale high = good, low = bad?
19 Aside: Correlation vs. Covariance >90% of Factor Analyses use correlation matrix <10% use covariance matrix We will focus on correlation matrix because It is less confusing than switching between the two It is much more commonly used and more commonly applicable Covariance does have its place (we ll address that next time).
20 Data Matrix Factor analysis is totally dependent on correlations between variables. Factor analysis summarizes correlation structure v 1...v k O O n Data Matrix v 1... v k v 1...v k Correlation Matrix v 1... v k F 1..F j Factor Matrix Implications for assumptions about X s?
21 Frailty Example (N=571) arm ski fastw grip pincr upex knee hipext shldr peg bmi usalk skinfld 0.71 fastwalk gripstr pinchstr upextstr kneeext hipext shldrrot pegbrd bmi uslwalk chrstand
22 One Factor Model X 1 = λ 1 F + e 1 X 2 = λ 2 F + e 2 Xm = λ m F + e m
23 One Factor Frailty Solution Variable Loadings arm_circ 0.28 skinfld 0.32 fastwalk 0.30 gripstr 0.32 pinchstr 0.31 upextstr 0.26 kneeext 0.33 hipext 0.26 shldrrot 0.21 pegbrd bmi 0.24 uslwalk 0.28 chrstand These numbers represent the correlations between the common factor, F, and the input variables. Clearly, estimating F is part of the process
24 More than One Factor m factor orthogonal model ORTHOGONAL = INDEPENDENT Example: frailty has domains, including strength, flexibility, speed. m factors, n observed variables X 1 = λ 11 F 1 + λ 12 F λ 1m F m + e 1 X 2 = λ 21 F 1 + λ 22 F λ 2m F m + e 2. X n = λ n1 F 1 + λ n2 F λ nm F m + e n
25 More than One Factor Matrix notation: X nx1 = Λ nxm F mx1 + e nx1 Same general assumptions as one factor model. corr(f s,x j ) = λ js Plus: corr(f s,f r ) = 0 for all s r (i.e. orthogonal) this is forced independence simplifies covariance structure corr(x i,x j ) = λ i1 λ j1 + λ i2 λ j2 + λ i3 λ j3 +. To see details of dependent factors, see Kim and Mueller.
26 Matrix notation: X nx1 = Λ nxm F mx1 + e nx1 X X F F e e n nx m n nm nxm m mx n nx M M L L M O M M O M L L M M M = + λ λ λ λ
27 Factor Matrix λ11 λ12 L λ1 λ21 λ22 L λ 2 M M O M λ 1 λ 1 L λ n n nm nxm Columns represent derived factors Rows represent input variables Loadings represent degree to which each of the variables correlates with each of the factors Loadings range from -1 to 1 Inspection of factor loadings reveals extent to which each of the variables contributes to the meaning of each of the factors. High loadings provide meaning and interpretation of factors (~ regression coefficients) m m
28 Frailty Example Factors Variable size 1 speed Hand Leg Uniqueness strength strength arm_circ skinfld fastwalk gripstr pinchstr upextstr kneeext hipext shldrrot pegbrd bmi uslwalk chrstand
29 Communalities The communality of X j is the proportion of the variance of X j explained by the m common factors: Comm( X j ) = m λ 2 ij i= 1 Recall one factor model: What was the interpretation of λ j2? Comm( X j ) = λ 2 j In other words, it can be thought of as the sum of squared multiplecorrelation coefficients between the X j and the factors. Uniqueness(X j ) = 1 - Comm(X j )
30 Communality of X j Common part of variance covariance between X j and the part of X j due to the underlying factors For standardized X j : 1 = communality + uniqueness uniqueness = 1 communality Can think of uniqueness = var(e j ) If X j is informative, communality is high If X j is not informative, uniqueness is high Intuitively: variables with high communality share more in common with the rest of the variables.
31 Communalities Unstandardized X s: Var(X) = Var(F) + Var(e) Var(X) = Communality + Uniqueness Communality Var(F) Uniqueness Var(e) How can Var(X)=Var(F)= 1 when using standardized variables? That implies that Var(e)=0. After Var(F) is derived, then F is standardized to have variance of 1. Two step procedure. Actual variances are irrelevant when using correlations and/or standardized X s.
32 How many factors? Intuitively: The number of uncorrelated constructs that are jointly measured by the X s. Only useful if number of factors is less than number of X s (recall data reduction ). Identifiability: Is there enough information in the data to estimate all of the parameters in the factor analysis? May be constrained to a certain number of factors.
33 Choosing Number of Factors Use principal components to help decide type of factor analysis number of factors is equivalent to number of variables each factor is a weighted combination of the input variables: F 1 = a 11 X 1 + a 12 X 2 +. Recall: For a factor analysis, generally, X 1 = a 11 F 1 + a 12 F
34 Estimating Principal Components The first PC is the linear combination with maximum variance That is, it finds vector a 1 to maximize Var(F 1 ) = Var(a 1T X)= a 1T Cov(X)a 1 (Can use correlation instead, equation is more complicated looking) Constrained such that Σa 1 2 = 1 First PC: linear combination a 1 X that maximizes Var(a 1T X) such that Σa 1 2 = 1 Second PC: linear combination a 2 X that maximizes Var(a 2T X) such that Σa 2 2 = 1 AND Corr(a 1T X, a 2T X)=0. And so on..
35 Eigenvalues To select how many factors to use, consider eigenvalues from a principal components analysis Two interpretations: eigenvalue equivalent number of variables which the factor represents eigenvalue amount of variance in the data described by the factor. Rules to go by: number of eigenvalues > 1 scree plot % variance explained comprehensibility
36 Frailty Example (principal components; 13 components retained) Component Eigenvalue Difference Proportion Cumulative
37 Scree Plot for Frailty Example 4 3 Eigenvalues Number
38 First 6 factors in principal components Eigenvectors Variable arm_circ skinfld fastwalk gripstr pinchstr upextstr kneeext hipext shldrrot pegbrd bmi uslwalk chrstand
39 At this stage. Don t worry about interpretation of factors! Main concern: whether a smaller number of factors can account for variability Researcher (i.e. YOU) needs to: provide number of common factors to be extracted OR provide objective criterion for choosing number of factors (e.g. scree plot, % variability, etc.)
40 Rotation In principal components, the first factor describes most of variability. After choosing number of factors to retain, we want to spread variability more evenly among factors. To do this we rotate factors: redefine factors such that loadings on various factors tend to be very high (-1 or 1) or very low (0) intuitively, it makes sharper distinctions in the meanings of the factors We use factor analysis for rotation NOT principal components!
41 5 Factors, Unrotated Factor Loadings Variable Uniqueness arm_circ skinfld fastwalk gripstr pinchstr upextstr kneeext hipext shldrrot pegbrd bmi uslwalk chrstand
42 5 Factors, Rotated (varimax rotation) Rotated Factor Loadings Variable Uniqueness arm_circ skinfld fastwalk gripstr pinchstr upextstr kneeext hipext shldrrot pegbrd bmi uslwalk chrstand
43 2 Factors, Unrotated Factor Loadings Variable 1 2 Uniqueness arm_circ skinfld fastwalk gripstr pinchstr upextstr kneeext hipext shldrrot pegbrd bmi uslwalk chrstand
44 2 Factors, Rotated (varimax rotation) Rotated Factor Loadings Variable 1 2 Uniqueness arm_circ skinfld fastwalk gripstr pinchstr upextstr kneeext hipext shldrrot pegbrd bmi uslwalk chrstand
45 Unique Solution? The factor analysis solution is NOT unique! More than one solution will yield the same result. We will understand this better by the end of the lecture..
46 Rotation (continued) Uses ambiguity or non-uniqueness of solution to make interpretation more simple Where does ambiguity come in? Unrotated solution is based on the idea that each factor tries to maximize variance explained, conditional on previous factors What if we take that away? Then, there is not one best solution. All solutions are relatively the same. Goal is simple structure Most construct validation assumes simple (typically rotated) structure. Rotation does NOT improve fit!
47 Rotating Factors (Intuitively) 3 F F2 1 4 F1 4 F1 Factor 1 Factor 2 x x x x Factor 1 Factor 2 x x x x
48 Orthogonal vs. Oblique Rotation Orthogonal: Factors are independent varimax: maximize squared loading variance across variables (sum over factors) quartimax: maximize squared loading variance across factors (sum over variables) Intuition: from previous picture, there is a right angle between axes Note: Uniquenesses remain the same!
49 Orthogonal vs. Oblique Rotation Oblique: Factors not independent. Change in angle. oblimin: minimize squared loading covariance between factors. promax: simplify orthogonal rotation by making small loadings even closer to zero. Target matrix: choose simple structure a priori. (see Kim and Mueller) Intuition: from previous picture, angle between axes is not necessarily a right angle. Note: Uniquenesses remain the same!
50 Promax Rotation: 5 Factors (promax rotation) Rotated Factor Loadings Variable Uniqueness arm_circ skinfld fastwalk gripstr pinchstr upextstr kneeext hipext shldrrot pegbrd bmi uslwalk chrstand
51 Promax Rotation: 2 Factors (promax rotation) Rotated Factor Loadings Variable 1 2 Uniqueness arm_circ skinfld fastwalk gripstr pinchstr upextstr kneeext hipext shldrrot pegbrd bmi uslwalk chrstand
52 Which to use? Choice is generally not critical Interpretation with orthogonal is simple because factors are independent: Loadings are correlations. Structure may appear more simple in oblique, but correlation of factors can be difficult to reconcile (deal with interactions, etc.) Theory? Are the conceptual meanings of the factors associated? Oblique: Loading is no longer interpretable as covariance or correlation between object and factor 2 matrices: pattern matrix (loadings) and structure matrix (correlations) Stata: varimax, promax
53 Steps in Exploratory Factor Analysis (1) Collect data: choose relevant variables. (2) Extract initial factors (via principal components) (3) Choose number of factors to retain (4) Choose estimation method, estimate model (5) Rotate and interpret (6) (a) Decide if changes need to be made (e.g. drop item(s), include item(s)) (b) repeat (4)-(5) (7) Construct scales and use in further analysis
54 Drop variables with Uniqueness>0.50 in 5 factor model. pca arm_circ skinfld fastwalk gripstr pinchstr kneeext bmi uslwalk (obs=782) (principal components; 8 components retained) Component Eigenvalue Difference Proportion Cumulative Eigenvalues Number
55 3 Factor, Varimax Rotated (varimax rotation) Rotated Factor Loadings Variable weight 1 Leg 2 agility. hand 3 str Uniqueness arm_circ skinfld fastwalk gripstr pinchstr kneeext bmi uslwalk
56 2 Factor, Varimax Rotated (varimax rotation) Rotated Factor Loadings Variable weight 1 speed 2 Uniqueness arm_circ skinfld fastwalk gripstr pinchstr kneeext bmi uslwalk
FACTOR ANALYSIS. Factor Analysis is similar to PCA in that it is a technique for studying the interrelationships among variables.
FACTOR ANALYSIS Introduction Factor Analysis is similar to PCA in that it is a technique for studying the interrelationships among variables Both methods differ from regression in that they don t have
Introduction to Principal Components and FactorAnalysis
Introduction to Principal Components and FactorAnalysis Multivariate Analysis often starts out with data involving a substantial number of correlated variables. Principal Component Analysis (PCA) is a
Factor Analysis. Principal components factor analysis. Use of extracted factors in multivariate dependency models
Factor Analysis Principal components factor analysis Use of extracted factors in multivariate dependency models 2 KEY CONCEPTS ***** Factor Analysis Interdependency technique Assumptions of factor analysis
Common factor analysis
Common factor analysis This is what people generally mean when they say "factor analysis" This family of techniques uses an estimate of common variance among the original variables to generate the factor
Rachel J. Goldberg, Guideline Research/Atlanta, Inc., Duluth, GA
PROC FACTOR: How to Interpret the Output of a Real-World Example Rachel J. Goldberg, Guideline Research/Atlanta, Inc., Duluth, GA ABSTRACT THE METHOD This paper summarizes a real-world example of a factor
Exploratory Factor Analysis and Principal Components. Pekka Malo & Anton Frantsev 30E00500 Quantitative Empirical Research Spring 2016
and Principal Components Pekka Malo & Anton Frantsev 30E00500 Quantitative Empirical Research Spring 2016 Agenda Brief History and Introductory Example Factor Model Factor Equation Estimation of Loadings
Factor Analysis. Advanced Financial Accounting II Åbo Akademi School of Business
Factor Analysis Advanced Financial Accounting II Åbo Akademi School of Business Factor analysis A statistical method used to describe variability among observed variables in terms of fewer unobserved variables
4. There are no dependent variables specified... Instead, the model is: VAR 1. Or, in terms of basic measurement theory, we could model it as:
1 Neuendorf Factor Analysis Assumptions: 1. Metric (interval/ratio) data 2. Linearity (in the relationships among the variables--factors are linear constructions of the set of variables; the critical source
PRINCIPAL COMPONENT ANALYSIS
1 Chapter 1 PRINCIPAL COMPONENT ANALYSIS Introduction: The Basics of Principal Component Analysis........................... 2 A Variable Reduction Procedure.......................................... 2
Principal Component Analysis
Principal Component Analysis ERS70D George Fernandez INTRODUCTION Analysis of multivariate data plays a key role in data analysis. Multivariate data consists of many different attributes or variables recorded
FACTOR ANALYSIS NASC
FACTOR ANALYSIS NASC Factor Analysis A data reduction technique designed to represent a wide range of attributes on a smaller number of dimensions. Aim is to identify groups of variables which are relatively
Statistics for Business Decision Making
Statistics for Business Decision Making Faculty of Economics University of Siena 1 / 62 You should be able to: ˆ Summarize and uncover any patterns in a set of multivariate data using the (FM) ˆ Apply
Multivariate Analysis (Slides 13)
Multivariate Analysis (Slides 13) The final topic we consider is Factor Analysis. A Factor Analysis is a mathematical approach for attempting to explain the correlation between a large set of variables
Exploratory Factor Analysis Brian Habing - University of South Carolina - October 15, 2003
Exploratory Factor Analysis Brian Habing - University of South Carolina - October 15, 2003 FA is not worth the time necessary to understand it and carry it out. -Hills, 1977 Factor analysis should not
15.062 Data Mining: Algorithms and Applications Matrix Math Review
.6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop
2. Linearity (in relationships among the variables--factors are linear constructions of the set of variables) F 2 X 4 U 4
1 Neuendorf Factor Analysis Assumptions: 1. Metric (interval/ratio) data. Linearity (in relationships among the variables--factors are linear constructions of the set of variables) 3. Univariate and multivariate
To do a factor analysis, we need to select an extraction method and a rotation method. Hit the Extraction button to specify your extraction method.
Factor Analysis in SPSS To conduct a Factor Analysis, start from the Analyze menu. This procedure is intended to reduce the complexity in a set of data, so we choose Data Reduction from the menu. And the
Overview of Factor Analysis
Overview of Factor Analysis Jamie DeCoster Department of Psychology University of Alabama 348 Gordon Palmer Hall Box 870348 Tuscaloosa, AL 35487-0348 Phone: (205) 348-4431 Fax: (205) 348-8648 August 1,
A Brief Introduction to Factor Analysis
1. Introduction A Brief Introduction to Factor Analysis Factor analysis attempts to represent a set of observed variables X 1, X 2. X n in terms of a number of 'common' factors plus a factor which is unique
A Brief Introduction to SPSS Factor Analysis
A Brief Introduction to SPSS Factor Analysis SPSS has a procedure that conducts exploratory factor analysis. Before launching into a step by step example of how to use this procedure, it is recommended
Review Jeopardy. Blue vs. Orange. Review Jeopardy
Review Jeopardy Blue vs. Orange Review Jeopardy Jeopardy Round Lectures 0-3 Jeopardy Round $200 How could I measure how far apart (i.e. how different) two observations, y 1 and y 2, are from each other?
Exploratory Factor Analysis: rotation. Psychology 588: Covariance structure and factor models
Exploratory Factor Analysis: rotation Psychology 588: Covariance structure and factor models Rotational indeterminacy Given an initial (orthogonal) solution (i.e., Φ = I), there exist infinite pairs of
How To Run Factor Analysis
Getting Started in Factor Analysis (using Stata 10) (ver. 1.5) Oscar Torres-Reyna Data Consultant [email protected] http://dss.princeton.edu/training/ Factor analysis is used mostly for data reduction
How To Understand Multivariate Models
Neil H. Timm Applied Multivariate Analysis With 42 Figures Springer Contents Preface Acknowledgments List of Tables List of Figures vii ix xix xxiii 1 Introduction 1 1.1 Overview 1 1.2 Multivariate Models
CHAPTER 8 FACTOR EXTRACTION BY MATRIX FACTORING TECHNIQUES. From Exploratory Factor Analysis Ledyard R Tucker and Robert C.
CHAPTER 8 FACTOR EXTRACTION BY MATRIX FACTORING TECHNIQUES From Exploratory Factor Analysis Ledyard R Tucker and Robert C MacCallum 1997 180 CHAPTER 8 FACTOR EXTRACTION BY MATRIX FACTORING TECHNIQUES In
Factor Analysis. Chapter 420. Introduction
Chapter 420 Introduction (FA) is an exploratory technique applied to a set of observed variables that seeks to find underlying factors (subsets of variables) from which the observed variables were generated.
NCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )
Chapter 340 Principal Components Regression Introduction is a technique for analyzing multiple regression data that suffer from multicollinearity. When multicollinearity occurs, least squares estimates
Introduction to Principal Component Analysis: Stock Market Values
Chapter 10 Introduction to Principal Component Analysis: Stock Market Values The combination of some data and an aching desire for an answer does not ensure that a reasonable answer can be extracted from
Exploratory Factor Analysis
Exploratory Factor Analysis Definition Exploratory factor analysis (EFA) is a procedure for learning the extent to which k observed variables might measure m abstract variables, wherein m is less than
Factor analysis. Angela Montanari
Factor analysis Angela Montanari 1 Introduction Factor analysis is a statistical model that allows to explain the correlations between a large number of observed correlated variables through a small number
Chapter 7 Factor Analysis SPSS
Chapter 7 Factor Analysis SPSS Factor analysis attempts to identify underlying variables, or factors, that explain the pattern of correlations within a set of observed variables. Factor analysis is often
Multivariate Analysis of Variance (MANOVA): I. Theory
Gregory Carey, 1998 MANOVA: I - 1 Multivariate Analysis of Variance (MANOVA): I. Theory Introduction The purpose of a t test is to assess the likelihood that the means for two groups are sampled from the
Factor Rotations in Factor Analyses.
Factor Rotations in Factor Analyses. Hervé Abdi 1 The University of Texas at Dallas Introduction The different methods of factor analysis first extract a set a factors from a data set. These factors are
Factor Analysis. Factor Analysis
Factor Analysis Principal Components Analysis, e.g. of stock price movements, sometimes suggests that several variables may be responding to a small number of underlying forces. In the factor model, we
Exploratory Factor Analysis
Introduction Principal components: explain many variables using few new variables. Not many assumptions attached. Exploratory Factor Analysis Exploratory factor analysis: similar idea, but based on model.
Factor Analysis. Sample StatFolio: factor analysis.sgp
STATGRAPHICS Rev. 1/10/005 Factor Analysis Summary The Factor Analysis procedure is designed to extract m common factors from a set of p quantitative variables X. In many situations, a small number of
Sections 2.11 and 5.8
Sections 211 and 58 Timothy Hanson Department of Statistics, University of South Carolina Stat 704: Data Analysis I 1/25 Gesell data Let X be the age in in months a child speaks his/her first word and
Topic 10: Factor Analysis
Topic 10: Factor Analysis Introduction Factor analysis is a statistical method used to describe variability among observed variables in terms of a potentially lower number of unobserved variables called
SPSS ADVANCED ANALYSIS WENDIANN SETHI SPRING 2011
SPSS ADVANCED ANALYSIS WENDIANN SETHI SPRING 2011 Statistical techniques to be covered Explore relationships among variables Correlation Regression/Multiple regression Logistic regression Factor analysis
Principal Component Analysis
Principal Component Analysis Principle Component Analysis: A statistical technique used to examine the interrelations among a set of variables in order to identify the underlying structure of those variables.
Latent Class Regression Part II
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
What is Rotating in Exploratory Factor Analysis?
A peer-reviewed electronic journal. Copyright is retained by the first or sole author, who grants right of first publication to the Practical Assessment, Research & Evaluation. Permission is granted to
Data analysis process
Data analysis process Data collection and preparation Collect data Prepare codebook Set up structure of data Enter data Screen data for errors Exploration of data Descriptive Statistics Graphs Analysis
T-test & factor analysis
Parametric tests T-test & factor analysis Better than non parametric tests Stringent assumptions More strings attached Assumes population distribution of sample is normal Major problem Alternatives Continue
Dimensionality Reduction: Principal Components Analysis
Dimensionality Reduction: Principal Components Analysis In data mining one often encounters situations where there are a large number of variables in the database. In such situations it is very likely
An introduction to. Principal Component Analysis & Factor Analysis. Using SPSS 19 and R (psych package) Robin Beaumont [email protected].
An introduction to Principal Component Analysis & Factor Analysis Using SPSS 19 and R (psych package) Robin Beaumont [email protected] Monday, 23 April 2012 Acknowledgment: The original version
Using Principal Components Analysis in Program Evaluation: Some Practical Considerations
http://evaluation.wmich.edu/jmde/ Articles Using Principal Components Analysis in Program Evaluation: Some Practical Considerations J. Thomas Kellow Assistant Professor of Research and Statistics Mercer
Introduction to Path Analysis
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
How to report the percentage of explained common variance in exploratory factor analysis
UNIVERSITAT ROVIRA I VIRGILI How to report the percentage of explained common variance in exploratory factor analysis Tarragona 2013 Please reference this document as: Lorenzo-Seva, U. (2013). How to report
Psychology 7291, Multivariate Analysis, Spring 2003. SAS PROC FACTOR: Suggestions on Use
: Suggestions on Use Background: Factor analysis requires several arbitrary decisions. The choices you make are the options that you must insert in the following SAS statements: PROC FACTOR METHOD=????
Introduction to Matrix Algebra
Psychology 7291: Multivariate Statistics (Carey) 8/27/98 Matrix Algebra - 1 Introduction to Matrix Algebra Definitions: A matrix is a collection of numbers ordered by rows and columns. It is customary
Module 3: Correlation and Covariance
Using Statistical Data to Make Decisions Module 3: Correlation and Covariance Tom Ilvento Dr. Mugdim Pašiƒ University of Delaware Sarajevo Graduate School of Business O ften our interest in data analysis
Steven M. Ho!and. Department of Geology, University of Georgia, Athens, GA 30602-2501
PRINCIPAL COMPONENTS ANALYSIS (PCA) Steven M. Ho!and Department of Geology, University of Georgia, Athens, GA 30602-2501 May 2008 Introduction Suppose we had measured two variables, length and width, and
Mehtap Ergüven Abstract of Ph.D. Dissertation for the degree of PhD of Engineering in Informatics
INTERNATIONAL BLACK SEA UNIVERSITY COMPUTER TECHNOLOGIES AND ENGINEERING FACULTY ELABORATION OF AN ALGORITHM OF DETECTING TESTS DIMENSIONALITY Mehtap Ergüven Abstract of Ph.D. Dissertation for the degree
Factor Analysis: Statnotes, from North Carolina State University, Public Administration Program. Factor Analysis
Factor Analysis Overview Factor analysis is used to uncover the latent structure (dimensions) of a set of variables. It reduces attribute space from a larger number of variables to a smaller number of
Multivariate Analysis
Table Of Contents Multivariate Analysis... 1 Overview... 1 Principal Components... 2 Factor Analysis... 5 Cluster Observations... 12 Cluster Variables... 17 Cluster K-Means... 20 Discriminant Analysis...
Experiment #1, Analyze Data using Excel, Calculator and Graphs.
Physics 182 - Fall 2014 - Experiment #1 1 Experiment #1, Analyze Data using Excel, Calculator and Graphs. 1 Purpose (5 Points, Including Title. Points apply to your lab report.) Before we start measuring
DISCRIMINANT FUNCTION ANALYSIS (DA)
DISCRIMINANT FUNCTION ANALYSIS (DA) John Poulsen and Aaron French Key words: assumptions, further reading, computations, standardized coefficents, structure matrix, tests of signficance Introduction Discriminant
Multivariate Normal Distribution
Multivariate Normal Distribution Lecture 4 July 21, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Lecture #4-7/21/2011 Slide 1 of 41 Last Time Matrices and vectors Eigenvalues
A Beginner s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis
Tutorials in Quantitative Methods for Psychology 2013, Vol. 9(2), p. 79-94. A Beginner s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis An Gie Yong and Sean Pearce University of Ottawa
Exploratory Factor Analysis
Exploratory Factor Analysis ( 探 索 的 因 子 分 析 ) Yasuyo Sawaki Waseda University JLTA2011 Workshop Momoyama Gakuin University October 28, 2011 1 Today s schedule Part 1: EFA basics Introduction to factor
5. Multiple regression
5. Multiple regression QBUS6840 Predictive Analytics https://www.otexts.org/fpp/5 QBUS6840 Predictive Analytics 5. Multiple regression 2/39 Outline Introduction to multiple linear regression Some useful
5.2 Customers Types for Grocery Shopping Scenario
------------------------------------------------------------------------------------------------------- CHAPTER 5: RESULTS AND ANALYSIS -------------------------------------------------------------------------------------------------------
What Are Principal Components Analysis and Exploratory Factor Analysis?
Statistics Corner Questions and answers about language testing statistics: Principal components analysis and exploratory factor analysis Definitions, differences, and choices James Dean Brown University
Practical Considerations for Using Exploratory Factor Analysis in Educational Research
A peer-reviewed electronic journal. Copyright is retained by the first or sole author, who grants right of first publication to the Practical Assessment, Research & Evaluation. Permission is granted to
Least-Squares Intersection of Lines
Least-Squares Intersection of Lines Johannes Traa - UIUC 2013 This write-up derives the least-squares solution for the intersection of lines. In the general case, a set of lines will not intersect at a
Component Ordering in Independent Component Analysis Based on Data Power
Component Ordering in Independent Component Analysis Based on Data Power Anne Hendrikse Raymond Veldhuis University of Twente University of Twente Fac. EEMCS, Signals and Systems Group Fac. EEMCS, Signals
How to Get More Value from Your Survey Data
Technical report How to Get More Value from Your Survey Data Discover four advanced analysis techniques that make survey research more effective Table of contents Introduction..............................................................2
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
Multivariate Statistical Inference and Applications
Multivariate Statistical Inference and Applications ALVIN C. RENCHER Department of Statistics Brigham Young University A Wiley-Interscience Publication JOHN WILEY & SONS, INC. New York Chichester Weinheim
2. Simple Linear Regression
Research methods - II 3 2. Simple Linear Regression Simple linear regression is a technique in parametric statistics that is commonly used for analyzing mean response of a variable Y which changes according
Factor Analysis and Structural equation modelling
Factor Analysis and Structural equation modelling Herman Adèr Previously: Department Clinical Epidemiology and Biostatistics, VU University medical center, Amsterdam Stavanger July 4 13, 2006 Herman Adèr
Multivariate Analysis of Variance (MANOVA)
Multivariate Analysis of Variance (MANOVA) Aaron French, Marcelo Macedo, John Poulsen, Tyler Waterson and Angela Yu Keywords: MANCOVA, special cases, assumptions, further reading, computations Introduction
" Y. Notation and Equations for Regression Lecture 11/4. Notation:
Notation: Notation and Equations for Regression Lecture 11/4 m: The number of predictor variables in a regression Xi: One of multiple predictor variables. The subscript i represents any number from 1 through
PARTIAL LEAST SQUARES IS TO LISREL AS PRINCIPAL COMPONENTS ANALYSIS IS TO COMMON FACTOR ANALYSIS. Wynne W. Chin University of Calgary, CANADA
PARTIAL LEAST SQUARES IS TO LISREL AS PRINCIPAL COMPONENTS ANALYSIS IS TO COMMON FACTOR ANALYSIS. Wynne W. Chin University of Calgary, CANADA ABSTRACT The decision of whether to use PLS instead of a covariance
Orthogonal Projections
Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors
EXPLORATORY FACTOR ANALYSIS IN MPLUS, R AND SPSS. [email protected]
EXPLORATORY FACTOR ANALYSIS IN MPLUS, R AND SPSS Sigbert Klinke 1,2 Andrija Mihoci 1,3 and Wolfgang Härdle 1,3 1 School of Business and Economics, Humboldt-Universität zu Berlin, Germany 2 Department of
Multiple regression - Matrices
Multiple regression - Matrices This handout will present various matrices which are substantively interesting and/or provide useful means of summarizing the data for analytical purposes. As we will see,
Geostatistics Exploratory Analysis
Instituto Superior de Estatística e Gestão de Informação Universidade Nova de Lisboa Master of Science in Geospatial Technologies Geostatistics Exploratory Analysis Carlos Alberto Felgueiras [email protected]
Improving the Performance of Data Mining Models with Data Preparation Using SAS Enterprise Miner Ricardo Galante, SAS Institute Brasil, São Paulo, SP
Improving the Performance of Data Mining Models with Data Preparation Using SAS Enterprise Miner Ricardo Galante, SAS Institute Brasil, São Paulo, SP ABSTRACT In data mining modelling, data preparation
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +
Factor Analysis Using SPSS
Factor Analysis Using SPSS The theory of factor analysis was described in your lecture, or read Field (2005) Chapter 15. Example Factor analysis is frequently used to develop questionnaires: after all
How To Cluster
Data Clustering Dec 2nd, 2013 Kyrylo Bessonov Talk outline Introduction to clustering Types of clustering Supervised Unsupervised Similarity measures Main clustering algorithms k-means Hierarchical Main
CHAPTER 4 EXAMPLES: EXPLORATORY FACTOR ANALYSIS
Examples: Exploratory Factor Analysis CHAPTER 4 EXAMPLES: EXPLORATORY FACTOR ANALYSIS Exploratory factor analysis (EFA) is used to determine the number of continuous latent variables that are needed to
Factor Analysis Example: SAS program (in blue) and output (in black) interleaved with comments (in red)
Factor Analysis Example: SAS program (in blue) and output (in black) interleaved with comments (in red) The following DATA procedure is to read input data. This will create a SAS dataset named CORRMATR
October 3rd, 2012. Linear Algebra & Properties of the Covariance Matrix
Linear Algebra & Properties of the Covariance Matrix October 3rd, 2012 Estimation of r and C Let rn 1, rn, t..., rn T be the historical return rates on the n th asset. rn 1 rṇ 2 r n =. r T n n = 1, 2,...,
For example, estimate the population of the United States as 3 times 10⁸ and the
CCSS: Mathematics The Number System CCSS: Grade 8 8.NS.A. Know that there are numbers that are not rational, and approximate them by rational numbers. 8.NS.A.1. Understand informally that every number
Teaching Multivariate Analysis to Business-Major Students
Teaching Multivariate Analysis to Business-Major Students Wing-Keung Wong and Teck-Wong Soon - Kent Ridge, Singapore 1. Introduction During the last two or three decades, multivariate statistical analysis
Question 2: How do you solve a matrix equation using the matrix inverse?
Question : How do you solve a matrix equation using the matrix inverse? In the previous question, we wrote systems of equations as a matrix equation AX B. In this format, the matrix A contains the coefficients
17. SIMPLE LINEAR REGRESSION II
17. SIMPLE LINEAR REGRESSION II The Model In linear regression analysis, we assume that the relationship between X and Y is linear. This does not mean, however, that Y can be perfectly predicted from X.
Principle Component Analysis and Partial Least Squares: Two Dimension Reduction Techniques for Regression
Principle Component Analysis and Partial Least Squares: Two Dimension Reduction Techniques for Regression Saikat Maitra and Jun Yan Abstract: Dimension reduction is one of the major tasks for multivariate
STA 4107/5107. Chapter 3
STA 4107/5107 Chapter 3 Factor Analysis 1 Key Terms Please review and learn these terms. 2 What is Factor Analysis? Factor analysis is an interdependence technique (see chapter 1) that primarily uses metric
Manifold Learning Examples PCA, LLE and ISOMAP
Manifold Learning Examples PCA, LLE and ISOMAP Dan Ventura October 14, 28 Abstract We try to give a helpful concrete example that demonstrates how to use PCA, LLE and Isomap, attempts to provide some intuition
Chapter 6: Multivariate Cointegration Analysis
Chapter 6: Multivariate Cointegration Analysis 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and und Econometrics Ökonometrie VI. Multivariate Cointegration
Analysis of Bayesian Dynamic Linear Models
Analysis of Bayesian Dynamic Linear Models Emily M. Casleton December 17, 2010 1 Introduction The main purpose of this project is to explore the Bayesian analysis of Dynamic Linear Models (DLMs). The main
Tutorial on Exploratory Data Analysis
Tutorial on Exploratory Data Analysis Julie Josse, François Husson, Sébastien Lê julie.josse at agrocampus-ouest.fr francois.husson at agrocampus-ouest.fr Applied Mathematics Department, Agrocampus Ouest
