Common factor analysis
|
|
|
- Sharleen Anthony
- 10 years ago
- Views:
Transcription
1
2 Common factor analysis This is what people generally mean when they say "factor analysis" This family of techniques uses an estimate of common variance among the original variables to generate the factor solution It is based on the fundamental assumption that some underlying factors, which are smaller in number than the observed variables, are responsible for the covariation among them
3 Questions to answer How many different factors are needed to explain the pattern of relationships among these variables? What is the nature of those factors? How well do the hypothesized factors explain the observed data? How much purely random or unique variance does each observed variable include? As with previous techniques we have discussed, the goal is dimension reduction, i.e. to describe a number of variables in a simpler form
4 Issues Factor solutions in the exploratory endeavor will be different depending on the data, algorithm and other researcher choices The goal with exploratory factor analysis is to discover structure, not determine it There are also differences in reporting such that one may see quite different results from sample to sample, study to study
5 Factor Analysis There are four basic steps: data collection and generation of the correlation matrix extraction of initial factor solution rotation and interpretation (also validation) construction of scales or factor scores to use in further analyses A good factor: Makes sense Will be easy to interpret Possesses simple structure Items have low cross loadings
6 Factor Analysis Factor analysis can be seen as a family of techniques, of which both PCA and EFA are members 1 Factor analysis is a statistical approach that can be used to analyze interrelationships among a large number of variables and to explain these variables in terms of their common underlying dimensions (factors) It involves finding a way of condensing the information contained in a number of original variables into a smaller set of dimensions (factors) with a minimum loss of information
7 Principal Components Analysis Principle components analysis (PCA) is a statistical technique applied to a single set of variables to discover which variables in the set form coherent subsets that are independent of one another Provides a unique solution, so that the original data, the covariance or correlation matrix, can be reconstructed from the results Looks at the total variance among the variables, so the solution generated will include as many factors/components as there are variables, although it is unlikely that they will all meet the criteria for retention Variables that are correlated with one another which are also largely independent of other subsets of variables are combined into factors Factors are generated which are thought to be representative of the underlying processes that have created the correlations among variables The underlying notion of PCA is that the observed variables can be transformed into linear combinations of an underlying set of hypothesized or unobserved components (factors) PCA is typically exploratory in nature
8 Common factor model PCA and common factor analysis may utilize a similar method and are conducted with similar goals in mind The difference between PCA and common FA involves the underlying model The common factor model for factor analysis PCA assumes that all variance is common, which is akin to assuming they are perfectly reliable 1 All unique factors, i.e. sources of variability not attributable to a factor, set equal to zero The common factor model on the other hand holds that the observed variance in each measure is attributable to relatively small set of common factors (latent characteristics common to two or more variables), and a single specific factor unrelated to any other underlying factor in the model
9 Comparison of underlying models 1 PCA Extraction is the process of forming PCs as linear combinations of the measured variables as we have done with our other techniques PC 1 = b 11 X 1 + b 21 X b k1 X k PC 2 = b 12 X 1 + b 22 X b k2 X k PC f = b 1f X 1 + b 2f X b kf X k Common factor analysis Each measure X has two contributing sources of variation: the common factor ξ and the specific or unique factor δ: 2 X 1 = λ 1 ξ + δ 1 X 2 = λ 2 ξ + δ 2 X f = λ f ξ + δ f
10 Example Consider the following example from Holzinger and Swineford 1939 Ol skool yeah! Variables Paragraph comprehension Sentence completion Word meaning Addition Counting dots Each person s score is a reflection of the weighted combination of the common factor (latent variable) and measurement error (uniqueness, unreliability) In these equations, λ represents the extent to which each measure reflects the underlying common factor
11 Factor Analysis When standardized the variance can be decomposed as follows: 2 var( X i) = var( λξ i + δi) = λ + var( δi) = 1 λi, in unsquared form, is now interpretable as a correlation coefficient and its square the proportion of the variation in X accounted for by the common factor, i.e. the communality The remaining is that which is accounted for by the specific factor or uniqueness (individual differences, measurement error, some other known factor e.g. intelligence) var( δ) = θ 1 θ = communality 2 ii ii
12 Measurement error The principal axis extraction method of EFA can be distinguished from PCA in terms of simply having communalities on the diagonal of the correlation matrix instead of 1s What does this mean? Having 1s assumes each item/variable is perfectly reliable, i.e. there is no measurement error in its ability to distinguish among cases/individuals L = V'RV L = Eigenvalue matrix V = Eigenvector matrix R = Correlation matrix R = AA' A = Loading matrix
13 Measurement error The fact that our estimates in psych are not perfectly reliable suggests that we use methods that take this into account This is a reason to use EFA over PCA in cases involving measurement scales or items, as the communalities can be seen as the lower bound (i.e. conservative) estimate of the variables reliability Note however, that low communalities are not interpreted as evidence of poor fit so much as evidence that the variables analyzed have little in common with one another, and thus are not reliable measures of a proposed factor solution
14 Two factor solution From our example before, we could have selected a two factor solution Quantitative vs. Verbal reasoning Now we have three sources of variation observed in test scores Two common factors and one unique factor As before, each λ reflects the extent to which each common factor contributes to the variance of each test score The communality for each variable is now λ λ 2 2
15 Analysis: the Correlation Matrices Observed Correlation Matrix Note that the manner in which missing values are dealt with will determine the observed correlation matrix In typical FA settings one may have quite a few, and so casewise deletion would not be appropriate Best would be to produce a correlation matrix resulting from some missing values analysis (e.g. EM algorithm) Reproduced Correlation Matrix That which is produced by the factor solution Recall that in PCA it is identical to the observed It is the product of the matrix of loadings and the transpose of the pattern matrix (partial loadings)* Residual Correlation Matrix R=AA' The difference between the two R res = R-R
16 Analysis: is the data worth reducing? The Kaiser Meyer Olkin Measure of Sampling Adequacy A statistic that indicates the proportion of variance in your variables that might be caused by common underlying factors An index for comparing the magnitudes of the observed correlation coefficients to the magnitudes of the partial correlation coefficients If two variables share a common factor with other variables, their partial correlation will be small once the factor is taken into account High values (close to 1.0) generally indicate that a factor analysis may be useful with your data. If the value is less than 0.50, the results of the factor analysis probably won't be very useful. Bartlett's test of sphericity Tests the hypothesis that your correlation matrix is an identity matrix (1s on the diagonal, 0s off diagonals), which would indicate that your variables are unrelated and therefore unsuitable for structure detection
17 Analysis: Extraction Methods Principal (Axis) Factors Estimates of communalities (SMC) are in the diagonal; used as starting values for the communality estimation (iterative) Removes unique and error variance Solution depends on quality of the initial communality estimates Maximum Likelihood Computationally intensive method for estimating loadings that maximize the likelihood of sampling the observed correlation matrix from a population Unweighted least squares Minimize off diagonal residuals between reproduced and original R matrix Generalized (weighted) least squares Also minimizes the off diagonal residuals Variables with larger communalities are given more weight in the analysis Alpha factoring Maximizes the reliability of the factors Image factoring Minimizes unique factors consisting of essentially one measured variable
18 Analysis: Rotation Methods After extraction, initial interpretation may be difficult Rotation is used to improve interpretability and utility Refer back to the PCA mechanics handout for the geometric interpretation of rotation 1 By placing a variable in the n dimensional space specified by the factors involved, factor loadings are the cosine of the angle formed by a vector from the origin to that coordinate and the factor axis Note how PCA would be distinguished from multiple regression PCA minimizes the squared distances to the axis, with each point mapping on to the axis forming a right angle (as opposed to dropping straight down in MR) MR is inclined to account for variance in the DV, where PCA will tilt more to whichever variable exhibits the most variance
19 Analysis: Rotation Methods So factors are the axes Orthogonal Factors are at right angles Oblique rotation allows for other angles Often achieve simpler structure, though at the cost that you must also consider the factor inter correlations when interpreting results Repositioning the axes changes the loadings on the factor but keeps the relative positioning of the points the same Length of the line from the origin to the variable coordinates is equal to the communality for that variable
20 Example: Rotation Note that the variance of the two factors for the original and rotated solutions sum to the same amount
21 Analysis: Rotation Methods Orthogonal rotation keeps factors uncorrelated while increasing the meaning of the factors Varimax most popular Cleans up the factors Makes large loadings larger and small loadings smaller Quartimax Cleans up the variables Each variable loads mainly on one factor Varimax works on the columns of the loading matrix; Quartimax works on the rows Not used as often; simplifying variables is not usually a goal Equamax Hybrid of the two that tries to simultaneously simplify factors and variables Not that popular either
22 Analysis: Rotation Methods Oblique Rotation Techniques Direct Oblimin Begins with an unrotated solution Has a parameter 1 that allows the user to define the amount of correlation acceptable; gamma values near 4 orthogonal, 0 leads to mild correlations (also direct quartimin) and close to 1 highly correlated Promax 2 Solution is orthogonally rotated initially (varimax) This is followed by oblique rotation Orthogonal loadings are raised to powers in order to drive down small to moderate loadings
23 Analysis: Orthogonal vs. Oblique output Orthogonal Rotation Factor matrix Correlation between observed variable and factor for the unrotated solution Pattern vs. structure matrix Structure matrix Loadings, i.e. structure coefficients Correlation between observed variable and factor Pattern matrix Standardized, partialled coefficients (weights, loadings) These structure and pattern matrices are the same in orthogonal solutions and so will not be distinguished Factor Score Coefficient matrix Coefficients used to calculate factor scores (like regression coefficients) from original variables (standardized)
24 Analysis: Orthogonal vs. Oblique output Oblique Rotation Factor matrix Correlation between observed variable and factor for the unrotated solution Structure Matrix Simple correlation between factors and variables Factor loading matrix Pattern Matrix Unique relationship between each factor and variable that takes into account the correlation between the factors The standardized regression coefficient from the common factor model The more factors, the lower the pattern coefficients as a rule since there will be more common contributions to variance explained For oblique rotation, the researcher looks at both the structure and pattern coefficients when attributing a label to a factor Factor Score Coefficient matrix Again used to derive factors scores from the original variables Factor Correlation Matrix correlation between the factors
25 Analysis: Factor scores Factor scores can be derived in a variety of ways, some of which are presented here 1 Regression Regression factor scores have a mean of 0 and variance equal to the squared multiple correlation between the estimated factor scores and the true factor values. They can be correlated even when factors are assumed to be orthogonal. The sum of squared residuals between true and estimated factors over individuals is minimized. Least squares Minimizes squared residuals of scores and true factor scores Same approach as above but uses the reproduced R matrix instead of the original Bartlett Minimizes the effect of unique factors (consisting of single variables) Anderson Rubin Same as Bartlett s but produces orthogonal factor scores Once obtained one can use factor scores in other analyses Recall PC regression
26 Other stuff: An iterative process To get an initial estimate of the communalities, we can simply start with the squared multiple correlation coefficient (R 2 ) for each item regressed on the other remaining items There are other approaches, but with this one we can see that if a measure was completely unreliable its R 2 value would be zero We run the EFA and come to a solution, however now we can estimate the communalities as the sum of the squared loadings for an item across the factors We now use these new estimates as communalities and rerun This is done until successive iterations are essentially identical Convergence is achieved If convergence is not obtained, another method of factor extraction, e.g. PCA, must be utilized, or if possible, sample size increased
27 Measurement error vs. sampling error Measurement error is the variance not attributable to the factor an observed variable purportedly represents (1 Reliability) Sampling error is the variability in estimates seen as we move from one sample to the next Just like every person is different, every sample taken would be Note that with successive iterations, we increase the likelihood that we are capitalizing on the unique sampling error associated with a given dataset, thus making our results less generalizable As one might expect, with larger samples (and fewer factors to consider) we have less to worry about regarding sampling error, and so might allow for more iterations Unfortunately there are no hard and fast rules regarding the limitation of iterations, however you should be aware of the trade off
28 Other stuff: More on sample size How big? Assume you ll need lots From some simulation studies 1. 4 or more variables per factor with large structure coefficients (e.g. greater than.6) may work with even small samples variables or more per factor, loadings.4 N > sample size > 300 The larger the communalites (i.e. the more reliable), the better off you are
29 Other stuff: How many factors? Refer back to PCA notes, there are many ways to determine this But recall that we are doing exploratory factor analysis As such, just as we suggested with cluster analysis, go with the solution that makes the most sense Also how you interpret them is, as it was with previous analyses, entirely subjective
30 Other stuff: Exploratory vs. Confirmatory Exploratory FA Summarizing data by grouping correlated variables Investigating sets of measured variables related to theoretical constructs Usually done near the onset of research The type of FA and PCA we are talking about Confirmatory FA More advanced technique When factor structure is known or at least theorized Testing generalization of factor structure to new data, etc. This is tested through SEM methods
31 Concrete example Beer data we did with PCA See appendix for code if you want to go along What influences a consumer s choice behavior when shopping for beer? 200 consumers are asked to rate on a scale of how important they consider each of seven qualities when deciding whether or not to buy the six pack: COST of the six pack, SIZE of the bottle (volume) Percentage of ALCOHOL in the beer REPUTATion of the brand COLOR of the beer AROMA of the beer TASTE of the beer First perform a PCA with varimax rotation In descriptives check correlation coefficients and KMO test of sphericity Make sure to select that the number of factors to be extracted equals the number variables (7) For easier reading you may want to suppress loadings less than.3 in the options dialog box
32 First we ll run a PCA and compare the results As always first get to know your correlation matrix You should be aware of the simple relationships quite well, and one can already guess the factor structure that may be found The first and last 3 correlate well within their group, reputation, correlates moderately negatively with all the others Our tests here indicate we ll be okay for further analysis Correlation cost size alcohol reputat color aroma taste Correlation Matrix cost size alcohol reputat color aroma taste KMO and Bartlett's Test Kaiser-Meyer-Olkin Measure of Sampling Adequacy. Bartlett's Test of Sphericity Approx. Chi-Square df Sig
33 PCA Recall with PCA we extract all the variance At this point it looks like we ll stick with two components/factors which account for almost 85% of the variance cost size alcohol reputat color aroma taste Communalities Initial Extraction Extraction Method: Principal Component Analysis.
34 PCA We ve got some strong loadings here, but it s not easily interpretable Perhaps a rotation is in order We ll do varimax and see what we come up with cost size alcohol reputat color aroma taste Component Matrix a Component Extraction Method: Principal Component Analysis. a. 7 components extracted.
35 PCA Just going by eigenvalues > 1, it looks like now there maybe a third factor worth considering Here, loadings <.3 have been suppressed Ah, much nicer, and perhaps we ll go with a 3 factor interpretation One factor related to practical concerns (how cheaply can I get drunk?) Another to aesthetic concerns (is it a good beer?) One factor is simply reputation (will I look cool drinking it?) cost size alcohol reputat color aroma taste Rotated Component Matrix a Component Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization. a. Rotation converged in 7 iterations.
36 Exploratory Factor Analysis Now we ll try the EFA Principal Axis factoring Varimax rotation We ll now be taking into account measurement error, so the communalities will be different If we just take the eigenvalues greater than 1 approach, we have 2 factors accounting for 80% of the total variance Factor cost size alcohol reputat color aroma taste Communalities Initial Extraction Extraction Method: Principal Axis Factoring. Total Variance Explained Initial Eigenvalues Extraction Sums of Squared Loadings Total % of Variance Cumulative % Total % of Variance Cumulative % Extraction Method: Principal Axis Factoring.
37 EFA Here are our initial structure coefficients before rotation Similar to before, not so well interpreted How about the rotated solution? cost size alcohol reputat color aroma taste Factor Matrix a Factor Extraction Method: Principal Axis Factoring. a. 2 factors extracted. 7 iterations required.
38 EFA Much better once again But note now we have the reputation variable loading on both, and negatively As this might be difficult to incorporate into our interpretation we may just stick to those that are loading highly 1 However, this is a good example of how you may end up with different results whether you do PCA or EFA Rotated Factor Matrix a Factor 1 2 cost size alcohol reputat color aroma taste Extraction Method: Principal Axis Factoring. Rotation Method: Varimax with Kaiser Normalization. a. Rotation converged in 3 iterations.
39 EFA vs. PCA Again, the reason to use other methods of EFA rather than PCA is to take into account measurement error In psych, this would the route one would typically want to take Because of the lack of measurement error, physical sciences typically do PCA However, in many cases the interpretation will not change for the most part, more so as more variables are involved The communalities make up less of the total values in the correlation matrix as we add variables Ex. 5 variables 10 correlations, 5 communalities 10 variables 45 corr 10 communalities Some of the other EFA methods will not be viable with some datasets Gist: in many situations you ll be fine with either, but perhaps you should have an initial preference for other methods besides PCA
Factor Analysis. Principal components factor analysis. Use of extracted factors in multivariate dependency models
Factor Analysis Principal components factor analysis Use of extracted factors in multivariate dependency models 2 KEY CONCEPTS ***** Factor Analysis Interdependency technique Assumptions of factor analysis
Introduction to Principal Components and FactorAnalysis
Introduction to Principal Components and FactorAnalysis Multivariate Analysis often starts out with data involving a substantial number of correlated variables. Principal Component Analysis (PCA) is a
A Brief Introduction to SPSS Factor Analysis
A Brief Introduction to SPSS Factor Analysis SPSS has a procedure that conducts exploratory factor analysis. Before launching into a step by step example of how to use this procedure, it is recommended
2. Linearity (in relationships among the variables--factors are linear constructions of the set of variables) F 2 X 4 U 4
1 Neuendorf Factor Analysis Assumptions: 1. Metric (interval/ratio) data. Linearity (in relationships among the variables--factors are linear constructions of the set of variables) 3. Univariate and multivariate
Overview of Factor Analysis
Overview of Factor Analysis Jamie DeCoster Department of Psychology University of Alabama 348 Gordon Palmer Hall Box 870348 Tuscaloosa, AL 35487-0348 Phone: (205) 348-4431 Fax: (205) 348-8648 August 1,
4. There are no dependent variables specified... Instead, the model is: VAR 1. Or, in terms of basic measurement theory, we could model it as:
1 Neuendorf Factor Analysis Assumptions: 1. Metric (interval/ratio) data 2. Linearity (in the relationships among the variables--factors are linear constructions of the set of variables; the critical source
T-test & factor analysis
Parametric tests T-test & factor analysis Better than non parametric tests Stringent assumptions More strings attached Assumes population distribution of sample is normal Major problem Alternatives Continue
Exploratory Factor Analysis and Principal Components. Pekka Malo & Anton Frantsev 30E00500 Quantitative Empirical Research Spring 2016
and Principal Components Pekka Malo & Anton Frantsev 30E00500 Quantitative Empirical Research Spring 2016 Agenda Brief History and Introductory Example Factor Model Factor Equation Estimation of Loadings
An introduction to. Principal Component Analysis & Factor Analysis. Using SPSS 19 and R (psych package) Robin Beaumont [email protected].
An introduction to Principal Component Analysis & Factor Analysis Using SPSS 19 and R (psych package) Robin Beaumont [email protected] Monday, 23 April 2012 Acknowledgment: The original version
5.2 Customers Types for Grocery Shopping Scenario
------------------------------------------------------------------------------------------------------- CHAPTER 5: RESULTS AND ANALYSIS -------------------------------------------------------------------------------------------------------
Rachel J. Goldberg, Guideline Research/Atlanta, Inc., Duluth, GA
PROC FACTOR: How to Interpret the Output of a Real-World Example Rachel J. Goldberg, Guideline Research/Atlanta, Inc., Duluth, GA ABSTRACT THE METHOD This paper summarizes a real-world example of a factor
Factor Analysis. Advanced Financial Accounting II Åbo Akademi School of Business
Factor Analysis Advanced Financial Accounting II Åbo Akademi School of Business Factor analysis A statistical method used to describe variability among observed variables in terms of fewer unobserved variables
Exploratory Factor Analysis Brian Habing - University of South Carolina - October 15, 2003
Exploratory Factor Analysis Brian Habing - University of South Carolina - October 15, 2003 FA is not worth the time necessary to understand it and carry it out. -Hills, 1977 Factor analysis should not
Factor Analysis. Chapter 420. Introduction
Chapter 420 Introduction (FA) is an exploratory technique applied to a set of observed variables that seeks to find underlying factors (subsets of variables) from which the observed variables were generated.
Exploratory Factor Analysis
Exploratory Factor Analysis Definition Exploratory factor analysis (EFA) is a procedure for learning the extent to which k observed variables might measure m abstract variables, wherein m is less than
To do a factor analysis, we need to select an extraction method and a rotation method. Hit the Extraction button to specify your extraction method.
Factor Analysis in SPSS To conduct a Factor Analysis, start from the Analyze menu. This procedure is intended to reduce the complexity in a set of data, so we choose Data Reduction from the menu. And the
Statistics in Psychosocial Research Lecture 8 Factor Analysis I. Lecturer: Elizabeth Garrett-Mayer
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
FACTOR ANALYSIS NASC
FACTOR ANALYSIS NASC Factor Analysis A data reduction technique designed to represent a wide range of attributes on a smaller number of dimensions. Aim is to identify groups of variables which are relatively
Factor analysis. Angela Montanari
Factor analysis Angela Montanari 1 Introduction Factor analysis is a statistical model that allows to explain the correlations between a large number of observed correlated variables through a small number
Exploratory Factor Analysis
Introduction Principal components: explain many variables using few new variables. Not many assumptions attached. Exploratory Factor Analysis Exploratory factor analysis: similar idea, but based on model.
Multivariate Analysis (Slides 13)
Multivariate Analysis (Slides 13) The final topic we consider is Factor Analysis. A Factor Analysis is a mathematical approach for attempting to explain the correlation between a large set of variables
CHAPTER 8 FACTOR EXTRACTION BY MATRIX FACTORING TECHNIQUES. From Exploratory Factor Analysis Ledyard R Tucker and Robert C.
CHAPTER 8 FACTOR EXTRACTION BY MATRIX FACTORING TECHNIQUES From Exploratory Factor Analysis Ledyard R Tucker and Robert C MacCallum 1997 180 CHAPTER 8 FACTOR EXTRACTION BY MATRIX FACTORING TECHNIQUES In
Factor Analysis Example: SAS program (in blue) and output (in black) interleaved with comments (in red)
Factor Analysis Example: SAS program (in blue) and output (in black) interleaved with comments (in red) The following DATA procedure is to read input data. This will create a SAS dataset named CORRMATR
Principal Component Analysis
Principal Component Analysis ERS70D George Fernandez INTRODUCTION Analysis of multivariate data plays a key role in data analysis. Multivariate data consists of many different attributes or variables recorded
A Beginner s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis
Tutorials in Quantitative Methods for Psychology 2013, Vol. 9(2), p. 79-94. A Beginner s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis An Gie Yong and Sean Pearce University of Ottawa
Factor Rotations in Factor Analyses.
Factor Rotations in Factor Analyses. Hervé Abdi 1 The University of Texas at Dallas Introduction The different methods of factor analysis first extract a set a factors from a data set. These factors are
Statistics for Business Decision Making
Statistics for Business Decision Making Faculty of Economics University of Siena 1 / 62 You should be able to: ˆ Summarize and uncover any patterns in a set of multivariate data using the (FM) ˆ Apply
FACTOR ANALYSIS. Factor Analysis is similar to PCA in that it is a technique for studying the interrelationships among variables.
FACTOR ANALYSIS Introduction Factor Analysis is similar to PCA in that it is a technique for studying the interrelationships among variables Both methods differ from regression in that they don t have
Exploratory Factor Analysis: rotation. Psychology 588: Covariance structure and factor models
Exploratory Factor Analysis: rotation Psychology 588: Covariance structure and factor models Rotational indeterminacy Given an initial (orthogonal) solution (i.e., Φ = I), there exist infinite pairs of
How to report the percentage of explained common variance in exploratory factor analysis
UNIVERSITAT ROVIRA I VIRGILI How to report the percentage of explained common variance in exploratory factor analysis Tarragona 2013 Please reference this document as: Lorenzo-Seva, U. (2013). How to report
NCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )
Chapter 340 Principal Components Regression Introduction is a technique for analyzing multiple regression data that suffer from multicollinearity. When multicollinearity occurs, least squares estimates
Dimensionality Reduction: Principal Components Analysis
Dimensionality Reduction: Principal Components Analysis In data mining one often encounters situations where there are a large number of variables in the database. In such situations it is very likely
Principle Component Analysis and Partial Least Squares: Two Dimension Reduction Techniques for Regression
Principle Component Analysis and Partial Least Squares: Two Dimension Reduction Techniques for Regression Saikat Maitra and Jun Yan Abstract: Dimension reduction is one of the major tasks for multivariate
Psychology 7291, Multivariate Analysis, Spring 2003. SAS PROC FACTOR: Suggestions on Use
: Suggestions on Use Background: Factor analysis requires several arbitrary decisions. The choices you make are the options that you must insert in the following SAS statements: PROC FACTOR METHOD=????
Factor Analysis Using SPSS
Factor Analysis Using SPSS The theory of factor analysis was described in your lecture, or read Field (2005) Chapter 15. Example Factor analysis is frequently used to develop questionnaires: after all
PRINCIPAL COMPONENT ANALYSIS
1 Chapter 1 PRINCIPAL COMPONENT ANALYSIS Introduction: The Basics of Principal Component Analysis........................... 2 A Variable Reduction Procedure.......................................... 2
A Brief Introduction to Factor Analysis
1. Introduction A Brief Introduction to Factor Analysis Factor analysis attempts to represent a set of observed variables X 1, X 2. X n in terms of a number of 'common' factors plus a factor which is unique
Factor Analysis - SPSS
Factor Analysis - SPSS First Read Principal Components Analysis. The methods we have employed so far attempt to repackage all of the variance in the p variables into principal components. We may wish to
How To Understand Multivariate Models
Neil H. Timm Applied Multivariate Analysis With 42 Figures Springer Contents Preface Acknowledgments List of Tables List of Figures vii ix xix xxiii 1 Introduction 1 1.1 Overview 1 1.2 Multivariate Models
Practical Considerations for Using Exploratory Factor Analysis in Educational Research
A peer-reviewed electronic journal. Copyright is retained by the first or sole author, who grants right of first publication to the Practical Assessment, Research & Evaluation. Permission is granted to
STA 4107/5107. Chapter 3
STA 4107/5107 Chapter 3 Factor Analysis 1 Key Terms Please review and learn these terms. 2 What is Factor Analysis? Factor analysis is an interdependence technique (see chapter 1) that primarily uses metric
EFFECT OF ENVIRONMENTAL CONCERN & SOCIAL NORMS ON ENVIRONMENTAL FRIENDLY BEHAVIORAL INTENTIONS
169 EFFECT OF ENVIRONMENTAL CONCERN & SOCIAL NORMS ON ENVIRONMENTAL FRIENDLY BEHAVIORAL INTENTIONS Joshi Pradeep Assistant Professor, Quantum School of Business, Roorkee, Uttarakhand, India [email protected]
Factor Analysis and Structural equation modelling
Factor Analysis and Structural equation modelling Herman Adèr Previously: Department Clinical Epidemiology and Biostatistics, VU University medical center, Amsterdam Stavanger July 4 13, 2006 Herman Adèr
Review Jeopardy. Blue vs. Orange. Review Jeopardy
Review Jeopardy Blue vs. Orange Review Jeopardy Jeopardy Round Lectures 0-3 Jeopardy Round $200 How could I measure how far apart (i.e. how different) two observations, y 1 and y 2, are from each other?
Chapter 7 Factor Analysis SPSS
Chapter 7 Factor Analysis SPSS Factor analysis attempts to identify underlying variables, or factors, that explain the pattern of correlations within a set of observed variables. Factor analysis is often
APPRAISAL OF FINANCIAL AND ADMINISTRATIVE FUNCTIONING OF PUNJAB TECHNICAL UNIVERSITY
APPRAISAL OF FINANCIAL AND ADMINISTRATIVE FUNCTIONING OF PUNJAB TECHNICAL UNIVERSITY In the previous chapters the budgets of the university have been analyzed using various techniques to understand the
Data analysis process
Data analysis process Data collection and preparation Collect data Prepare codebook Set up structure of data Enter data Screen data for errors Exploration of data Descriptive Statistics Graphs Analysis
Factor Analysis. Sample StatFolio: factor analysis.sgp
STATGRAPHICS Rev. 1/10/005 Factor Analysis Summary The Factor Analysis procedure is designed to extract m common factors from a set of p quantitative variables X. In many situations, a small number of
Subspace Analysis and Optimization for AAM Based Face Alignment
Subspace Analysis and Optimization for AAM Based Face Alignment Ming Zhao Chun Chen College of Computer Science Zhejiang University Hangzhou, 310027, P.R.China [email protected] Stan Z. Li Microsoft
What Are Principal Components Analysis and Exploratory Factor Analysis?
Statistics Corner Questions and answers about language testing statistics: Principal components analysis and exploratory factor analysis Definitions, differences, and choices James Dean Brown University
How To Run Factor Analysis
Getting Started in Factor Analysis (using Stata 10) (ver. 1.5) Oscar Torres-Reyna Data Consultant [email protected] http://dss.princeton.edu/training/ Factor analysis is used mostly for data reduction
Principal Component Analysis
Principal Component Analysis Principle Component Analysis: A statistical technique used to examine the interrelations among a set of variables in order to identify the underlying structure of those variables.
CHAPTER 4 EXAMPLES: EXPLORATORY FACTOR ANALYSIS
Examples: Exploratory Factor Analysis CHAPTER 4 EXAMPLES: EXPLORATORY FACTOR ANALYSIS Exploratory factor analysis (EFA) is used to determine the number of continuous latent variables that are needed to
Factor Analysis. Factor Analysis
Factor Analysis Principal Components Analysis, e.g. of stock price movements, sometimes suggests that several variables may be responding to a small number of underlying forces. In the factor model, we
What is Rotating in Exploratory Factor Analysis?
A peer-reviewed electronic journal. Copyright is retained by the first or sole author, who grants right of first publication to the Practical Assessment, Research & Evaluation. Permission is granted to
EM Clustering Approach for Multi-Dimensional Analysis of Big Data Set
EM Clustering Approach for Multi-Dimensional Analysis of Big Data Set Amhmed A. Bhih School of Electrical and Electronic Engineering Princy Johnson School of Electrical and Electronic Engineering Martin
How To Cluster
Data Clustering Dec 2nd, 2013 Kyrylo Bessonov Talk outline Introduction to clustering Types of clustering Supervised Unsupervised Similarity measures Main clustering algorithms k-means Hierarchical Main
Mehtap Ergüven Abstract of Ph.D. Dissertation for the degree of PhD of Engineering in Informatics
INTERNATIONAL BLACK SEA UNIVERSITY COMPUTER TECHNOLOGIES AND ENGINEERING FACULTY ELABORATION OF AN ALGORITHM OF DETECTING TESTS DIMENSIONALITY Mehtap Ergüven Abstract of Ph.D. Dissertation for the degree
Nonlinear Iterative Partial Least Squares Method
Numerical Methods for Determining Principal Component Analysis Abstract Factors Béchu, S., Richard-Plouet, M., Fernandez, V., Walton, J., and Fairley, N. (2016) Developments in numerical treatments for
Factor Analysis: Statnotes, from North Carolina State University, Public Administration Program. Factor Analysis
Factor Analysis Overview Factor analysis is used to uncover the latent structure (dimensions) of a set of variables. It reduces attribute space from a larger number of variables to a smaller number of
Does organizational culture cheer organizational profitability? A case study on a Bangalore based Software Company
Does organizational culture cheer organizational profitability? A case study on a Bangalore based Software Company S Deepalakshmi Assistant Professor Department of Commerce School of Business, Alliance
CHAPTER VI ON PRIORITY SECTOR LENDING
CHAPTER VI IMPACT OF PRIORITY SECTOR LENDING 6.1 PRINCIPAL FACTORS THAT HAVE DIRECT IMPACT ON PRIORITY SECTOR LENDING 6.2 ASSOCIATION BETWEEN THE PROFILE VARIABLES AND IMPACT OF PRIORITY SECTOR CREDIT
DATA ANALYSIS AND INTERPRETATION OF EMPLOYEES PERSPECTIVES ON HIGH ATTRITION
DATA ANALYSIS AND INTERPRETATION OF EMPLOYEES PERSPECTIVES ON HIGH ATTRITION Analysis is the key element of any research as it is the reliable way to test the hypotheses framed by the investigator. This
SPSS ADVANCED ANALYSIS WENDIANN SETHI SPRING 2011
SPSS ADVANCED ANALYSIS WENDIANN SETHI SPRING 2011 Statistical techniques to be covered Explore relationships among variables Correlation Regression/Multiple regression Logistic regression Factor analysis
Using R and the psych package to find ω
Using R and the psych package to find ω William Revelle Department of Psychology Northwestern University February 23, 2013 Contents 1 Overview of this and related documents 1 2 Install R and relevant packages
EXPLORATORY FACTOR ANALYSIS IN MPLUS, R AND SPSS. [email protected]
EXPLORATORY FACTOR ANALYSIS IN MPLUS, R AND SPSS Sigbert Klinke 1,2 Andrija Mihoci 1,3 and Wolfgang Härdle 1,3 1 School of Business and Economics, Humboldt-Universität zu Berlin, Germany 2 Department of
Research Methodology: Tools
MSc Business Administration Research Methodology: Tools Applied Data Analysis (with SPSS) Lecture 02: Item Analysis / Scale Analysis / Factor Analysis February 2014 Prof. Dr. Jürg Schwarz Lic. phil. Heidi
Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.
Statistical Learning: Chapter 4 Classification 4.1 Introduction Supervised learning with a categorical (Qualitative) response Notation: - Feature vector X, - qualitative response Y, taking values in C
9.2 User s Guide SAS/STAT. The FACTOR Procedure. (Book Excerpt) SAS Documentation
SAS/STAT 9.2 User s Guide The FACTOR Procedure (Book Excerpt) SAS Documentation This document is an individual chapter from SAS/STAT 9.2 User s Guide. The correct bibliographic citation for the complete
Factor Analysis - 2 nd TUTORIAL
Factor Analysis - 2 nd TUTORIAL Subject marks File sub_marks.csv shows correlation coefficients between subject scores for a sample of 220 boys. sub_marks
Multivariate Analysis
Table Of Contents Multivariate Analysis... 1 Overview... 1 Principal Components... 2 Factor Analysis... 5 Cluster Observations... 12 Cluster Variables... 17 Cluster K-Means... 20 Discriminant Analysis...
Reliability Analysis
Measures of Reliability Reliability Analysis Reliability: the fact that a scale should consistently reflect the construct it is measuring. One way to think of reliability is that other things being equal,
Component Ordering in Independent Component Analysis Based on Data Power
Component Ordering in Independent Component Analysis Based on Data Power Anne Hendrikse Raymond Veldhuis University of Twente University of Twente Fac. EEMCS, Signals and Systems Group Fac. EEMCS, Signals
STATISTICA Formula Guide: Logistic Regression. Table of Contents
: Table of Contents... 1 Overview of Model... 1 Dispersion... 2 Parameterization... 3 Sigma-Restricted Model... 3 Overparameterized Model... 4 Reference Coding... 4 Model Summary (Summary Tab)... 5 Summary
A Comparison of Variable Selection Techniques for Credit Scoring
1 A Comparison of Variable Selection Techniques for Credit Scoring K. Leung and F. Cheong and C. Cheong School of Business Information Technology, RMIT University, Melbourne, Victoria, Australia E-mail:
A Solution Manual and Notes for: Exploratory Data Analysis with MATLAB by Wendy L. Martinez and Angel R. Martinez.
A Solution Manual and Notes for: Exploratory Data Analysis with MATLAB by Wendy L. Martinez and Angel R. Martinez. John L. Weatherwax May 7, 9 Introduction Here you ll find various notes and derivations
Best Practices in Exploratory Factor Analysis: Four Recommendations for Getting the Most From Your Analysis
A peer-reviewed electronic journal. Copyright is retained by the first or sole author, who grants right of first publication to the Practical Assessment, Research & Evaluation. Permission is granted to
Partial Least Squares (PLS) Regression.
Partial Least Squares (PLS) Regression. Hervé Abdi 1 The University of Texas at Dallas Introduction Pls regression is a recent technique that generalizes and combines features from principal component
Manifold Learning Examples PCA, LLE and ISOMAP
Manifold Learning Examples PCA, LLE and ISOMAP Dan Ventura October 14, 28 Abstract We try to give a helpful concrete example that demonstrates how to use PCA, LLE and Isomap, attempts to provide some intuition
Least Squares Estimation
Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN-13: 978-0-470-86080-9 ISBN-10: 0-470-86080-4 Editors Brian S Everitt & David
Additional sources Compilation of sources: http://lrs.ed.uiuc.edu/tseportal/datacollectionmethodologies/jin-tselink/tselink.htm
Mgt 540 Research Methods Data Analysis 1 Additional sources Compilation of sources: http://lrs.ed.uiuc.edu/tseportal/datacollectionmethodologies/jin-tselink/tselink.htm http://web.utk.edu/~dap/random/order/start.htm
Medical Information Management & Mining. You Chen Jan,15, 2013 [email protected]
Medical Information Management & Mining You Chen Jan,15, 2013 [email protected] 1 Trees Building Materials Trees cannot be used to build a house directly. How can we transform trees to building materials?
How to Get More Value from Your Survey Data
Technical report How to Get More Value from Your Survey Data Discover four advanced analysis techniques that make survey research more effective Table of contents Introduction..............................................................2
Linear Algebra Review. Vectors
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka [email protected] http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length
How to do a factor analysis with the psych package
How to do a factor analysis with the psych package William Revelle Department of Psychology Northwestern University February 28, 2013 Contents 0.1 Jump starting the psych package to do factor analysis
Common factor analysis versus principal component analysis: a comparison of loadings by means of simulations
1 Common factor analysis versus principal component analysis: a comparison of loadings by means of simulations Joost C. F. de Winter, Dimitra Dodou Faculty of Mechanical, Maritime and Materials Engineering,
15.062 Data Mining: Algorithms and Applications Matrix Math Review
.6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop
Introduction to Principal Component Analysis: Stock Market Values
Chapter 10 Introduction to Principal Component Analysis: Stock Market Values The combination of some data and an aching desire for an answer does not ensure that a reasonable answer can be extracted from
Similar matrices and Jordan form
Similar matrices and Jordan form We ve nearly covered the entire heart of linear algebra once we ve finished singular value decompositions we ll have seen all the most central topics. A T A is positive
Multivariate Normal Distribution
Multivariate Normal Distribution Lecture 4 July 21, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Lecture #4-7/21/2011 Slide 1 of 41 Last Time Matrices and vectors Eigenvalues
This chapter will demonstrate how to perform multiple linear regression with IBM SPSS
CHAPTER 7B Multiple Regression: Statistical Methods Using IBM SPSS This chapter will demonstrate how to perform multiple linear regression with IBM SPSS first using the standard method and then using the
Choosing the Right Type of Rotation in PCA and EFA James Dean Brown (University of Hawai i at Manoa)
Shiken: JALT Testing & Evaluation SIG Newsletter. 13 (3) November 2009 (p. 20-25) Statistics Corner Questions and answers about language testing statistics: Choosing the Right Type of Rotation in PCA and
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
Multiple regression - Matrices
Multiple regression - Matrices This handout will present various matrices which are substantively interesting and/or provide useful means of summarizing the data for analytical purposes. As we will see,
Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013
Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,
THE INTERNATIONAL JOURNAL OF BUSINESS & MANAGEMENT
THE INTERNATIONAL JOURNAL OF BUSINESS & MANAGEMENT Customer Preference and Satisfaction towards Housing Finance with Special Reference to Vijayawada, Andhra Pradesh P. Krishna Priya Assistant Professor,
Factor Analysis Using SPSS
Psychology 305 p. 1 Factor Analysis Using SPSS Overview For this computer assignment, you will conduct a series of principal factor analyses to examine the factor structure of a new instrument developed
