On Common Factor and Principal Component Representations of Data: Implications for Theory and for Confirmatory Replications (original) (raw)

Common factor analysis and component analysis: are they interchangeable? A word of caution

2010

In this review, we revisit the debate over the interchangeability of the statistical techniques of common factor analysis and component analysis. The literature shows that both techniques are conceptually distinctive. Additionally, the position that component analysis, in practice, can be used as an effective substitute for common factor analysis has provided mixed results. This paper discusses the risk of mixing up these techniques and concludes with some recommendations for this issue.

In Spite of Indeterminacy Many Common Factor Score Estimates Yield an Identical Reproduced Covariance Matrix

Psychometrika, 2007

It was investigated whether commonly used factor score estimates lead to the same reproduced covariance matrix of observed variables. This was achieved by means of regression component analysis, since it is possible to compute the reproduced covariance matrices of the regression components corresponding to different factor score estimates. It was shown that Thurstone's, Ledermann's, Bartlett's, Anderson-Rubin's, McDonald's, Krijnen, Wansbeek, and Ten Berge's, as well as Takeuchi, Yanai, and Mukherjee's score estimates reproduce the same covariance matrix. In contrast, Harman's ideal variables score estimates lead to a different reproduced covariance matrix.

Common factor analysis versus principal component analysis: a comparison of loadings by means of simulations

Communications in Statistics - Simulation and Computation, 2016

Common factor analysis (CFA) and principal component analysis (PCA) are widely used multivariate techniques. Using simulations, we compared CFA with PCA loadings for distortions of a perfect cluster configuration. Results showed that nonzero PCA loadings were higher and more stable than nonzero CFA loadings. Compared to CFA loadings, PCA loadings correlated weakly with the true factor loadings for underextraction, overextraction, and heterogeneous loadings within factors. The pattern of differences between CFA and PCA was consistent across sample sizes, levels of loadings, principal axis factoring versus maximum likelihood factor analysis, and blind versus target rotation.

Principal Components Analysis , Exploratory Factor Analysis , and Confirmatory Factor Analysis

2014

Principal components analysis and factor analysis are common methods used to analyze groups of variables for the purpose of reducing them into subsets represented by latent constructs (Bartholomew, 1984; Grimm & Yarnold, 1995). Even though PCA shares some important characteristics with factor analytic methods such as exploratory factor analysis (EFA) and confirmatory factor analysis (CFA), the similarities between the two types of methods are superficial. The most important distinction to make is that PCA is a descriptive method, whereas EFA and CFA are modeling techniques (Unkel & Trendafilov, 2010). Together, PCA, EFA, and CFA are used to analyze multiple variables for the purposes of data reduction, scale construction and improvement, and evaluation of validity and psychometric utility describes the appropriate problems to which each might be correctly applied, and discusses the similarities and differences between these three methods. Principal components analysis (PCA; Goodall,...

Relating Multidimensional Sets of Variables: Canonical Correlation or Factor Analysis?

Psychological Reports, 1975

Several prominent multivariate psychological methodologists recommend the use of canonical correlation to relate multidimensional sets of variables. This method along with separate factor analyses of the sets is considered in relation to the questions they may be able to answer on a specific research problem. Alternate analyses of data from the social sciences illustrate the value of common factor analysis compared with canonical analysis as a method for relating the underlying constructs across sets of variables.

Invariance of the “NEO-PI-R” factor structure across exploratory and confirmatory factor analyses

Personality and Individual Differences, 2005

Confirmatory factor analyses (CFA) carried out in the last decades to test the Five-factor simple structure have generally obtained negative results. suggested that CFA limitations were the main cause of such negative results. They also claim that Procrustes Orthogonal rotation method is an adequate procedure to test the replicability of the Five-factor structure. The aim of the present paper is twofold: (A) comparing several Exploratory factor procedures (including the Procrustes rotation) to test what is the most appropriate one to analyse the replicability of the Five-factor structure, and (B) replicating the CFA results of the McCrae et al.Õs study with a larger number of subjects. The normative American (N = 1000), and Spanish standardization samples of the NEO-PI-R, together with an independent university sample (N = 948) were analysed. Results were replicated in the three samples, and suggest that: (1) structural properties of the Five-factor model, as measured by the NEO-PI-R, are invariant irrespective of the Exploratory factor procedure used, and (2) when CFA limitations are surpassed, the Five-factor structure is supported.

Score Predictor Factor Analysis: Reproducing Observed Covariances by Means of Factor Score Predictors

Frontiers in Psychology

The non-diagonal elements of the observed covariances are more exactly reproduced by the factor loadings than by the model implied by the corresponding factor score predictors. This is a limitation to the validity of factor score predictors. It is therefore investigated whether it is possible to estimate factor loadings for which the model implied by the factor score predictors optimally reproduces the non-diagonal elements of the observed covariance matrix. Accordingly, loading estimates are proposed for which the model implied by the factor score predictors allows for a least-squares approximation of the non-diagonal elements of the observed covariance matrix. This estimation method is termed score predictor factor analysis and algebraically compared with Minres factor analysis as well as principal component analysis. A population-based and a sample-based simulation study was performed in order to compare score predictor factor analysis, Minres factor analysis, and principal component analysis. It turns out that the non-diagonal elements of the observed covariance matrix can more exactly be reproduced from the factor score predictors computed from score predictor factor analysis than from the factor score predictors computed from Minres factor analysis and from principal components.

Confirmatory factor analysis : a brief introduction and critique by

2013

One of the routes to construct validation of a test is predicting the test's factor structure based on the theory that guided its construction, followed by testing it. The method of choice for such testing is often confirmatory factor analysis (CFA). In CFA, the predicted factor structure of a number of observed variables is translated into the complete covariance matrix over these variables. Next, this matrix is adjusted to the actual covariance matrix, and subsequently compared with it. The discrepancy between the two, the "goodness of fit" (GOF), is expressed by a number of indices. An assessment of how well the predicted factor structure is corroborated by the sample data, and whether it could be generalized to the population, is often based on the values of these indices. This brief and selective review discusses the CFA procedure, the associated indices of goodness of fit and the reliability of the latter. Much doubt surrounds the GOF indices, and thus an alterna...

Old and New Ideas for Data Screening and Assumption Testing for Exploratory and Confirmatory Factor Analysis

Frontiers in Psychology, 2012

We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables.

The factor paradox: Common factors can be correlated with the variance not accounted for by the common factors!

The case that the factor model does not account for all the covariances of the observed variables is considered. This is a quite realistic condition because some model error as well as some sampling error should usually occur with empirical data. It is shown that principal components representing covariances not accounted for by the factors of the model can have a non-zero correlation with the common factors of the factor model. Non-zero correlations of components representing variance not accounted for by the factor model with common factors were also found in a simulation study. Based on these results it should be concluded that common factors can be correlated with variance components representing model error as well as sampling error.