首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
By extension of the rotational process, meaningful orthogonally related positions were found for all of the thirteen centroid factors which Thurstone extracted from his original PMA intercorrelations. Most of the original primary ability factors were more sharply delineated and corresponded more closely to the Army Air Force factors that bear similar names (demonstrating greater invariance from analysis to analysis). While such different results obtained by two investigators applying the same methods on the same data may initiate some concern, the results strengthen rather than weaken the idea that more psychological meaningfulness and greater invariance will result if centroid axes are rotated, using the concepts of a simple structure and positive manifold.Thurstone considered loadings between ±.20 as negligible and considered only loadings of at least .40 in naming factors.  相似文献   

2.
Exploratory factor analysis (EFA) is an extremely popular method for determining the underlying factor structure for a set of variables. Due to its exploratory nature, EFA is notorious for being conducted with small sample sizes, and recent reviews of psychological research have reported that between 40% and 60% of applied studies have 200 or fewer observations. Recent methodological studies have addressed small size requirements for EFA models; however, these models have only considered complete data, which are the exception rather than the rule in psychology. Furthermore, the extant literature on missing data techniques with small samples is scant, and nearly all existing studies focus on topics that are not of primary interest to EFA models. Therefore, this article presents a simulation to assess the performance of various missing data techniques for EFA models with both small samples and missing data. Results show that deletion methods do not extract the proper number of factors and estimate the factor loadings with severe bias, even when data are missing completely at random. Predictive mean matching is the best method overall when considering extracting the correct number of factors and estimating factor loadings without bias, although 2-stage estimation was a close second.  相似文献   

3.
K. W. Heese 《Psychometrika》1942,7(3):213-223
Results of 10 trials on 6 tests for 50 subjects were analyzed, first, by applying the centroid method to actual improvement or practice scores and, second, by applying a formula developed by Woodrow for determining factor loadings for practice scores from the differences between factor loadings of initial and final scores. Contrary to expectation, the two methods yielded discrepant results, for the explanation of which a hypothesis is advanced. The operation of a general factor was not demonstrated. Tentative interpretations of the factors extracted by the centroid method are offered.  相似文献   

4.
A comparison of the Wherry-Gaylord iterative factor analysis procedure and the Thurstone multiple-group analysis of sub-tests shows that the two methods result in the same factors. The Wherry-Gaylord method has the advantage of giving factor loadings for items. The number of iterations needed can be reduced by doing a factor analysis of sub-tests, re-grouping sub-tests according to factors, and using each group as a starting point for iterations.This research was carried out under Contract No. WSW-2503, between the Department of the Army and Ohio State University. This paper is based on the final report PRS No. 827 under that contract. The opinions expressed herein regarding matters relating to the Department of the Army are those of the authors and are not necessarily official.  相似文献   

5.
A factor analysis of the ten sub-tests of the Seashore test of pitch discrimination revealed that more than one ability is involved. One factor, which accounted for the greater share of the variances, had loadings that decreased systematically with increasing difficulty. A second factor had strongest loadings among the more difficult items, particularly those with frequency differences of 2 to 5 cycles per second. A third had strongest loadings at differences of 5 to 12 cycles per second. No explanation for the three factors is apparent, but the hypothesis is accepted that they represent distinct abilities. In tests so homogeneous as to content and form, where a single common factor might well have been expected, the appearance of additional common factors emphasizes the importance of considering the difficulty level of test items, both in the attempt to interpret new factors and in the practice of testing. The same kind of item may measure different abilities according as it is easy or difficult for the individuals to whom it is applied.  相似文献   

6.
Relationships between the results of factor analysis and component analysis are derived when oblique factors have independent clusters with equal variances of unique factors. The factor loadings are analytically shown to be smaller than the corresponding component loadings while the factor correlations are shown to be greater than the corresponding component correlations. The condition for the inequality of the factor/component contributions is derived in the case with different variances for unique factors. Further, the asymptotic standard errors of parameter estimates are obtained for a simplified model with the assumption of multivariate normality, which shows that the component loading estimate is more stable than the corresponding factor loading estimate.  相似文献   

7.
In exploratory factor analysis, latent factors and factor loadings are seldom interpretable until analytic rotation is performed. Typically, the rotation problem is solved by numerically searching for an element in the manifold of orthogonal or oblique rotation matrices such that the rotated factor loadings minimize a pre-specified complexity function. The widely used gradient projection (GP) algorithm, although simple to program and able to deal with both orthogonal and oblique rotation, is found to suffer from slow convergence when the number of manifest variables and/or the number of latent factors is large. The present work examines the effectiveness of two Riemannian second-order algorithms, which respectively generalize the well-established truncated Newton and trust-region strategies for unconstrained optimization in Euclidean spaces, in solving the rotation problem. When approaching a local minimum, the second-order algorithms usually converge superlinearly or even quadratically, better than first-order algorithms that only converge linearly. It is further observed in Monte Carlo studies that, compared to the GP algorithm, the Riemannian truncated Newton and trust-region algorithms require not only much fewer iterations but also much less processing time to meet the same convergence criterion, especially in the case of oblique rotation.  相似文献   

8.
A multi‐group factor model is suitable for data originating from different strata. However, it often requires a relatively large sample size to avoid numerical issues such as non‐convergence and non‐positive definite covariance matrices. An alternative is to pool data from different groups in which a single‐group factor model is fitted to the pooled data using maximum likelihood. In this paper, properties of pseudo‐maximum likelihood (PML) estimators for pooled data are studied. The pooled data are assumed to be normally distributed from a single group. The resulting asymptotic efficiency of the PML estimators of factor loadings is compared with that of the multi‐group maximum likelihood estimators. The effect of pooling is investigated through a two‐group factor model. The variances of factor loadings for the pooled data are underestimated under the normal theory when error variances in the smaller group are larger. Underestimation is due to dependence between the pooled factors and pooled error terms. Small‐sample properties of the PML estimators are also investigated using a Monte Carlo study.  相似文献   

9.
New procedures are presented for measuring invariance and matching factors for fixed variables and for fixed or different subjects. Two of these, the coefficient of invariance for factor loadings and the coefficient of factor similarity, utilize factor scores computed from the different sets of factor loadings and one of the original standard score matrices. Another, the coefficient of subject invariance, is obtained by using one of the sets of factor loadings in conjunction with the different standard score matrices. These coefficients are correlations between factor scores of the appropriate matrices. When the best match of factors is desired, rather than degree of resemblance, the method of assignment is proposed.  相似文献   

10.
Abstract: Exploratory methods using second‐order components and second‐order common factors were proposed. The second‐order components were obtained from the resolution of the correlation matrix of obliquely rotated first‐order principal components. The standard errors of the estimates of the second‐order component loadings were derived from an augmented information matrix with restrictions for the loadings and associated parameters. The second‐order factor analysis proposed was similar to the classical method in that the factor correlations among the first‐order factors were further resolved by the exploratory method of factor analysis. However, in this paper the second‐order factor loadings were estimated by the generalized least squares using the asymptotic variance‐covariance matrix for the first‐order factor correlations. The asymptotic standard errors for the estimates of the second‐order factor loadings were also derived. A numerical example was presented with simulated results.  相似文献   

11.
The “factor” analyses published by Schultz, Kaye, and Hoyer (1980) confused component and factor analysis and led in this case as in many others to unwarranted conclusions. They used component analysis to develop factor models that were subjected to restricted (confirmatory) maximum likelihood analysis, but the final models for which good fits with the observed correlations were obtained were not common factor models. They were, however, discussed as such and conclusions drawn accordingly. When their correlation matrices are analyzed by the principal factors method, two factors are sufficient to account for the intercorrelations. These two factors generally support the a priori expectation of a difference between intelligence tasks and spontaneous flexibility tasks. They are also quite similar in younger and older subjects, when similarity is judged in terms of factor pattern. Factor loadings for the younger subjects, however, are much smaller than expectations based on the respective ranges of talent in the two groups of subjects or on past experience with similar tests in undergraduate student populations.  相似文献   

12.
The article describes 6 issues influencing standard errors in exploratory factor analysis and reviews 7 methods of computing standard errors for rotated factor loadings and factor correlations. These 7 methods are the augmented information method, the nonparametric bootstrap method, the infinitesimal jackknife method, the method using the asymptotic distributions of unrotated factor loadings, the sandwich method, the parametric bootstrap method, and the jackknife method. Standard error estimates are illustrated using a personality study with 537 men and an intelligence study with 145 children.  相似文献   

13.
Jensen has posited a research method to investigate group differences in cognitive tests. This method consists of first extracting a general intelligence factor by means of exploratory factor analysis. Secondly, similarity of factor loadings across groups is evaluated in an attempt to ensure that the same constructs are measured. Finally, the correlation is computed between the loadings of the tests on the general intelligence factor and the mean differences between groups on the tests. This part is referred to as a test of "Spearman's Hypothesis", which essentially states that differences in g account for the main part of differences in observed scores. Based on the correlation, inferences are made with respect to group differences in general intelligence.

The validity of these inferences is investigated and compared to the validity of inferences based on multi-group confirmatory factor analysis. For this comparison, population covariance matrices are constructed which incorporate violations of the central assumption underlying Jensen's method concerning the existence of g and/or violations of Spearman's Hypothesis. It is demonstrated that Jensen's method is quite insensitive to the violations. This lack of specificity is observed consistently for all types of violations introduced in the present study. Multi-group confirmatory factor analysis emerges as clearly superior to Jensen's method.  相似文献   

14.
In Multi-Trait Multi-Method (MTMM) studies of causal attributions for laboratory events, there is little evidence of convergent and discriminant validity for attribution measures. We report the first MTMM study to investigate the validity of two methods of eliciting causal beliefs for an illness, specifically, myocardial infarction. Adult respondents (N?=?107) listed causes of MI, then completed questionnaire rating scales for causal beliefs for MI. Measures were compared using both Campbell and Fiske's approach to MTMM analyses, and a Confirmatory Factor Analysis approach. Neither single item measures causal beliefs, nor scales of causal beliefs derived using exploratory factor analysis provided much evidence of convergent and discriminant validity. Confirmatory factor analysis showed that a model containing only causal belief factors provided a moderately good fit to the data. Adding a questionnaire method factor significantly improved the fit of the model, as well as substantially changing the pattern of factor loadings: loadings of questionnaire items on causal belief factors were markedly reduced. These results highlight major problems with the measurement of causal beliefs, and in particular question the validity of factor analysis of questionnaire measures of causal beliefs. They also suggest that at least some of the MI causal belief factors reported in the literature are artefacts of common questionnaire method variance.  相似文献   

15.
Procedures for assessing the invariance of factors found in data sets using different subjects and the same variables are often using the least squares criterion, which appears to be too restrictive for comparing factors.Tucker's coefficient of congruence, on the other hand, is more closely related to the human interpretation of factorial invariance than the least squares criterion. A method maximizing simultaneously the sum of coefficients of congruence between two matrices of factor loadings, using orthogonal rotation of one matrix is presented. As shown in examples, the sum of coefficients of congruence obtained using the presented rotation procedure is slightly higher than the sum of coefficients of congruence using Orthogonal Procrustes Rotation based on the least squares criterion.The author is obliged to Lewis R. Goldberg for critically reviewing the first draft of this paper.  相似文献   

16.
叶宝娟  温忠粦 《心理科学》2013,36(3):728-733
在心理、教育和管理等研究领域中,经常会碰到两水平(两层)的数据结构,如学生嵌套在班级中,员工嵌套在企业中。在两水平研究中,被试通常不是独立的,如果直接用单水平信度公式进行估计,会高估测验信度。文献上已有研究讨论如何更准确地估计两水平研究中单维测验的信度。本研究指出了现有的估计公式的不足之处,用两水平验证性因子分析推导出一个新的信度公式,举例演示如何计算,并给出简单的计算程序。  相似文献   

17.
Floyd, Shands, Rafael, Bergeron and McGrew (2009) used generalizability theory to test the reliability of general-factor loadings and to compare three different sources of error in them: the test battery size, the test battery composition, the factor-extraction technique, and their interactions. They found that their general-factor loadings were moderately to strongly dependable. We replicated the methods of Floyd et al. (2009) in a different sample of tests, from the Minnesota Study of Twins Reared Apart (MISTRA). Our first hypothesis was that, given the greater diversity of the tests in MISTRA, the general-factor loadings would be less dependable than in Floyd et al. (2009). Our second hypothesis, contrary to the positions of Floyd et al. (2009) and Jensen and Weng (1994), was that the general factors from the small, randomly-formed test batteries would differ substantively from the general factor from a well-specified hierarchical model of all available tests. Subtests from MISTRA were randomly selected to form independent and overlapping batteries of 2, 4 and 8 tests in size, and the general-factor loadings of eight probe tests were obtained in each battery by principal components analysis, principal factor analysis and maximum likelihood estimation. Results initially indicated that the general-factor loadings were unexpectedly more dependable than in Floyd et al. (2009); however, further analysis revealed that this was due to the greater diversity of our probe tests. After adjustment for this difference in diversity, and consideration of the representativeness of our probe tests versus those of Floyd et al. (2009), our first hypothesis of lower dependability was confirmed in the overlapping batteries, but not the independent ones. To test the second hypothesis, we correlated g factor scores from the random test batteries with g factor scores from the VPR model; we also calculated special coefficients of congruence on the same relation. Consistent with our second hypothesis, the general factors from small non-hierarchical models were found to not be reliable enough for the purposes of theoretical research. We discuss appropriate standards for the construction and factor analysis of intelligence test batteries.  相似文献   

18.
The most commonly used method of factoring a matrix of intercorrelations is the centroid method developed by L. L. Thurstone. It is, however, necessary to transform the centroid matrix of factor loadings into a simple structure matrix in order to facilitate the interpretation of the factor loadings. Current methods for effecting this transformation are chiefly graphical and require considerable experience and personal judgment. This paper presents a new method for transforming an arbitrary factor matrix into a simple structure matrix by methods almost completely objective. The theory underlying the method is developed and approximation procedures are derived. The method is applied to a matrix of factor loadings previously analyzed by Thurstone.  相似文献   

19.
The Pollyanna hypothesis is extended into the field of intertrait inference to predict that inferential thresholds for positively evaluated characteristics will be lower than those for negatively evaluated characteristics. This prediction is confirmed, and is shown to have important implications for models of inference rules. Data from several studies are analysed to reveal that the Pollyanna threshold effect is reliably greater for women than for men, and it is shown that this effect is unrelated to sex differences in extremity of responding.  相似文献   

20.
A distinction is drawn between the method of principal components developed by Hotelling and the common factor analysis discussed in psychological literature both from the point of view of stochastic models involved and problems of statistical inference. The appropriate statistical techniques are briefly reviewed in the first case and detailed in the second. A new method of analysis called the canonical factor analysis, explaining the correlations between rather than the variances of the measurements, is developed. This analysis furnishes one out of a number of possible solutions to the maximum likelihood equations of Lawley. It admits an iterative procedure for estimating the factor loadings and also for constructing the likelihood criterion useful in testing a specified hypothesis on the number of factors and in determining a lower confidence limit to the number of factors.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号