首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Component loss functions (CLFs) similar to those used in orthogonal rotation are introduced to define criteria for oblique rotation in factor analysis. It is shown how the shape of the CLF affects the performance of the criterion it defines. For example, it is shown that monotone concave CLFs give criteria that are minimized by loadings with perfect simple structure when such loadings exist. Moreover, if the CLFs are strictly concave, minimizing must produce perfect simple structure whenever it exists. Examples show that methods defined by concave CLFs perform well much more generally. While it appears important to use a concave CLF, the specific CLF used is less important. For example, the very simple linear CLF gives a rotation method that can easily outperform the most popular oblique rotation methods promax and quartimin and is competitive with the more complex simplimax and geomin methods. The author would like to thank the editor and three reviewers for helpful suggestions and for identifying numerous errors.  相似文献   

2.
A new oblique factor rotation method is proposed, the aim of which is to identify a simple and well‐clustered structure in a factor loading matrix. A criterion consisting of the complexity of a factor loading matrix and a between‐cluster dissimilarity is optimized using the gradient projection algorithm and the k‐means algorithm. It is shown that if there is an oblique rotation of an initial loading matrix that has a perfect simple structure, then the proposed method with Kaiser's normalization will produce the perfect simple structure. Although many rotation methods can also recover a perfect simple structure, they perform poorly when a perfect simple structure is not possible. In this case, the new method tends to perform better because it clusters the loadings without requiring the clusters to be perfect. Artificial and real data analyses demonstrate that the proposed method can give a simple structure, which the other methods cannot produce, and provides a more interpretable result than those of widely known rotation techniques.  相似文献   

3.
Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (Psychometrika 47:41?C54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (Psychometrika 76:537?C549, 2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bifactor rotation criterion designed to produce a rotated loading matrix that has an approximate bi-factor structure. Among other things this can be used as an aid in finding an explicit bi-factor structure for use in a confirmatory bi-factor analysis. They considered only orthogonal rotation. The purpose of this paper is to consider oblique rotation and to compare it to orthogonal rotation. Because there are many more oblique rotations of an initial loading matrix than orthogonal rotations, one expects the oblique results to approximate a bi-factor structure better than orthogonal rotations and this is indeed the case. A surprising result arises when oblique bi-factor rotation methods are applied to ideal data.  相似文献   

4.
Matrices of factor loadings are often rotated to simple structure. When more than one loading matrix is available for the same variables, the loading matrices can be compared after rotating them all (separately) to simple structure. An alternative procedure is to rotate them to optimal agreement, and then compare them. In the present paper techniques are described that combine these two procedures. Specifically, five techniques that combine the ideals of rotation to optimal agreement and rotation to simple structure are compared on the basis of contrived and empirical data. For the contrived data, it is assessed to what extent the rotations recover the underlying common structure. For both the contrived and the empirical data it is studied to what extent the techniques give well matching rotated matrices, to what extent these have a simple structure, and to what extent the most prominent parts of the different loading matrices agree. It was found that the simple procedure of combining a Generalized Procrustes Analysis (GPA) with Varimax on the mean of the matched loading matrices performs very well on all criteria, and, for most purposes, offers an attractive compromise of rotation to agreement and simple structure. In addition to this comparison, some technical improvements are proposed for Bloxom's rotation to simple structure and maximum similarity.This research has been made possible by a fellowship from the Royal Netherlands Academy of Arts and Sciences to the author. The author is obliged to René van der Heijden for assistance in programming the procedures in the simulation study reported in this paper, and to Jos ten Berge, three anonymous reviewers and an associate editor for helpful comments on an earlier version of this paper.  相似文献   

5.
Requirements for an objective definition of simple structure are investigated and a number of proposed objective criteria are evaluated. A distinction is drawn between exploratory factorial studies and confirmatory factorial studies, with the conclusion drawn that objective definition of simple structure depends on study design as well as on objective criteria. A proposed definition of simple structure is described in terms of linear constellations. This definition lacks only a statistical test to compare with possible chance results. A computational procedure is also described for searching for linear constellations. This procedure is very laborious and might best be accomplished on high-speed automatic computers. There is no guarantee that the procedure will find all linear constellations, but it probably would yield satisfactory results for well-designed studies.  相似文献   

6.
A method of matrix analysis of group structure   总被引:7,自引:0,他引:7  
LUCE RD  PERRY AD 《Psychometrika》1949,14(2):95-116
Matrix methods may be applied to the analysis of experimental data concerning group structure when these data indicate relationships which can be depicted by line diagrams such as sociograms. One may introduce two concepts,n-chain and clique, which have simple relationships to the powers of certain matrices. Using them it is possible to determine the group structure by methods which are both faster and more certain than less systematic methods. This paper describes such a matrix method and applies it to the analysis of practical examples. At several points some unsolved problems in this field are indicated.  相似文献   

7.
A loading matrix has perfect simple structure if each row has at most one nonzero element. It is shown that if there is an orthogonal rotation of an initial loading matrix that has perfect simple structure, then orthomax rotation with 0 1 of the initial loading matrix will produce the perfect simple structure. In particular, varimax and quartimax will produce rotations with perfect simple structure whenever they exist.  相似文献   

8.
Factor analysis and principal component analysis are usually followed by simple structure rotations of the loadings. These rotations optimize a certain criterion (e.g., varimax, oblimin), designed to measure the degree of simple structure of the pattern matrix. Simple structure can be considered optimal if a (usually large) number of pattern elements is exactly zero. In the present paper, a class of oblique rotation procedures is proposed to rotate a pattern matrix such that it optimally resembles a matrix which has an exact simple pattern. It is demonstrated that this method can recover relatively complex simple structures where other well-known simple structure rotation techniques fail.This research has been made possible by a fellowship from the Royal Netherlands Academy of Arts and Sciences. The author is obliged to Jos ten Berge for helpful comments on an earlier version.  相似文献   

9.
It is proved for the common factor model withr common factors that under certain condition s which maintain the distinctiveness of each common factor a given common factor will be determinate if there exists an unlimited number of variables in the model each having an absolute correlation with the factor greater than some arbitrarily small positive quantity.The author is indebted to R. P. McDonald for suggesting the proof of Guttman's determinantal equation for the squared multiple correlation in predicting a factor from the observed variables used in the parenthetical note.  相似文献   

10.
Mesoudi A  Whiten A  Laland KN 《The Behavioral and brain sciences》2006,29(4):329-47; discussion 347-83
We suggest that human culture exhibits key Darwinian evolutionary properties, and argue that the structure of a science of cultural evolution should share fundamental features with the structure of the science of biological evolution. This latter claim is tested by outlining the methods and approaches employed by the principal subdisciplines of evolutionary biology and assessing whether there is an existing or potential corresponding approach to the study of cultural evolution. Existing approaches within anthropology and archaeology demonstrate a good match with the macroevolutionary methods of systematics, paleobiology, and biogeography, whereas mathematical models derived from population genetics have been successfully developed to study cultural microevolution. Much potential exists for experimental simulations and field studies of cultural microevolution, where there are opportunities to borrow further methods and hypotheses from biology. Potential also exists for the cultural equivalent of molecular genetics in "social cognitive neuroscience," although many fundamental issues have yet to be resolved. It is argued that studying culture within a unifying evolutionary framework has the potential to integrate a number of separate disciplines within the social sciences.  相似文献   

11.
The identification of the second of two targets presented in close succession is often impaired—a phenomenon referred to as the attentional blink. Extending earlier work (Di Lollo, Kawahara, Ghorashi, and Enns, in Psychological Research 69:191–200, 2005), the present study shows that increasing the number of targets in the stream can lead to remarkable improvements as long as there are no intervening distractors. In addition, items may even recover from an already induced blink whenever they are preceded by another target. It is shown that limited memory resources contribute to overall performance, but independent of the attentional blink. The findings argue against a limited-capacity account of the blink and suggest a strong role for attentional control processes that may be overzealously applied.  相似文献   

12.
The purpose of this study was to investigate and compare the performance of a stepwise variable selection algorithm to traditional exploratory factor analysis. The Monte Carlo study included six factors in the design; the number of common factors; the number of variables explained by the common factors; the magnitude of factor loadings; the number of variables not explained by the common factors; the type of anomaly evidenced by the poorly explained variables; and sample size. The performance of the methods was evaluated in terms of selection and pattern accuracy, and bias and root mean squared error of the structure coefficients. Results indicate that the stepwise algorithm was generally ineffective at excluding anomalous variables from the factor model. The poor selection accuracy of the stepwise approach suggests that it should be avoided.  相似文献   

13.
Several algorithms for covariance structure analysis are considered in addition to the Fletcher-Powell algorithm. These include the Gauss-Newton, Newton-Raphson, Fisher Scoring, and Fletcher-Reeves algorithms. Two methods of estimation are considered, maximum likelihood and weighted least squares. It is shown that the Gauss-Newton algorithm which in standard form produces weighted least squares estimates can, in iteratively reweighted form, produce maximum likelihood estimates as well. Previously unavailable standard error estimates to be used in conjunction with the Fletcher-Reeves algorithm are derived. Finally all the algorithms are applied to a number of maximum likelihood and weighted least squares factor analysis problems to compare the estimates and the standard errors produced. The algorithms appear to give satisfactory estimates but there are serious discrepancies in the standard errors. Because it is robust to poor starting values, converges rapidly and conveniently produces consistent standard errors for both maximum likelihood and weighted least squares problems, the Gauss-Newton algorithm represents an attractive alternative for at least some covariance structure analyses.Work by the first author has been supported in part by Grant No. Da01070 from the U. S. Public Health Service. Work by the second author has been supported in part by Grant No. MCS 77-02121 from the National Science Foundation.  相似文献   

14.
Abstract: Many social practices treat citizens with cognitive disabilities differently from their nondisabled peers. Does John Rawls's theory of justice imply that we have different duties of justice to citizens whenever they are labeled with cognitive disabilities? Some theorists have claimed that the needs of the cognitively disabled do not raise issues of justice for Rawls. I claim that it is premature to reject Rawlsian contractualism. Rawlsians should regard all citizens as moral persons provided they have the potential for developing the two moral powers. I claim that every citizen requires specific Enabling Conditions to develop and exercise the two moral powers. Structuring basic social institutions to deny some citizens the Enabling Conditions is unjust because it blocks their developmental pathways toward becoming fully cooperating members of society. Hence, we have a duty of justice to provide citizens labeled with cognitive disabilities with the Enabling Conditions they require until they become fully cooperating members of society.  相似文献   

15.
黎光明  蒋欢 《心理科学》2019,(3):731-738
包含评分者侧面的测验通常不符合任意一种概化理论设计,因此从概化理论的角度来看这类测验下的数据应属于缺失数据,而决定缺失结构的就是测验的评分方案。用R软件模拟出三种评分方案下的数据,并比较传统法、评价法和拆分法在各评分方案下的估计效果,结果表明:(1)传统法估计准确性较差;(2)评分者一致性较高时,适宜用评价法进行估计;(3)拆分法的估计结果最准确,仅在固定评分者评分方案下需注意评分者与考生数量之比,该比值小于等于0.0047 时估计结果较为准确。  相似文献   

16.
The development of predictive tests for genetic diseases such as Huntington’s chorea, not only raises new ethical and psychosocial issues for people at risk for genetic diseases, but also poses a challenge for the professionals treating them. The number of patients facing these issues will grow considerably with future advances of the human genome project. While there is general agreement among geneticists and neurologists that concurrent psychosocial and psychotherapeutic counselling for patients considering getting tested should be a prerequisite for predictive test-ing, much less agreement exists in the field of psychotherapy on the form and content of psychotherapeutic counselling for patients with a genetic risk factor. By their very nature, hereditary genetic conditions are a family affair. Whatever the test result will be in the end, it will have repercussions on other family members. In conse-quence, this article argues in favor of a counselling approach that is family and re-source oriented. From their experience with a collaborative treatment project, the authors present the major topics which psychotherapists need to address when working with patients at risk.  相似文献   

17.
Wallace A. Murphree 《Sophia》1991,30(2-3):59-70
Conclusion In this paper I challenge both the contemporary secular view that religious faith is not a virtue, and also the contemporary theistic view that religious faith is a virtue that is unavailable to nonbelievers. Although these views appear reasonable from the respective sides when faith is interpreted as belief, if faith is understood to be the entrusting of one’s ultimate concerns to whatever powers are in control (as I suggest), then such faith, with its accompanying ‘freedom from bondage’ (Spinoza), not only appears to be a virtue in itself, but it also appears to be one that can be achieved by nonbelievers as well as by theists. This is not to claim, however, that theists should hold the nonbeliever’s faith to be as viable as their own (or vice versa); rather, it is to claim that there is no more reason for theists to hold that nonbelievers must be without faith than there is for them to hold that nonbelievers must be without hope or love. Still, of course, it may be that God does exist and that the belief that he exists is part of the formula for the realization of some ultimate religious concern, such as eternal life. (For example, it could have been that there was a person conducting a rescue mission for the mountain climbers, but who refused to bring those who did not believe so to final safety, even though they has boarded the platform.) So, if God does exist and if the formula for eternal life, for example, does include the requirement that creatures believe that he exists, then atheists and agnostics will certainly have erred by not embracing theism. But their error then (assuming their doubts not to be the products of such vices as pride or dishonesty) will have been an error in calculative judgment, rather than a failure in virtue: they will have erred by not having engaged a hypnotist—at least in last resort— to equip them with a precautionary theism.  相似文献   

18.
ObjectivesTo outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering.DesignThe principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed.ResultsThe way the statistical procedures are employed are demonstrated with some hypothetical data.ConclusionResearch into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome.Educational objectives: After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression.  相似文献   

19.
A simple method is presented for examining the hierarchical structure of a set of variables, based on factor scores from rotated solutions involving one to many factors. The correlations among orthogonal factor scores from adjoining levels can be viewed as path coefficients in a hierarchical structure. The method is easily implemented using any of a wide variety of standard computer programs, and it has proved to be extremely useful in a number of diverse applications, some of which are here described.  相似文献   

20.
Assessing item fit for unidimensional item response theory models for dichotomous items has always been an issue of enormous interest, but there exists no unanimously agreed item fit diagnostic for these models, and hence there is room for further investigation of the area. This paper employs the posterior predictive model‐checking method, a popular Bayesian model‐checking tool, to examine item fit for the above‐mentioned models. An item fit plot, comparing the observed and predicted proportion‐correct scores of examinees with different raw scores, is suggested. This paper also suggests how to obtain posterior predictive p‐values (which are natural Bayesian p‐values) for the item fit statistics of Orlando and Thissen that summarize numerically the information in the above‐mentioned item fit plots. A number of simulation studies and a real data application demonstrate the effectiveness of the suggested item fit diagnostics. The suggested techniques seem to have adequate power and reasonable Type I error rate, and psychometricians will find them promising.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号