首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Parceling—using composites of observed variables as indicators for a common factor—strengthens loadings, but reduces the number of indicators. Factor indeterminacy is reduced when there are many observed variables per factor, and when loadings and factor correlations are strong. It is proven that parceling cannot reduce factor indeterminacy. In special cases where the ratio of loading to residual variance is the same for all items included in each parcel, factor indeterminacy is unaffected by parceling. Otherwise, parceling worsens factor indeterminacy. While factor indeterminacy does not affect the parameter estimates, standard errors, or fit indices associated with a factor model, it does create uncertainty, which endangers valid inference.

  相似文献   

2.
Autism is a highly uncertain entity and little is said about it with any degree of certainty. Scientists must, and do, work through these uncertainties in the course of their work. Scientists explain uncertainty in autism research through discussion of epistemological uncertainties which suggest that diverse methods and techniques make results hard to reconcile, ontological uncertainties which suggest doubt over taxonomic coherence, but also through reference to autism’s indeterminacy which suggests that the condition is inherently heterogeneous. Indeed, indeterminacy takes two forms—an inter-personal form which suggests that there are fundamental differences between individuals with autism and an intra-personal form which suggests that no one factor is able to explain all features of autism within a given individual. What is apparent in the case of autism is that scientists put uncertainty and indeterminacy into discussion with one another and, rather than a well-policed epistemic-ontic boundary, there is a movement between, and an entwinement of, the two. Understanding scientists’ dialogue concerning uncertainty and indeterminacy is of importance for understanding autism and autistic heterogeneity but also for understanding uncertainty and ‘uncertainty work’ within science more generally.  相似文献   

3.
4.
Points of agreement and disagreement with Maraun's (1996) account of factor indeterminacy are briefly stated. It is pointed out that if the common factor model must be discarded because of indeterminacy, so also must almost all the technology of psychological measurement, including the whole of item response theory. This unpalatable conclusion can be avoided by recognizing the behavior domain concept as the essential foundation of all theory for latent variables. Applications of behavior domain theory require a higher standard of psychological conceptualization than is commonly met by researchers.  相似文献   

5.
The author argues that, though social scientists generally value tolerance for ambiguity, and some even assert a fundamental indeterminacy in human systems, there is still a discipline-wide discomfort with uncertainty and ambiguity. It is argued that this distaste for uncertainty derives from a distorted view of the classical physical sciences, a view that ignores the essentially critical and radical foundations of scientific practice. The drive for certainty, it is argued, is essentially unscientific, in that certain, or adequate, forms of knowledge can only recapitulate the already known and in their dogmatic and institutionalized forms prevent the development of genuinely new knowledge. In contrast, uncertainty is defended as a positive condition, generative of new knowledge because it is open to discovery and to the mystery of the other. The conclusion drawn from this analysis is that the social sciences can only progress if uncertainty, or mystery, is protected and cultivated through a scientific discourse constituted in local and concrete terms (rather than in general and universal ones) and through a self-reflective and self-critical research praxis.  相似文献   

6.
Carl S. Helrich 《Zygon》2000,35(3):489-503
The quantum-measurement problem and the Heisenberg indeterminacy principle are presented in the language of the Dirac formulation of the quantum theory. Particularly the relationship between quantum state prior to measurement and the result of the measurement are discussed. The relation between the indeterminacy principle and the analog between quantum and classical systems is presented, showing that this principle may be discussed independently of the wave-particle duality. The importance of statistics in the treatment of many body systems is outlined and the approach to investigating God's interaction with human beings is discussed in this context. The treatment is nonmathematical.  相似文献   

7.
The threat of ontological deflationism (the view that disagreement about what there is can be non-substantive) is averted by appealing to realism about fundamental structure—or so tells us Ted Sider. In this paper, the notion of structural indeterminacy is introduced as a particular case of metaphysical indeterminacy; then it is argued that structural indeterminacy is not only compatible with a metaphysics of fundamental structure, but it can even safeguard it from a crucial objection; finally, it is shown that, if there are instances of structural indeterminacy, a hitherto unacknowledged variety of ontological deflationism will arise. Unless structure is shown to be determinate, ontological deflationism remains a live option. Furthermore, I will consider whether structural indeterminacy could be challenged by adopting a naturalistic epistemology of structure; the question is answered in the negative on the basis of a formal result concerning theory choice. Finally, I submit a new way of articulating the epistemology of structure, which hinges on the very possibility of structural indeterminacy.  相似文献   

8.
Semi-sparse PCA     
Eldén  Lars  Trendafilov  Nickolay 《Psychometrika》2019,84(1):164-185

It is well known that the classical exploratory factor analysis (EFA) of data with more observations than variables has several types of indeterminacy. We study the factor indeterminacy and show some new aspects of this problem by considering EFA as a specific data matrix decomposition. We adopt a new approach to the EFA estimation and achieve a new characterization of the factor indeterminacy problem. A new alternative model is proposed, which gives determinate factors and can be seen as a semi-sparse principal component analysis (PCA). An alternating algorithm is developed, where in each step a Procrustes problem is solved. It is demonstrated that the new model/algorithm can act as a specific sparse PCA and as a low-rank-plus-sparse matrix decomposition. Numerical examples with several large data sets illustrate the versatility of the new model, and the performance and behaviour of its algorithmic implementation.

  相似文献   

9.
Nearly 70 years ago, eminent mathematician Edwin Bidwell Wilson attended a dinner at Harvard where visitor Charles Spearman discussed the "two-factor theory" of intelligence and his just-released book The Abilities of Man. Wilson, having just discovered factor indeterminacy, attempted to explain to Spearman and the assembled guests that Spearman's two-factor theory might have a non-uniqueness problem. Neither Spearman nor the guests could follow Wilson's argument, but Wilson persisted, first through correspondence, later through a series of publications that spanned more than a decade, involving Spearman and several other influential statisticians in an extended debate. Many years have passed since the Spearman-Wilson debates, yet the fascinating statistical, logical, and philosophical issues surrounding factor indeterminacy are very much alive. Equally fascinating are the sociological issues and historical questions surrounding the way indeterminacy has periodically vanished from basic textbooks on factor analysis. In this article, I delineate some of these historical-sociological issues, and respond to a critique from some recent commentators on the history of factor indeterminacy.  相似文献   

10.
Sufficient conditions for mean square convergence of factor predictors in common factor analysis are given by Guttman, by Williams, and by Schneeweiss and Mathes. These conditions do not hold for confirmatory factor analysis or when an error variance equals zero (Heywood cases). Two sufficient conditions are given for the three basic factor predictors and a predictor from rotated principal components analysis to converge to the factors of the model for confirmatory factor analysis, including Heywood cases. For certain model specifications the conditions are necessary. The conditions are sufficient for the existence of a unique true factor. A geometric interpretation is given for factor indeterminacy and mean square convergence of best linear factor prediction.  相似文献   

11.
Statistically based banding is often considered a viable method for minimizing adverse impact in test‐based employment decisions. By utilizing the standard error of the difference (SED), scores are equated based on the assumption that there is substantial unreliability in any single observed score. However, based on the derivations of Dudek, the formula commonly used to calculate the standard error of measurement (SEM) – a component that is typically used to calculate the SED – is incorrect. Specifically, utilizing the SEM when calculating the SED produces a band of observed scores around a true score, not a band of true scores around an observed score as would be appropriate for banding. This study compares the differences between banding‐based selection decisions when the appropriate SED formula – which utilizes the standard error of estimate – is and is not applied. Overall, results suggest that utilizing the appropriate formula for calculating the SED produces substantial variations in employment decisions. The potential legal and ethical implications of these discrepancies are discussed.  相似文献   

12.
The subject of factor indeterminacy has a vast history in factor analysis (Guttman, 1955; Lederman, 1938; Wilson, 1928). It has lead to strong differences in opinion (Steiger, 1979). The current paper gives necessary and sufficient conditions for observability of factors in terms of the parameter matrices and a finite number of variables. Five conditions are given which rigorously define indeterminacy. It is shown that (un)observable factors are (in)determinate. Specifically, the indeterminacy proof by Guttman is extended to Heywood cases. The results are illustrated by two examples and implications for indeterminacy are discussed.  相似文献   

13.
There has been recent interest in formulating theories of non-representational indeterminacy. The aim of this paper is to clarify the relevance of quantum mechanics to this project. Quantum-mechanical examples of vague objects have been offered by various authors, displaying indeterminate identity, in the face of the famous Evans argument that such an idea is incoherent. It has also been suggested that the quantum-mechanical treatment of state-dependent properties exhibits metaphysical indeterminacy. In both cases it is important to consider the details of the metaphysical account and the way in which the quantum phenomenon is captured within it. Indeed if we adopt a familiar way of thinking about indeterminacy and apply it in a natural way to quantum mechanics, we run into illuminating difficulties and see that the case is far less straightforward than might be hoped.  相似文献   

14.
错误是人类决策和行为过程中在所难免的。然而, 错误常常带来不利后果甚至危及生命(如高危作业时的失误)。如何有效监控错误并优化行为对于个体生存和发展至关重要。错误的发生受到内部心理状态影响, 个体常在不确定的情境中做出判断, 不确定状态增强还是削弱错误监控, 是一个重要科学问题, 却存在矛盾结果。在前期积累和理论分析基础上, 本项目拟从人格差异角度探索不确定容忍度的调节作用。不确定容忍度的差异意味着个体对于模糊情境的耐受性和错误的敏感性不同, 因而可能调节不确定状态中的错误加工过程。研究1采用行为实验, 揭示多种不确定情境中(如奖赏/惩罚)错误监控和错误后调整的认知特点, 考察不确定容忍度的调节作用; 研究2通过考察电生理表征、时间加工进程和神经振荡机制等进一步解释这些现象。本项目对于探明不确定状态中的错误加工规律及其人格调节机制具有重要理论价值, 对于促进个体的环境适应、目标达成等具有较好现实意义。  相似文献   

15.
Tryon WW  Lewis C 《心理学方法》2008,13(3):272-277
Evidence of group matching frequently takes the form of a nonsignificant test of statistical difference. Theoretical hypotheses of no difference are also tested in this way. These practices are flawed in that null hypothesis statistical testing provides evidence against the null hypothesis and failing to reject H-sub-0 is not evidence supportive of it. Tests of statistical equivalence are needed. This article corrects the inferential confidence interval (ICI) reduction factor introduced by W. W. Tryon (2001) and uses it to extend his discussion of statistical equivalence. This method is shown to be algebraically equivalent with D. J. Schuirmann's (1987) use of 2 one-sided t tests, a highly regarded and accepted method of testing for statistical equivalence. The ICI method provides an intuitive graphic method for inferring statistical difference as well as equivalence. Trivial difference occurs when a test of difference and a test of equivalence are both passed. Statistical indeterminacy results when both tests are failed. Hybrid confidence intervals are introduced that impose ICI limits on standard confidence intervals. These intervals are recommended as replacements for error bars because they facilitate inferences.  相似文献   

16.
Results obtained by Guttman [1955] on the determinacy of common factors have been thought to have disturbing consequences for the common factor model. It is argued that factors must be thought of as unobservable, and uniquely defined but numerically indeterminate. It follows that Guttman's measure of indeterminacy is inconsistent with the foundations of the factor model in probability theory, and the traditional measures of factor indeterminacy used by earlier writers should be reinstated. These yield no disturbing conclusions about the model.  相似文献   

17.
Research indicates that meaning in life is an important correlate of health and well-being. However, relatively little is known about the way a sense of meaning may change over time. The purpose of this study is to explore two ways of assessing change in meaning within a second-order confirmatory factor analysis framework. First, tests are conducted to see if the first and second-order factor loadings and measurement error terms are invariant over time. Second, a largely overlooked technique is used to assess change and stability in meaning at the second-order level. Findings from a nationwide survey reveal that the first and second-order factor loadings are invariant of time. Moreover, the second-order measurement error terms, but not the first-order measurement error terms, are invariant, as well. The results further reveal that standard ways of assessing stability mask significant change in meaning that is due largely to regression to the mean.  相似文献   

18.
Current measures of ability emotional intelligence (EI) – in particular the well-known Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) – suffer from several limitations, including low discriminant validity and questionable construct and incremental validity. We show that the MSCEIT is largely predicted by personality dimensions, general intelligence, and demographics having multiple R’s with the MSCEIT branches up to .66; for the general EI factor this relation was even stronger (Multiple R = .76). As concerns the factor structure of the MSCEIT, we found support for four first-order factors, which had differential relations with personality, but no support for a higher-order global EI factor. We discuss implications for employing the MSCEIT, including (a) using the single branches scores rather than the total score, (b) controlling for personality and general intelligence to ensure unbiased parameter estimates in the EI factors, and (c) accounting for measurement error. Failure to correctly model these methodological aspects may severely compromise predictive validity testing. We also discuss potential avenues for the improvement of ability-based tests.  相似文献   

19.
Stephen Read has presented an argument for the inconsistency of the concept of validity. We extend Read’s results and show that this inconsistency is but one half of a larger problem. Like the concept of truth, validity is infected with what we call semantic pathology, a condition that actually gives rise to two symptoms: inconsistency and indeterminacy. After sketching the basic ideas behind semantic pathology and explaining how it manifests both symptoms in the concept of truth, we present cases that establish the indeterminacy of validity and that link this indeterminacy with the concept’s inconsistency. Our conclusion is that an adequate treatment of the semantic pathology thus revealed must deal with both of its symptoms. Further, it must extend to the occurrences of this condition elsewhere: in the concept of truth, in the other central semantic notions, and even in certain philosophical concepts outside semantics.  相似文献   

20.
In Word and Object, Quine argues from the observation that ‘there is no justification for collating linguistic meanings, unless in terms of men's dispositions to respond overtly to socially observable stimulations’ to the conclusion that ‘the enterprise of translation is found to be involved in a certain systematic indeterminacy’. In this paper, I propose to show (1) that Quine's thesis, when properly understood, reveals in the situation of translation no peculiar indeterminacy but merely the ordinary indeterminacy present in any case of empirical investigation; (2) that it is plausible that, because the subject of inquiry is language, we are in a better position with respect to such empirical indeterminacies than we are in other areas of investigation; (3) that, in any case, Quine's arguments are impotent, for they are either contradictory or incoherent; and (4) that Quine is led to his radical conclusions because he confuses a trivial and unexciting indeterminacy, which does obtain, with the striking indeterminacy for which he argues, which does not obtain.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号