首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   50篇
  免费   0篇
  2023年   1篇
  2021年   1篇
  2016年   1篇
  2013年   1篇
  2011年   1篇
  2009年   5篇
  2006年   2篇
  2003年   1篇
  2002年   1篇
  2001年   2篇
  1999年   2篇
  1998年   1篇
  1997年   4篇
  1996年   3篇
  1995年   2篇
  1994年   3篇
  1993年   3篇
  1992年   3篇
  1991年   5篇
  1990年   2篇
  1989年   4篇
  1988年   1篇
  1987年   1篇
排序方式: 共有50条查询结果,搜索用时 15 毫秒
21.
Earlier research has shown that bootstrap confidence intervals from principal component loadings give a good coverage of the population loadings. However, this only applies to complete data. When data are incomplete, missing data have to be handled before analysing the data. Multiple imputation may be used for this purpose. The question is how bootstrap confidence intervals for principal component loadings should be corrected for multiply imputed data. In this paper, several solutions are proposed. Simulations show that the proposed corrections for multiply imputed data give a good coverage of the population loadings in various situations.  相似文献   
22.
A class of four simultaneous component models for the exploratory analysis of multivariate time series collected from more than one subject simultaneously is discussed. In each of the models, the multivariate time series of each subject is decomposed into a few series of component scores and a loading matrix. The component scores series reveal the latent data structure in the course of time. The interpretation of the components is based on the loading matrix. The simultaneous component models model not only intraindividual variability, but interindividual variability as well. The four models can be ordered hierarchically from weakly to severely constrained, thus allowing for big to small interindividual differences in the model. The use of the models is illustrated by an empirical example.This research has been made possible by funding from the Netherlands Organization of Scientific Research (NWO) to the first author. The authors are obliged to Tom A.B. Snijders, Jos M.F. ten Berge and three anonymous reviewers for comments on an earlier version of this paper, and to Kim Shifren for providing us with her data set, which was collected at Syracuse University.  相似文献   
23.
Recently, a number of model selection heuristics (i.e. DIFFIT, CORCONDIA, the numerical convex hull based heuristic) have been proposed for choosing among Parafac and/or Tucker3 solutions of different complexity for a given three‐way three‐mode data set. Such heuristics are often validated by means of extensive simulation studies. However, these simulation studies are unrealistic in that it is assumed that the variance in real three‐way data can be split into two parts: structural variance, due to a true underlying Parafac or Tucker3 model of low complexity, and random noise. In this paper, we start from the much more reasonable assumption that the variance in any real three‐way data set is due to three different sources: (1) a strong Parafac or Tucker3 structure of low complexity, accounting for a considerable amount of variance, (2) a weak Tucker3 structure, capturing less prominent data aspects, and (3) random noise. As such, Parafac and Tucker3 simulation studies are run in which the data are generated by adding a weak Tucker3 structure to a strong Parafac or Tucker3 one and perturbing the resulting data with random noise. The design of these studies is based on the reanalysis of real data sets. In these studies, the performance of the numerical convex hull based model selection method is evaluated with respect to its capability of discriminating strong from weak underlying structures. The results show that in about two‐thirds of the simulated cases, the hull heuristic yields a model of the same complexity as the strong underlying structure and thus succeeds in disentangling strong and weak underlying structures. In the vast majority of the remaining third, this heuristic selects a solution that combines the strong structure and (part of) the weak structure.  相似文献   
24.
Hierarchical relations among three-way methods   总被引:1,自引:0,他引:1  
  相似文献   
25.
26.
A generalization of Takane's algorithm for dedicom   总被引:1,自引:0,他引:1  
An algorithm is described for fitting the DEDICOM model for the analysis of asymmetric data matrices. This algorithm generalizes an algorithm suggested by Takane in that it uses a damping parameter in the iterative process. Takane's algorithm does not always converge monotonically. Based on the generalized algorithm, a modification of Takane's algorithm is suggested such that this modified algorithm converges monotonically. It is suggested to choose as starting configurations for the algorithm those configurations that yield closed-form solutions in some special cases. Finally, a sufficient condition is described for monotonic convergence of Takane's original algorithm.Financial Support by the Netherlands organization for scientific research (NWO) is gratefully acknowledged. The authors are obliged to Richard Harshman.  相似文献   
27.
28.
Principal covariate regression (PCOVR) is a method for regressing a set of criterion variables with respect to a set of predictor variables when the latter are many in number and/or collinear. This is done by extracting a limited number of components that simultaneously synthesize the predictor variables and predict the criterion ones. So far, no procedure has been offered for estimating statistical uncertainties of the obtained PCOVR parameter estimates. The present paper shows how this goal can be achieved, conditionally on the model specification, by means of the bootstrap approach. Four strategies for estimating bootstrap confidence intervals are derived and their statistical behaviour in terms of coverage is assessed by means of a simulation experiment. Such strategies are distinguished by the use of the varimax and quartimin procedures and by the use of Procrustes rotations of bootstrap solutions towards the sample solution. In general, the four strategies showed appropriate statistical behaviour, with coverage tending to the desired level for increasing sample sizes. The main exception involved strategies based on the quartimin procedure in cases characterized by complex underlying structures of the components. The appropriateness of the statistical behaviour was higher when the proper number of components were extracted.  相似文献   
29.
Prior to a three-way component analysis of a three-way data set, it is customary to preprocess the data by centering and/or rescaling them. Harshman and Lundy (1984) considered that three-way data actually consist of a three-way model part, which in fact pertains to ratio scale measurements, as well as additive “offset” terms that turn the ratio scale measurements into interval scale measurements. They mentioned that such offset terms might be estimated by incorporating additional components in the model, but discarded this idea in favor of an approach to remove such terms from the model by means of centering. Then estimates for the three-way component model parameters are obtained by analyzing the centered data. In the present paper, the possibility of actually estimating the offset terms is taken up again. First, it is mentioned in which cases such offset terms can be estimated uniquely. Next, procedures are offered for estimating model parameters and offset parameters simultaneously, as well as successively (i.e., providing offset term estimates after the three-way model parameters have been estimated in the traditional way on the basis of the centered data). These procedures are provided for both the CANDECOMP/PARAFAC model and the Tucker3 model extended with offset terms. The successive and the simultaneous approaches for estimating model and offset parameters have been compared on the basis of simulated data. It was found that both procedures perform well when the fitted model captures at least all offset terms actually underlying the data. The simultaneous procedures performed slightly better than the successive procedures. If fewer offset terms are fitted than there are underlying the model, the results are considerably poorer, but in these cases the successive procedures performed better than the simultaneous ones. All in all, it can be concluded that the traditional approach for estimating model parameters can hardly be improved upon, and that offset terms can sufficiently well be estimated by the proposed successive approach, which is a simple extension of the traditional approach. The author is obliged to Jos M.F. ten Berge and Marieke Timmerman for helpful comments on an earlier version of this paper. The author is obliged to Iven van Mechelen for making available the data set used in Section 6.  相似文献   
30.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号