首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   296篇
  免费   22篇
  国内免费   21篇
  339篇
  2023年   5篇
  2022年   2篇
  2021年   5篇
  2020年   9篇
  2019年   8篇
  2018年   8篇
  2017年   11篇
  2016年   9篇
  2015年   3篇
  2014年   8篇
  2013年   25篇
  2012年   6篇
  2011年   8篇
  2010年   6篇
  2009年   12篇
  2008年   8篇
  2007年   10篇
  2006年   8篇
  2005年   17篇
  2004年   4篇
  2003年   6篇
  2002年   2篇
  2001年   4篇
  2000年   4篇
  1999年   5篇
  1998年   6篇
  1997年   7篇
  1996年   11篇
  1995年   3篇
  1994年   3篇
  1993年   7篇
  1992年   5篇
  1991年   12篇
  1990年   2篇
  1989年   6篇
  1988年   5篇
  1987年   4篇
  1986年   6篇
  1985年   9篇
  1984年   6篇
  1983年   3篇
  1982年   7篇
  1981年   1篇
  1980年   10篇
  1979年   9篇
  1978年   9篇
  1977年   8篇
  1976年   7篇
排序方式: 共有339条查询结果,搜索用时 46 毫秒
1.
This paper suggests a method to supplant missing categorical data by reasonable replacements. These replacements will maximize the consistency of the completed data as measured by Guttman's squared correlation ratio. The text outlines a solution of the optimization problem, describes relationships with the relevant psychometric theory, and studies some properties of the method in detail. The main result is that the average correlation should be at least 0.50 before the method becomes practical. At that point, the technique gives reasonable results up to 10–15% missing data.We thank Anneke Bloemhoff of NIPG-TNO for compiling and making the Dutch Life Style Survey data available to use, and Chantal Houée and Thérèse Bardaine, IUT, Vannes, France, exchange students under the COMETT program of the EC, for computational assistance. We also thank Donald Rubin, the Editors and several anonymous reviewers for constructive suggestions.  相似文献   
2.
An Extended Two-Way Euclidean Multidimensional Scaling (MDS) model which assumes both common and specific dimensions is described and contrasted with the standard (Two-Way) MDS model. In this Extended Two-Way Euclidean model then stimuli (or other objects) are assumed to be characterized by coordinates onR common dimensions. In addition each stimulus is assumed to have a dimension (or dimensions) specific to it alone. The overall distance between objecti and objectj then is defined as the square root of the ordinary squared Euclidean distance plus terms denoting the specificity of each object. The specificity,s j , can be thought of as the sum of squares of coordinates on those dimensions specific to objecti, all of which have nonzero coordinatesonly for objecti. (In practice, we may think of there being just one such specific dimension for each object, as this situation is mathematically indistinguishable from the case in which there are more than one.)We further assume that ij =F(d ij ) +e ij where ij is the proximity value (e.g., similarity or dissimilarity) of objectsi andj,d ij is the extended Euclidean distance defined above, whilee ij is an error term assumed i.i.d.N(0, 2).F is assumed either a linear function (in the metric case) or a monotone spline of specified form (in the quasi-nonmetric case). A numerical procedure alternating a modified Newton-Raphson algorithm with an algorithm for fitting an optimal monotone spline (or linear function) is used to secure maximum likelihood estimates of the paramstatistics) can be used to test hypotheses about the number of common dimensions, and/or the existence of specific (in addition toR common) dimensions.This approach is illustrated with applications to both artificial data and real data on judged similarity of nations.  相似文献   
3.
Three-way metric unfolding via alternating weighted least squares   总被引:6,自引:3,他引:3  
Three-way unfolding was developed by DeSarbo (1978) and reported in DeSarbo and Carroll (1980, 1981) as a new model to accommodate the analysis of two-mode three-way data (e.g., nonsymmetric proximities for stimulus objects collected over time) and three-mode, three-way data (e.g., subjects rendering preference judgments for various stimuli in different usage occasions or situations). This paper presents a revised objective function and new algorithm which attempt to prevent the common type of degenerate solutions encountered in typical unfolding analysis. We begin with an introduction of the problem and a review of three-way unfolding. The three-way unfolding model, weighted objective function, and new algorithm are presented. Monte Carlo work via a fractional factorial experimental design is described investigating the effect of several data and model factors on overall algorithm performance. Finally, three applications of the methodology are reported illustrating the flexibility and robustness of the procedure.We wish to thank the editor and reviewers for their insightful comments.  相似文献   
4.
The properties of nonmetric multidimensional scaling (NMDS) are explored by specifying statistical models, proving statistical consistency, and developing hypothesis testing procedures. Statistical models with errors in the dependent and independent variables are described for quantitative and qualitative data. For these models, statistical consistency often depends crucially upon how error enters the model and how data are collected and summarized (e.g., by means, medians, or rank statistics). A maximum likelihood estimator for NMDS is developed, and its relationship to the standard Shepard-Kruskal estimation method is described. This maximum likelihood framework is used to develop a method for testing the overall fit of the model.  相似文献   
5.
Three methods for estimating reliability are studied within the context of nonparametric item response theory. Two were proposed originally by Mokken (1971) and a third is developed in this paper. Using a Monte Carlo strategy, these three estimation methods are compared with four classical lower bounds to reliability. Finally, recommendations are given concerning the use of these estimation methods.The authors are grateful for constructive comments from the reviewers and from Charles Lewis.  相似文献   
6.
Forced classification: A simple application of a quantification method   总被引:1,自引:0,他引:1  
This study formulates a property of a quantification method, the principle of equivalent partitioning (PEP). When the PEP is used together with Guttman's principle of internal consistency (PIC) in a simple way, the combination offers an interesting way of analyzing categorical data in terms of the variate(s) chosen by the investigator, a type of canonical analysis. The study discusses applications of the technique to multiple-choice, rank-order, and paired comparison data.This study was supported by the Natural Sciences and Engineering Research Council of Canada (Grant No. A7942). Comments on the earlier drafts from anonymous reviewers and the editor were much appreciated.  相似文献   
7.
Redundancy analysis (also called principal components analysis of instrumental variables) is a technique for two sets of variables, one set being dependent of the other. Its aim is maximization of the explained variance of the dependent variables by a linear combination of the explanatory variables. The technique is generalized to qualitative variables; it then gives implicitly a simultaneous optimal scaling of the dependent, qualitative variables. Examples are taken from the Dutch Life Situation Survey 1977, using Satisfaction with Life and Happiness as dependent variables. The analysis leads to one well-being scale, defined by the explanatory variables Marital status, Schooling, Income and Activity.The views expressed in this paper are those of the author and do not necessarily reflect the policies of the Netherlands Central Bureau of Statistics.  相似文献   
8.
Abstract: A probabilistic multidimensional scaling model is proposed. The model assumes that the coordinates of each stimulus are normally distributed with variance Σi = diag(σ21, … σ2Ri). The advantage of this model is that axes are determined uniquely. The distribution of the distance between two stimuli is obtained by polar coordinates transformation. The method of maximum likelihood estimation for means and variances using the EM algorithm is discussed. Further, simulated annealing is suggested as a means of obtaining initial values in order to avoid local maxima. A simulation study shows that the estimates are accurate, and a numerical example concerning the location of Japanese cities shows that natural axes can be obtained without introducing individual parameters.  相似文献   
9.
A new computational method to fit the weighted euclidean distance model   总被引:1,自引:0,他引:1  
This paper describes a computational method for weighted euclidean distance scaling which combines aspects of an analytic solution with an approach using loss functions. We justify this new method by giving a simplified treatment of the algebraic properties of a transformed version of the weighted distance model. The new algorithm is much faster than INDSCAL yet less arbitrary than other analytic procedures. The procedure, which we call SUMSCAL (subjectivemetricscaling), gives essentially the same solutions as INDSCAL for two moderate-size data sets tested.Comments by J. Douglas Carroll and J. B. Kruskal have been very helpful in preparing this paper.  相似文献   
10.
The vast majority of existing multidimensional scaling (MDS) procedures devised for the analysis of paired comparison preference/choice judgments are typically based on either scalar product (i.e., vector) or unfolding (i.e., ideal-point) models. Such methods tend to ignore many of the essential components of microeconomic theory including convex indifference curves, constrained utility maximization, demand functions, et cetera. This paper presents a new stochastic MDS procedure called MICROSCALE that attempts to operationalize many of these traditional microeconomic concepts. First, we briefly review several existing MDS models that operate on paired comparisons data, noting the particular nature of the utility functions implied by each class of models. These utility assumptions are then directly contrasted to those of microeconomic theory. The new maximum likelihood based procedure, MICROSCALE, is presented, as well as the technical details of the estimation procedure. The results of a Monte Carlo analysis investigating the performance of the algorithm as a number of model, data, and error factors are experimentally manipulated are provided. Finally, an illustration in consumer psychology concerning a convenience sample of thirty consumers providing paired comparisons judgments for some fourteen brands of over-the-counter analgesics is discussed.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号