首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   323篇
  免费   43篇
  国内免费   48篇
  2023年   4篇
  2022年   12篇
  2021年   20篇
  2020年   13篇
  2019年   16篇
  2018年   16篇
  2017年   18篇
  2016年   9篇
  2015年   9篇
  2014年   25篇
  2013年   31篇
  2012年   11篇
  2011年   13篇
  2010年   10篇
  2009年   15篇
  2008年   21篇
  2007年   13篇
  2006年   12篇
  2005年   8篇
  2004年   8篇
  2003年   7篇
  2002年   8篇
  2001年   6篇
  2000年   10篇
  1999年   5篇
  1998年   8篇
  1997年   10篇
  1996年   1篇
  1995年   6篇
  1994年   5篇
  1993年   8篇
  1992年   1篇
  1991年   7篇
  1990年   5篇
  1989年   6篇
  1988年   2篇
  1987年   1篇
  1986年   4篇
  1985年   5篇
  1984年   1篇
  1983年   6篇
  1982年   1篇
  1981年   5篇
  1980年   2篇
  1979年   3篇
  1978年   5篇
  1977年   1篇
  1976年   1篇
排序方式: 共有414条查询结果,搜索用时 15 毫秒
91.
The present study had two major purposes. First it sought to determine to what extent in an earlier study of distance estimation in stairways (Hanyu & Itsukushima, 1995) would generalize to other types of stairway. Second, it sought to examine which hypothesis, information storage or effort, better explain the earlier results, in which people overestimated distance and traversed time estimates. We obtained four distance and time measures: distance estimate, traversal time estimate, mental walking time and actual traversal time. To measure information, we had participants rate each stairway for complexity (simple-complex) and effort (effortless-effortful) before and after the distance and time measurement tasks. The results revealed that the earlier findings (Hanyu & Itsukushima, 1995) did not fully generalize. The results also did not support either the information storage or the effort hypothesis.  相似文献   
92.
Although the Bock–Aitkin likelihood-based estimation method for factor analysis of dichotomous item response data has important advantages over classical analysis of item tetrachoric correlations, a serious limitation of the method is its reliance on fixed-point Gauss-Hermite (G-H) quadrature in the solution of the likelihood equations and likelihood-ratio tests. When the number of latent dimensions is large, computational considerations require that the number of quadrature points per dimension be few. But with large numbers of items, the dispersion of the likelihood, given the response pattern, becomes so small that the likelihood cannot be accurately evaluated with the sparse fixed points in the latent space. In this paper, we demonstrate that substantial improvement in accuracy can be obtained by adapting the quadrature points to the location and dispersion of the likelihood surfaces corresponding to each distinct pattern in the data. In particular, we show that adaptive G-H quadrature, combined with mean and covariance adjustments at each iteration of an EM algorithm, produces an accurate fast-converging solution with as few as two points per dimension. Evaluations of this method with simulated data are shown to yield accurate recovery of the generating factor loadings for models of upto eight dimensions. Unlike an earlier application of adaptive Gibbs sampling to this problem by Meng and Schilling, the simulations also confirm the validity of the present method in calculating likelihood-ratio chi-square statistics for determining the number of factors required in the model. Finally, we apply the method to a sample of real data from a test of teacher qualifications.  相似文献   
93.
The authors introduce subset conjunction as a classification rule by which an acceptable alternative must satisfy some minimum number of criteria. The rule subsumes conjunctive and disjunctive decision strategies as special cases. Subset conjunction can be represented in a binary-response model, for example, in a logistic regression, using only main effects or only interaction effects. This results in a confounding of the main and interaction effects when there is little or no response error. With greater response error, a logistic regression, even if it gives a good fit to data, can produce parameter estimates that do not reflect the underlying decision process. The authors propose a model in which the binary classification of alternatives into acceptable/unacceptable categories is based on a probabilistic implementation of a subset-conjunctive process. The satisfaction of decision criteria biases the odds toward one outcome or the other. The authors then describe a two-stage choice model in which a (possibly large) set of alternatives is first reduced using a subset-conjunctive rule, after which an alternative is selected from this reduced set of items. They describe methods for estimating the unobserved consideration probabilities from classification and choice data, and illustrate the use of the models for cancer diagnosis and consumer choice. They report the results of simulations investigating estimation accuracy, incidence of local optima, and model fit. The authors thank the Editor, the Associate Editor, and three anonymous reviewers for their constructive suggestions, and also thank Asim Ansari and Raghuram Iyengar for their helpful comments. They also thank Sawtooth Software, McKinsey and Company, and Intelliquest for providing the PC choice data, and the University of Wisconsin for making the breast-cancer data available at the machine learning archives.  相似文献   
94.
95.
Liechty, Pieters & Wedel (2003) developed a hidden Markov Model (HMM) to identify the states of an attentional process in an advertisement viewing task. This work is significant because it demonstrates the benefits of stochastic modeling and Bayesian estimation in making inferences about cognitive processes based on eye movement data. One limitation of the proposed approach is that attention is conceptualized as an autonomous random process that is affected neither by the overall layout of the stimulus nor by the visual information perceived during the current fixation. An alternative model based on the input-output hidden Markov model (IOHMM; Bengio, 1999) is suggested as an extension of the HMM. The need for further studies that validate the HMM classification results is also discussed.  相似文献   
96.
Three approaches to the analysis of main and interaction effect hypotheses in nonorthogonal designs were compared in a 2×2 design for data that was neither normal in form nor equal in variance. The approaches involved either least squares or robust estimators of central tendency and variability and/or a test statistic that either pools or does not pool sources of variance. Specifically, we compared the ANOVA F test which used trimmed means and Winsorized variances, the Welch-James test with the usual least squares estimators for central tendency and variability and the Welch-James test using trimmed means and Winsorized variances. As hypothesized, we found that the latter approach provided excellent Type I error control, whereas the former two did not.Financial support for this research was provided by grants to the first author from the National Sciences and Engineering Research Council of Canada (#OGP0015855) and the Social Sciences and Humanities Research Council (#410-95-0006). The authors would like to express their appreciation to the Associate Editor as well as the reviewers who provided valuable comments on an earlier version of this paper.  相似文献   
97.
Hierarchical Bayes procedures for the two-parameter logistic item response model were compared for estimating item and ability parameters. Simulated data sets were analyzed via two joint and two marginal Bayesian estimation procedures. The marginal Bayesian estimation procedures yielded consistently smaller root mean square differences than the joint Bayesian estimation procedures for item and ability estimates. As the sample size and test length increased, the four Bayes procedures yielded essentially the same result.The authors wish to thank the Editor and anonymous reviewers for their insightful comments and suggestions.  相似文献   
98.
A size estimation (SE) paradigm and the Mueller-Lyer (ML) illusion were used to examine perceptual disturbances in schizophrenics. 35 reliably diagnosed (DSM III-R) schizophrenics were compared to 20 subjects with no history of psychiatric illness. Perceptual distortions found in previous studies of schizophrenics were only to a certain extent confirmed in the present results. More overestimators were found among the schizophrenics than among the normals on the SE task. The schizophrenics, first of all the chronic patients, also proved to be more prone to the Mueller-Lyer illusion. A reason why the very clear differences between schizophrenics and normals found in previous examinations were not confirmed in the present study, might be that a reliable diagnostic instrument was for the first time used in this kind of study.  相似文献   
99.
An observer is to make inference statements about a quantityp, called apropensity and bounded between 0 and 1, based on the observation thatp does or does not exceed a constantc. The propensityp may have an interpretation as a proportion, as a long-run relative frequency, or as a personal probability held by some subject. Applications in medicine, engineering, political science, and, most especially, human decision making are indicated. Bayes solutions for the observer are obtained based on prior distributions in the mixture of beta distribution family; these are then specialized to power-function prior distributions. Inference about logp and log odds is considered. Multiple-action problems are considered in which the focus of inference shifts to theprocess generating the propensitiesp, both in the case of a process parameter known to the subject and unknown. Empirical Bayes techniques are developed for observer inference aboutc when is known to the subject. A Bayes rule, a minimax rule and a beta-minimax rule are constructed for the subject when he is uncertain about.This research was partially supported by the Defense Advanced Research Projects Agency of the Department of Defense and was monitored by ONR under Contract No. N00014-77-C-0095. Any opinions, findings, conclusions or recommendations expressed herein are those of the author and do not necessarily reflect the views of the Defense Advanced Research Projects Agency, the Office of Naval Research, or Carnegie-Mellon University.  相似文献   
100.
Suppose a collection of standard tests is given to all subjects in a random sample, but a different new test is given to each group of subjects in nonoverlapping subsamples. A simple method is developed for displaying the information that the data set contains about the correlational structure of the new tests. This is possible to some extent, even though each subject takes only one new test. The method uses plausible values of the partial correlations among the new tests given the standard tests in order to generate plausible simple correlations among the new tests and plausible multiple correlations between composites of the new tests and the standard tests. The real data example included suggests that the method can be useful in practical problems.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号