首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   206篇
  免费   11篇
  国内免费   16篇
  2023年   2篇
  2022年   2篇
  2021年   7篇
  2020年   6篇
  2019年   4篇
  2018年   6篇
  2017年   4篇
  2016年   10篇
  2015年   3篇
  2014年   7篇
  2013年   19篇
  2012年   7篇
  2011年   5篇
  2010年   5篇
  2009年   9篇
  2008年   9篇
  2007年   9篇
  2006年   9篇
  2005年   8篇
  2004年   11篇
  2003年   7篇
  2002年   4篇
  2001年   10篇
  2000年   7篇
  1999年   5篇
  1998年   6篇
  1997年   7篇
  1996年   2篇
  1995年   5篇
  1994年   4篇
  1993年   2篇
  1992年   10篇
  1991年   1篇
  1990年   3篇
  1989年   3篇
  1988年   3篇
  1987年   1篇
  1986年   2篇
  1985年   1篇
  1984年   1篇
  1981年   1篇
  1980年   2篇
  1979年   1篇
  1977年   1篇
  1976年   2篇
排序方式: 共有233条查询结果,搜索用时 15 毫秒
31.
One of the main objectives of many empirical studies in the social and behavioral sciences is to assess the causal effect of a treatment or intervention on the occurrence of a certain event. The randomized controlled trial is generally considered the gold standard to evaluate such causal effects. However, for ethical or practical reasons, social scientists are often bound to the use of nonexperimental, observational designs. When the treatment and control group are different with regard to variables that are related to the outcome, this may induce the problem of confounding. A variety of statistical techniques, such as regression, matching, and subclassification, is now available and routinely used to adjust for confounding due to measured variables. However, these techniques are not appropriate for dealing with time-varying confounding, which arises in situations where the treatment or intervention can be received at multiple timepoints. In this article, we explain the use of marginal structural models and inverse probability weighting to control for time-varying confounding in observational studies. We illustrate the approach with an empirical example of grade retention effects on mathematics development throughout primary school.  相似文献   
32.
Ronald Cordero 《Metaphilosophy》2016,47(4-5):719-727
Logic is a central and highly useful part of philosophy. Its value is particularly evident when it comes to keeping our thinking about disjunctive probabilities clear. Because of the two meanings of “or” (“just one of these statements is true,” “at least one of these statements is true”), logic can show how the likelihood of a disjunction being true can be determined quite easily. To gauge the chance that one of two or more exclusive alternatives is true, one need only sum up their respective likelihoods. And to know the chance that at least one of two or more compatible alternatives is true, one simply has to figure the chance that it is false that all of them are false!  相似文献   
33.
This paper presents a practical implementation of multicriteria methodologies based on the UTA model by Jacquet‐Lagrèze and Siskos and Quasi‐UTA model by Beuthe and Scannella, which are specified with a non‐linear, but piecewise linear, additive utility function. In contrast with the general UTA model, the Quasi‐UTA specification structures the partial utilities as recursive exponential functions of only one curvature parameter. This allows for a reduction of the quantity of information necessary to build the utility function. The software MUSTARD implements different variants of these models. Firstly, it offers the basic deterministic UTA model of disaggregation, but also its first programmed stochastic version. In both cases, the software proceeds stepwise and interactively helping the decision maker to formulate the problem and state preferences between projects; in the stochastic case, the decision maker is even helped to build the criteria distributions. The Quasi‐UTA specification can be introduced in this disaggregation model. Secondly, the software offers an aggregation model whereby the Quasi‐UTA partial utility functions are built separately through specific questioning processes. The questions relating to deterministic criteria are of the ‘direct rating type’, while those of the stochastic criteria are either of the ‘variable probability’ or the ‘variable outcome’ type. The criteria weights can be assessed by the ‘swing weight’ method or by a UTA‐II side‐program. As an example as well as a test of the Quasi‐UTA aggregation approach, the paper presents its application to a real problem of selecting road investment projects in Belgium. Several experts and civil servants were interviewed, and their individual utility functions derived. The projects are ranked according to their rate of return, which is computed on the basis of the projects certain equivalent money value. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   
34.
Subjects judged the disutility of health conditions (e.g. blindness) using one of them (e.g. blindness+deafness) as a standard, using three elicitation methods: analog scale (AS, how bad is blindness compared to blindness+deafness?); magnitude estimation (ME, blindness+deafness is how many times as bad as blindness?); and person trade‐off (PTO, how many people cured of blindness is as good as 10 people cured of blindness+deafness?). ME disutilities of the less bad condition were smallest, and AS was highest. Interleaving PTO with ME made PTO more like ME. AS disutilities were inconsistent with direct judgments of differences between pairs of conditions. ME and PTO judgments were internally inconsistent: e.g. the disutility of one‐eye‐blindness relative to blindness+deafness was larger than predicted from comparison of each to blindness. Consistency training reduced inconsistency, increased agreement between AS and PTO, and transferred from one method to the other. The results support the use of consistency checks in utility elicitation. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   
35.
Traditionally, parameters of multiattribute utility models, representing a decision maker's preference judgements, are treated deterministically. This may be unrealistic, because assessment of such parameters is potentially fraught with imprecisions and errors. We thus treat such parameters as stochastic and investigate how their associated imprecision/errors are propagated in an additive multiattribute utility function in terms of the aggregate variance. Both a no information and a rank order case regarding the attribute weights are considered, assuming a uniform distribution over the feasible region of attribute weights constrained by the respective information assumption. In general, as the number of attributes increases, the variance of the aggregate utility in both cases decreases and approaches the same limit, which depends only on the variances as well as the correlations among the single-attribute utilities. However, the marginal change in aggregate utility variance decreases rather rapidly and hence decomposition as a variance reduction mechanism is generally useful but becomes relatively ineffective if the number of attributes exceed about 10. Moreover, it was found that utilities which are positively correlated increase the aggregate utility variance, hence every effort should be made to avoid positive correlations between the single-attribute utilities. We also provide guidelines for determining under what condition and to what extent a decision maker should decompose to obtain an aggregate utility variance that is smaller than that of holistic assessments. Extensions of the current model and empirical research to support some of our behavioural assumptions are discussed. © 1997 John Wiley & Sons, Ltd.  相似文献   
36.
Nowadays, utility theory and compromise programming (CP) are considered very different paradigms and methodologies to measure preferences as well as to determine decision maker's optima on an efficient frontier. In this paper, however, we show that a utility function with separate variables (presented in the form of a Taylor series around the ideal point) is reducible to a weighted sum of CP distances. This linkage between utility and compromise (based on a main assumption in which the usual utility functions hold) leads to (i) a method for specification and optimization of usual utility functions by operational technique and (ii) a reformulation of standard CP with the advantage of determining the best CP solution from a utility perspective. © 1997 John Wiley & Sons, Ltd.  相似文献   
37.
We show that only two simple trade-off judgments are sufficient to determine whether the multiplicative multiattribute model assumes its additive form, regardless of the number of attributes in the model. This additivity condition offers a useful alternative to the test based on multiattribute lotteries commonly presented in textbooks. It can make the determination of additivity easier and more reliable. © 1997 John Wiley & Sons, Ltd.  相似文献   
38.
The Savage–Dickey density ratio is a simple method for computing the Bayes factor for an equality constraint on one or more parameters of a statistical model. In regression analysis, this includes the important scenario of testing whether one or more of the covariates have an effect on the dependent variable. However, the Savage–Dickey ratio only provides the correct Bayes factor if the prior distribution of the nuisance parameters under the nested model is identical to the conditional prior under the full model given the equality constraint. This condition is violated for multiple regression models with a Jeffreys–Zellner–Siow prior, which is often used as a default prior in psychology. Besides linear regression models, the limitation of the Savage–Dickey ratio is especially relevant when analytical solutions for the Bayes factor are not available. This is the case for generalized linear models, non-linear models, or cognitive process models with regression extensions. As a remedy, the correct Bayes factor can be computed using a generalized version of the Savage–Dickey density ratio.  相似文献   
39.
医学创新之大忌:急功近利   总被引:8,自引:3,他引:5  
急功近利是医学创新中的一种不良现象,给我国医学科学的发展带来了十分严重的后果.急功近利背离了正确的科学功利观,是对医学创新功利关系的严重错位.  相似文献   
40.
Expected utility theory explains collective action as an attempt by individuals to maximize their gains. In contrast, my application of prospect theory to collective action suggests that people are motivated to participate in collective action by a fear of loss. These alternative rationalities are considered in the context of the successful cooperative effort of four economic groups in Chile during 1973–75, the first years of the Pinochet military regime. In this case, the logic of prospect theory better captures how actors made decisions about whether or not to engage in collective action. Of the four groups that did join the 1973–75 economic coalition, only one (the mineowners) appears to have maximized its net asset level, as expected utility theory predicts. All four groups seem to have been motivated to cooperate because they found themselves in the domain of losses and expected that cooperation with other, even rival, economic groups might help them recoup their recent losses.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号