首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
Choice probabilities are basic to much of the theory of individual choice behavior in mathematical psychology. On the other hand, consumer economics has relied primarily on preference relations and choice functions for its theories of individual choice. Although there are sizable literatures on the connections between choice probabilities and preference relations, and between preference relations and choice functions, little has been done—apart from their common ties to preference relations—to connect choice probabilities and choice functions. The latter connection is studied in this paper. A family of choice functions that depends on a threshold parameter is defined from a choice probability function. It is then shown what must be true of the choice probability function so that the choice functions satisfy three traditional rationality conditions. Conversely, it is shown what must be true of the choice functions so that the choice probability function satisfies a version of Luce's axiom for individual choice probabilities.  相似文献   

2.
Occasionally, people are called upon to estimate probabilities after an event has occurred. In hindsight, was this an outcome we could have expected? Could things easily have turned out differently? One strategy for performing post hoc probability judgements would be to mentally turn the clock back and reconstruct one's expectations before the event. But if asked about the probability of an alternative, counterfactual outcome, a simpler strategy is available, based on this outcome's perceived closeness to what actually happened. The article presents five studies exploring the relationship between counterfactual closeness and counterfactual probability. The first study indicates that post hoc probabilities typically refer to the counterfactual rather than the factual outcome. Studies 2-5 show that physical, temporal, or conceptual proximity play a decisive role for post hoc probability assessments of counterfactual events. When margins are narrow, the probabilities of, for instance, winning a match (when losing), and of losing (when actually winning) may even be rated higher than the corresponding probabilities of what really happened. Closeness is also more often referred to, and rated to be a better reason for believing there is a “good chance” of the counterfactual rather than of the factual result occurring. Finally, the closeness of the alternative outcome in success and failure stories is shown to be significantly correlated to its rated probability.  相似文献   

3.
There are a number of reasons for being interested in uncertainty, and there are also a number of uncertainty formalisms. These formalisms are not unrelated. It is argued that they can all be reflected as special cases of the approach of taking probabilities to be determined by sets of probability functions defined on an algebra of statements. Thus, interval probabilities should be construed as maximum and minimum probabilities within a set of distributions, Glenn Shafer's belief functions should be construed as lower probabilities, etc. Updating probabilities introduces new considerations, and it is shown that the representation of belief as a set of probabilities conflicts in this regard with the updating procedures advocated by Shafer. The attempt to make subjectivistic probability plausible as a doctrine of rational belief by making it more flowery — i.e., by adding new dimensions — does not succeed. But, if one is going to represent beliefs by sets of distributions, those sets of distributions might as well be based in statistical knowledge, as they are in epistemological or evidential probability.  相似文献   

4.
Haines Brown 《Axiomathes》2014,24(3):291-312
The paper assumes that to be of practical interest process must be understood as physical action that takes place in the world rather than being an idea in the mind. It argues that if an ontology of process is to accommodate actuality, it must be represented in terms of relative probabilities. Folk physics cannot accommodate this, and so the paper appeals to scientific culture because it is an emergent knowledge of the world derived from action in it. Process is represented as a contradictory probability distribution that does not depend on a spatio-temporal frame. An actuality is a probability density that grounds the values of probabilities to constitute their distributions. Because probability is a conserved value, probability distributions are subject to the constraint of symmetry and must be zero-sum. An actuality is locked-in by other actualities to become a zero-sum symmetry of probability values. It is shown that the locking-in of actualities constructs spatio-temporal locality, lends actualities specificity, and makes them a contradiction. Localization is the basis for understanding empirical observation. Because becoming depends on its construction of being, processes exist as trajectories. The historical trajectories of evolution and revolution as well as the non-historical trajectory of strong emergence are how processes are observed to exist.  相似文献   

5.
Conclusion Even those generally skeptical of propensity interpretations of probability must now grant the following two points. First, the above single-case propensity interpretation meets recognized formal conditions for being a genuine interpretation of probability. Second, this interpretation is not logically reducible to a hypothetical relative frequency interpretation, nor is it only vacuously different from such an interpretation.The main objection to this propensity interpretation must be not that it is too vague or vacuous, but that it is metaphysically too extravagant. It asserts not only that there are physical possibilities in nature, but further that nature itself contains innate tendencies toward these possibilities, tendencies which have the logical structure of probabilities. Thus the basic dispute between advocates of an actualist relative frequency interpretation and a single-case propensity interpretation is not a matter of epistemology, but metaphysics. The frequency theorist wishes to maintain that claims about physical probabilities are nothing more than claims about relative frequencies that will occur in the actual history of the world, be it infinite or no. It is a substantial, though hardly conclusive, argument for the propensity view that the mathematical structures commonly employed in studies of stochastic processes and statistical inference are richer than can be accommodated by a relative frequency interpretation. Whether it is possible to bridge this gap without going beyond an actualist metaphysics remains to be seen.38  相似文献   

6.
Jan Von Plato 《Synthese》1982,53(3):419-432
De Finetti's representation theorem is a special case of the ergodic decomposition of stationary probability measures. The problems of the interpretation of probabilities centred around de Finetti's theorem are extended to this more general situation. The ergodic decomposition theorem has a physical background in the ergodic theory of dynamical systems. Thereby the interpretations of probabilities in the cases of de Finetti's theorem and its generalization and in ergodic theory are systematically connected to each other.This paper is an extended version of footnote 5 of von Plato (1981).  相似文献   

7.
Epistemology and probability   总被引:1,自引:0,他引:1  
Pollock  John L. 《Synthese》1983,55(2):231-252
Probability is sometimes regarded as a universal panacea for epistemology. It has been supposed that the rationality of belief is almost entirely a matter of probabilities. Unfortunately, those philosophers who have thought about this most extensively have tended to be probability theorists first, and epistemologists only secondarily. In my estimation, this has tended to make them insensitive to the complexities exhibited by epistemic justification. In this paper I propose to turn the tables. I begin by laying out some rather simple and uncontroversial features of the structure of epistemic justification, and then go on to ask what we can conclude about the connection between epistemology and probability in the light of those features. My conclusion is that probability plays no central role in epistemology. This is not to say that probability plays no role at all. In the course of the investigation, I defend a pair of probabilistic acceptance rules which enable us, under some circumstances, to arrive at justified belief on the basis of high probability. But these rules are of quite limited scope. The effect of there being such rules is merely that probability provides one source for justified belief, on a par with perception, memory, etc. There is no way probability can provide a universal cure for all our epistemological ills.  相似文献   

8.
Girotto V  Gonzalez M 《Cognition》2002,84(3):353-359
Do individuals unfamiliar with probability and statistics need a specific type of data in order to draw correct inferences about uncertain events? Girotto and Gonzalez (Cognition 78 (2001) 247) showed that naive individuals solve frequency as well as probability problems, when they reason extensionally, in particular when probabilities are represented by numbers of chances. Hoffrage, Gigerenzer, Krauss, and Martignon (Cognition 84 (2002) 343) argued that numbers of chances are natural frequencies disguised as probabilities, though lacking the properties of true probabilities. They concluded that we failed to demonstrate that naive individuals can deal with true probabilities as opposed to natural frequencies. In this paper, we demonstrate that numbers of chances do represent probabilities, and that naive individuals do not confuse numbers of chances with frequencies. We conclude that there is no evidence for the claim that natural frequencies have a special cognitive status, and the evolutionary argument that the human mind is unable to deal with probabilities.  相似文献   

9.
Richard Jeffrey 《Erkenntnis》1996,45(2-3):327-335
From a point of view like de Finetti's, what is the judgmental reality underlying the objectivistic claim that a physical magnitude X determines the objective probability that a hypothesis H is true? When you have definite conditional judgmental probabilities for H given the various unknown values of X, a plausible answer is sufficiency, i.e., invariance of those conditional probabilities as your probability distribution over the values of X varies. A different answer, in terms of conditional exchangeability, is offered for use when such definite conditional probabilities are absent.  相似文献   

10.
Müller  Thomas  Placek  Tomasz 《Synthese》2001,128(3):343-379
Since the validity of Bell's inequalities implies the existence of joint probabilities for non-commuting observables, there is no universal consensus as to what the violation of these inequalities signifies. While the majority view is that the violation teaches us an important lesson about the possibility of explanations, if not about metaphysical issues, there is also a minimalist position claiming that the violation is to be expected from simple facts about probability theory. This minimalist position is backed by theorems due to A. Fine and I. Pitowsky.Our paper shows that the minimalist position cannot be sustained. To this end,we give a formally rigorous interpretation of joint probabilities in thecombined modal and spatiotemporal framework of `stochastic outcomes inbranching space-time' (SOBST) (Kowalski and Placek, 1999; Placek, 2000). We show in this framework that the claim that there can be no joint probabilities fornon-commuting observables is incorrect. The lesson from Fine's theorem is notthat Bell's inequalities will be violated anyhow, but that an adequate modelfor the Bell/Aspect experiment must not define global joint probabilities. Thus we investigate the class of stochastic hidden variable models, whichprima facie do not define such joint probabilities. The reasonwhy these models fail supports the majority view: Bell's inequalities are notjust a mathematical artifact.  相似文献   

11.
Undermined     
A popular strategy for understanding the probabilities that arise in physics is to interpret them via reductionist accounts of chance—indeed, it is sometimes claimed that such accounts are uniquely well-suited to make sense of the probabilities in classical statistical mechanics. Here it is argued that reductionist accounts of chance carry a steep but unappreciated cost: when applied to physical theories of the relevant type, they inevitably distort the relations of probability that they take as input.  相似文献   

12.
Dennis Dieks 《Synthese》2007,156(3):427-439
According to the Doomsday Argument we have to rethink the probabilities we assign to a soon or not so soon extinction of mankind when we realize that we are living now, rather early in the history of mankind. Sleeping Beauty finds herself in a similar predicament: on learning the date of her first awakening, she is asked to re-evaluate the probabilities of her two possible future scenarios. In connection with Doom, I argue that it is wrong to assume that our ordinary probability judgements do not already reflect our place in history: we justify the predictive use we make of the probabilities yielded by science (or other sources of information) by our knowledge of the fact that we live now, a certain time before the possible occurrence of the events the probabilities refer to. Our degrees of belief should change drastically when we forget the date—importantly, this follows without invoking the “Self Indication Assumption”. Subsequent conditionalization on information about which year it is cancels this probability shift again. The Doomsday Argument is about such probability shifts, but tells us nothing about the concrete values of the probabilities—for these, experience provides the only basis. Essentially the same analysis applies to the Sleeping Beauty problem. I argue that Sleeping Beauty “thirders” should be committed to thinking that the Doomsday Argument is ineffective; whereas “halfers” should agree that doom is imminent—but they are wrong.  相似文献   

13.
The conjunction fallacy occurs when people judge a conjunctive statement B‐and‐A to be more probable than a constituent B, in contrast to the law of probability that P(B ∧ A) cannot exceed P(B) or P(A). Researchers see this fallacy as demonstrating that people do not follow probability theory when judging conjunctive probability. This paper shows that the conjunction fallacy can be explained by the standard probability theory equation for conjunction if we assume random variation in the constituent probabilities used in that equation. The mathematical structure of this equation is such that random variation will be most likely to produce the fallacy when one constituent has high probability and the other low, when there is positive conditional support between the constituents, when there are two rather than three constituents, and when people rank probabilities rather than give numerical estimates. The conjunction fallacy has been found to occur most frequently in exactly these situations. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

14.
Decision makers typically overweight small probabilities and underweight large probabilities. However, there are recent reports that when probability is presented in the form of relative frequencies, this typical pattern reverses. We tested this hypothesis by comparing decision making in two tasks: In one task, probability was stated numerically, and in the other task, it was conveyed through a visual representation. In the visual task, participants chose whether a "stochastic bullet" should be fired at either a large target for a small reward or a small target for a large reward. Participants' knowledge of probability in the visual task was the result of extensive practice firing bullets at targets. In the classical numerical task, participants chose between pairs of lotteries with probabilities and rewards matched to the probabilities and rewards in the visual task. We found that participants' probability-weighting functions were significantly different in the two tasks, but the pattern for the visual task was the typical, not the reversed, pattern.  相似文献   

15.
The Adams family     
Douven I  Verbrugge S 《Cognition》2010,117(3):302-318
  相似文献   

16.
Ellery Eells 《Synthese》1983,57(3):387-442
I argue that to the extent to which philosophical theories of objective probability have offered theoretically adequateconceptions of objective probability (in connection with such desiderata as causal and explanatory significance, applicability to single cases, etc.), they have failed to satisfy amethodological standard — roughly, a requirement to the effect that the conception offered be specified with the precision appropriate for a physical interpretation of an abstract formal calculus and be fully explicated in terms of concepts, objects or phenomena understood independently of the idea of physical probability. The significance of this, and of the suggested methodological standard, is then briefly discussed.  相似文献   

17.
An analysis of indefinite probability statements has been offered by Jackson and Pargetter (1973). We accept that this analysis will assign the correct probability values for indefinite probability claims. But it does so in a way which fails to reflect the epistemic state of a person who makes such a claim. We offer two alternative analyses: one employing de re (epistemic) probabilities, and the other employing de dicto (epistemic) probabilities. These two analyses appeal only to probabilities which are accessible to a person who makes an indefinite probability judgment, and yet we prove that the probabilities which either of them assigns will always be equivalent to those assigned by the Jackson and Pargetter analysis.  相似文献   

18.
19.
Aidan Lyon 《Synthese》2011,182(3):413-432
Some have argued that chance and determinism are compatible in order to account for the objectivity of probabilities in theories that are compatible with determinism, like Classical Statistical Mechanics (CSM) and Evolutionary Theory (ET). Contrarily, some have argued that chance and determinism are incompatible, and so such probabilities are subjective. In this paper, I argue that both of these positions are unsatisfactory. I argue that the probabilities of theories like CSM and ET are not chances, but also that they are not subjective probabilities either. Rather, they are a third type of probability, which I call counterfactual probability. The main distinguishing feature of counterfactual-probability is the role it plays in conveying important counterfactual information in explanations. This distinguishes counterfactual probability from chance as a second concept of objective probability.  相似文献   

20.
A number of studies have shown that people exploit transitional probabilities between successive syllables to segment a stream of artificial continuous speech into words. It is often assumed that what is actually exploited are the forward transitional probabilities (given XY, the probability that X will be followed by Y), even though the backward transitional probabilities (the probability that Y has been preceded by X) were equally informative about word structure in the languages involved in those studies. In two experiments, we showed that participants were able to learn the words from an artificial speech stream when the only available cues were the backward transitional probabilities. Learning is as good under those conditions as when the only available cues are the forward transitional probabilities. Implications for some current models of word segmentation, particularly the simple recurrent networks and PARSER models, are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号