首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Choice probabilities are basic to much of the theory of individual choice behavior in mathematical psychology. On the other hand, consumer economics has relied primarily on preference relations and choice functions for its theories of individual choice. Although there are sizable literatures on the connections between choice probabilities and preference relations, and between preference relations and choice functions, little has been done—apart from their common ties to preference relations—to connect choice probabilities and choice functions. The latter connection is studied in this paper. A family of choice functions that depends on a threshold parameter is defined from a choice probability function. It is then shown what must be true of the choice probability function so that the choice functions satisfy three traditional rationality conditions. Conversely, it is shown what must be true of the choice functions so that the choice probability function satisfies a version of Luce's axiom for individual choice probabilities.  相似文献   

2.
Belief revision (BR) and truthlikeness (TL) emerged independently as two research programmes in formal methodology in the 1970s. A natural way of connecting BR and TL is to ask under what conditions the revision of a belief system by new input information leads the system towards the truth. It turns out that, for the AGM model of belief revision, the only safe case is the expansion of true beliefs by true input, but this is not very interesting or realistic as a model of theory change in science. The new accounts of non-prioritized belief revision do not seem more promising in this respect, and the alternative BR account of updating by imaging leads to other problems. Still, positive results about increasing truthlikeness by belief revision may be sought by restricting attention to special kinds of theories. Another approach is to link truthlikeness to epistemic matters by an estimation function which calculates expected degrees of truthlikeness relative to evidence. Then we can study how the expected truthlikeness of a theory changes when probabilities are revised by conditionalization or imaging. Again, we can ask under what conditions such changes lead our best theories towards the truth.  相似文献   

3.
Common probability theories only allow the deduction of probabilities by using previously known or presupposed probabilities. They do not, however, allow the derivation of probabilities from observed data alone. The question thus arises as to how probabilities in the empirical sciences, especially in medicine, may be arrived at. Carnap hoped to be able to answer this question byhis theory of inductive probabilities. In the first four sections of the present paper the above mentioned problem is discussed in general. After a short presentation of Carnap's theory it is then shown that this theory cannot claim validity for arbitrary random processes. It is suggested that the theory be only applied to binomial and multinomial experiments. By application of de Finetti's theorem Carnap's inductive probabilities are interpreted as consecutive probabilities of the Bayesian kind. Through the introduction of a new axiom the decision parameter λ can be determined even if no a priori knowledge is given. Finally, it is demonstrated that the fundamental problem of Wald's decision theory, i.e., the determination of a plausible criterion where no a priori knowledge is available, can be solved for the cases of binomial and multinomial experiments.  相似文献   

4.

Must probabilities be countably additive? On the one hand, arguably, requiring countable additivity is too restrictive. As de Finetti pointed out, there are situations in which it is reasonable to use merely finitely additive probabilities. On the other hand, countable additivity is fruitful. It can be used to prove deep mathematical theorems that do not follow from finite additivity alone. One of the most philosophically important examples of such a result is the Bayesian convergence to the truth theorem, which says that conditional probabilities converge to 1 for true hypotheses and to 0 for false hypotheses. In view of the long-standing debate about countable additivity, it is natural to ask in what circumstances finitely additive theories deliver the same results as the countably additive theory. This paper addresses that question and initiates a systematic study of convergence to the truth in a finitely additive setting. There is also some discussion of how the formal results can be applied to ongoing debates in epistemology and the philosophy of science.

  相似文献   

5.
By virtue of what do alarm calls and facial expressions carry natural information? The answer I defend in this paper is that they carry natural information by virtue of changing the probabilities of various states of affairs, relative to background data. The Probabilistic Difference Maker Theory (PDMT) of natural information that I introduce here is inspired by Dretske's [1981] seminal analysis of natural information, but parts ways with it by eschewing the requirements that information transmission must be nomically underwritten, mind-independent, and knowledge-yielding. PDMT includes both a qualitative account of information transmission and a measure of natural information in keeping with the basic principles of Shannon's communication theory and Bayesian confirmation theory. It also includes a new account of the informational content of a signal, understood as the combination of the incremental and overall support that the signal provides for all states of affairs at the source. Finally, I compare and contrast PDMT with other probabilistic and non-probabilistic theories of natural information, most notably Millikan's [2013] recent theory of natural information as non-accidental pattern repetition.  相似文献   

6.
Kripke’s theory of truth (Kripke, The Journal of Philosophy72(19), 690–716; 1975) has been very successful but shows well-known expressive difficulties; recently, Field has proposed to overcome them by adding a new conditional connective to it. In Field’s theories, desirable conditional and truth-theoretic principles are validated that Kripke’s theory does not yield. Some authors, however, are dissatisfied with certain aspects of Field’s theories, in particular the high complexity. I analyze Field’s models and pin down some reasons for discontent with them, focusing on the meaning of the new conditional and on the status of the principles so successfully recovered. Subsequently, I develop a semantics that improves on Kripke’s theory following Field’s program of adding a conditional to it, using some inductive constructions that include Kripke’s one and feature a strong evaluation for conditionals. The new theory overcomes several problems of Kripke’s one and, although weaker than Field’s proposals, it avoids the difficulties that affect them; at the same time, the new theory turns out to be quite simple. Moreover, the new construction can be used to model various conceptions of what a conditional connective is, in ways that are precluded to both Kripke’s and Field’s theories.  相似文献   

7.
In recent years, the notion of a reason has come to occupy a central place in both metaethics and normative theory more broadly. Indeed, many philosophers have come to view reasons as providing the basis of normativity itself . The common conception is that reasons are facts that count in favor of some act or attitude. More recently, philosophers have begun to appreciate a distinction between objective and subjective reasons, where (roughly) objective reasons are determined by the facts, while subjective reasons are determined by one's beliefs. My goal in this paper is to offer a plausible theory of subjective reasons. Although much attention has been focused on theories of objective reasons, very little has been offered in the literature regarding what sort of account of subjective reasons we should adopt; and what has been offered is rather perfunctory, and requires filling-out. Taking what has been said thus far as a starting point, I will consider several putative theories of subjective reasons, offering objections and amendments along the way, will settle on what I take to be a highly plausible account, and will defend that account against objections.  相似文献   

8.
Joshua C. Thurow 《Synthese》2013,190(9):1587-1603
Paul Benacerraf’s argument that mathematical realism is apparently incompatible with mathematical knowledge has been widely thought to also show that a priori knowledge in general is problematic. Although many philosophers have rejected Benacerraf’s argument because it assumes a causal theory of knowledge, some maintain that Benacerraf nevertheless put his finger on a genuine problem, even though he didn’t state the problem in its most challenging form. After diagnosing what went wrong with Benacerraf’s argument, I argue that a new, more challenging, version of Benacerraf’s problem can be constructed. The new version—what I call the Defeater Version—of Benacerraf’s problem makes use of a no-defeater condition on knowledge and justification. I conclude by arguing that the best way to avoid the problem is to construct a theory of how a priori judgments reliably track the facts. I also suggest four different kinds of theories worth pursuing.  相似文献   

9.
It has become increasingly common for philosophers to distinguish between objective and subjective rightness, and there has been much discussion recently about what an adequate theory of subjective rightness looks like. In this article, I propose a new theory of subjective rightness. According to it, an action is subjectively right if and only if it minimizes expected objective wrongness. I explain this theory in detail and argue that it avoids many of the problems that other theories of subjective rightness face. I end by responding to some objections.  相似文献   

10.
New paradoxes of risky decision making   总被引:1,自引:0,他引:1  
  相似文献   

11.
Saunders  Simon 《Synthese》1998,114(3):373-404
A variety of ideas arising in decoherence theory, and in the ongoing debate over Everett's relative-state theory, can be linked to issues in relativity theory and the philosophy of time, specifically the relational theory of tense and of identity over time. These have been systematically presented in companion papers (Saunders 1995; 1996a); in what follows we shall consider the same circle of ideas, but specifically in relation to the interpretation of probability, and its identification with relations in the Hilbert Space norm. The familiar objection that Everett's approach yields probabilities different from quantum mechanics is easily dealt with. The more fundamental question is how to interpret these probabilities consistent with the relational theory of change, and the relational theory of identity over time. I shall show that the relational theory needs nothing more than the physical, minimal criterion of identity as defined by Everett's theory, and that this can be transparently interpreted in terms of the ordinary notion of the chance occurrence of an event, as witnessed in the present. It is in this sense that the theory has empirical content.  相似文献   

12.
The recent movement towards virtue–theoretic treatments of epistemological concepts can be understood in terms of the desire to eliminate epistemic luck. Significantly, however, it is argued that the two main varieties of virtue epistemology are responding to different types of epistemic luck. In particular, whilst proponents of reliabilism–based virtue theories have been focusing on the problem of what I call "veritic" epistemic luck, non–reliabilism–based virtue theories have instead been concerned with a very different type of epistemic luck, what I call "reflective" epistemic luck. It is argued that, prima facie at least, both forms of epistemic luck need to be responded to by any adequate epistemological theory. The problem, however, is that one can best eliminate veritic epistemic luck by adducing a so–called safety–based epistemological theory that need not be allied to a virtue–based account, and there is no fully adequate way of eliminating reflective epistemic luck. I thus conclude that this raises a fundamental difficulty for virtue–based epistemological theories, on either construal.  相似文献   

13.
The recent movement towards virtue–theoretic treatments of epistemological concepts can be understood in terms of the desire to eliminate epistemic luck. Significantly, however, it is argued that the two main varieties of virtue epistemology are responding to different types of epistemic luck. In particular, whilst proponents of reliabilism–based virtue theories have been focusing on the problem of what I call "veritic" epistemic luck, non–reliabilism–based virtue theories have instead been concerned with a very different type of epistemic luck, what I call "reflective" epistemic luck. It is argued that, prima facie at least, both forms of epistemic luck need to be responded to by any adequate epistemological theory. The problem, however, is that one can best eliminate veritic epistemic luck by adducing a so–called safety–based epistemological theory that need not be allied to a virtue–based account, and there is no fully adequate way of eliminating reflective epistemic luck. I thus conclude that this raises a fundamental difficulty for virtue–based epistemological theories, on either construal.  相似文献   

14.
Rational Choice, Deterrence, and Theoretical Integration   总被引:1,自引:0,他引:1  
The old version of rational choice theory is that people engage in conscious and deliberate cost–benefit analysis such that they maximize the values and minimize the costs of their actions. The new version of rational choice theory is that people intuit the values and costs of an action; but because they are imperfect processors of information, they pursue what they perceive as most satisfying. The possibility that legal punishments deter is consistent with the new version of rational choice theory, which can be used to integrate deterrence with other criminological theories, such as strain and social learning. An integrated theory of deterrence is presented and tested with experimental data.  相似文献   

15.
This paper presents a new theory of modal reasoning, i.e. reasoning about what may or may not be the case, and what must or must not be the case. It postulates that individuals construct models of the premises in which they make explicit only what is true. A conclusion is possible if it holds in at least one model, whereas it is necessary if it holds in all the models. The theory makes three predictions, which are corroborated experimentally. First, conclusions correspond to the true, but not the false, components of possibilities. Second, there is a key interaction: it is easier to infer that a situation is possible as opposed to impossible, whereas it is easier to infer that a situation is not necessary as opposed to necessary. Third, individuals make systematic errors of omission and of commission. We contrast the theory with theories based on formal rules.  相似文献   

16.
Some moral theories, such as objective forms of consequentialism, seem to fail to be practically useful: they are of little to no help in trying to decide what to do. Even if we do not think this constitutes a fatal flaw in such theories, we may nonetheless agree that being practically useful does make a moral theory a better theory, or so some have suggested. In this paper, I assess whether the uncontroversial respect in which a moral theory can be claimed to be better if it is practically useful can provide a ground worth taking into account for believing one theory rather than another. I argue that this is not the case. The upshot is that if there is a sound objection to theories such as objective consequentialism that is based on considerations of practical usefulness, the objection requires that it is established that the truth about what we morally ought to do cannot be epistemically inaccessible to us. The value of practical usefulness has no bearing on the issue.  相似文献   

17.
A theory of attending and reinforcement in conditional discriminations is extended to working memory in delayed matching to sample by adding terms for disruption of attending during the retention interval. Like its predecessor, the theory assumes that reinforcers and disruptors affect the independent probabilities of attending to sample and comparison stimuli in the same way as the rate of overt free-operant responding as suggested by Nevin and Grace, and that attending is translated into discriminative performance by the model of Davison and Nevin. The theory accounts for the effects of sample-stimulus discriminability and retention-interval disruption on the levels and slopes of forgetting functions, and for the diverse relations between accuracy and sensitivity to reinforcement reported in the literature. It also accounts for the effects of reinforcer probability in multiple schedules on the levels and resistance to change of forgetting functions; for the effects of reinforcer probabilities signaled within delayed-matching trials; and for the effects of reinforcer delay, sample duration, and intertrial-interval duration. The model accounts for some data that have been problematic for previous theories, and makes testably different predictions of the effects of reinforcer probabilities and disruptors on forgetting functions in multiple schedules and signaled trials.  相似文献   

18.
Leda Cosmides  John Tooby   《Cognition》1994,50(1-3):41-77
Cognitive psychology has an opportunity to turn itself into a theoretically rigorous discipline in which a powerful set of theories organize observations and suggest focused new hypotheses. This cannot happen, however, as long as intuition and folk psychology continue to set our research agenda. This is because intuition systematically blinds us to the full universe of problems our minds spontaneously solve, restricting our attention instead to a minute class of unrepresentative “high-level” problems. In contrast, evolutionarily rigorous theories of adaptive function are the logical foundation on which to build cognitive theories, because the architecture of the human mind acquired its functional organization through the evolutionary process. Theories of adaptive function specify what problems our cognitive mechanisms were designed by evolution to solve, thereby supplying critical information about what their design features are likely to be. This information can free cognitive scientists from the blinders of intuition and folk psychology, allowing them to construct experiments capable of detecting complex mechanisms they otherwise would not have thought to test for. The choice is not between no-nonsense empiricism and evolutionary theory; it is between folk theory and evolutionary theory.  相似文献   

19.
Hans Johann Glock 《Synthese》2006,148(2):345-368
My paper takes issue both with the standard view that the Tractatus contains a correspondence theory and with recent suggestions that it features a deflationary or semantic theory. Standard correspondence interpretations are mistaken, because they treat the isomorphism between a sentence and what it depicts as a sufficient condition of truth rather than of sense. The semantic/deflationary interpretation ignores passages that suggest some kind of correspondence theory. The official theory of truth in the Tractatus is an obtainment theory – a sentence is true iff the state of affairs it depicts obtains. This theory differs from deflationary theories in that it involves an ontology of states of affairs/facts; and it can be transformed into a type of correspondence theory: a sentence is true iff it corresponds to, i.e. depicts an obtaining state of affairs (fact). Admittedly, unlike correspondence theories as commonly portrayed, this account does not involve a genuinely truth-making relation. It features a relation of correspondence, yet it is that of depicting, between a meaningful sentence and its sense – a possible state of affairs. What makes for truth is not that relation, but the obtaining of the depicted state of affairs. This does not disqualify the Tractatus from holding a correspondence theory, however, since the correspondence theories of Moore and Russell are committed to a similar position. Alternatively, the obtainment theory can be seen as a synthesis of correspondence, semantic and deflationary approaches. It does justice to the idea that what is true depends solely on what is the case, and it combines a semantic explanation of the relation between a sentence and what it says with a deflationary account of the agreement between what the sentence says and what obtains or is the case if it is true  相似文献   

20.
Theories are needed to explain and predict health behavior, as well as for the design and evaluation of interventions. Although there has been a history of developing, testing, applying, and refining health behavior theories, debates and limitations in evidence exist: The component of theories which, for example, predicts change should be better elaborated so that we can more easily understand what actually drives behavior change. Theories need to be empirically testable in two ways. Theories need to specify a set of changeable predictors to describe, explain, and predict behavior change, and they should enable us to design an effective intervention that produces exactly those changes in behavior that are predicted by the relevant theory. To make this possible, theories need to be specified in such a way that they can be rigorously tested and falsified. Moreover, for the design of theory-based interventions it must be possible to derive change techniques from the theory and to use them to generate changes in behavior. Based on eight state-of-the-science articles that make conceptual and empirical contributions to the current debate on health behavior theories, various approaches are discussed to gain further insights into explaining and changing health behaviors and the iterative process of theory development.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号