首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 671 毫秒
1.
This paper identifies and criticizes certain fundamental commitments of virtue theories in epistemology. A basic question for virtues approaches is whether they represent a ‘third force’––a different source of normativity to internalism and externalism. Virtues approaches so-conceived are opposed. It is argued that virtues theories offer us nothing that can unify the internalist and externalist sub-components of their preferred success-state. Claims that character can unify a virtues-based axiology are overturned. Problems with the pluralism of virtues theories are identified––problems with pluralism and the nature of the self; and problems with pluralism and the goals of epistemology. Moral objections to virtue theory are identified––specifically, both the idea that there can be a radical axiological priority to character and the anti-enlightenment tendencies in virtues approaches. Finally, some strengths to virtue theory are conceded, while the role of epistemic luck is identified as an important topic for future work.  相似文献   

2.
Colin Howson 《Synthese》2007,156(3):491-512
Many people regard utility theory as the only rigorous foundation for subjective probability, and even de Finetti thought the betting approach supplemented by Dutch Book arguments only good as an approximation to a utility-theoretic account. I think that there are good reasons to doubt this judgment, and I propose an alternative, in which the probability axioms are consistency constraints on distributions of fair betting quotients. The idea itself is hardly new: it is in de Finetti and also Ramsey. What is new is that it is shown that probabilistic consistency and consequence can be defined in a way formally analogous to the way these notions are defined in deductive (propositional) logic. The result is a free-standing logic which does not pretend to be a theory of rationality and is therefore immune to, among other charges, that of “logical omniscience”.  相似文献   

3.
The study of arousal and attention could be of prominent importance for elucidating both fundamental and practical aspects of the mind–brain puzzle. Defined as “general activation of mind” (Kahnemann in Attention and effort. Prentice-Hall, New Jersey, 1973), or “general operation of consciousness” (Thacher and John in Functional neuroscience: foundations of cognitive processing. Erlbaum, Hillsdale, 1977), arousal can be considered as a starting point of fundamental research on consciousness. Similar role could be assigned to attention, which can be defined by substituting the attributes “general” with “focused”. Concerning the practical applications, the empirically established correlation between neuronal oscillations and arousal/attention levels is widely used in research and clinics, including neurofeedback, brain–computer communication, etc. However, the neurophysical mechanism underlying this correlation is still not clear enough. In this paper, after reviewing some present classical and quantum approaches, a transition probability concept of arousal based on field–dipole quantum interactions and information entropy is elaborated. The obtained analytical expressions and numerical values correspond to classical empirical results for arousal and attention, including the characteristic frequency dependence and intervals. Simultaneously, the fundamental (substrate) role of EEG spectrum has been enlightened, whereby the attention appears to be a bridge between arousal and the content of consciousness. Finally, some clinical implications, including the brain-rate parameter as an indicator of arousal and attention levels, are provided.  相似文献   

4.
Elaine Landry 《Synthese》2007,158(1):1-17
Recent semantic approaches to scientific structuralism, aiming to make precise the concept of shared structure between models, formally frame a model as a type of set-structure. This framework is then used to provide a semantic account of (a) the structure of a scientific theory, (b) the applicability of a mathematical theory to a physical theory, and (c) the structural realist’s appeal to the structural continuity between successive physical theories. In this paper, I challenge the idea that, to be so used, the concept of a model and so the concept of shared structure between models must be formally framed within a single unified framework, set-theoretic or other. I first investigate the Bourbaki-inspired assumption that structures are types of set-structured systems and next consider the extent to which this problematic assumption underpins both Suppes’ and recent semantic views of the structure of a scientific theory. I then use this investigation to show that, when it comes to using the concept of shared structure, there is no need to agree with French that “without a formal framework for explicating this concept of ‘structure-similarity’ it remains vague, just as Giere’s concept of similarity between models does ...” (French, 2000, Synthese, 125, pp. 103–120, p. 114). Neither concept is vague; either can be made precise by appealing to the concept of a morphism, but it is the context (and not any set-theoretic type) that determines the appropriate kind of morphism. I make use of French’s (1999, From physics to philosophy (pp. 187–207). Cambridge: Cambridge University Press) own example from the development of quantum theory to show that, for both Weyl and Wigner’s programmes, it was the context of considering the ‘relevant symmetries’ that determined that the appropriate kind of morphism was the one that preserved the shared Lie-group structure of both the theoretical and phenomenological models. I wish to thank Katherine Brading, Anjan Chakravartty, Steven French, Martin Thomson-Jones, Antigone Nounou, Stathis Psillos, Dean Rickles, Mauricio Suarez and two anonymous referees for valuable comments and criticisms, and Gregory Janzen for editorial suggestions. Research for this paper was funded by a generous SSHRC grant for which I am grateful  相似文献   

5.
The reference class problem is your problem too   总被引:2,自引:0,他引:2  
Alan Hájek 《Synthese》2007,156(3):563-585
The reference class problem arises when we want to assign a probability to a proposition (or sentence, or event) X, which may be classified in various ways, yet its probability can change depending on how it is classified. The problem is usually regarded as one specifically for the frequentist interpretation of probability and is often considered fatal to it. I argue that versions of the classical, logical, propensity and subjectivist interpretations also fall prey to their own variants of the reference class problem. Other versions of these interpretations apparently evade the problem. But I contend that they are all “no-theory” theories of probability - accounts that leave quite obscure why probability should function as a guide to life, a suitable basis for rational inference and action. The reference class problem besets those theories that are genuinely informative and that plausibly constrain our inductive reasonings and decisions. I distinguish a “metaphysical” and an “epistemological” reference class problem. I submit that we can dissolve the former problem by recognizing that probability is fundamentally a two-place notion: conditional probability is the proper primitive of probability theory. However, I concede that the epistemological problem remains.  相似文献   

6.
Sydney Shoemaker has given a sophisticated theory of phenomenal content, motivated by the transparency of experience and by the possibility of spectrum inversion without illusion (1994, 2000, 2001, 2002). It centers on the idea that color experiences represent what he calls “appearance properties”. I consider the different sorts of appearance properties that Shoemaker has suggested might enter into phenomenal content – occurrent appearance properties, dispositional appearance properties, and higher-order dispositional appearance properties – and argue that none of them are plausibly represented by color experiences. I argue that Shoemaker's theory faces a dilemma – either it makes misperception too difficult, or it does not truly accommodate veridical spectrum inversion. I then examine some alternative Russellian theories of phenomenal content that might be consistent with Shoemaker's motivations, including a different sort of proposal recently suggested by Shoemaker (forthcoming). I argue that these views are also lacking, for similar reasons as the appearance property view. Finally, I conclude that in order for a representationalist theory to properly accommodate spectrum inversion without illusion, phenomenal content must include an indexical element. Such a view requires the adoption of a broadly Fregean theory of phenomenal content, according to which sameness of phenomenal character does not entail sameness in extension. What phenomenally identical experiences have in common is not what they represent, but how they represent.  相似文献   

7.
Under the independence and competence assumptions of Condorcet’s classical jury model, the probability of a correct majority decision converges to certainty as the jury size increases, a seemingly unrealistic result. Using Bayesian networks, we argue that the model’s independence assumption requires that the state of the world (guilty or not guilty) is the latest common cause of all jurors’ votes. But often – arguably in all courtroom cases and in many expert panels – the latest such common cause is a shared ‘body of evidence’ observed by the jurors. In the corresponding Bayesian network, the votes are direct descendants not of the state of the world, but of the body of evidence, which in turn is a direct descendant of the state of the world. We develop a model of jury decisions based on this Bayesian network. Our model permits the possibility of misleading evidence, even for a maximally competent observer, which cannot easily be accommodated in the classical model. We prove that (i) the probability of a correct majority verdict converges to the probability that the body of evidence is not misleading, a value typically below 1; (ii) depending on the required threshold of ‘no reasonable doubt’, it may be impossible, even in an arbitrarily large jury, to establish guilt of a defendant ‘beyond any reasonable doubt’.  相似文献   

8.
Journal of Philosophical Logic - We present a revenge argument for non-reflexive theories of semantic notions – theories which restrict the rule of assumption, or (equivalently) initial...  相似文献   

9.
A non-monotonic theory of probability is put forward and shown to have applicability in the quantum domain. It is obtained simply by replacing Kolmogorov’s positivity axiom, which places the lower bound for probabilities at zero, with an axiom that reduces that lower bound to minus one. Kolmogorov’s theory of probability is monotonic, meaning that the probability of A is less then or equal to that of B whenever A entails B. The new theory violates monotonicity, as its name suggests; yet, many standard theorems are also theorems of the new theory since Kolmogorov’s other axioms are retained. What is of particular interest is that the new theory can accommodate quantum phenomena (photon polarization experiments) while preserving Boolean operations, unlike Kolmogorov’s theory. Although non-standard notions of probability have been discussed extensively in the physics literature, they have received very little attention in the philosophical literature. One likely explanation for that difference is that their applicability is typically demonstrated in esoteric settings that involve technical complications. That barrier is effectively removed for non-monotonic probability theory by providing it with a homely setting in the quantum domain. Although the initial steps taken in this paper are quite substantial, there is much else to be done, such as demonstrating the applicability of non-monotonic probability theory to other quantum systems and elaborating the interpretive framework that is provisionally put forward here. Such matters will be developed in other works.  相似文献   

10.
Mongrel Gravity     
James Mattingly 《Erkenntnis》2009,70(3):379-395
It was recognized almost from the original formulation of general relativity that the theory was incomplete because it dealt only with classical, rather than quantum, matter. What must be done in order to complete the theory has been a subject of considerable debate over the last century, and here I just mention a few of the various options that have been suggested for a quantum theory of gravity. The aim of what follows is twofold. First, I address worries about the consistency and physical plausibility of hybrid theories of gravity—theories involving a classical gravitational field and quantum matter fields. Such worries are shown to be unfounded. These hybrid theories—mongrel gravity—in fact comprise the only current, actual theories of gravity that incorporate quantum matter, and they also offer legitimate promise as tools for discovering the full theory of gravity. So my second aim is to highlight these theories as providing an interesting example of scientific revolution in action. I begin to try to draw some philosophical lessons from mongrel gravity theories, but more importantly I try to convince philosophers of physics that they should pay more attention to them. QFT in curved space-time, however, is not a unification. It is a mongrel, and as such deserves to be put down.—M. J. Duff
James MattinglyEmail:
  相似文献   

11.
Hans Johann Glock 《Synthese》2006,148(2):345-368
My paper takes issue both with the standard view that the Tractatus contains a correspondence theory and with recent suggestions that it features a deflationary or semantic theory. Standard correspondence interpretations are mistaken, because they treat the isomorphism between a sentence and what it depicts as a sufficient condition of truth rather than of sense. The semantic/deflationary interpretation ignores passages that suggest some kind of correspondence theory. The official theory of truth in the Tractatus is an obtainment theory – a sentence is true iff the state of affairs it depicts obtains. This theory differs from deflationary theories in that it involves an ontology of states of affairs/facts; and it can be transformed into a type of correspondence theory: a sentence is true iff it corresponds to, i.e. depicts an obtaining state of affairs (fact). Admittedly, unlike correspondence theories as commonly portrayed, this account does not involve a genuinely truth-making relation. It features a relation of correspondence, yet it is that of depicting, between a meaningful sentence and its sense – a possible state of affairs. What makes for truth is not that relation, but the obtaining of the depicted state of affairs. This does not disqualify the Tractatus from holding a correspondence theory, however, since the correspondence theories of Moore and Russell are committed to a similar position. Alternatively, the obtainment theory can be seen as a synthesis of correspondence, semantic and deflationary approaches. It does justice to the idea that what is true depends solely on what is the case, and it combines a semantic explanation of the relation between a sentence and what it says with a deflationary account of the agreement between what the sentence says and what obtains or is the case if it is true  相似文献   

12.
Five hens, experienced in discrimination of two categories of multidimensional geometrical figures presented in fixed pairs in a simultaneous discrimination, were tested with familiar figures arranged as new pairs to assess the dependence of categorization performance on learned relational or configural cues. Test performance did not differ from training: relational or configural cues still influenced discrimination performance. It was suggested that – in accordance with exemplar theories – this influence depended on differences between pairs of probe exemplars that facilitate retrieval of learned category members. To test whether exemplar, feature or prototype theory was most suitable to explain categorization by chickens, the rates of pecking at exemplars were analysed using principal components analysis (PCA). The distribution of the exemplars' component loads on the single component obtained was examined in the light of the conditions dictated by the three types of theories on how representative category exemplars should be. The least constraining theory, i.e. the exemplar theory, was most suitable. Defining factors of classificatory behaviour are discussed with a special emphasis on the characteristics of category-defining stimulus attributes. Accepted after revision: 29 May 2001 Electronic Publication  相似文献   

13.
Alberto Peruzzi 《Axiomathes》2006,16(4):424-459
Among the main concerns of 20th century philosophy was that of the foundations of mathematics. But usually not recognized is the relevance of the choice of a foundational approach to the other main problems of 20th century philosophy, i.e., the logical structure of language, the nature of scientific theories, and the architecture of the mind. The tools used to deal with the difficulties inherent in such problems have largely relied on set theory and its “received view”. There are specific issues, in philosophy of language, epistemology and philosophy of mind, where this dependence turns out to be misleading. The same issues suggest the gain in understanding coming from category theory, which is, therefore, more than just the source of a “non-standard” approach to the foundations of mathematics. But, even so conceived, it is the very notion of what a foundation has to be that is called into question. The philosophical meaning of mathematics is no longer confined to which first principles are assumed and which “ontological” interpretation is given to them in terms of some possibly updated version of logicism, formalism or intuitionism. What is central to any foundational project proper is the role of universal constructions that serve to unify the different branches of mathematics, as already made clear in 1969 by Lawvere. Such universal constructions are best expressed by means of adjoint functors and representability up to isomorphism. In this lies the relevance of a category-theoretic perspective, which leads to wide-ranging consequences. One such is the presence of functorial constraints on the syntax–semantics relationships; another is an intrinsic view of (constructive) logic, as arises in topoi and, subsequently, in more general fibrations. But as soon as theories and their models are described accordingly, a new look at the main problems of 20th century’s philosophy becomes possible. The lack of any satisfactory solution to these problems in a purely logical and set-theoretic setting is the result of too circumscribed an approach, such as a static and punctiform view of objects and their elements, and a misconception of geometry and its historical changes before, during, and after the foundational “crisis”, as if algebraic geometry and synthetic differential geometry – not to mention algebraic topology – were secondary sources for what concerns foundational issues. The objectivity of basic geometrical intuitions also acts against the recent version of structuralism proposed as ‘the’ philosophy of category theory. On the other hand, the need for a consistent and adequate conceptual framework in facing the difficulties met by pre-categorical theories of language and scientific knowledge not only provides the basic concepts of category theory with specific applications but also suggests further directions for their development (e.g., in approaching the foundations of physics or the mathematical models in the cognitive sciences). This ‘virtuous’ circle is by now largely admitted in theoretical computer science; the time is ripe to realise that the same holds for classical topics of philosophy. Text of a talk given at the Workshop and Symposium on the Ramifications of Category Theory, Florence, November 18–22, 2003. For further documentation on the conference, see  相似文献   

14.
Decisions can sometimes have a constructive role, so that the act of, for example, choosing one option over another creates a preference for that option (e.g., , ,  and ). In this work we explore the constructive role of just articulating an impression, for a presented visual stimulus, as opposed to making a choice (specifically, the judgments we employ are affective evaluations). Using quantum probability theory, we outline a cognitive model formalizing such a constructive process. We predict a simple interaction, in relation to how a second image is evaluated, following the presentation of a first image, depending on whether there is a rating for the first image or not. The interaction predicted by the quantum model was confirmed across three experiments and a variety of control manipulations. The advantages of using quantum probability theory to model the present results, compared with existing models of sequence order effects in judgment (e.g., Hogarth & Einhorn, 1992) or other theories of constructive processes when a choice is made (e.g.,  and ) are discussed.  相似文献   

15.
16.
Summary  Goodman published his “riddle” in the middle of the 20th century and many philosophers have attempted to solve it. These attempts almost all shared an assumption that, I shall argue, might be wrong, namely, the assumption that when we project from cases we have examined to cases we have not, what we project are predicates (and that this projectibility is an absolute property of some predicates). I shall argue that this assumption, shared by almost all attempts at a solution, looks wrong, because, in the first place, what we project are generalizations and not predicates, and a generalization is projectible (or unprojectible) relative to a given context. In this paper I develop the idea of explainable–projectible generalizations versus unexplainable–unprojectible generalizations, relative to a specific context. My main claim is that we rationally project a generalization if and only if we rationally believe that there is something that explains the general phenomenon that the generalized statement in question asserts to obtain, and that a generalization is projectible, if and only if its putative truth can be explained, whether we know that it can be or not.  相似文献   

17.
Schulte  Oliver 《Synthese》1999,118(3):329-361
This paper analyzes the notion of a minimal belief change that incorporates new information. I apply the fundamental decision-theoretic principle of Pareto-optimality to derive a notion of minimal belief change, for two different representations of belief: First, for beliefs represented by a theory – a deductively closed set of sentences or propositions – and second for beliefs represented by an axiomatic base for a theory. Three postulates exactly characterize Pareto-minimal revisions of theories, yielding a weaker set of constraints than the standard AGM postulates. The Levi identity characterizes Pareto-minimal revisions of belief bases: a change of belief base is Pareto-minimal if and only if the change satisfies the Levi identity (for “maxichoice” contraction operators). Thus for belief bases, Pareto-minimality imposes constraints that the AGM postulates do not. The Ramsey test is a well-known way of establishing connections between belief revision postulates and axioms for conditionals (“if p, then q”). Pareto-minimal theory change corresponds exactly to three characteristic axioms of counterfactual systems: a theory revision operator that satisfies the Ramsey test validates these axioms if and only if the revision operator is Pareto-minimal. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

18.
According to Roy Sorensen [Philosophical Studies 100 (2000) 175–191] an object cannot differ aesthetically from its mirror image. On his view, mirror-reversing an object – changing its left/right orientation – cannot bring about any aesthetic change. However, in arguing for this thesis Sorensen assumes that aesthetic properties supervene on intrinsic properties alone. This is a highly controversial assumption and nothing is offered in its support. Moreover, a plausible weakening of the assumption does not improve the argument. Finally, Sorensen’s second argument is shown to be formally flawed. As a result, the case for the aesthetic irrelevancy of orientation seems still open.  相似文献   

19.
There are at least two general theories for building probabilistic-dynamical systems: one is Markov theory and another is quantum theory. These two mathematical frameworks share many fundamental ideas, but they also differ in some key properties. On the one hand, Markov theory obeys the law of total probability, but quantum theory does not; on the other hand, quantum theory obeys the doubly stochastic law, but Markov theory does not. Therefore, the decision about whether to use a Markov or a quantum system depends on which of these laws are empirically obeyed in an application. This article derives two general methods for testing these theories that are parameter free, and presents a new experimental test. The article concludes with a review of experimental findings from cognitive psychology that evaluate these two properties.  相似文献   

20.
意义理论     
Research into logical syntax provides us the knowledge of the structure of sentences, while logical semantics provides a window into uncovering the truth of sentences. Therefore, it is natural to make sentences and truth the central concern when one deals with the theory of meaning logically. Although their theories of meaning differ greatly, both Michael Dummett’s theory and Donald Davidson’s theory are concerned with sentences and truth and developed in terms of truth. Logical theories and methods first introduced by G. Frege underwent great developments during the past century and have played an important role in expanding these two scholars’ theories of meaning. Translated by Ma Minghui from Zhexue Yanjiu 哲学研究 (Philosophical Research), 2006, (7): 53–61  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号