首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The error theory is a metaethical theory that maintains that normative judgments are beliefs that ascribe normative properties, and that these properties do not exist. In a recent paper, Bart Streumer argues that it is impossible to fully believe the error theory. Surprisingly, he claims that this is not a problem for the error theorist: even if we can’t fully believe the error theory, the good news is that we can still come close to believing the error theory. In this paper I show that Streumer’s arguments fail. First, I lay out Streumer’s argument for why we can’t believe the error theory. Then, I argue against the unbelievability of the error theory. Finally, I show that Streumer’s positive proposal that we can come close to believing the error theory is actually undermined by his own argument for why we can’t believe the error theory.  相似文献   

2.
We have no reason to believe that reasons do not exist. Contra Bart Streumer’s recent proposal, this has nothing to do with our incapacity to believe this error theory. Rather, it is because if we know that if a proposition is true, we have no reason to believe it, then we have no reason to believe this proposition. From a different angle: if we know that we have at best misleading reasons to believe a proposition, then we have no reason to believe it. This has two consequences. Firstly, coming close to believing the error theory is idle or pointless. Secondly, philosophers who argue that believing sweeping theories like determinism or physicalism is self-defeating because they are either false or believed for no reason pursue a worthwhile argumentative strategy.  相似文献   

3.
This paper contributes to the debate on whether we can have reason to do what we are unable to do. I take as my starting point two papers recently published in Philosophical Studies, by Bart Streumer and Ulrike Heuer, which defend the two dominant opposing positions on this issue. Briefly, whereas Streumer argues that we cannot have reason to do what we are unable to do, Heuer argues that we can have reason to do what we are unable to do when we can get closer to success but cannot have reason to try to do what we are unable to do when we cannot get closer to success. In this paper, I reject both positions as they are presented, on the grounds that neither can accommodate an important category of reasons, which are the reasons to realise and to try to realise dimensions of value that lie at the boundary of what is realisable, specifically, genuinely valuable ideals. I defend a third view that we can have reason to do and to try to do what we are unable to do even when we cannot, in Heuer’s sense, get closer to success. Moreover, I argue that we can have reason to realise and to try to realise genuinely valuable ideals for their own sake and not simply for the sake of achieving mundane, realisable ends.  相似文献   

4.
David Faraci 《Philosophia》2013,41(3):751-755
In “The possibility of morality,” Phil Brown considers whether moral error theory is best understood as a necessary or contingent thesis. Among other things, Brown contends that the argument from relativity, offered by John Mackie—error theory’s progenitor—supports a stronger modal reading of error theory. His argument is as follows: Mackie’s is an abductive argument that error theory is the best explanation for divergence in moral practices. Since error theory will likewise be the best explanation for similar divergences in possible worlds similar to our own, we may conclude that error theory is true at all such worlds, just as it is in the actual world. I contend that Brown’s argument must fail, as abductive arguments cannot support the modal conclusions he suggests. I then consider why this is the case, concluding that Brown has stumbled upon new and interesting evidence that agglomerating one’s beliefs can be epistemically problematic—an issue associated most famously with Henry Kyburg’s lottery paradox.  相似文献   

5.

Researchers in cultural evolutionary theory (CET) have recently proposed the foundation of a new field of research in cultural evolution named ‘epistemic evolution’. Drawing on evolutionary epistemology’s early studies, this programme aims to study science as an evolutionary cultural process. The paper discusses the way CET’s study of science can contribute to the philosophical debate and, vice versa, how the philosophy of science can benefit from the adoption of a cultural evolutionary perspective. Here, I argue that CET’s main contribution to an evolutionary model of scientific growth comes from the application of ‘population thinking’ to science. Populationism offers a ‘variation based’ understanding of scientists’ epistemic and socio-epistemic criteria that is able to better accommodate the variegated preferences that intervene in scientific epistemic decisions. A discussion of the so called theory choice context is offered as an example of the way a populationist approach can shed new light on the operation of scientists’ epistemic choices.

  相似文献   

6.
If, as the new tenseless theory of time maintains, there are no tensed facts, then why do our emotional lives seem to suggest that there are? This question originates with Prior’s ‘Thank Goodness That’s Over’ problem, and still presents a significant challenge to the new B–theory of time. We argue that this challenge has more dimensions to it than has been appreciated by those involved in the debate so far. We present an analysis of the challenge, showing the different questions that a B–theorist must answer in order to meet it. The debate has focused on the question of what is the object of my relief when an unpleasant experience is past. We outline the prevailing response to this question. The additional, and neglected, questions are, firstly –‘Why does the same event elicit different emotional responses from us depending on whether it is in the past, present, or future?’ And secondly –‘Why do we care more about proximate future pain than about distant future pain?’ We give B–theory answers to these questions, which appeal to evolutionary considerations.  相似文献   

7.
Abstract

In this paper, I argue for three main claims. First, that there are two broad sorts of error theory about a particular region of thought and talk, eliminativist error theories and non-eliminativist error theories. Second, that an error theory about rule following can only be an eliminativist view of rule following, and therefore an eliminativist view of meaning and content on a par with Paul Churchland’s prima facie implausible eliminativism about the propositional attitudes. Third, that despite some superficial appearances to the contrary, non-eliminativist error theory does not provide a plausible vehicle for understanding the ‘sceptical solution’ to the sceptical paradox about rule-following developed in Saul Kripke’s Wittgenstein on Rules and Private Language.  相似文献   

8.
The debate about how to solve the paradox of fiction has largely been a debate between Kendall Walton and the so‐called thought theorists. In recent years, however, Jenefer Robinson has argued, based on her affective appraisal theory of emotion, for a noncognitivist solution to the paradox as an alternative to the thought theorists’ solution and especially to Walton's controversial solution. In this article, I argue that, despite appearances to the contrary, Robinson's affective appraisal theory is compatible with Walton's solution, at the core of which lies the thesis that there are quasi‐emotions. Moreover, since Robinson's theory is compatible with Walton's solution, I show how it can be used as a model to empirically test whether quasi‐emotions exist.  相似文献   

9.
Abstract

In this paper I argue against Jürgen Habermas’s theoretical dualism between ethics and morality. I do this by showing how his account of normativity is vitiated by an unnecessary superposition of a social-evolutionary and a theoretical-linguistic account of normativity, and that this brings about theoretical problems that in the end cannot be overcome. I also show that Rainer Forst’s attempt at salvaging Habermas’s distinction is equally doomed to failure, but that his attempt nevertheless invites new and more fruitful avenues for normative theory that are worth exploring. The conclusion of this paper is that traditional notions of ethics and morality can be preserved provided we heavily redefine their meanings and release them from some of the theoretical work they have been expected to accomplish, but that to complete this transition we also need to supersede Forst’s pluralization of normative contexts toward a theory of normative practices that in the end makes the distinction between ethics and morality workable but useless. I begin by first locating the debate about ethics and morality within the context of recent normative theory (§1), and proceed to examine the two main strategies through which Habermas has elaborated his idea of a sharp dualism between ethics and morality (§2). I then introduce a theoretical distinction between what I call a horizontal and a vertical integration of ethics and morality (§3) and contend that whilst only the horizontal is viable, Habermas decidedly prefers the idea of a vertical integration (§4). With this work done, I proceed to complete my critique of Habermas’s argument and show how, by recovering the pragmatist roots of his thought, an alternative solution based on a functionalist understanding of morality could be envisaged (§5). I then conclude by examining Rainer Forst’s attempt at salvaging Habermas’s account, and show that the failure of Forst’s attempts opens the way for new and more fruitful approaches to normative theory which are more likely to recover the pragmatist roots of Habermas’s thought (§6).  相似文献   

10.
In this article, against the background of a notion of ‘assembled’ truth, the evolutionary progressiveness of a theory is suggested as novel and promising explanation for the success of science. A new version of realism in science, referred to as ‘naturalised realism’ is outlined. Naturalised realism is ‘fallibilist’ in the unique sense that it captures and mimics the self-corrective core of scientific knowledge and its progress. It is argued that naturalised realism disarms Kyle Stanford’s anti-realist ‘new induction’ threats by showing that ‘explanationism’ and his ‘epistemic instrumentalism’ are just two positions among many on a constantly evolving continuum of options between instrumentalism and full-blown realism. In particular it is demonstrated that not only can naturalised realism redefine the terms of realist debate in such a way that no talk of miracles need enter the debate, but it also promises interesting defenses against inductive- and under-determination-based anti-realist arguments.  相似文献   

11.
My goal in this paper is to advance a long-standing debate about the nature of moral rights. The debate focuses on the questions: In virtue of what do persons possess moral rights? What could explain the fact that they possess moral rights? The predominant sides in this debate are the status theory and the instrumental theory. I aim to develop and defend a new instrumental theory. I take as my point of departure the influential view of Joseph Raz, which for all its virtues is unable to meet the challenge to the instrumentalist that I will address: the problem of justifying the enforcement of rights. I then offer a new instrumental theory in which duties are grounded on individuals’ interests, and individuals rights exist in virtue of the duties owed to them. I argue that my theory enables the instrumentalist to give the right sort of justification for enforcing rights.  相似文献   

12.
ABSTRACT

This paper describes how Locke’s Two Treatises of Government was read in Britain from Josiah Tucker to Peter Laslett. It focuses in particular upon how Locke’s readers responded to his detailed and lengthy engagement with the patriarchalist political thought of Sir Robert Filmer. In the second half of the eighteenth century, the debate between Locke and Filmer continued to provide the framework within which political obligation was discussed. A hundred years later that had changed, to the point where Locke’s readers found it unintelligible that he argued against Filmer and not Hobbes. I explain this in terms of the development in nineteenth-century Britain of a new conception of the history of political philosophy, the product of interest in the Hegelian theory of the state. The story told here is offered as one example of how understandings of the history of philosophy are shaped by understandings of philosophy itself.  相似文献   

13.
Leal  Fernando 《Argumentation》2022,36(4):541-567

This paper deals in detail with a fairly recent philosophical debate centered around the ability of the theory of natural selection to account for those phenotypical changes which can be argued to make organisms better adapted to their environments. The philosopher and cognitive scientist Jerry Fodor started the debate by claiming that natural selection cannot do the job. He follows two main lines of argumentation. One is based on an alleged conceptual defect in the theory, the other on alleged empirical problems in it as well as empirical alternatives to it. Four philosophers and two biologists respond in a way that displays what might easily be described as fallacious. The paper relies on the ideal model of critical discussion of pragma-dialectics to offer a step-by-step analysis of the whole debate, which extended for four issues of the London Review of Books, from October 2007 through January 2008. This pragma-dialectical analysis is carried out by constant reference to the various questions (problems, issues) that arise in the debate. The analysis includes as much detail as possible both in Fodor’s original argument and in the critics’ various comments as well as Fodor’s replies along two rounds of debate. Since a simple negative evaluation in terms of fallacies is out of the question in view of the proved argumentative accomplishments of the participants, an alternative explanation is offered: the undeniable derailments in strategic maneuvering are due to the fact that, whilst ostensibly discussing the theory of natural selection, Fodor’s detractors are worried by an underlying issue, namely, the dangers of discussing the merits and demerits of natural selection as a theory of evolution in a venue as exposed to the general public as the London Review of Books, given the religiously inspired movements that threaten the teaching of evolutionary biology in schools.

  相似文献   

14.
Fujimoto  Kentaro 《Synthese》2019,196(3):1045-1069

The conservativeness argument poses a dilemma to deflationism about truth, according to which a deflationist theory of truth must be conservative but no adequate theory of truth is conservative. The debate on the conservativeness argument has so far been framed in a specific formal setting, where theories of truth are formulated over arithmetical base theories. I will argue that the appropriate formal setting for evaluating the conservativeness argument is provided not by theories of truth over arithmetic but by those over subject matters ‘richer’ than arithmetic, such as set theory. The move to this new formal setting provides deflationists with better defence and brings a broader perspective to the debate.

  相似文献   

15.
In 1884 Samuel Butler published a collection of essays entitled Remarks on George Romanes’ Mental Evolution, where he attempted to show how Romanes’ idea of mental evolution presented similarities with his theory of unconscious memory. By looking at Romanes’ work through Butler’s writing, this article will reevaluate some aspects of their works regarding the complex debate about memory, heredity, and instinct. This paper will explore the main differences and similarities between Romanes’ science and Butler’s writing on science both in terms of their ideas and contents. It will then look into their different professional relationships with Darwin and how this determined the professional and public reception of their theories.  相似文献   

16.
Marion Vorms 《Synthese》2013,190(2):293-319
Linkage (or genetic) maps are graphs, which are intended to represent the linear ordering of genes on the chromosomes. They are constructed on the basis of statistical data concerning the transmission of genes. The invention of this technique in 1913 was driven by Morgan’s group’s adoption of a set of hypotheses concerning the physical mechanism of heredity. These hypotheses were themselves grounded in Morgan’s defense of the chromosome theory of heredity, according to which chromosomes are the physical basis of genes. In this paper, I analyze the 1919 debate between William Castle and Morgan’s group, about the construction of genetic maps. The official issue of the debate concerns the arrangement of genes on chromosomes. However, the disputants tend to carry out the discussions about how one should model the data in order to draw predictions concerning the transmission of genes; the debate does not bear on the data themselves, nor does it focus on the hypotheses explaining these data. The main criteria that are appealed to by the protagonists are simplicity and predictive efficacy. However, I show that both parties’ assessments of the simplicity and predictive efficacy of different ways of modeling the data themselves depend on background theoretical positions. I aim at clarifying how preference for a given model and theoretical commitments articulate.  相似文献   

17.
The German ‘headscarf debate’ was sparked off by a dispute concerning a teacher who refused to remove her hijab at work. ‘Case Ludin’ brought the issue to national attention and eventually led to new legislation in half of Germany's 16 federated states. This article focuses on a critical analysis of a party-political debate around Case Ludin in the Baden-Württemberg parliament in 1998. The analysis shows that whilst party-politicians claimed to be concerned with issues of social justice as well as with the protection of constitutional rights and democratic values, the party-political arena of this debate has been preoccupied with the discursive construction of German national identity and its assumed incompatibility with Muslim identity. It comes to the conclusion that discourses used in this debate reproduce stereotypical images associating Islam with ‘gendered oppression’, political extremism and irreconcilable difference, and that these discourses continue to shape current debates in Germany and beyond.  相似文献   

18.
Over the centuries, especially since the Jansenist controversy, much ink has been spilled defending and criticizing Augustine’s contentious interpretation of the revealed doctrines of predestination and reprobation. Instead of attempting to trace the entire debate or adjudicate the exegetical questions, I attempt the more modest task of analyzing how Augustine’s massa damnata theory of election has been re‐received in modern Catholic scholarship. Thus, leaving aside the historical and exegetical complexity of the issue, I argue for a particular conceptual appropriation of Augustine’s theory in line with a contemporary Catholic theology of grace and predestination.  相似文献   

19.
This paper is meant to link the philosophical debate concerning the underdetermination of theories by evidence with a rather significant socio-political issue that has been taking place in Canada over the past few years: the so-called ‘death of evidence’ controversy. It places this debate within a broader philosophical framework by discussing the connection between evidence and theory; by bringing out the role of epistemic values in the so-called scientific method; and by examining the role of social values in science. While it should be admitted that social values play an important role in science, the key question for anyone who advocates this view is: what and whose values? The way it is answered makes an important epistemic difference to how the relation between evidence and theory is appraised. I first review various arguments for the claim that evidence underdetermines theory and shows their presuppositions and limitations, using conceptual analysis and historical examples. After broaching the relation between evidence and method in science by highlighting the need to incorporate epistemic values into the scientific method, my discussion focuses on recent arguments for the role of social values in science. Finally, I address the implications of the approach outlined for the current ‘death of evidence’ debate in Canada.  相似文献   

20.
Michiru Nagatsu 《Synthese》2013,190(12):2267-2289
In this paper I examine Don Ross’s application of unificationism as a methodological criterion of theory appraisal in economics and cognitive science. Against Ross’s critique that explanations of the preference reversal phenomenon by the ‘heuristics and biases’ programme is ad hoc or ‘Ptolemaic’, I argue that the compatibility hypothesis, one of the explanations offerd by this programme, is theoretically and empirically well-motivated. A careful examination of this hypothesis suggests several strengths of a procedural approach to modelling cognitive processes underlying individual decision making, compared to a multiple-agent approach which Ross promotes. I argue that the debate between economists and psychologists are both theoretical and empirical, but cannot be resolved by appealing to the ideal of unification.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号