首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
Raul Hakli  Sara Negri 《Synthese》2012,187(3):849-867
Various sources in the literature claim that the deduction theorem does not hold for normal modal or epistemic logic, whereas others present versions of the deduction theorem for several normal modal systems. It is shown here that the apparent problem arises from an objectionable notion of derivability from assumptions in an axiomatic system. When a traditional Hilbert-type system of axiomatic logic is generalized into a system for derivations from assumptions, the necessitation rule has to be modified in a way that restricts its use to cases in which the premiss does not depend on assumptions. This restriction is entirely analogous to the restriction of the rule of universal generalization of first-order logic. A necessitation rule with this restriction permits a proof of the deduction theorem in its usual formulation. Other suggestions presented in the literature to deal with the problem are reviewed, and the present solution is argued to be preferable to the other alternatives. A contraction- and cut-free sequent calculus equivalent to the Hilbert system for basic modal logic shows the standard failure argument untenable by proving the underivability of ${\square\,A}$ from A.  相似文献   

3.
Timo Kajamies 《Philosophia》2009,37(3):525-534
In his topical article, Andrew Cling claims that the best extant formulation of the so-called epistemic regress problem rests on five assumptions that are too strong. Cling offers an improved version that rests on a different set of three core epistemic assumptions, each of which he argues for. Despite of owing a great deal to Cling’s ideas, I argue that the epistemic regress problem surfaces from more fundamental assumptions than those offered by Cling. There are ultimately two core assumptions—in fact two contradictory strands within the concept of epistemic support—which jointly create a powerful challenge for our pursuit of paramount epistemic values.
Timo KajamiesEmail:
  相似文献   

4.
The Gestalt psychologists' view of restructuring and the associated phenomenon of insight is discussed and related to findings in modern cognitive psychology. In line with Ohlsson (1984b) it is assumed that search in semantic memory is an indispensable part of restructuring. However, in contrast to Ohlsson's (1984b) information processing theory of restructuring and insight the present paper focuses on the role of mental models. It is asserted that the Gestalt approach to problem solving is compatible with the idea that a mental model is manipulated. The paper discusses three assumptions of restructuring and insight, all of which are related to mental models: (a) restructuring involves manipulating a mental model; (b) the experience of insight is based on "seeing" something in a mental model; (c) restructuring aims at realizing structural balance in a mental model. To assess the validity of these three assumptions is seen as a challenge to future research on human problem solving.  相似文献   

5.
ABSTRACT

This paper discusses the metaphilosophical assumptions that have dominated analytic philosophy of mind, and how they gave rise to the central question that the best-known forms of non-reductivism available have sought to answer, namely: how can mind fit within nature? Its goal is to make room for forms of non-reductivism that have challenged the fruitfulness of this question, and which have taken a different approach to the so-called “placement” problem. Rather than trying to solve the placement problem, the forms of non-reductivism discussed in this paper have put pressure on the metaphilosophical assumptions that have given rise to the question of the place of mind in nature in the first instance.  相似文献   

6.
Recent research has reported successful training interventions that improve insight problem solving. In some ways this is surprising, because the processes involved in insight solutions are often assumed to be unconscious, whereas the training interventions focus on conscious cognitive strategies. We propose one mechanism that may help to explain this apparent disconnect. Recognition of a barrier to progress during insight problem solving may provide a point of access to the tacit constraining assumptions that have misled the solution process. We tested this proposal in an experiment that examined the effects of different training routines on problem solving. The experiment compared four training routines, focusing either on barriers and assumptions combined, barriers alone, assumptions alone, or goals, with two control conditions. Outcomes were measured using eleven spatial insight problems. The results indicated that training that combined focus on barriers and assumptions was significantly more effective than all other conditions, supporting the proposition that recognizing and reinterpreting barriers may assist in surfacing the unwarranted assumptions that prevent problem solving.  相似文献   

7.
McNemar's problem concerns the hypothesis of equal probabilities for the unlike pairs of correlated binary variables. We consider four different extensions to this problem, each for testing simultaneous equality of proportions of unlike pairs inc independent populations of correlated binary variables, but each under different assumptions and/or additional hypotheses. For each extension both the likelihood ratio test and the goodness-of-fit chi-square test are given. Whenc=1, all cases reduce to McNemar's problem. Forc ≥ 2, however, the tests are quite different, depending on exactly how the hypothesis and alternatives of McNemar are extended. An example illustrates how widely the results may differ, depending on which extended framework is appropriate.  相似文献   

8.
The “exchange paradox”—also referred to in the literature by a variety of other names, notably the “two-envelopes problem”—is notoriously difficult, and experts are not all agreed as to its resolution. Some of the various expressions of the problem are open to more than one interpretation; some are stated in such a way that assumptions are required in order to fill in missing information that is essential to any resolution. In three experiments several versions of the problem were used, in each of which the information given was sufficient to determine an optimal choice strategy when it exists or to justify indifference regarding keeping or trading when such a strategy does not exist. College students who were presented with the various versions of the problem tended to base their choices on simple heuristics and to give little evidence of understanding the probabilistic implications of the differences in the problem statements.  相似文献   

9.
Gr&#;nbaum  Thor 《Synthese》2018,198(17):4045-4068

This paper concerns local yet systematic problems of contrastive underdetermination of model choice in cognitive neuroscience debates about the so-called two visual systems hypothesis. The underdetermination problem is systematically generated by the way certain assumptions about the representationalist nature of computation are translated into experimental practice. The problem is that behavioural data underdetermine the choice between competing representational models. In this paper, I diagnose how these assumptions generate underdetermination problems in the choice between competing functional models of perception–action. Using the tools of philosophy of science, I describe the type of underdetermination and sketch a possible cure.

  相似文献   

10.
As Bayesian methods become more popular among behavioral scientists, they will inevitably be applied in situations that violate the assumptions underpinning typical models used to guide statistical inference. With this in mind, it is important to know something about how robust Bayesian methods are to the violation of those assumptions. In this paper, we focus on the problem of contaminated data (such as data with outliers or conflicts present), with specific application to the problem of estimating a credible interval for the population mean. We evaluate five Bayesian methods for constructing a credible interval, using toy examples to illustrate the qualitative behavior of different approaches in the presence of contaminants, and an extensive simulation study to quantify the robustness of each method. We find that the “default” normal model used in most Bayesian data analyses is not robust, and that approaches based on the Bayesian bootstrap are only robust in limited circumstances. A simple parametric model based on Tukey’s “contaminated normal model” and a model based on the t-distribution were markedly more robust. However, the contaminated normal model had the added benefit of estimating which data points were discounted as outliers and which were not.  相似文献   

11.
Lewis的临时内在性问题出名地难以理解,也许是因为它质疑了某些我们认为理所当然的基本概念。本文通过简明地陈述问题和对若干可能解法的分析试图使之易于理解。此外,本文给出了一个产生Bradley倒退的简明易懂的方法。  相似文献   

12.
13.
Part of the general problem in the anthropology of Buddhism as I demonstrate in this article is that the theoretical significance of the fact that the category 'Buddhism' is a recent and Western invention has not been sufficiently appreciated. Therefore, the anthropology of ‘Sinhala Buddhism’ continues to address the ahistorical and essentialist questions of who are Buddhists and who are not. In my view, such questions can only serve to further establish the essentialist assumptions about ‘authentic Buddhism’. Contrary to that, I explain how recent scholarship has challenged such established academic assumptions as what Buddhism is and who Buddhists are, and proposes questions of a different kind.  相似文献   

14.
The existence of tradeoffs between speed and accuracy is an important interpretative problem in choice reaction time (RT) experiments. A recently suggested solution to this problem is the use of complete speed-accuracy tradeoff functions as the primary dependent variable in choice RT ,experiments instead of a single mean RT and error rate. This paper reviews and compares existing procedures for generating empirical speed-accuracy tradeoff, functions for use as dependent variables in choice RT experiments. Two major types of tradeoff function are identified, and their experimental designs and computational procedures are discussed and evaluated. Systematic disparities are demonstrated between the two tradeoff functions in both empirical and computer-simulated data. Although all existing procedures for generating speed-accuracy tradeoff functions involve empirically untested assumptions, one procedure requires less stringent assumptions and is less sensitive to sources of experimental and statistical error. This procedure involves plotting accuracy against RT over a set of experimental conditions in which subjects’ criteria for speed vs. accuracy are systematically varied.  相似文献   

15.
Discussions of the problem of evil presuppose and appeal to axiological and metaethical assumptions, but seldom pay adequate attention to those assumptions. I argue that certain theories of value are consistent with theistic answers to the argument from evil and that several other well‐known theories of value, such as hedonism, are difficult, if not impossible, to reconcile with theism. Although moral realism is the subject of lively debate in contemporary philosophy, almost all standard discussions of the problem of evil presuppose the truth of moral realism. I explain the implications of several nonrealist theories of value for the problem of evil and argue that, if nonrealism is true, then we need to rethink and re‐frame the entire discussion about the problem of evil.  相似文献   

16.
The probability-distance hypothesis states that the probability with which one stimulus is discriminated from another is a function of some subjective distance between these stimuli. The analysis of this hypothesis within the framework of multidimensional Fechnerian scaling yields the following results. If the hypothetical subjective metric is internal (which means, roughly, that the distance between two stimuli equals the infimum of the lengths of all paths connecting them), then the underlying assumptions of Fechnerian scaling are satisfied and the metric in question coincides with the Fechnerian metric. Under the probability-distance hypothesis, the Fechnerian metric exists (i.e., the underlying assumptions of Fechnerian scaling are satisfied) if and only if the hypothetical subjective metric is internalizable, which means, roughly, that by a certain transformation it can be made to coincide in the small with an internal metric; and then this internal metric is the Fechnerian metric. The specialization of these results to unidimensional stimulus continua is closely related to the so-called Fechner problem proposed in 1960's as a substitute for Fechner's original theory.  相似文献   

17.
Edgar Schein (1969, 1978, 1989, 1990) has proposed three models of consultation based on assumptions inherent in different helping styles. The first two models, purchase-of-expertise and doctor-patient, focus on the content of organizational problems. The client gives the problem to the consultant to find and implement solutions. The process consultation model focuses on how organizational problems are solved. The client and consultant collaborate throughout the problem-solving effort to find workable solutions. Because of the nature of organizations' cultures and the underlying assumptions that determine how they operate, Schein has suggested that it is most efficacious for consultants to begin in the process consultation mode and involve the client in the investigation of the problem.  相似文献   

18.

This paper discusses the reasons for the persistence of the disease model of alcoholism in the face of increasing evidence contradicting its basic assumptions. Data from in‐depth interviews with former problem drinkers are used to illustrate the ways in which the disease model: 1) shapes the empirical reality from which data for its support are derived, 2) implies methodological strategies which limit the possibility of its refutation and 3) blinds its supporters to the meaning of anomalous data. Implications for the immediate future of the disease model are discussed.  相似文献   

19.
This article seeks to state, first, what traditionally has been assumed must be the case in order for an infinite epistemic regress to arise. It identifies three assumptions. Next it discusses Jeanne Peijnenburg's and David Atkinson's setting up of their argument for the claim that some infinite epistemic regresses can actually be completed and hence that, in addition to foundationalism, coherentism, and infinitism, there is yet another solution (if only a partial one) to the traditional epistemic regress problem. The article argues that Peijnenburg and Atkinson fail to address the traditional regress problem, as they don't adopt all of the three assumptions that underlie the traditional regress problem. It also points to a problem in the notion of making probable that Peijnenburg and Atkinson use in their account of justification.  相似文献   

20.
At a Christian university, 167 subjects completed questionnaires measuring worldview assumptions, religious problem solving, physical and emotional abuse, and subjects' beliefs about whether they had been "abused." Results indicated that worldview assumptions were not related to actual abuse histories. Instead, such assumptions were related to the subjects' beliefs that they had been abused. Subjects who believed that they had been abused had more negative views of the impersonal world, people, and themselves; they were also more likely to see events as random. Both actual abuse history and subjects' beliefs that they had been abused were related to religious problem-solving styles. Finally, problem-solving styles were related to various worldview assumptions. Results are discussed in terms of previous research on abuse and in the psychology of religion.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号