首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Natural deduction systems were motivated by the desire to define the meaning of each connective by specifying how it is introduced and eliminated from inference. In one sense, this attempt fails, for it is well known that propositional logic rules (however formulated) underdetermine the classical truth tables. Natural deduction rules are too weak to enforce the intended readings of the connectives; they allow non-standard models. Two reactions to this phenomenon appear in the literature. One is to try to restore the standard readings, for example by adopting sequent rules with multiple conclusions. Another is to explore what readings the natural deduction rules do enforce. When the notion of a model of a rule is generalized, it is found that natural deduction rules express “intuitionistic” readings of their connectives. A third approach is presented here. The intuitionistic readings emerge when models of rules are defined globally, but the notion of a local model of a rule is also natural. Using this benchmark, natural deduction rules enforce exactly the classical readings of the connectives, while this is not true of axiomatic systems. This vindicates the historical motivation for natural deduction rules. One odd consequence of using the local model benchmark is that some systems of propositional logic are not complete for the semantics that their rules express. Parallels are drawn with incompleteness results in modal logic to help make sense of this.  相似文献   

2.
Prawitz proved a theorem, formalising ‘harmony’ in Natural Deduction systems, which showed that, corresponding to any deduction there is one to the same effect but in which no formula occurrence is both the consequence of an application of an introduction rule and major premise of an application of the related elimination rule. As Gentzen ordered the rules, certain rules in Classical Logic had to be excepted, but if we see the appropriate rules instead as rules for Contradiction, then we can extend the theorem to the classical case. Properly arranged there is a thoroughgoing ‘harmony’, in the classical rules. Indeed, as we shall see, they are, all together, far more ‘harmonious’ in the general sense than has been commonly observed. As this paper will show, the appearance of disharmony has only arisen because of the illogical way in which natural deduction rules for Classical Logic have been presented.  相似文献   

3.
Methods available for the axiomatization of arbitrary finite-valued logics can be applied to obtain sound and complete intelim rules for all truth-functional connectives of classical logic including the Sheffer stroke (nand) and Peirce’s arrow (nor). The restriction to a single conclusion in standard systems of natural deduction requires the introduction of additional rules to make the resulting systems complete; these rules are nevertheless still simple and correspond straightforwardly to the classical absurdity rule. Omitting these rules results in systems for intuitionistic versions of the connectives in question.  相似文献   

4.
《Journal of Applied Logic》2015,13(3):188-196
The purpose of this brief note is to prove a limitative theorem for a generalization of the deduction theorem. I discuss the relationship between the deduction theorem and rules of inference. Often when the deduction theorem is claimed to fail, particularly in the case of normal modal logics, it is the result of a confusion over what the deduction theorem is trying to show. The classic deduction theorem is trying to show that all so-called ‘derivable rules’ can be encoded into the object language using the material conditional. The deduction theorem can be generalized in the sense that one can attempt to encode all types of rules into the object language. When a rule is encoded in this way I say that it is reflected in the object language. What I show, however, is that certain logics which reflect a certain kind of rule must be trivial. Therefore, my generalization of the deduction theorem does fail where the classic deduction theorem didn't.  相似文献   

5.

The best-known syntactic account of the logical constants is inferentialism . Following Wittgenstein’s thought that meaning is use, inferentialists argue that meanings of expressions are given by introduction and elimination rules. This is especially plausible for the logical constants, where standard presentations divide inference rules in just this way. But not just any rules will do, as we’ve learnt from Prior’s famous example of tonk, and the usual extra constraint is harmony. Where does this leave identity? It’s usually taken as a logical constant but it doesn’t seem harmonious: standardly, the introduction rule (reflexivity) only concerns a subset of the formulas canvassed by the elimination rule (Leibniz’s law). In response, Read [5, 8] and Klev [3] amend the standard approach. We argue that both attempts fail, in part because of a misconception regarding inferentialism and identity that we aim to identify and clear up.

  相似文献   

6.
From the point of view of proof-theoretic semantics, it is argued that the sequent calculus with introduction rules on the assertion and on the assumption side represents deductive reasoning more appropriately than natural deduction. In taking consequence to be conceptually prior to truth, it can cope with non-well-founded phenomena such as contradictory reasoning. The fact that, in its typed variant, the sequent calculus has an explicit and separable substitution schema in form of the cut rule, is seen as a crucial advantage over natural deduction, where substitution is built into the general framework.  相似文献   

7.
8.
The introduction and elimination rules for material implication in natural deduction are not complete with respect to the implicational fragment of classical logic. A natural way to complete the system is through the addition of a new natural deduction rule corresponding to Peirce’s formula (((A → B) → A) → A). E. Zimmermann [6] has shown how to extend Prawitz’ normalization strategy to Peirce’s rule: applications of Peirce’s rule can be restricted to atomic conclusions. The aim of the present paper is to extend Seldin’s normalization strategy to Peirce’s rule by showing that every derivation Π in the implicational fragment can be transformed into a derivation Π′ such that no application of Peirce’s rule in Π′ occurs above applications of →-introduction and →-elimination. As a corollary of Seldin’s normalization strategy we obtain a form of Glivenko’s theorem for the classical {→}-fragment.  相似文献   

9.
M. W. Bunder 《Studia Logica》1982,41(2-3):95-108
The standard deduction theorem or introduction rule for implication, for classical logic is also valid for intuitionistic logic, but just as with predicate logic, other rules of inference have to be restricted if the theorem is to hold for weaker implicational logics.In this paper we look in detail at special cases of the Gentzen rule for and show that various subsets of these in effect constitute deduction theorems determining all the theorems of many well known as well as not well known implicational logics. In particular systems of rules are given which are equivalent to the relevance logics E,R, T, P-W and P-W-I.  相似文献   

10.
Standefer  Shawn 《Studia Logica》2019,107(6):1103-1134

Two common forms of natural deduction proof systems are found in the Gentzen–Prawitz and Ja?kowski–Fitch systems. In this paper, I provide translations between proofs in these systems, pointing out the ways in which the translations highlight the structural rules implicit in the systems. These translations work for classical, intuitionistic, and minimal logic. I then provide translations for classical S4 proofs.

  相似文献   

11.
The idea of an ‘inversion principle’, and the name itself, originated in the work of Paul Lorenzen in the 1950s, as a method to generate new admissible rules within a certain syntactic context. Some fifteen years later, the idea was taken up by Dag Prawitz to devise a strategy of normalization for natural deduction calculi (this being an analogue of Gentzen's cut-elimination theorem for sequent calculi). Later, Prawitz used the inversion principle again, attributing it with a semantic role. Still working in natural deduction calculi, he formulated a general type of schematic introduction rules to be matched – thanks to the idea supporting the inversion principle – by a corresponding general schematic Elimination rule. This was an attempt to provide a solution to the problem suggested by the often quoted note of Gentzen. According to Gentzen ‘it should be possible to display the elimination rules as unique functions of the corresponding introduction rules on the basis of certain requirements’. Many people have since worked on this topic, which can be appropriately seen as the birthplace of what are now referred to as “general elimination rules”, recently studied thoroughly by Sara Negri and Jan von Plato. In this study, we retrace the main threads of this chapter of proof-theoretical investigation, using Lorenzen's original framework as a general guide.  相似文献   

12.
How natural is natural deduction?– Gentzen's system of natural deduction intends to fit logical rules to the effective mathematical reasoning in order to overcome the artificiality of deductions in axiomatic systems (¶ 2). In spite of this reform some of Gentzen's rules for natural deduction are criticised by psychologists and natural language philosophers for remaining unnatural. The criticism focuses on the principle of extensionality and on formalism of logic (¶ 3). After sketching the criticism relatively to the main rules, I argue that the criteria of economy, simplicity, pertinence etc., on which the objections are based, transcend the strict domain of logic and apply to arguments in general (¶ 4). (¶ 5) deals with Frege's critique of the concept of naturalness as regards logic. It is shown that this concept means a regression into psychologism and is exposed to the same difficulties as are: relativity, lack of precision, the error of arguing from `is' to `ought' (the naturalistic fallacy). Despite of these, the concept of naturalness plays the role of a diffuse ideal which favours the construction of alternative deductive systems in contrast to the platonic conception of logic (¶ 6).  相似文献   

13.
The analysis of atomic sentences and their subatomic components poses a special problem for proof-theoretic approaches to natural language semantics, as it is far from clear how their semantics could be explained by means of proofs rather than denotations. The paper develops a proof-theoretic semantics for a fragment of English within a type-theoretical formalism that combines subatomic systems for natural deduction [20] with constructive (or Martin-L?f) type theory [8, 9] by stating rules for the formation, introduction, elimination and equality of atomic propositions understood as types (or sets) of subatomic proof-objects. The formalism is extended with dependent types to admit an interpretation of non-atomic sentences. The paper concludes with applications to natural language including internally nested proper names, anaphoric pronouns, simple identity sentences, and intensional transitive verbs.  相似文献   

14.
Raul Hakli  Sara Negri 《Synthese》2012,187(3):849-867
Various sources in the literature claim that the deduction theorem does not hold for normal modal or epistemic logic, whereas others present versions of the deduction theorem for several normal modal systems. It is shown here that the apparent problem arises from an objectionable notion of derivability from assumptions in an axiomatic system. When a traditional Hilbert-type system of axiomatic logic is generalized into a system for derivations from assumptions, the necessitation rule has to be modified in a way that restricts its use to cases in which the premiss does not depend on assumptions. This restriction is entirely analogous to the restriction of the rule of universal generalization of first-order logic. A necessitation rule with this restriction permits a proof of the deduction theorem in its usual formulation. Other suggestions presented in the literature to deal with the problem are reviewed, and the present solution is argued to be preferable to the other alternatives. A contraction- and cut-free sequent calculus equivalent to the Hilbert system for basic modal logic shows the standard failure argument untenable by proving the underivability of ${\square\,A}$ from A.  相似文献   

15.
Free Semantics     
Free Semantics is based on normalized natural deduction for the weak relevant logic DW and its near neighbours. This is motivated by the fact that in the determination of validity in truth-functional semantics, natural deduction is normally used. Due to normalization, the logic is decidable and hence the semantics can also be used to construct counter-models for invalid formulae. The logic DW is motivated as an entailment logic just weaker than the logic MC of meaning containment. DW is the logic focussed upon, but the results extend to MC. The semantics is called ‘free semantics’ since it is disjunctively and existentially free in that no disjunctive or existential witnesses are produced, unlike in truth-functional semantics. Such ‘witnesses’ are only assumed in generality and are not necessarily actual. The paper sets up the free semantics in a truth-functional style and gives a natural deduction interpetation of the meta-logical connectives. We then set out a familiar tableau-style system, but based on natural deduction proof rather than truth-functional semantics. A proof of soundness and completeness is given for a reductio system, which is a transform of the tableau system. The reductio system has positive and negative rules in place of the elimination and introduction rules of Brady’s normalized natural deduction system for DW. The elimination-introduction turning points become closures of threads of proof, which are at the points of contradiction for the reductio system.  相似文献   

16.
Peter Milne 《Synthese》1994,100(1):49-94
The thesis that, in a system of natural deduction, the meaning of a logical constant is given by some or all of its introduction and elimination rules has been developed recently in the work of Dummett, Prawitz, Tennant, and others, by the addition of harmony constraints. Introduction and elimination rules for a logical constant must be in harmony. By deploying harmony constraints, these authors have arrived at logics no stronger than intuitionist propositional logic. Classical logic, they maintain, cannot be justified from this proof-theoretic perspective. This paper argues that, while classical logic can be formulated so as to satisfy a number of harmony constraints, the meanings of the standard logical constants cannot all be given by their introduction and/or elimination rules; negation, in particular, comes under close scrutiny.  相似文献   

17.
18.
Abstract

Two experiments examined the learning of a set of Greek pronunciation rules through explicit and implicit modes of rule presentation. Experiment 1 compared the effectiveness of implicit and explicit modes of presentation in two modalities, visual and auditory. Subjects in the explicit or rule group were presented with the rule set, and those in the implicit or natural group were shown a set of Greek words, composed of letters from the rule set, linked to their pronunciations. Subjects learned the Greek words to criterion and were then given a series of tests which aimed to tap different types of knowledge. The results showed an advantage of explicit study of the rules. In addition, an interaction was found between mode of presentation and modality. Explicit instruction was more effective in the visual than in the auditory modality, whereas there was no modality effect for implicit instruction. Experiment 2 examined a possible reason for the advantage of the rule groups by comparing different combinations of explicit and implicit presentation in the study and learning phases. The results suggested that explicit presentation of the rules is only beneficial when it is followed by practice at applying them.  相似文献   

19.
The papers where Gerhard Gentzen introduced natural deduction and sequent calculi suggest that his conception of logic differs substantially from the now dominant views introduced by Hilbert, Gödel, Tarski, and others. Specifically, (1) the definitive features of natural deduction calculi allowed Gentzen to assert that his classical system nk is complete based purely on the sort of evidence that Hilbert called ‘experimental’, and (2) the structure of the sequent calculi li and lk allowed Gentzen to conceptualize completeness as a question about the relationships among a system's individual rules (as opposed to the relationship between a system as a whole and its ‘semantics’). Gentzen's conception of logic is compelling in its own right. It is also of historical interest, because it allows for a better understanding of the invention of natural deduction and sequent calculi.  相似文献   

20.
The well-known picture that sequent derivations without cuts and normal derivations “are the same” will be changed. Sequent derivations without maximum cuts (i.e. special cuts which correspond to maximum segments from natural deduction) will be considered. It will be shown that the natural deduction image of a sequent derivation without maximum cuts is a normal derivation, and the sequent image of a normal derivation is a derivation without maximum cuts. The main consequence of that property will be that sequent derivations without maximum cuts and normal derivations “are the same”.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号