首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The difference between the Henry “memory-drum” theory and our version is that ours includes an additional assumption that, after programming has occurred, the resultant representation can be stored in short-term memory. Otherwise, the essential ideas are the same in the two theories. Implications of the presently available data for the distinction between the theories are discussed. Regardless of how one evaluates our added assumption, it is clear that the essential insight of the Henry theory has fared very well in the 20 yr since the theory first appeared in print.  相似文献   

2.
Henry Jackman 《Erkenntnis》1996,44(3):317-326
Davidson has claimed that to conclude that reference is inscrutable, one must assume that If some theory of truth ... is satisfactory in the light of all relevant evidence ... then any theory that is generated from the first theory by a permutation will also be satisfactory in the light of all relevant evidence. However, given that theories of truth are not directly read off the world, but rather serve as parts of larger theories of behavior, this assumption is far from self-evident. A proper understanding of the role truth theories play in theories of interpretation makes the inscrutability of reference much less wide-spread than Davidson suggests, and, as a result, the radical interpretation methodology is much less likely to saddle its defenders with counterintuitive cases of indeterminacy than is commonly supposed.  相似文献   

3.
Psychophysiological theories on the development of essential hypertension are reviewed and evaluated. Two interconnected theories that relate behavior to essential hypertension and account for individual differences in susceptibility to disease are the "hyperreactivity" theory and "the symptom specificity" theory. The "hyperreactivity" theory identifies individual differences in autonomic nervous system reactivity as the pathophysiological mechanism and the "symptom specificity" theory suggests that inflexible, stereotypical responding increases the risk to develop hypertension. Based on a literature review, these theories are examined. There exist both case/control and prospective studies on autonomic nervous system reactivity and the development of hypertension. It is concluded that a neurogenically mediated hyperreactivity to stress is a precursor and not an effect of hypertension. Tasks that call for active but not passive coping efforts are more efficient elicitors of reactivity differences between those at high and low risk to develop hypertension in case/control studies. In prospective studies, active tasks may also have a predictive advantage over passive with respect to blood pressure development. In the early phase of hypertension, an increased cardiovascular reactivity is accompanied by increased neuroendocrine activation. In the later phase, heightened reactivity is confined to the cardiovascular system. This does not prove but is consistent with the notion that transient episodes of increased cardiac output translate into essential hypertension by causing vascular hypertrophy. Case/control studies suggest that an increased "symptom specificity", with stereotypical responding across multiple stressors, is independent of cardiovascular reactivity and a precursor of hypertension. The literature lacks prospective studies on the clinical relevance of stereotypical responding. It is suggested that the presence of both hyperreactivity and symptom specificity in a single individual increases the risk to develop essential hypertension.  相似文献   

4.
In this paper we present a new metaphysical theory of material objects. On our theory, objects are bundles of property instances, where those properties give the nature or essence of that object. We call the theory essential bundle theory. Property possession is not analysed as bundle-membership, as in traditional bundle theories, since accidental properties are not included in the object’s bundle. We have a different story to tell about accidental property possession. This move reaps many benefits. Essential bundle theory delivers a simple theory of the essential properties of material objects; an explanation of how object coincidence can arise; an actual-world ground for modal differences between coincident objects; a simple story about intrinsic properties; and a plausible account of certain ubiquitous cases of causal overdetermination.  相似文献   

5.
Over several decades, appraisal theory has emerged as a prominent theoretical framework explaining the elicitation and differentiation of emotions, and has stimulated a great deal of theorising and empirical research. Despite the large amount of research in this area, there are many aspects of appraisal theory and research that remain unclear or problematic. In this review, we identify a common assumption of many appraisal theories—the fixed appraisal set—and argue that this assumption, combined with a lack of explicit theorising about the predicted relationship between appraisals and emotions, leads to a lack of clarity in both appraisal models and the empirical testing of those models. We recommend that appraisal theorists move in a direction already taken by a small number of theorists, and adopt the starting assumption of a variable appraisal set. We further suggest that theories of concepts and categorisation may inform theorising about appraisal–emotion relationships.  相似文献   

6.
Bernbach Bernbach, 1967, Bernbach, 1971 has claimed that empirical findings of Type 2 d′ invariance in confidence-rated recall experiments confirm a prediction of his two-state theory of recognition and disprove strength theories of memory. In order to derive this prediction from his recognition theory, Bernbach (1967) found it necessary to add to the theory an assumption concerning recall processes. However, this recall assumption is untenable, as it leads to an empirically false prediction concerning accuracy on recognition and recall tests. Moreover, an alternative recall assumption proposed by Bernbach and Kupchak (1972) does not lead to a prediction of Type 2 d′ invariance. Thus, Bernbach's theory predicts the invariance of Type 2 d′ only with the aid of an untenable recall assumption. Consequently, empirical findings of Type 2 d′ invariance cannot be regarded as supportive of Bernbach's theory.  相似文献   

7.
8.
Geoffrey Hellman 《Synthese》1982,53(3):461-503
Standard proofs of generalized Bell theorems, aiming to restrict stochastic, local hidden-variable theories for quantum correlation phenomena, employ as a locality condition the requirement of conditional stochastic independence. The connection between this and the no-superluminary-action requirement of the special theory of relativity has been a topic of controversy. In this paper, we introduce an alternative locality condition for stochastic theories, framed in terms of the models of such a theory (§2). It is a natural generalization of a light-cone determination condition that is essentially equivalent to mathematical conditions that have been used to derive Bell inequalities in the deterministic case. Further, it is roughly equivalent to a condition proposed by Bell that, in one investigation, needed to be supplemented with a much stronger assumption in order to yield an inequality violated by some quantum mechanical predictions. It is shown here that this reflects a very general situation: from the proposed locality condition, even adding the strict anticorrelation condition and the auxiliary hypotheses needed to derive experimentally useful (and theoretically telling) inequalities, no Bell-type inequality is derivable. (These independence claims are the burden of §4.) A certain limitation on the scope of the proposed stochastic locality condition is exposed (§5), but it is found to be rather minor. The conclusion is thus supported that conditional stochastic independence, however reasonable on other grounds, is essentially stronger than what is required by the special theory.Our results stand in apparent contradiction with a class of derivations purporting to obtain generalized Bell inequalities from locality alone. It is shown in Appendix (B) that such proofs do not achieve their goal. This fits with our conclusion that generalized Bell theorems are not straightforward generalizations of theorems restricting deterministic hidden-variable theories, and that, in fact, such generalizations do not exist. This leaves open the possibility that a satisfactory, non-deterministic account of the quantum correlation phenomena can be given within the framework of the special theory.  相似文献   

9.
In this article, we examine the temporary and long-term consequences of the death of a parent or child on happiness. According to set-point theory external conditions are expected to only have a short-term or limited influence on happiness. This directly contradicts the basic assumption of affective theories on happiness, which states that major life-events have a lasting influence on well-being. Moreover, we test whether the association between bereavement and happiness is equally strong across the life course. To test our hypotheses we make use of the fourth wave of the European Values Study. Our research findings demonstrate that people who lost their father, mother or child are more likely to feel unhappy than people without this experience. Ten years after the death of a parent or child we still find a significant difference in happiness between people who have and have not experienced this loss. The assumption of set-point theory, that major life evens only have a temporary impact on SWB, is not supported by our data. Moreover, the association between bereavement and SWB strongly differs across the life-course. We might even conclude that the age at which the loss occurred is more decisive for the strength of the association between bereavement and SWB than the duration of the loss.  相似文献   

10.
New paradoxes of risky decision making   总被引:1,自引:0,他引:1  
  相似文献   

11.
Most standard results on structure identification in first order theories depend upon the correctness and completeness (in the limit) of the data, which are provided to the learner. These assumption are essential for the reliability of inductive methods and for their limiting success (convergence to the truth).The paper investigates inductive inference from (possibly) incorrect and incomplete data. It is shown that such methods can be reliable not in the sense of truth approximation, but in the sense that the methods converge to empirically adequate theories, i.e. theories, which are consistent with all data (past and future) and complete with respect to a given complexity class of L-sentences. Adequate theories of bounded complexity can be inferred uniformly and effectively by polynomial-time learning algorithms. Adequate theories of unbounded complexity can be inferred pointwise by less efficient methods.  相似文献   

12.
It has been alleged against divine command theory (DCT) that we cannot justify our acceptance of it without giving it up. For if we provide moral reasons for our acceptance of God’s commands, then those reasons, and not God’s commands, must be our ultimate moral standard. Kai Nielsen has offered the most forceful version of this objection in his book, Ethics Without God. My principal aim is to show that Nielsen’s charge does not succeed. His argument crucially relies upon the assumption that the moral judgments one employs to justify acceptance of a normative theory are more fundamental to one’s moral outlook than the theory itself. I argue that this assumption presupposes a questionable foundationalist view of theory justification, and if we instead adopt a coherentist reflective equilibrium stance, we can thoughtfully evaluate DCT without abandoning it.  相似文献   

13.
Juli Eflin 《Metaphilosophy》2003,34(1-2):48-68
Traditional epistemology has, in the main, presupposed that the primary task is to give a complete account of the concept knowledge and to state under what conditions it is possible to have it. In so doing, most accounts have been hierarchical, and all assume an idealized knower. The assumption of an idealized knower is essential for the traditional goal of generating an unassailable account of knowledge acquisition. Yet we, as individuals, fail to reach the ideal. Perhaps more important, we have epistemic goals not addressed in the traditional approach – among them, the ability to reach understanding in areas we deem important for our lives. Understanding is an epistemic concept. But how we obtain it has not traditionally been a focus. Developing an epistemic account that starts from a set of assumptions that differ from the traditional starting points will allow a different sort of epistemic theory, one on which generating understanding is a central goal and the idealized knower is replaced with an inquirer who is not merely fallible but working from a particular context with particular goals. Insight into how an epistemic account can include the particular concerns of an embedded inquirer can be found by examining the parallels between ethics and epistemology and, in particular, by examining the structure and starting points of virtue accounts. Here I develop several interrelated issues that contrast the goals and evaluative concepts that form the structure of both standard, traditional epistemological and ethical theories and virtue–centered theories. In the end, I sketch a virtue–centered epistemology that accords with who we are and how we gain understanding.  相似文献   

14.
15.
Juli Eflin 《Metaphilosophy》2003,34(1&2):48-68
Traditional epistemology has, in the main, presupposed that the primary task is to give a complete account of the concept knowledge and to state under what conditions it is possible to have it. In so doing, most accounts have been hierarchical, and all assume an idealized knower. The assumption of an idealized knower is essential for the traditional goal of generating an unassailable account of knowledge acquisition. Yet we, as individuals, fail to reach the ideal. Perhaps more important, we have epistemic goals not addressed in the traditional approach – among them, the ability to reach understanding in areas we deem important for our lives. Understanding is an epistemic concept. But how we obtain it has not traditionally been a focus. Developing an epistemic account that starts from a set of assumptions that differ from the traditional starting points will allow a different sort of epistemic theory, one on which generating understanding is a central goal and the idealized knower is replaced with an inquirer who is not merely fallible but working from a particular context with particular goals. Insight into how an epistemic account can include the particular concerns of an embedded inquirer can be found by examining the parallels between ethics and epistemology and, in particular, by examining the structure and starting points of virtue accounts. Here I develop several interrelated issues that contrast the goals and evaluative concepts that form the structure of both standard, traditional epistemological and ethical theories and virtue–centered theories. In the end, I sketch a virtue–centered epistemology that accords with who we are and how we gain understanding.  相似文献   

16.
K. Brad Wray 《Synthese》2013,190(9):1719-1729
I aim to clarify the relationship between the success of a theory and the truth of that theory. This has been a central issue in the debates between realists and anti-realists. Realists assume that success is a reliable indicator of truth, but the details about the respects in which success is a reliable indicator or test of truth have been largely left to our intuitions. Lewis (Synthese 129:371–380, 2001) provides a clear proposal of how success and truth might be connected, comparing a test of success of our theories to medical tests with low rates of false positives and false negatives. But, contrary to what Lewis claims, I argue that it is not enough for the realist to undercut the claim that success is not a reliable indicator of truth. Rather, the realist must show that our current best theories are likely true. Further, I argue that tests in science are unlike medical tests in a number of important ways.  相似文献   

17.
Summary Within contemporary visual-information-processing psychology, two classes of selective-attention theories can be distinguished: position-not-special theories and position-special theories. The position-not-special theories postulate that attentional selection by colour, by form, and by position are equivalent selective operations. The position-special theories assume that selection by position is more basic or direct than selection by colour or by form. Examples of both types of theory are briefly described, and irrelevant and relevant evidence is critically discussed. It is concluded that the relevant evidence is directly compatible with the position-special views and that the position-not-special theories require additional extraneous assumptions. The position-special model presented in Van der Heijden (1992) is elaborated in further detail. It is shown that this model is compatible with two important and often substantiated assumptions of the position-not-special theories: the assumption that pre-attentive analysers organize the visual scene in objects against a background, and the assumption that visual-selective attention can be directed at objects isolated in this way. This position-special theory is a parsimonious theory because it can identify the mentalistic conceptselective attention with the materialistic conceptspatial position.  相似文献   

18.
王晓田 《心理学报》2019,51(4):407-414
本文提出了决策中不确定性的五种类型及其行为学和心理学的应对机制:用简捷启发式替代加权求和应对信息不确定性, 用直觉应对认知不确定性, 用价值观预测选择偏好应对行为不确定性, 用决策参照点的权重替代概率应对结果不确定性, 用时间换时间以降低延迟折扣应对未来不确定性。新行为经济学应当通过“为什么”的功能性分析, 找到行为助推的心理杠杆。化解不确定性本身就是一种有效的行为助推; 化繁为简是行为助推的关键所在。  相似文献   

19.
With reference to Polish logico-philosophical tradition two formal theories of language syntax have been sketched and then compared with each other. The first theory is based on the assumption that the basic linguistic stratum is constituted by object-tokens (concrete objects perceived through the senses) and that the types of such objects (ideal objects) are derivative constructs. The other is founded on an opposite philosophical orientation. The two theories are equivalent. The main conclusion is that in syntactic researches it is redundant to postulate the existence of abstract linguistic entities. Earlier, in a slightly different form, the idea was presented in [27] and signalled in [26] and [25].To the memory of Jerzy Supecki  相似文献   

20.
The assumption that people possess a strategy repertoire for inferences has been raised repeatedly. The strategy selection learning theory specifies how people select strategies from this repertoire. The theory assumes that individuals select strategies proportional to their subjective expectations of how well the strategies solve particular problems; such expectations are assumed to be updated by reinforcement learning. The theory is compared with an adaptive network model that assumes people make inferences by integrating information according to a connectionist network. The network's weights are modified by error correction learning. The theories were tested against each other in 2 experimental studies. Study 1 showed that people substantially improved their inferences through feedback, which was appropriately predicted by the strategy selection learning theory. Study 2 examined a dynamic environment in which the strategies' performances changed. In this situation a quick adaptation to the new situation was not observed; rather, individuals got stuck on the strategy they had successfully applied previously. This "inertia effect" was most strongly predicted by the strategy selection learning theory.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号