首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Much of perception, learning and high-level cognition involves finding patterns in data. But there are always infinitely many patterns compatible with any finite amount of data. How does the cognitive system choose 'sensible' patterns? A long tradition in epistemology, philosophy of science, and mathematical and computational theories of learning argues that patterns 'should' be chosen according to how simply they explain the data. This article reviews research exploring the idea that simplicity drives a wide range of cognitive processes. We outline mathematical theory, computational results and empirical data that underpin this viewpoint.  相似文献   

2.
Perceptual scientists have recently enjoyed success in constructing mathematical theories for specific perceptual capacities, capacities such as stereovision, auditory localization, and color perception. Analysis of these theories suggests that they all share a common mathematical structure. If this is true, the elucidation of this structure, the study of its properties, the derivation of its consequences, and the empirical testing of its predictions are promising directions for perceptual research. We consider a candidate for the common structure, a candidate called an "observer". Observers, in essence, perform inferences; each observer has a characteristic class of perceptual premises, a characteristic class of perceptual conclusions, and its own functional relationship between these premises and conclusions. If observers indeed capture the structure common to perceptual capacities, then each capacity, regardless of its modality or manner of instantiation, can be described as some observer. In this paper we develop the definition of an observer. We first consider two examples of perceptual capacities: the measurement of visual motion, and the perception of depth from visual motion. In each case, we review a formal theory of the capacity and abstract its structural essence. From this essence we construct the definition of observer. We then exercise the definition in discussions of transduction, perceptual illusions, perceptual uncertainty, regularization theory, the cognitive penetrability of perception, and the theory neutrality of observation.  相似文献   

3.
We present an algorithmic model for the development of children's intuitive theories within a hierarchical Bayesian framework, where theories are described as sets of logical laws generated by a probabilistic context-free grammar. We contrast our approach with connectionist and other emergentist approaches to modeling cognitive development. While their subsymbolic representations provide a smooth error surface that supports efficient gradient-based learning, our symbolic representations are better suited to capturing children's intuitive theories but give rise to a harder learning problem, which can only be solved by exploratory search. Our algorithm attempts to discover the theory that best explains a set of observed data by performing stochastic search at two levels of abstraction: an outer loop in the space of theories and an inner loop in the space of explanations or models generated by each theory given a particular dataset. We show that this stochastic search is capable of learning appropriate theories in several everyday domains and discuss its dynamics in the context of empirical studies of children's learning.  相似文献   

4.
A theory from the behavioral and social sciences is presented from the structuralist point of view. A more comprehensive theory-net is outlined, some basic terms and core assumptions are formulated, and an expansion of the theory towards two intended applications is given. Finally, some results of a first empirical test of the theory are reported. The aim of the paper is to show that the structuralist account of scientific theories is not confined to mathematical theories from the natural sciences, but can also be applied to relatively informal constructions of the behavioral and social sciences.  相似文献   

5.
We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism—neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non‐neural notions of computation ought to be replaced or reinterpreted in terms of neural computation.  相似文献   

6.
多媒体学习的认知理论和图文理解整合模型指出, 多媒体学习中图片和文字在早期阶段的加工遵循双通道假设, 在晚期不同表征的信息进行整合完成知识建构。理论上的探讨得到了神经科学实证研究的支持。脑成像研究发现, 图、文加工在早期的前语义阶段存在差异, 但晚期的信息整合阶段共享了同一语义系统。事件相关电位研究也发现在早期图片诱发了独特的N300成分, 而晚期图、文均诱发了N400效应。尽管现有研究已经明确了图、文共享同一语义系统, 但是信息是如何在这一系统中进行整合的还需要进一步的研究。  相似文献   

7.
Many criminological theories about individual violation can be broadly classified into two groups: (1) those that attempt to explain the violation of personally held conventional norms and values (normative violations) and (2) those that attempt to explain the following of deviant norms and values (cultural deviance). We argue that each set of theories focuses on a unique dependent variable, but that social learning is one example of a theory that explicitly integrates these dependent variables under one explanatory heading. This important difference in what theories are trying to explain has been downplayed in criminology, with potentially serious consequences. The failure to provide accurate, focused measures of the correct dependent variables may have produced empirical findings that are weaker than expected. This article takes a step toward rectifying this important problem. It presents a discussion of the different sets of theories and of the attempt by social learning theory to explain both of behavior; in addition, it spells out the implications for research of this difference.  相似文献   

8.
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak. This weakness relates to the many arbitrary ways that priors, likelihoods, and utility functions can be altered in order to account for the data that are obtained, making the models unfalsifiable. It further relates to the fact that Bayesian theories are rarely better at predicting data compared with alternative (and simpler) non-Bayesian theories. Second, we show that the empirical evidence for Bayesian theories in neuroscience is weaker still. There are impressive mathematical analyses showing how populations of neurons could compute in a Bayesian manner but little or no evidence that they do. Third, we challenge the general scientific approach that characterizes Bayesian theorizing in cognitive science. A common premise is that theories in psychology should largely be constrained by a rational analysis of what the mind ought to do. We question this claim and argue that many of the important constraints come from biological, evolutionary, and processing (algorithmic) considerations that have no adaptive relevance to the problem per se. In our view, these factors have contributed to the development of many Bayesian "just so" stories in psychology and neuroscience; that is, mathematical analyses of cognition that can be used to explain almost any behavior as optimal.  相似文献   

9.
There are two basically different sorts of theory of meaning and the sort of theory one adopts has a great deal to do with the view one takes of the learning of language. Corresponding to the two kinds of theory of meaning two theories of the learning of language are delineated and the issues that are disputed by them clarified. The logical question of whether each theory is intelligible in itself is first discussed, and, it being concluded that both are genuine empirical hypotheses, an examination is made of psychological grounds for preferring one or the other. While no final decision is arrived at, it is argued that there are grounds for preferring one of the theories.  相似文献   

10.
It is argued that the process of operationalizing a (linear) mathematical learning theory consists of a semantical and a 'structural' part. Another important aspect of the application of a theory is evaluating its empirical adequacy (goodness-of-fit, etc.). In this paper problems connected to operationalization and assumptions concerning parameter in-variance and homogeneity are discussed, and illustrated in a contingent two-choice learning situation with partial reinforcement.  相似文献   

11.
General goodness of fit tests for probabilistic response models are developed. The tests are applicable in psychophysics, in the theory of choice behavior and in mathematical learning theories. The necessary and sufficient constraints that a measurement model puts on the response probabilities are used for testing this model. In addition, representation theorems for some models are proved and the goodness of fit to experimental data is considered.  相似文献   

12.
Claudia E. Vanney 《Zygon》2015,50(3):736-756
Quantum mechanics (QM) studies physical phenomena on a microscopic scale. These phenomena are far beyond the reach of our observation, and the connection between QM's mathematical formalism and the experimental results is very indirect. Furthermore, quantum indeterminism defies common sense. Microphysical experiments have shown that, according to the empirical context, electrons and quanta of light behave as waves and other times as particles, even though it is impossible to design an experiment that manifests both behaviors at the same time. Unlike Newtonian physics, the properties of quantum systems (position, velocity, energy, time, etc.) are not all well‐defined simultaneously. Moreover, quantum systems are not characterized by their properties, but by a wave function. Although one of the principles of the theory is the uncertainty principle, the trajectory of the wave function is controlled by the deterministic Schrödinger equations. But what is the wave function? Like other theories of the physical sciences, quantum theory assigns states to systems. The wave function is a particular mathematical representation of the quantum state of a physical system, which contains information about the possible states of the system and the respective probabilities of each state.  相似文献   

13.
This review synthesizes relevant research dealing with the processes of learning and suggests its applications to compliance gaining. The two major issues addressed are: (1) to what degree can learning theories explain the acquisition of new attitudes and behaviors, and (2) to what degree are attitudinal and behavioral changes governed by learning theory principles? The learning theories discussed are grouped into three categories: stimulus-response or connectionist approaches; cognitive approaches; and stochastic, mathematical, and cybernetic approaches. The stimulus-response models, which encompass most of the research examined in this paper, are further broken down into four types: (1) classical conditioning, (2) contiguity models, (3) instrumental (or operant) conditioning and (4) models including drive and drive reduction. Principles and major research evidence from numerous learning theories are reviewed and analyzed, and suggestions are made as to how this evidence may aid in the construction of more complete theories of persuasion and attitude change.  相似文献   

14.
Building on previous work, I continue the arguments for scientific realism in the presence of a natural level structure of science. That structure results from a cognitive antireductionism that calls for the retention of mature theories even though they have been “superseded”. The level structure is based on “scientific truth” characterized by a theory's validity domain and the confirming empirical data. Reductionism (including fundamentalism) fails cognitively because of qualitative differences in the ontology and semantics of successive theories. This cognitive failure exists in spite of the mathematical success of theory reduction. The claim for scientific realism is strongly based on theory coherence between theories on adjacent levels. Level coherence consists of mathematical relations between levels, as well as of reductive explanations. The latter refers to questions that can be posed (but not answered) on a superseded level, but which can be answered (explained) on the superseding level. In view of the pluralism generated by cognitive antireductionism, theory coherence is claimed to be so compelling that it provides strong epistemic justification for a pluralistic scientific realism.  相似文献   

15.
16.
Christoph Lumer 《Erkenntnis》2005,62(2):235-262
In this paper an empirical theory about the nature of intention is sketched. After stressing the necessity of reckoning with intentions in philosophy of action a strategy for deciding empirically between competing theories of intention is exposed and applied for criticizing various philosophical theories of intention, among others that of Bratman. The hypothesis that intentions are optimality beliefs is defended on the basis of empirical decision theory. Present empirical decision theory however does not provide an empirically satisfying elaboration of the desirability concepts used in these optimality beliefs. Based on process theories of deliberation two hypotheses for filling this gap are developed.  相似文献   

17.
This article addresses the question of which societal characteristics are likely to enhance subjective well-being. Empirical results bearing on four theories are presented: needs theory, goals theory, relative standards models, and cultural approaches. The theories are to a degree compatible, rather than completely contradictory. There is empirical support for each of the theories, but also there are data contradicting a simple formulation of each model, and no approach can by itself explain all of the extant findings. For both applied and theoretical reasons, it is imperative that we determine the types of societal characteristics that enhance subjective well-being. In this vein a model called Evaluation Theory is proposed, in which SWB depends on people's evaluations of self-relevant information. Attention is selective and therefore the factors that determine its focus are likely to influence evaluations of events. Thus, appraisals are likely to be influenced by chronically accessible information, which in turn is influenced by the person's needs, goals, and culture. Currently, salient information is seen as being a key to life satisfaction judgments. The present paper describes numerous limitations in current research suggesting studies that will allow more definitive theories to emerge.  相似文献   

18.
Mark Colyvan 《Erkenntnis》1999,51(2-3):323-332
The Quine-Putnam indispensability argument urges us to place mathematical entities on the same ontological footing as (other) theoretical entities of empirical science. Recently this argument has attracted much criticism, and in this paper I address one criticism due to Elliott Sober. Sober argues that mathematical theories cannot share the empirical support accrued by our best scientific theories, since mathematical propositions are not being tested in the same way as the clearly empirical propositions of science. In this paper I defend the Quine-Putnam argument against Sober's objections. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

19.
Using the mathematical frameworks of economic preference ranking, subjective probability, and rational learning through empirical evidence, the epistemological implications of teleological ethical intuitionism are pointed out to the extent to which the latter is based on cognitivist and objectivist concepts of value. The notions of objective value and objective norm are critically analysed with reference to epistemological criteria of intersubjectively shared valuative experience. It is concluded that one cannot meaningfully postulate general material theories of morality that could be tested, confirmed or refuted by intersubjective empirical evidence of preferences and values, however loosely the empirical evidence of values may be interpreted. This situation is explained with reference to the ways in which preceived values become systematically influenced by the concomitants of individual valuative experience, but which have nothing to do with contingent subjective interests.  相似文献   

20.
The distinction between analytic and synthetic propositions, and with that the distinction between a priori and a posteriori truth, is being abandoned in much of analytic philosophy and the philosophy of most of the sciences. These distinctions should also be abandoned in the philosophy of mathematics. In particular, we must recognize the strong empirical component in our mathematical knowledge. The traditional distinction between logic and mathematics, on the one hand, and the natural sciences, on the other, should be dropped. Abstract mathematical objects, like transcendental numbers or Hilbert spaces, are theoretical entities on a par with electromagnetic fields or quarks. Mathematical theories are not primarily logical deductions from axioms obtained by reflection on concepts but, rather, are constructions chosen to solve some collection of problems while fitting smoothly into the other theoretical commitments of the mathematician who formulates them. In other words, a mathematical theory is a scientific theory like any other, no more certain but also no more devoid of content.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号