首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   668篇
  免费   9篇
  677篇
  2024年   6篇
  2023年   10篇
  2022年   9篇
  2021年   13篇
  2020年   23篇
  2019年   24篇
  2018年   32篇
  2017年   24篇
  2016年   33篇
  2015年   18篇
  2014年   31篇
  2013年   82篇
  2012年   55篇
  2011年   59篇
  2010年   30篇
  2009年   28篇
  2008年   32篇
  2007年   27篇
  2006年   16篇
  2005年   28篇
  2004年   20篇
  2003年   16篇
  2002年   15篇
  2001年   4篇
  2000年   2篇
  1999年   6篇
  1998年   9篇
  1996年   3篇
  1992年   2篇
  1991年   2篇
  1990年   4篇
  1988年   2篇
  1987年   2篇
  1986年   2篇
  1985年   2篇
  1983年   1篇
  1981年   1篇
  1979年   1篇
  1978年   1篇
  1974年   1篇
  1973年   1篇
排序方式: 共有677条查询结果,搜索用时 0 毫秒
61.
Spatio-temporal interactions between simple geometrical shapes typically elicit strong impressions of intentionality. Recent research has started to explore the link between attentional processes and the detection of interacting objects. Here, we asked whether visual attention is biased toward such interactions. We investigated probe discrimination performance in algorithmically generated animations that involved two chasing objects and two randomly moving objects. In Experiment 1, we observed a pronounced attention capture effect for chasing objects. Because reduced interobject spacing is an inherent feature of interacting objects, in Experiment 2 we designed randomly moving objects that were matched to the chasing objects with respect to interobject spacing at probe onset. In this experiment, the capture effect attenuated completely. Therefore, we argue that reduced interobject spacing reflects an efficient cue to guide visual attention toward objects that interact intentionally.  相似文献   
62.
Markus Pantsar 《Synthese》2014,191(17):4201-4229
Recent years have seen an explosion of empirical data concerning arithmetical cognition. In this paper that data is taken to be philosophically important and an outline for an empirically feasible epistemological theory of arithmetic is presented. The epistemological theory is based on the empirically well-supported hypothesis that our arithmetical ability is built on a protoarithmetical ability to categorize observations in terms of quantities that we have already as infants and share with many nonhuman animals. It is argued here that arithmetical knowledge developed in such a way cannot be totally conceptual in the sense relevant to the philosophy of arithmetic, but neither can arithmetic understood to be empirical. Rather, we need to develop a contextual a priori notion of arithmetical knowledge that preserves the special mathematical characteristics without ignoring the roots of arithmetical cognition. Such a contextual a priori theory is shown not to require any ontologically problematic assumptions, in addition to fitting well within a standard framework of general epistemology.  相似文献   
63.
64.
65.
66.
    
Facial examiners make visual comparisons of face images to establish the identities of persons in police investigations. This study utilised eye-tracking and an individual differences approach to investigate whether these experts exhibit specialist viewing behaviours during identification, by comparing facial examiners with forensic fingerprint analysts and untrained novices across three tasks. These comprised of face matching under unlimited (Experiment 1) and time-restricted viewing (Experiment 2), and with a feature-comparison protocol derived from examiner casework procedures (Experiment 3). Facial examiners exhibited individual differences in facial comparison accuracy and did not consistently outperform fingerprint analysts and novices. Their behaviour was also marked by similarities to the comparison groups in terms of how faces were viewed, as evidenced from eye movements, and how faces were perceived, based on the made feature judgements and identification decisions. These findings further understanding of how facial comparisons are performed and clarify the nature of examiner expertise.  相似文献   
67.
    
Scientists have shown that many non‐human animals such as ants, dogs, or rats are very good at using smells to find their way through their environments. But are humans also capable of navigating through their environment based on olfactory cues? There is not much research on this topic, a gap that the present research seeks to bridge. We here provide one of the first empirical studies investigating the possibility of using olfactory cues as landmarks in human wayfinding. Forty subjects participated in a piloting study to determine the olfactory material for the main experiment. Then, 24 subjects completed a wayfinding experiment with 12 odors as orientation cues. Our results are astonishing: Participants were rather good at what we call “odor‐based wayfinding.” This indicates that the ability of humans to use olfactory cues for navigation is often underestimated. We discuss two different cognitive explanations and rule out the idea that our results are just an instance of sequential learning. Rather, we argue that humans can enrich their cognitive map of the environment with olfactory landmarks and may use them for wayfinding.  相似文献   
68.
69.
    
Conditional inferences can be phrased with unspecific terms (“If a person is on a diet, then the person loses weight. A person is on a diet. The person loses weight”) or specific terms (“If Anna is on a diet, then Anna loses weight. Anna is on a diet. Anna loses weight”). We investigate whether the specificity of terms affects people's acceptance of inferences. In Experiment 1, inferences with specific terms received higher acceptance ratings than inferences with unspecific terms. In Experiments 2 and 3, we used the same problems as in Experiment 1 but also problems with unspecific terms in the conditional and specific terms in the categorical and vice versa. When the conditional and the categorical had the same specificity, results were as in Experiment 1. When the specificity of the conditional and the categorical mismatched, acceptance ratings were lower. Our results illustrate the importance of phrasing on reasoning.  相似文献   
70.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号