首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   255篇
  免费   29篇
  国内免费   63篇
  2023年   9篇
  2022年   11篇
  2021年   16篇
  2020年   28篇
  2019年   24篇
  2018年   32篇
  2017年   27篇
  2016年   15篇
  2015年   23篇
  2014年   18篇
  2013年   35篇
  2012年   12篇
  2011年   12篇
  2010年   8篇
  2009年   3篇
  2008年   5篇
  2007年   11篇
  2006年   7篇
  2005年   5篇
  2004年   3篇
  2003年   5篇
  2002年   6篇
  2001年   1篇
  2000年   3篇
  1998年   2篇
  1997年   7篇
  1996年   1篇
  1995年   2篇
  1994年   1篇
  1992年   1篇
  1991年   2篇
  1990年   2篇
  1987年   1篇
  1986年   2篇
  1985年   2篇
  1984年   1篇
  1983年   3篇
  1977年   1篇
排序方式: 共有347条查询结果,搜索用时 0 毫秒
11.
注意捕获是指与任务无关的刺激能够不自觉地吸引注意的现象。实验一采用视觉搜索任务,考察与主任务无关的情绪面孔的注意捕获水平及其机制,实验二进一步探究时间任务需求对无关情绪面孔注意捕获的影响。结果发现:与其他情绪面孔相比,愤怒的情绪面孔捕获了更多的注意,且受到整体情绪加工的影响;时间任务需求影响了目标刺激的注意选择,但愤怒优势效应不受到时间任务需求的影响,因此可能是一种较为自动化的加工过程。  相似文献   
12.
摘要:本文基于无意义词与简单几何体之间的跨模态映射效应,来研究社会知觉中人名发音与脸型的关系。本研究控制了人名发音的唇形和人物面孔的轮廓,通过简单匹配范式来考察人名发音与面孔之间是否存在跨模态映射效应。结果发现:被试将圆唇人名与圆面孔匹配、扁唇人名与尖面孔匹配的概率,均显著大于随机水平。该结果说明了人名发音和脸型之间存在跨模态映射效应,这不仅拓宽了声音-形状跨模态映射的研究范围,对于人们取名字方面具有一定的指导意义。  相似文献   
13.
Ratings of emotion in laterally presented faces: sex and handedness effects   总被引:2,自引:0,他引:2  
Sixteen right-handed participants (8 male and 8 female students) and 16 left-handed participants (8 male and 8 female students) were presented with cartoon faces expressing emotions ranging from extremely positive to extremely negative. A forced-choice paradigm was used in which the participants were asked to rate the faces as either positive or negative. Compared to men, women rated faces more positively, especially in response to right visual field presentations. Women rated neutral and mildly positive faces more positively in the right than in the left visual field, whereas men rated these faces consistently across visual fields. Handedness did not affect the ratings of emotion. The data suggest a positive emotional bias of the left hemisphere in women.  相似文献   
14.
There is accumulating evidence that disgust plays an important role in prejudice toward individuals with obesity, but that research is primarily based on self-reported emotions. In four studies, we examined whether participants displayed a physiological marker of disgust (i.e. levator labii activity recorded using facial electromyography) in response to images of obese individuals, and whether these responses corresponded with their self-reported disgust to those images. All four studies showed the predicted self-reported disgust response toward images of obese individuals. Study 1 further showed that participants exhibited more levator activity to images of obese individuals than to neutral images. However, Studies 2–4 failed to provide any evidence that the targets’ body size affected levator responses. These findings suggest that disgust may operate at multiple levels, and that the disgust response to images of obese individuals may be more of a cognitive-conceptual one than a physiological one.  相似文献   
15.
16.
This experiment investigated social referencing as a form of discriminative learning in which maternal facial expressions signaled the consequences of the infant's behavior in an ambiguous context. Eleven 4- and 5-month-old infants and their mothers participated in a discrimination-training procedure using an ABAB design. Different consequences followed infants' reaching toward an unfamiliar object depending on the particular maternal facial expression. During the training phases, a joyful facial expression signaled positive reinforcement for the infant reaching for an ambiguous object, whereas a fearful expression signaled aversive stimulation for the same response. Baseline and extinction conditions were implemented as controls. Mothers' expressions acquired control over infants' approach behavior for all participants. All participants ceased to show discriminated responding during the extinction phase. The results suggest that 4- and 5-month-old infants can learn social referencing via discrimination training.  相似文献   
17.
The tendency to express emotions non-verbally is positively related to perception of emotions in oneself. This study examined its relationship to perception of emotions in others. In 40 healthy adults, EEG theta synchronization was used to indicate emotion processing following presentation of happy, angry, and neutral faces. Both positive and negative expressiveness were associated with higher emotional sensitivity, as shown by cortical responses to facial expressions during the early, unconscious processing stage. At the late, conscious processing stage, positive expressiveness was associated with higher sensitivity to happy faces but lower sensitivity to angry faces. Thus, positive expressiveness predisposes people to allocate fewer attentional resources for conscious perception of angry faces. In contrast, negative expressiveness was consistently associated with higher sensitivity. The effects of positive expressiveness occurred in cortical areas that deal with emotions, but the effects of negative expressiveness occurred in areas engaged in self-referential processes in the context of social relationships.  相似文献   
18.
Affective changes in response to motive-relevant stimuli are a defining feature of implicit motives. We therefore expected to find an effect of individual differences in the implicit need for affiliation (nAff) on corrugator supercilii activity, an indicator of affect, when participants were confronted with nonverbal indicators of a conversational partner’s withdrawal. Participants’ nAff was assessed with a Picture Story Exercise (PSE). They were then involved in an interaction with a smiling or a neutral experimenter while their corrugator activity was measured with electromyography (EMG). As expected, we found higher corrugator activity for people high in nAff compared to people low in nAff when the experimenter kept a neutral facial expression throughout the interaction but not when he/she was smiling.  相似文献   
19.
Using signal detection methods, possible effects of emotion type (happy, angry), gender of the stimulus face, and gender of the participant on the detection and response bias of emotion in briefly presented faces were investigated. Fifty-seven participants (28 men, 29 women) viewed 90 briefly presented faces (30 happy, 30 angry, and 30 neutral, each with 15 male and 15 female faces) answering yes if the face was perceived as emotional and no if it was not perceived as emotional. Sensitivity [d', z(hit rate) minus z(false alarm rate)] and response bias (β, likelihood ratio of "signal plus noise" vs. "noise") were measured for each face combination for each presentation time (6.25, 12.50, 18.75, 25.00, 31.25 ms). The d' values were higher for happy than for angry faces and higher for angry-male than for angry-female faces, and there were no effects of gender-of-participant. Results also suggest a greater tendency for participants to judge happy-female faces as emotional, as shown by lower β values for these faces as compared to the other emotion-gender combinations. This happy-female response bias suggests, at least, a partial explanation to happy-superiority effects in studies where performance is only measured as percent correct responses, and, in general, that women are expected to be happy.  相似文献   
20.
The present study investigated the effect of the perception of faces expressing shame on time perception in children aged 5 and 8 years, as well as in adults, as a function of their ability to recognize this emotional expression. The participants' ability to recognize the expression of shame among faces expressing different emotions was tested. They were then asked to perform a temporal bisection task involving both neutral and ashamed faces. The results showed that, from the age of 8 years, the participants who recognized the facial expressions of shame underestimated their presentation time compared to that of neutral faces. In contrast, no time distortion was observed in the children who did not recognize the ashamed faces or in those younger children who did recognize them. The results are discussed in terms of self-conscious emotions which develop to involve an attentional mechanism.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号