首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Earlier research has indicated that some characteristics of facial expressions may be automatically processed. This study investigated automaticity as evidenced by involuntary interference in a word evaluation task. Compound stimuli, consisting of words superimposed on pictures of affective faces, were presented to subjects who were given the task of evaluating the affective valence of the words while disregarding the faces. Results of three experiments showed that word evaluation was influenced by the concurrently shown affective faces. Overall, negative words were found to require longer latencies, indicating that more processing resources are invested in negative than in positive stimuli. This speed advantage for positive words was modified by the faces. Negative words were facilitated, relative to positive ones, when shown with a negative expression (e.g. a sad face). Correspondingly, negative words were inhibited, relative to positive ones, when shown with a positive expression (e.g. a happy face). The results are consistent with automatic, involuntary semantic processing of affective facial expressions.  相似文献   

2.
We examined proactive and reactive control effects in the context of task-relevant happy, sad, and angry facial expressions on a face-word Stroop task. Participants identified the emotion expressed by a face that contained a congruent or incongruent emotional word (happy/sad/angry). Proactive control effects were measured in terms of the reduction in Stroop interference (difference between incongruent and congruent trials) as a function of previous trial emotion and previous trial congruence. Reactive control effects were measured in terms of the reduction in Stroop interference as a function of current trial emotion and previous trial congruence. Previous trial negative emotions exert greater influence on proactive control than the positive emotion. Sad faces in the previous trial resulted in greater reduction in the Stroop interference for happy faces in the current trial. However, current trial angry faces showed stronger adaptation effects compared to happy faces. Thus, both proactive and reactive control mechanisms are dependent on emotional valence of task-relevant stimuli.  相似文献   

3.
采用修订的情绪觉察水平量表(LEAS)对315名职前教师进行调查,并从中筛选出高、低情绪觉察能力组被试各60人,分别完成情绪面孔Stroop任务(研究1)和情绪词Stroop任务(研究2)。(1)研究1的正确率上,中性面孔最高,愉快面孔次之,悲伤面孔最差;反应时上,悲伤面孔最长,愉快面孔次之,中性面孔最短;高分组仅在愉快和中性面孔上的反应时长于低分组;消极面孔的干扰效应高于积极面孔的干扰效应。(2)研究2的反应时上,消极词的反应时最长,显著长于中性和积极词;高分组仅在积极词和中性词上的反应时长于低分组,且高分组在积极词上的干扰效应高于低分组。研究结果表明与中性刺激相比,高低情绪觉察能力组均对情绪刺激产生了注意偏向,尤其是负性情绪刺激;与低情绪觉察能力组相比,高情绪觉察能力职前教师不仅对消极情绪信息产生注意偏向,还对积极情绪信息产生注意偏向。  相似文献   

4.
抑郁个体对情绪面孔的返回抑制能力不足   总被引:2,自引:0,他引:2  
戴琴  冯正直 《心理学报》2009,41(12):1175-1188
探讨抑郁对情绪面孔返回抑制能力的影响。以贝克抑郁量表、自评抑郁量表、CCMD-3和汉密顿抑郁量表为工具筛选出了正常对照组、抑郁康复组和抑郁患者组各17名被试进行了真人情绪面孔线索-靶子任务的行为学实验和事件相关电位(ERP)实验。在线索靶子范式中, 靶子在线索消失后出现, 被试对靶子的位置作出反应。行为学实验显示线索靶子间隔时间(stimulus onset asynchronies, SOA)为14ms时, 正常对照组对中性面孔有返回抑制效应, 抑郁康复组对所有面孔均存在返回抑制效应, 患者组对愤怒、悲伤面孔和中性面孔存在返回抑制效应; SOA为250ms时三组被试均对悲伤面孔存在返回抑制能力不足, 以患者组最突出, 康复组对高兴面孔存在返回抑制能力不足; SOA为750ms时正常组对悲伤面孔存在返回抑制效应, 康复组对高兴和悲伤面孔存在返回抑制能力不足, 患者组对悲伤面孔存在返回抑制能力不足, 对愤怒面孔存在返回抑制效应。在SOA为750ms的条件下, ERP波形特点为正常组对高兴面孔线索P3波幅大于其他组, 对高兴面孔无效提示P1波幅小于其他面孔, 对悲伤面孔有效提示P1波幅小于高兴面孔, 对高兴面孔有效提示P3波幅大于患者组, 对悲伤面孔无效提示P3波幅大于其他组; 康复组对悲伤面孔线索P3波幅大于其他面孔, 对高兴面孔有效提示P3波幅大于患者组, 对悲伤面孔无效提示P3波幅小于正常组; 患者组对悲伤面孔线索P1波幅大于其他组、P3波幅大于其他面孔, 对悲伤面孔无效提示P3波幅小于正常组, 高兴面孔有效提示P3波幅小于其他组。提示抑郁患者对负性刺激有返回抑制能力不足, 这种对负性刺激抑制能力的缺失导致抑郁个体难以抗拒负性事件的干扰而受到不良情绪状态的困扰, 所以他们可能更多的体验到抑郁情绪, 并致使抑郁持续和发展。而抑郁康复个体对高兴、悲伤面孔均有返回抑制能力不足, 这让康复个体能同时感受到正、负性刺激, 从而能保持一种认知和情绪上特定的平衡。  相似文献   

5.
In two studies, we used a negative affective priming task with pictures of angry (Study 1), sad (Study 2), and happy faces (Studies 1 and 2) to measure attentional inhibition of emotional stimuli as a function of attachment style. Results showed that attachment avoidance was associated with a stronger inhibition of both angry and sad faces. This indicates that the regulatory strategies of avoidant individuals involve inhibition of different types of negative, but not positive, stimuli. Attachment anxiety, on the other hand, showed no association with inhibitory responding to negative stimuli, although we did find indications of impaired inhibitory processing of happy faces in Study 1. The results are discussed in relation to current evidence on avoidant affect-regulation strategies.  相似文献   

6.
从中国情绪面孔系统和现代汉语情感词系统中分别选取两种情绪效价的表情面孔(愉快、悲伤)和情绪词(积极、消极),运用词—面孔范式,以60名大学生为被试,探讨了情绪效价冲突效应及性别差异。结果表明:(1)面孔的情绪类型主效应显著,悲伤面孔反应时长于愉快面孔;(2)词和面孔的情绪效价冲突性主效应显著,情绪效价一致条件下的反应时短于情绪效价不一致条件的反应时;(3)表情面孔的情绪类型、词和面孔情绪效价的冲突性、被试性别三者的交互作用显著,简单效应分析表明女性在词和面孔情绪效价不一致条件下对悲伤表情的判别反应时短于男性。  相似文献   

7.
Threatening facial expressions can signal the approach of someone or something potentially dangerous. Past research has established that adults have an attentional bias for angry faces, visually detecting their presence more quickly than happy or neutral faces. Two new findings are reported here. First, evidence is presented that young children share this attentional bias. In five experiments, young children and adults were asked to find a picture of a target face among an array of eight distracter faces. Both age groups detected threat‐relevant faces – angry and frightened – more rapidly than non‐threat‐relevant faces (happy and sad). Second, evidence is presented that both adults and children have an attentional bias for negative stimuli overall. All negative faces were detected more quickly than positive ones in both age groups. As the first evidence that young children exhibit the same superior detection of threatening facial expressions as adults, this research provides important support for the existence of an evolved attentional bias for threatening stimuli.  相似文献   

8.
In two studies, we used a negative affective priming task with pictures of angry (Study 1), sad (Study 2), and happy faces (Studies 1 and 2) to measure attentional inhibition of emotional stimuli as a function of attachment style. Results showed that attachment avoidance was associated with a stronger inhibition of both angry and sad faces. This indicates that the regulatory strategies of avoidant individuals involve inhibition of different types of negative, but not positive, stimuli. Attachment anxiety, on the other hand, showed no association with inhibitory responding to negative stimuli, although we did find indications of impaired inhibitory processing of happy faces in Study 1. The results are discussed in relation to current evidence on avoidant affect-regulation strategies.  相似文献   

9.
The present study investigated whether dysphoric individuals have a difficulty in disengaging attention from negative stimuli and/or reduced attention to positive information. Sad, neutral and happy facial stimuli were presented in an attention-shifting task to 18 dysphoric and 18 control participants. Reaction times to neutral shapes (squares and diamonds) and the event-related potentials to emotional faces were recorded. Dysphoric individuals did not show impaired attentional disengagement from sad faces or facilitated disengagement from happy faces. Right occipital lateralisation of P100 was absent in dysphoric individuals, possibly indicating reduced attention-related sensory facilitation for faces. Frontal P200 was largest for sad faces among dysphoric individuals, whereas controls showed larger amplitude to both sad and happy as compared with neutral expressions, suggesting that dysphoric individuals deployed early attention to sad, but not happy, expressions. Importantly, the results were obtained controlling for the participants' trait anxiety. We conclude that at least under some circumstances the presence of depressive symptoms can modulate early, automatic stages of emotional processing.  相似文献   

10.
An immense body of research demonstrates that emotional facial expressions can be processed unconsciously. However, it has been assumed that such processing takes place solely on a global valence-based level, allowing individuals to disentangle positive from negative emotions but not the specific emotion. In three studies, we investigated the specificity of emotion processing under conditions of limited awareness using a modified variant of an affective priming task. Faces with happy, angry, sad, fearful, and neutral expressions were presented as masked primes for 33 ms (Study 1) or 14 ms (Studies 2 and 3) followed by emotional target faces (Studies 1 and 2) or emotional adjectives (Study 3). Participants' task was to categorise the target emotion. In all three studies, discrimination of targets was significantly affected by the emotional primes beyond a simple positive versus negative distinction. Results indicate that specific aspects of emotions might be automatically disentangled in addition to valence, even under conditions of subjective unawareness.  相似文献   

11.
An immense body of research demonstrates that emotional facial expressions can be processed unconsciously. However, it has been assumed that such processing takes place solely on a global valence-based level, allowing individuals to disentangle positive from negative emotions but not the specific emotion. In three studies, we investigated the specificity of emotion processing under conditions of limited awareness using a modified variant of an affective priming task. Faces with happy, angry, sad, fearful, and neutral expressions were presented as masked primes for 33 ms (Study 1) or 14 ms (Studies 2 and 3) followed by emotional target faces (Studies 1 and 2) or emotional adjectives (Study 3). Participants’ task was to categorise the target emotion. In all three studies, discrimination of targets was significantly affected by the emotional primes beyond a simple positive versus negative distinction. Results indicate that specific aspects of emotions might be automatically disentangled in addition to valence, even under conditions of subjective unawareness.  相似文献   

12.
We investigated the source of the visual search advantage of some emotional facial expressions. An emotional face target (happy, surprised, disgusted, fearful, angry, or sad) was presented in an array of neutral faces. A faster detection was found for happy targets, with angry and, especially, sad targets being detected more poorly. Physical image properties (e.g., luminance) were ruled out as a potential source of these differences in visual search. In contrast, the search advantage is partly due to the facilitated processing of affective content, as shown by an emotion identification task. Happy expressions were identified faster than the other expressions and were less likely to be confounded with neutral faces, whereas misjudgements occurred more often for angry and sad expressions. Nevertheless, the distinctiveness of some local features (e.g., teeth) that are consistently associated with emotional expressions plays the strongest role in the search advantage pattern. When the contribution of these features to visual search was factored out statistically, the advantage disappeared.  相似文献   

13.
The aim of the present study was to investigate the time course of the positive advantage in the expression classification of faces by recording event-related potentials (ERPs). Although neutral faces were classified more quickly than either happy or sad faces, a significant positive classification advantage (PCA)—that is, faster classification for happy than for sad faces—was found. For ERP data, as compared with sad faces, happy faces elicited a smaller N170 and a larger posterior N2 component. The P3 was modulated by facial expressions with higher amplitudes and shorter latencies for both happy and neutral stimuli than for sad stimuli, and the reaction times were significantly correlated with the amplitude and latency of the P3. Overall, these data showed robust PCA in expression classification, starting when the stimulus has been recognized as a face revealed by the N170 component.  相似文献   

14.
The role of cognitive control mechanisms in reducing interference from emotionally salient distractors was investigated. In two experiments, participants performed a flanker task in which target-distractor affective compatibility and cognitive load were manipulated. Differently from past studies, targets and distractors were presented at separate spatial locations and cognitive load was not domain-specific. In Experiment 1, words (positive vs. negative) and faces (angry, happy or neutral faces), were used respectively as targets and distractors, whereas in Experiment 2, both targets (happy vs. angry) and distractors were faces. Findings showed interference from distractor processing only when cognitive load was high. The present findings indicate that, when targets and distractors are presented at different spatial locations, cognitive control mechanisms are involved in preventing interference from positive (Exp. 1) or negative distractors (Exp. 2). The role of stimulus valence and type is also discussed with regard to different patterns of interference observed.  相似文献   

15.
Compared to neutral or happy stimuli, subliminal fear stimuli are known to be well processed through the automatic pathway. We tried to examine whether fear stimuli could be processed more strongly than other negative emotional stimuli using a modified subliminal affective priming paradigm. Twenty-six healthy subjects participated in two separated sessions. Fear, disgust and neutral facial expressions were adopted as primes, and 50% happy facial stimuli were adopted as a target to let only stronger negative primes reveal a priming effect. Participants were asked to appraise the affect of target faces in the affect appraisal session and to appraise the genuineness of target faces in the genuineness appraisal session. The genuineness instruction was developed to help participants be sensitive to potential threats. In the affect appraisal, participants judged 50% happy target faces significantly more 'unpleasant' when they were primed by fear faces than primed by 50% happy control faces. In the genuineness appraisal, participants judged targets significantly more 'not genuine' when they were primed by fear and disgust faces than primed by controls. These findings suggest that there may be differential priming effects between subliminal fear and disgust expressions, which could be modulated by a sensitive context of potential threat.  相似文献   

16.
Recent research suggests that emotion effects in word processing resemble those in other stimulus domains such as pictures or faces. The present study aims to provide more direct evidence for this notion by comparing emotion effects in word and face processing in a within-subject design. Event-related brain potentials (ERPs) were recorded as participants made decisions on the lexicality of emotionally positive, negative, and neutral German verbs or pseudowords, and on the integrity of intact happy, angry, and neutral faces or slightly distorted faces. Relative to neutral and negative stimuli both positive verbs and happy faces elicited posterior ERP negativities that were indistinguishable in scalp distribution and resembled the early posterior negativities reported by others. Importantly, these ERP modulations appeared at very different latencies. Therefore, it appears that similar brain systems reflect the decoding of both biological and symbolic emotional signals of positive valence, differing mainly in the speed of meaning access, which is more direct and faster for facial expressions than for words.  相似文献   

17.
The goal of this research was to examine the effects of facial expressions on the speed of sex recognition. Prior research revealed that sex recognition of female angry faces was slower compared with male angry faces and that female happy faces are recognized faster than male happy faces. We aimed to replicate and extend the previous research by using different set of facial stimuli, different methodological approach and also by examining the effects of some other previously unexplored expressions (such as crying) on the speed of sex recognition. In the first experiment, we presented facial stimuli of men and women displaying anger, fear, happiness, sadness, crying and three control conditions expressing no emotion. Results showed that sex recognition of angry females was significantly slower compared with sex recognition in any other condition, while sad, crying, happy, frightened and neutral expressions did not impact the speed of sex recognition. In the second experiment, we presented angry, neutral and crying expressions in blocks and again only sex recognition of female angry expressions was slower compared with all other expressions. The results are discussed in a context of perceptive features of male and female facial configuration, evolutionary theory and social learning context.  相似文献   

18.
AntecedentsGiven the contradictions of previous studies on the changes in attentional responses produced in aging a Stroop emotional task was proposed to compare young and older adults to words or faces with an emotional valence.MethodThe words happy or sad were superimposed on faces that express the emotion of happiness or sadness. The emotion expressed by the word and the face could agree or not (cued and uncued trials, respectively). 85 young and 66 healthy older adults had to identify both faces and words separately, and the interference between the two types of stimuli was examined.ResultsAn interference effect was observed for both types of stimuli in both groups. There was more interference on positive faces and words than on negative stimuli.ConclusionsOlder adults had more difficulty than younger in focusing on positive uncued trials, whereas there was no difference across samples on negative uncued trials.  相似文献   

19.
Happy, surprised, disgusted, angry, sad, fearful, and neutral facial expressions were presented extrafoveally (2.5° away from fixation) for 150 ms, followed by a probe word for recognition (Experiment 1) or a probe scene for affective valence evaluation (Experiment 2). Eye movements were recorded and gaze-contingent masking prevented foveal viewing of the faces. Results showed that (a) happy expressions were recognized faster than others in the absence of fixations on the faces, (b) the same pattern emerged when the faces were presented upright or upside-down, (c) happy prime faces facilitated the affective evaluation of emotionally congruent probe scenes, and (d) such priming effects occurred at 750 but not at 250 ms prime–probe stimulus–onset asynchrony. This reveals an advantage in the recognition of happy faces outside of overt visual attention, and suggests that this recognition advantage relies initially on featural processing and involves processing of positive affect at a later stage.  相似文献   

20.
Research has shown that neutral faces are better recognized when they had been presented with happy rather than angry expressions at study, suggesting that emotional signals conveyed by facial expressions influenced the encoding of novel facial identities in memory. An alternative explanation, however, would be that the influence of facial expression resulted from differences in the visual features of the expressions employed. In this study, this possibility was tested by manipulating facial expression at study versus test. In line with earlier studies, we found that neutral faces were better recognized when they had been previously encountered with happy rather than angry expressions. On the other hand, when neutral faces were presented at study and participants were later asked to recognize happy or angry faces of the same individuals, no influence of facial expression was detected. As the two experimental conditions involved exactly the same amount of changes in the visual features of the stimuli between study and test, the results cannot be simply explained by differences in the visual properties of different facial expressions and may instead reside in their specific emotional meaning. The findings further suggest that the influence of facial expression is due to disruptive effects of angry expressions rather than facilitative effects of happy expressions. This study thus provides additional evidence that facial identity and facial expression are not processed completely independently.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号