首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 140 毫秒
1.
本研究通过3个实验探讨群体信息对面部表情识别的影响。结果发现:(1)周围面孔的情绪状态影响个体对目标面孔情绪的识别,两者情绪一致时的反应时显著短于不一致时的反应时,且面部表情识别准确性更高。(2)群体信息会调节周围面孔情绪对目标面孔的影响,进而影响面部表情识别。具体而言,群体条件下,个体对目标面部表情的识别受到周围面孔情绪状态的影响,相比周围面孔情绪与目标面孔情绪不一致,两者情绪一致时,即符合个体基于知觉线索建立的群体成员情绪具有一致性的预期,面部表情识别的准确性更高、速度更快;而非群体条件下,个体则不受周围面孔情绪状态的影响。研究结果表明,个体能够基于互动人物之间的社会关系识别面孔情绪,群体存在时,会建立群体成员情绪具有一致性的预期,进而影响面部表情识别。  相似文献   

2.
躯体和面孔是个体情绪识别的敏感线索。与面部表情的早期视觉加工相似, P1成分对恐惧、愤怒等负性躯体表情更加敏感, 反映了对躯体威胁信息快速且无意识的加工。情绪躯体和面孔还有着类似的构型加工, 表现为二者都能诱发颞枕区视觉皮层相似的N170成分, 但涉及的神经基础并不完全相同。在构型编码加工中, 面部表情的N170与顶中正成分(Vertex Positive Potential, VPP)较躯体表情的N170、VPP更加明显。在面部表情和躯体表情的后期加工阶段, 早期后部负波(Early Posterior Negativity, EPN)反映了面孔和躯体视觉编码的注意指向加工, 随后出现的P3与晚期正成分(Late Positive Component, LPC)代表了顶额皮层对复杂情绪信息的高级认知加工。躯体表情还存在与外纹状皮层躯体区相关的N190成分, 其对躯体的情绪和动作信息敏感。今后的研究应进一步探讨动作对情绪知觉的影响、动态面孔−躯体情绪的加工机制等。  相似文献   

3.
面孔、语音情绪信息的整合加工是社会交往的重要技能, 近年来逐渐引起心理学、神经科学研究的关注。当前研究较为系统地考察了双通道情绪信息整合加工的行为表现和影响因素, 也很好地回答了“何时整合”与“在哪里整合”两个认知神经科学关注的问题, 但对“面孔、语音情绪信息能否整合为一致情绪客体?双通道情绪信息在大脑中如何整合为一?”两个关键问题都还缺乏系统研究。因此, 本项目拟系统操纵面孔、语音刺激的情绪凸显度和任务要求, 引入动态面孔-语音刺激以增加外部效度, 综合运用行为和电生理技术, 从多角度挖掘数据, 特别是引入神经振荡(时频、相干)分析, 系统考察动态性面孔和语音情绪信息是否能整合成一致情绪客体, 并在神经振荡层面探明双通道情绪信息整合的机制。  相似文献   

4.
躯体和面孔是个体情绪表达与识别的重要线索。与面部表情相比,躯体表情加工的显著特点是补偿情绪信息,感知运动与行为信息,及产生适应性行为。情绪躯体与面孔加工的神经基础可能相邻或部分重合,但也存在分离;EBA、FBA、SPL、IPL等是与躯体表情加工相关的特异性脑区。今后应系统研究面孔、躯体及语音情绪线索加工潜在的神经基础,探讨躯体情绪加工的跨文化差异,考察情绪障碍患者的躯体表情加工特点。  相似文献   

5.
探讨情绪性面孔的知觉和表象过程中,不同类型线索表情的启动效应,并关注不同类型表情表象难易的差异。选取NimStim数据库中20位演员的愉快、愤怒和中性表情作为启动刺激,随后呈现同一演员不同表情的图片,或通过颜色提示被试对不同表情进行表象,并同时进行表情类型判断。研究发现,情绪性面孔知觉与表象任务中均存在启动效应,之前呈现的线索面孔将会对接下来呈现的相同效价的面孔产生启动效应,对相反效价及中性面孔产生抑制; 在平衡不同类型面孔可能存在的启动效应后,正性、负性及中性表情是同样易于表象的。  相似文献   

6.
不同愉悦度面孔阈下情绪启动效应:来自ERP的证据   总被引:2,自引:0,他引:2  
吕勇  张伟娜  沈德立 《心理学报》2010,42(9):929-938
采用事件相关电位技术,研究阈下情绪启动效应。实验中的因素是阈下呈现的情绪启动面孔的愉悦度,分为高、低两个水平。被试的任务是对中性靶刺激面孔进行情绪判断。结果发现:被试在对靶刺激进行情绪判断时出现与启动刺激愉悦度趋于一致的启动效应;低愉悦度面孔作启动刺激条件下N1和P2的波幅显著大于高愉悦度面孔作为启动刺激的条件;不同愉悦度情绪面孔的阈下启动效应是由于启动刺激影响了对靶刺激的知觉加工所致。  相似文献   

7.
表情视觉搜索任务是让作为目标的情绪面孔在其他干扰面孔中呈现,并要求被试快速搜索目标以做出反应.本研究采用实验的方法,以表情面孔图片作为实验材料,比较了情绪诱发状态下,情绪调节策略使用与否以及两种不同的情绪调节策略对搜索绩效的影响,同时探讨了目标情绪面孔特征和搜索者即时情绪状态对搜索绩效的作用.结果发现:(1)对微笑面孔的搜索快于生气面孔,且存在显著差异,说明人们对积极的信息有觉察优势;(2)中性情绪状态优于正性和负性情绪状态,且存在显著差异,说明情绪唤起会干扰搜索过程;(3)认知重评和表达抑制策略均优于无情绪调节策略,且存在显著差异.因此,表情搜索绩效与是否使用调节策略、搜索目标特征及搜索者情绪状态有关.  相似文献   

8.
在现实生活中, 有效的情绪识别往往依赖于不同通道间的信息整合(如, 面孔、声音)。本文梳理相关研究认为, 面孔表情和声音情绪信息在早期知觉阶段即产生交互作用, 且初级感知觉皮层负责两者信息的编码; 而在晚期决策阶段, 杏仁核、颞叶等高级脑区完成对情绪信息内容的认知评估整合; 此外, 神经振荡活动在多个频段上的功能耦合促进了跨通道情绪信息整合。未来研究需要进一步探究两者整合是否与情绪冲突有关, 以及不一致的情绪信息在整合中是否有优势, 探明不同频段的神经振荡如何促进面孔表情和声音情绪信息整合, 以便更深入地了解面孔表情和声音情绪信息整合的神经动力学基础。  相似文献   

9.
摘 要 面孔情绪识别过程中的多感觉通道效应指参与刺激加工的各个通道对面孔表情认知的综合影响。研究者们通过行为实验、事件相关电位以及脑成像技术等对该过程中的多个获取信息的通道进行研究,肢体表情、情绪性声音、特定气味能系统地影响面孔表情的情绪识别。一系列的研究对多通道效应的作用时间、潜在作用机制、相关激活脑区进行了探索。未来的研究可以整合脑网络的技术,并结合其他学科的新技术以更细致具体地考察这些通道下信息的物理属性所起的作用。 关键词 面孔表情 多通道效应 面孔情绪识别 肢体表情 情绪性声音 嗅觉信号  相似文献   

10.
吴彬星  张智君  孙雨生 《心理学报》2015,47(10):1201-1212
对于面孔性别与表情的关系, 目前的理论尚不完善。而众多研究证据表明, 面孔熟悉度与面孔性别及表情的加工均有密切关系。本研究基于Garner范式考察了在不同面孔熟悉度下面孔性别与表情的相互关系。共包括4项实验:实验1, 面孔刺激的身份陌生且不重复, 刺激在Garner范式的控制组和正交组中均仅呈现一次, 面孔熟悉度低; 实验2, 除面孔刺激的身份重复外, 其余均同实验1, 面孔熟悉度中等; 实验3, 面孔刺激的身份陌生且不重复, 但分别在控制组和正交组中重复呈现多次, 面孔熟悉度高; 实验4, 通过面孔学习增加面孔的熟悉度, 以直接验证面孔熟悉度的增加对面孔性别与表情相互关系的影响。结果发现:对于陌生面孔, 表情单向影响面孔性别的加工; 随着面孔熟悉度的增加, 面孔性别与表情之间出现双向的影响。因此, 面孔熟悉度对面孔性别与表情的相互影响具有调节作用。  相似文献   

11.
当面孔以群体形式出现,认知神经系统会自动整合情绪信息提取平均情绪,此过程被称为群体面孔情绪的整体编码。探讨其与低水平整体表征的分离,与个体表征的关系及神经活动特点是揭示其加工机制的关键,但目前尚未形成系统性模型。未来应综合利用眼动、神经电生理和脑成像技术,结合注意、记忆及社会线索进一步拓展对其认知神经机制和影响因素的研究,同时关注具有认知情感障碍的特殊人群,并从毕生发展的角度探索其发展轨迹。  相似文献   

12.
The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general “facial response effect” is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.  相似文献   

13.
According to cognitive and neural theories of emotion, attentional processing of innate threat stimuli, such as angry facial expressions, is prioritised over neutral stimuli. To test this hypothesis, the present study used a modified version of the rapid serial visual presentation (RSVP) paradigm to investigate the effect of emotional face stimuli on the attentional blink (AB). The target stimuli were schematic faces which depicted threatening (angry), positive or neutral facial expressions. Results showed that performance accuracy was enhanced (i.e., the AB was reduced) on trials in which the second target was an angry face, rather than a neutral face. Results extend previous research by demonstrating that angry faces reduce the AB, and that this effect is found for schematic facial expressions. These findings further support the proposal that, when there is competition for attentional resources, threat stimuli are given higher priority in processing compared with non-threatening stimuli.  相似文献   

14.
胡治国  刘宏艳 《心理科学》2015,(5):1087-1094
正确识别面部表情对成功的社会交往有重要意义。面部表情识别受到情绪背景的影响。本文首先介绍了情绪背景对面部表情识别的增强作用,主要表现为视觉通道的情绪一致性效应和跨通道情绪整合效应;然后介绍了情绪背景对面部表情识别的阻碍作用,主要表现为情绪冲突效应和语义阻碍效应;接着介绍了情绪背景对中性和歧义面孔识别的影响,主要表现为背景的情绪诱发效应和阈下情绪启动效应;最后对现有研究进行了总结分析,提出了未来研究的建议。  相似文献   

15.
Recent developments in functional imaging techniques, such as Positron Emission Tomography (PET) and functional Magnetic Resonance Imaging (fMRI), allow us to characterize more precisely the functional neuroanatomy mediating emotional responding. This corpus of studies has led to the development of affective neuroscience. First, we present a summary of the studies aimed at understanding the underlying mechanisms of the emotional response, which were conducted prior to the use of the brain imaging techniques. Then, this paper reviews the studies investigating the neural substrates implicated in the processing of facial expressions and those implicated in the production of experimentally induced emotional responses. This review of the literature includes a meta‐analysis of eight studies using PET and one fMRI study reporting the neural correlates of experimentally induced emotions in healthy individuals. The methods and results of these studies are described through figures drawn from the reported Talairach's coordinates depicting the cerebral regions activated in relation to different experimental conditions. The implications of the results and the role of the cerebral structures that have been identified are discussed. As regards the studies on the neural bases of the processing of facial expressions of emotion, there are separable neural circuits that are involved in mediating responding to differing categories of facial expressions of emotion. Fearful expressions have relatively consistently been found to activate the amygdala, as, occasionally, have sad and happy expressions. The anterior insula and the putamen seem to be particularly involved in disgust expression recognition, whereas the facial expression of anger seems to be predominantly associated with anterior cingular and orbitofrontal cortex activity. Among the cerebral structures that have appeared to be activated by experimentally induced emotions, the anterior cingulate cortex seems to play a specific role in representing subjective emotional responses.  相似文献   

16.
Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain–behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = –.51) and memory (r = –.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.  相似文献   

17.
Evidence suggests that social skills are affected by childhood mild traumatic brain injury (mTBI), but the neural and affective substrates of these difficulties are still underexplored. In particular, nothing is known about consequences on the perception of emotional facial expressions, despite its critical role in social interactions and the importance of the preschool period in the development of this ability. This study thus aimed to investigate the electrophysiological correlates of emotional facial expressions processing after early mTBI. To this end, 18 preschool children (mean age 53 ± 8 months) who sustained mTBI and 15 matched healthy controls (mean age 55 ± 11 months) were presented with pictures of faces expressing anger, happiness, or no emotion (neutral) while event-related potentials (ERP) were recorded. The main results revealed that P1 amplitude was higher for happy faces than for angry faces, and that N170 latency was shorter for emotional faces than for neutral faces in the control group only. These findings suggest that preschool children who sustain mTBI do not present the early emotional effects that are observed in healthy preschool children at visuospatial and visual expertise stages. This study provides new evidence regarding the consequences of childhood mTBI on socioemotional processing, by showing alterations of emotional facial expressions processing, an ability known to underlie social competence and appropriate social interactions.  相似文献   

18.
PurposeEvent-related brain potentials (ERPs) were used to investigate the neural correlates of emotion processing in 5- to 8-year-old children who do and do not stutter.MethodsParticipants were presented with an audio contextual cue followed by images of threatening (angry/fearful) and neutral facial expressions from similarly aged peers. Three conditions differed in audio-image pairing: neutral context-neutral expression (neutral condition), negative context-threatening expression (threat condition), and reappraisal context-threatening expression (reappraisal condition). These conditions reflected social stimuli that are ecologically valid to the everyday life of children.ResultsP100, N170, and late positive potential (LPP) ERP components were elicited over parietal and occipital electrodes. The threat condition elicited an increased LPP mean amplitude compared to the neutral condition across our participants, suggesting increased emotional reactivity to threatening facial expressions. In addition, LPP amplitude decreased during the reappraisal condition— evidence of emotion regulation. No group differences were observed in the mean amplitude of ERP components between children who do and do not stutter. Furthermore, dimensions of childhood temperament and stuttering severity were not strongly correlated with LPP elicitation.ConclusionThese findings are suggestive that, at this young age, children who stutter exhibit typical brain activation underlying emotional reactivity and regulation to social threat from peer facial expressions.  相似文献   

19.
Recent studies have shown that cueing eye gaze can affect the processing of visual information, and this phenomenon is called the gaze-orienting effect (visual-GOE). Emerging evidence has shown that the cueing eye gaze also affects the processing of auditory information (auditory-GOE). However, it is unclear whether the auditory-GOE is modulated by emotion. We conducted three behavioural experiments to investigate whether cueing eye gaze influenced the orientation judgement to a sound, and whether the effect was modulated by facial expressions. The current study set four facial expressions (angry, fearful, happy, and neutral), manipulated the display type of facial expressions, and changed the sequence of gaze and emotional expressions. Participants were required to judge the sound orientation after facial expressions and gaze cues. The results showed that the orientation judgement of sound was influenced by gaze direction in all three experiments, and the orientation judgement of sound was faster when the face was oriented to the target location (congruent trials) than when the face was oriented away from the target location (incongruent trials). The modulation of emotion on auditory-GOE was observed only when gaze shifted followed by facial expression (Exp3); the auditory-GOE was significantly greater for angry faces than for neutral faces. These findings indicate that auditory-GOE as a social phenomenon exists widely, and the effect was modulated by facial expression. Gaze shift before the presentation of emotion was the key influencing factor for the emotional modulation in an auditory target gaze-orienting task. Our findings suggest that the integration of facial expressions and eye gaze was context-dependent.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号