首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Previous studies have revealed that decoding of facial expressions is a specific component of face comprehension and that semantic information might be processed separately from the basic stage of face perception. In order to explore event-related potentials (ERPs) related to recognition of facial expressions and the effect of the semantic content of the stimulus, we analyzed 20 normal subjects. Faces with three prototypical emotional expressions (fear, happiness, and sadness) and with three morphed expressions were presented in random order. The neutral stimuli represented the control condition. Whereas ERP profiles were similar with respect to an early negative ERP (N170), differences in peak amplitude were observed later between incongruous (morphed) expressions and congruous (prototypical) ones. In fact, the results demonstrated that the emotional morphed faces elicited a negative peak at about 360 ms, mainly distributed over the posterior site. The electrophysiological activity observed may represent a specific cognitive process underlying decoding of facial expressions in case of semantic anomaly detection. The evidence is in favor of the similarity of this negative deflection with the N400 ERP effect elicited in linguistic tasks. A domain-specific semantic module is proposed to explain these results.  相似文献   

2.
Is facial expression recognition marked by specific event-related potentials (ERPs) effects? Are conscious and unconscious elaborations of emotional facial stimuli qualitatively different processes? In Experiment 1, ERPs elicited by supraliminal stimuli were recorded when 21 participants viewed emotional facial expressions of four emotions and a neutral stimulus. Two ERP components (N2 and P3) were analyzed for their peak amplitude and latency measures. First, emotional face-specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). A more posterior distribution of ERPs was found for N2. Moreover, a lateralization effect was revealed for negative (right lateralization) and positive (left lateralization) facial expressions. In Experiment 2 (20 participants), 1-ms subliminal stimulation was carried out. Unaware information processing was revealed to be quite similar to aware information processing for peak amplitude but not for latency. In fact, unconscious stimulation produced a more delayed peak variation than conscious stimulation.  相似文献   

3.
The author analyzed the role of consciousness in emotional face comprehension. The author recorded psychophysiological measures of event-related potentials (ERPs), elicited by supraliminal and subliminal stimuli when participants viewed emotional facial expressions of 4 emotions or neutral stimuli. The author analyzed an ERP emotion-specific effect (N200 peak variation; temporal interval was 180-300 ms poststimulus) in terms of peak amplitude and latency variables. The results indicated 4 important findings. First, there was an emotional-specific ERP deflection only for emotional stimuli, not for neutral stimuli. Second, the unaware information processing was quite similar to that of aware in terms of peak morphology, but not in terms of latency. In fact, unconscious stimulation produced a more delayed peak variation than did conscious stimulation. Third, valence of facial stimulus (positive or negative) was supraliminally and subliminally decoded because it was showed by differences of peak deflection between negative high arousing (fear and anger) and low arousing (happiness, sadness, and neutral) stimuli. Finally, the author found a more posterior distribution of ERP as a function of emotional content of the stimulus. Cortical lateralization (right or left) was not correlated to conscious or unconscious stimulation. The author discussed the functional significance of her results in terms of supraliminal and subliminal conditions.  相似文献   

4.
The emotional content of stimuli influences cognitive performance. In two experiments, we investigated the time course and mechanisms of emotional influences on visual word processing in various tasks by recording event-related brain potentials (ERPs). The stimuli were verbs of positive, negative, and neutral valence. In Experiment 1, where lexical decisions had to be performed on single verbs, both positive and negative verbs were processed more quickly than neutral verbs and elicited a distinct ERP component, starting around 370 msec. In Experiment 2, the verbs were embedded in a semantic context provided by single nouns. Likewise, structural, lexical, and semantic decisions for positive verbs were accelerated, and an ERP effect with a scalp distribution comparable to that in Experiment 1 now started about 200 msec earlier. These effects may signal an automatic allocation of attentional resources to emotionally arousing words, since they were not modulated by different task demands. In contrast, a later ERP effect of emotion was restricted to lexical and semantic decisions and, thus, appears to indicate more elaborated, task-dependent processing of emotional words.  相似文献   

5.
The study investigates cross-modal simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs), using a wide range of different emotions (happiness, sadness, fear, anger, surprise, and disgust). Auditory emotional stimuli (a neutral word pronounced in an affective tone) and visual patterns (emotional facial expressions) were matched in congruous (the same emotion in face and voice) and incongruous (different emotions) pairs. Subjects (N=31) were required to watch and listen to the stimuli in order to comprehend them. Repeated measures ANOVAs showed a positive ERP deflection (P2), more posterior distributed. This P2 effect may represent a marker of cross-modal integration, modulated as a function of congruous/incongruous condition. Indeed, it shows an ampler peak in response to congruous stimuli than incongruous ones. It is suggested P2 can be a cognitive marker of multisensory processing, independently from the emotional content.  相似文献   

6.
Event-related potential (ERP) studies have shown that emotional stimuli elicit greater amplitude late positive-polarity potentials (LPPs) than neutral stimuli. This effect has been attributed to arousal, but emotional stimuli are also more semantically coherent than uncategorized neutral stimuli. ERPs were recorded during encoding of positive, negative, uncategorized neutral, and categorized neutral words. Differences in LPP amplitude elicited by emotional versus uncategorized neutral stimuli were evident from 450 to 1000 ms. From 450 to 700 ms, LPP effects at midline and right hemisphere frontal electrodes indexed arousal, whereas LPP effects at left hemisphere centro-parietal electrodes indexed semantic cohesion. This dissociation helps specify the processes underlying emotional stimulus encoding, and suggests the need to control for semantic cohesion in emotional information processing studies.  相似文献   

7.
PurposeEvent-related brain potentials (ERPs) were used to investigate the neural correlates of emotion processing in 5- to 8-year-old children who do and do not stutter.MethodsParticipants were presented with an audio contextual cue followed by images of threatening (angry/fearful) and neutral facial expressions from similarly aged peers. Three conditions differed in audio-image pairing: neutral context-neutral expression (neutral condition), negative context-threatening expression (threat condition), and reappraisal context-threatening expression (reappraisal condition). These conditions reflected social stimuli that are ecologically valid to the everyday life of children.ResultsP100, N170, and late positive potential (LPP) ERP components were elicited over parietal and occipital electrodes. The threat condition elicited an increased LPP mean amplitude compared to the neutral condition across our participants, suggesting increased emotional reactivity to threatening facial expressions. In addition, LPP amplitude decreased during the reappraisal condition— evidence of emotion regulation. No group differences were observed in the mean amplitude of ERP components between children who do and do not stutter. Furthermore, dimensions of childhood temperament and stuttering severity were not strongly correlated with LPP elicitation.ConclusionThese findings are suggestive that, at this young age, children who stutter exhibit typical brain activation underlying emotional reactivity and regulation to social threat from peer facial expressions.  相似文献   

8.
利用ERP技术对16名被试在不同注意负荷下对动态表情识别的脑时程特点进行了考察,结果发现:N170成分不受注意负荷和表情效价影响;低注意负荷条件下动态表情加工早期负性表情诱发的EPN成分(early posterior negativity)显著大于中性、正性表情,后期高级分析阶段正、负性表情均诱发明显的LPP成分(late positive potentials),且负性较正性表情诱发更大的LPP成分;而在高注意负荷条件下并未发现表情识别诱发明显的EPN或LPP成分。这些结果说明,动态表情识别明显受到注意资源调节,只有在注意资源不断增加时表情加工才会出现注意偏向效应,尤以负性表情的加工优势最为凸显且持久,正性表情相对微弱和短暂。  相似文献   

9.
Recent research suggests that emotion effects in word processing resemble those in other stimulus domains such as pictures or faces. The present study aims to provide more direct evidence for this notion by comparing emotion effects in word and face processing in a within-subject design. Event-related brain potentials (ERPs) were recorded as participants made decisions on the lexicality of emotionally positive, negative, and neutral German verbs or pseudowords, and on the integrity of intact happy, angry, and neutral faces or slightly distorted faces. Relative to neutral and negative stimuli both positive verbs and happy faces elicited posterior ERP negativities that were indistinguishable in scalp distribution and resembled the early posterior negativities reported by others. Importantly, these ERP modulations appeared at very different latencies. Therefore, it appears that similar brain systems reflect the decoding of both biological and symbolic emotional signals of positive valence, differing mainly in the speed of meaning access, which is more direct and faster for facial expressions than for words.  相似文献   

10.
Threatening, friendly, and neutral faces were presented to test the hypothesis of the facilitated perceptual processing of threatening faces. Dense sensor event-related brain potentials were measured while subjects viewed facial stimuli. Subjects had no explicit task for emotional categorization of the faces. Assessing early perceptual stimulus processing, threatening faces elicited an early posterior negativity compared with nonthreatening neutral or friendly expressions. Moreover, at later stages of stimulus processing, facial threat also elicited augmented late positive potentials relative to the other facial expressions, indicating the more elaborate perceptual analysis of these stimuli. Taken together, these data demonstrate the facilitated perceptual processing of threatening faces. Results are discussed within the context of an evolved module of fear (A. Ohman & S. Mineka, 2001).  相似文献   

11.
Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain–behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = –.51) and memory (r = –.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.  相似文献   

12.
Emotions can be recognized whether conveyed by facial expressions, linguistic cues (semantics), or prosody (voice tone). However, few studies have empirically documented the extent to which multi-modal emotion perception differs from uni-modal emotion perception. Here, we tested whether emotion recognition is more accurate for multi-modal stimuli by presenting stimuli with different combinations of facial, semantic, and prosodic cues. Participants judged the emotion conveyed by short utterances in six channel conditions. Results indicated that emotion recognition is significantly better in response to multi-modal versus uni-modal stimuli. When stimuli contained only one emotional channel, recognition tended to be higher in the visual modality (i.e., facial expressions, semantic information conveyed by text) than in the auditory modality (prosody), although this pattern was not uniform across emotion categories. The advantage for multi-modal recognition may reflect the automatic integration of congruent emotional information across channels which enhances the accessibility of emotion-related knowledge in memory.  相似文献   

13.
采用事件相关电位(ERP)技术考察了情绪语音影响面孔表情识别的时间进程。通过设置效价一致或不一致的“语音-面孔”对,要求被试判断情绪语音和面孔表情的效价是否一致。行为结果显示,被试对效价一致的“语音-面孔”对的反应更快。ERP结果显示,在70-130ms和220-450ms,不一致条件下的面孔表情比一致条件诱发了更负的波形;在450-750ms,不一致条件下的面孔表情比一致条件诱发更正的后正成分。说明情绪语音对面孔表情识别的多个阶段产生了跨通道影响。  相似文献   

14.
Extracting meaning from faces to understand other people's mental states and intentions, and to quickly adjust our actions accordingly, is a vital aspect of our social interactions. However, not all emotionally relevant attributes of a person are directly observable from facial features or expressions. In this study event-related brain potentials were used to investigate the effects of affective information about a person's biography that cannot be derived from the visual appearance of the face. Faces of well-known and initially unfamiliar persons with neutral expressions were associated with negative, positive or neutral biographical information. For well-known faces, early event-related brain potential (ERP) modulations induced by emotional knowledge, their scalp topographies and time course strongly resemble the effects frequently reported for emotional facial expressions even though here, access to stored semantic knowledge is required. These results demonstrate that visually opaque affective knowledge is extracted at high speed and modulates sensory processing in the visual cortex.  相似文献   

15.
The degree to which emotional aspects of stimuli are processed automatically is controversial. Here, we assessed the automatic elicitation of emotion-related brain potentials (ERPs) to positive, negative, and neutral words and facial expressions in an easy and superficial face-word discrimination task, for which the emotional valence was irrelevant. Both emotional words and facial expressions impacted ERPs already between 50 and 100 ms after stimulus onset, possibly reflecting rapid relevance detection. Following this initial processing stage only emotionality in faces but not in words was associated with an early posterior negativity (EPN). Therefore, when emotion is irrelevant in a task which requires superficial stimulus analysis, automatically enhanced sensory encoding of emotional content appears to occur only for evolutionary prepared emotional stimuli, as reflected in larger EPN amplitudes to faces, but not to symbolic word stimuli.  相似文献   

16.
Facial expressions play a key role in affective and social behavior. However, the temporal dynamics of the brain responses to emotional faces remain still unclear, in particular an open question is at what stage of face processing expressions might influence encoding and recognition memory. To try and answer this question we recorded the event-related potentials (ERPs) elicited in an old/new recognition task. A novel aspect of the present design was that whereas faces were presented during the study phase with either a happy, fearful or neutral expression, they were always neutral during the memory retrieval task. The ERP results showed three main findings: An enhanced early fronto-central positivity for faces encoded as fearful, both during the study and the retrieval phase. During encoding subsequent memory (Dm effect) was influenced by emotion. At retrieval the early components P100 and N170 were modulated by the emotional expression of the face at the encoding phase. Finally, the later ERP components related to recognition memory were modulated by the previously encoded facial expressions. Overall, these results suggest that face recognition is modulated by top-down influences from brain areas associated with emotional memory, enhancing encoding and retrieval in particular for fearful emotional expressions.  相似文献   

17.
Although positive and negative images enhance the visual processing of young adults, recent work suggests that a life-span shift in emotion processing goals may lead older adults to avoid negative images. To examine this tendency for older adults to regulate their intake of negative emotional information, the current study investigated age-related differences in the perceptual boost received by probes appearing over facial expressions of emotion. Visually-evoked event-related potentials were recorded from the scalp over cortical regions associated with visual processing as a probe appeared over facial expressions depicting anger, sadness, happiness, or no emotion. The activity of the visual system in response to each probe was operationalized in terms of the P1 component of the event-related potentials evoked by the probe. For young adults, the visual system was more active (i.e., greater P1 amplitude) when the probes appeared over any of the emotional facial expressions. However, for older adults, the visual system displayed reduced activity when the probe appeared over angry facial expressions.  相似文献   

18.
为探讨高特质焦虑者在前注意阶段对情绪刺激的加工模式以明确其情绪偏向性特点, 本研究采用偏差-标准反转Oddball范式探讨了特质焦虑对面部表情前注意加工的影响。结果发现: 对于低特质焦虑组, 悲伤面孔所诱发的早期EMMN显著大于快乐面孔, 而对于高特质焦虑组, 快乐和悲伤面孔所诱发的早期EMMN差异不显著。并且, 高特质焦虑组的快乐面孔EMMN波幅显著大于低特质焦虑组。结果表明, 人格特质是影响面部表情前注意加工的重要因素。不同于普通被试, 高特质焦虑者在前注意阶段对快乐和悲伤面孔存在相类似的加工模式, 可能难以有效区分快乐和悲伤情绪面孔。  相似文献   

19.
The immediate and long-term neural correlates of linguistic processing deficits reported following paediatric and adolescent traumatic brain injury (TBI) are poorly understood. Therefore, the current research investigated event-related potentials (ERPs) elicited during a semantic picture-word priming experiment in two groups of highly functioning individuals matched for various demographic variables and behavioural language performance. Participants in the TBI group had a recorded history of paediatric or adolescent TBI involving injury mechanisms associated with diffuse white matter pathology, while participants in the control group never sustained any insult to the brain. A comparison of N400 Mean Amplitudes elicited during three experimental conditions with varying semantic relatedness between the prime and target stimuli (congruent, semantically related, unrelated) revealed a significantly smaller N400 response in the unrelated condition in the TBI group, indicating residual linguistic processing deviations when processing demands required the quick detection of a between-category (unrelated) violation of semantic expectancy.  相似文献   

20.
The present study investigates the perception of facial expressions of emotion, and explores the relation between the configural properties of expressions and their subjective attribution. Stimuli were a male and a female series of morphed facial expressions, interpolated between prototypes of seven emotions (happiness, sadness, fear, anger, surprise and disgust, and neutral) from Ekman and Friesen (1976). Topographical properties of the stimuli were quantified using the Facial Expression Measurement (FACEM) scheme. Perceived dissimilarities between the emotional expressions were elicited using a sorting procedure and processed with multidimensional scaling. Four dimensions were retained in the reconstructed facial-expression space, with positive and negative expressions opposed along D1, while the other three dimensions were interpreted as affective attributes distinguishing clusters of expressions categorized as "Surprise-Fear," "Anger," and "Disgust." Significant relationships were found between these affective attributes and objective facial measures of the stimuli. The findings support a componential explanatory scheme for expression processing, wherein each component of a facial stimulus conveys an affective value separable from its context, rather than a categorical-gestalt scheme. The findings further suggest that configural information is closely involved in the decoding of affective attributes of facial expressions. Configural measures are also suggested as a common ground for dimensional as well as categorical perception of emotional faces.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号