首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Event-related brain potentials were measured in 7- and 12-month-old infants to examine the development of processing happy and angry facial expressions. In 7-month-olds a larger negativity to happy faces was observed at frontal, central, temporal and parietal sites (Experiment 1), whereas 12-month-olds showed a larger negativity to angry faces at occipital sites (Experiment 2). These data suggest that processing of these facial expressions undergoes development between 7 and 12 months: while 7-month-olds exhibit heightened sensitivity to happy faces, 12-month-olds resemble adults in their heightened sensitivity to angry faces. In Experiment 3 infants' visual preference was assessed behaviorally, revealing that the differences in ERPs observed at 7 and 12 months do not simply reflect differences in visual preference.  相似文献   

2.
Event-related brain potentials (ERPs) were recorded to assess the processing time course of ambiguous facial expressions with a smiling mouth but neutral, fearful, or angry eyes, in comparison with genuinely happy faces (a smile and happy eyes) and non-happy faces (neutral, fearful, or angry mouth and eyes). Participants judged whether the faces looked truly happy or not. Electroencephalographic recordings were made from 64 scalp electrodes to generate ERPs. The neural activation patterns showed early P200 sensitivity (differences between negative and positive or neutral expressions) and EPN sensitivity (differences between positive and neutral expressions) to emotional valence. In contrast, sensitivity to ambiguity (differences between genuine and ambiguous expressions) emerged only in later LPP components. Discrimination of emotional vs. neutral affect occurs between 180 and 430 ms from stimulus onset, whereas the detection and resolution of ambiguity takes place between 470 and 720 ms. In addition, while blended expressions involving a smile with angry eyes can be identified as not happy in the P200 (175–240 ms) component, smiles with fearful or neutral eyes produce the same ERP pattern as genuinely happy faces, thus revealing poor discrimination.  相似文献   

3.
采用事件相关电位(ERPs)技术考察了奖赏预期对人类面孔情绪识别的影响。实验采用线索-目标范式, 分别记录了被试在奖赏预期条件下以及无奖赏预期条件下对正性、中性和负性面孔进行情绪辨别任务的ERP数据。行为结果显示, 被试在奖赏预期条件下的反应时快于无奖赏预期条件下的反应时, 对情绪面孔的反应时快于对中性面孔的反应时。ERPs数据显示, 奖赏线索比无奖赏线索诱发了更正的P1、P2和P300成分。目标刺激诱发的P1、N170波幅以及N300均受到奖赏预期的调节, 在奖赏预期条件下目标诱发了更正的ERPs。P1、N170、VPP等成分没有受到面孔情绪的影响, 而额中央位置的N300波幅显示情绪(正性与负性)面孔与中性面孔加工的差异。重要的是, N300波幅出现奖赏预期与情绪的交互作用, 正、负情绪加工效应以及负性偏向效应受奖赏预期的差异性影响。正性情绪加工效应不受奖赏预期的影响, 而负性情绪加工效应和负性偏向效应在奖赏预期条件下显著大于无奖赏预期条件下。这些结果说明, 奖赏预期能够调节对面孔情绪的加工, 且不同加工进程中奖赏对情绪加工的调节作用不同。动机性信息调节注意资源的分配, 促进了个体在加工面孔情绪时的负性偏向。  相似文献   

4.
In the present study we examined the neural correlates of facial emotion processing in the first year of life using ERP measures and cortical source analysis. EEG data were collected cross‐sectionally from 5‐ (N = 49), 7‐ (N = 50), and 12‐month‐old (N = 51) infants while they were viewing images of angry, fearful, and happy faces. The N290 component was found to be larger in amplitude in response to fearful and happy than angry faces in all posterior clusters and showed largest response to fear than the other two emotions only over the right occipital area. The P400 and Nc components were found to be larger in amplitude in response to angry than happy and fearful faces over central and frontal scalp. Cortical source analysis of the N290 component revealed greater cortical activation in the right fusiform face area in response to fearful faces. This effect started to emerge at 5 months and became well established at 7 months, but it disappeared at 12 months. The P400 and Nc components were primarily localized to the PCC/Precuneus where heightened responses to angry faces were observed. The current results suggest the detection of a fearful face in infants’ brain can happen shortly (~200–290 ms) after the stimulus onset, and this process may rely on the face network and develop substantially between 5 to 7 months of age. The current findings also suggest the differential processing of angry faces occurred later in the P400/Nc time window, which recruits the PCC/Precuneus and is associated with the allocation of infants’ attention.  相似文献   

5.
Past research has found evidence for face and emotional expression processing differences between individuals with Asperger's syndrome (AS) and neurotypical (NT) controls at both the neurological and behavioural levels. The aim of the present study was to examine the neurophysiological basis of emotional expression processing in children and adults with AS relative to age- and gender-matched NT controls. High-density event-related potentials were recorded during explicit processing of happy, sad, angry, scared, and neutral faces. Adults with AS were found to exhibit delayed P1 and N170 latencies and smaller N170 amplitudes in comparison to control subjects for all expressions. This may reflect impaired holistic and configural processing of faces in AS adults. However, these differences were not observed between AS and control children. This may result from incomplete development of the neuronal generators of these ERP components and/or early intervention.  相似文献   

6.
Threatening facial expressions can signal the approach of someone or something potentially dangerous. Past research has established that adults have an attentional bias for angry faces, visually detecting their presence more quickly than happy or neutral faces. Two new findings are reported here. First, evidence is presented that young children share this attentional bias. In five experiments, young children and adults were asked to find a picture of a target face among an array of eight distracter faces. Both age groups detected threat‐relevant faces – angry and frightened – more rapidly than non‐threat‐relevant faces (happy and sad). Second, evidence is presented that both adults and children have an attentional bias for negative stimuli overall. All negative faces were detected more quickly than positive ones in both age groups. As the first evidence that young children exhibit the same superior detection of threatening facial expressions as adults, this research provides important support for the existence of an evolved attentional bias for threatening stimuli.  相似文献   

7.
We investigated the influence of happy and angry expressions on memory for new faces. Participants were presented with happy and angry faces in an intentional or incidental learning condition and were later asked to recognise the same faces displaying a neutral expression. They also had to remember what the initial expressions of the faces had been. Remember/know/guess judgements were made both for identity and expression memory. Results showed that faces were better recognised when presented with a happy rather than an angry expression, but only when learning was intentional. This was mainly due to an increase of the "remember" responses for happy faces when encoding was intentional rather than incidental. In contrast, memory for emotional expressions was not different for happy and angry faces whatever the encoding conditions. We interpret these findings according to the social meaning of emotional expressions for the self.  相似文献   

8.
Recent research suggests that emotion effects in word processing resemble those in other stimulus domains such as pictures or faces. The present study aims to provide more direct evidence for this notion by comparing emotion effects in word and face processing in a within-subject design. Event-related brain potentials (ERPs) were recorded as participants made decisions on the lexicality of emotionally positive, negative, and neutral German verbs or pseudowords, and on the integrity of intact happy, angry, and neutral faces or slightly distorted faces. Relative to neutral and negative stimuli both positive verbs and happy faces elicited posterior ERP negativities that were indistinguishable in scalp distribution and resembled the early posterior negativities reported by others. Importantly, these ERP modulations appeared at very different latencies. Therefore, it appears that similar brain systems reflect the decoding of both biological and symbolic emotional signals of positive valence, differing mainly in the speed of meaning access, which is more direct and faster for facial expressions than for words.  相似文献   

9.
Typical adults mimic facial expressions within 1000 ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study evaluated mechanisms underlying RFRs during childhood and examined possible impairment in children with ASD. Experiment 1 found RFRs to happy and angry faces (not fear faces) in 15 typically developing children from 7 to 12 years of age. RFRs of fear (not anger) in response to angry faces indicated an emotional mechanism. In 11 children (8-13 years of age) with ASD, Experiment 2 found undifferentiated RFRs to fear expressions and no consistent RFRs to happy or angry faces. However, as children with ASD aged, matching RFRs to happy faces increased significantly, suggesting the development of processes underlying matching RFRs during this period in ASD.  相似文献   

10.
Human attention is selective, focusing on some aspects of events at the expense of others. In particular, angry faces engage attention. Most studies have used pictures of young faces, even when comparing young and older age groups. Two experiments asked (1) whether task-irrelevant faces of young and older individuals with happy, angry, and neutral expressions disrupt performance on a face-unrelated task, (2) whether interference varies for faces of different ages and different facial expressions, and (3) whether young and older adults differ in this regard. Participants gave speeded responses on a number task while irrelevant faces appeared in the background. Both age groups were more distracted by own than other-age faces. In addition, young participants' responses were slower for angry than happy faces, whereas older participants' responses were slower for happy than angry faces. Factors underlying age-group differences in interference from emotional faces of different ages are discussed.  相似文献   

11.
The ability to rapidly detect facial expressions of anger and threat over other salient expressions has adaptive value across the lifespan. Although studies have demonstrated this threat superiority effect in adults, surprisingly little research has examined the development of this process over the childhood period. In this study, we examined the efficiency of children's facial processing in visual search tasks. In Experiment 1, children (N=49) aged 8 to 11 years were faster and more accurate in detecting angry target faces embedded in neutral backgrounds than vice versa, and they were slower in detecting the absence of a discrepant face among angry than among neutral faces. This search pattern was unaffected by an increase in matrix size. Faster detection of angry than neutral deviants may reflect that angry faces stand out more among neutral faces than vice versa, or that detection of neutral faces is slowed by the presence of surrounding angry distracters. When keeping the background constant in Experiment 2, children (N=35) aged 8 to 11 years were faster and more accurate in detecting angry than sad or happy target faces among neutral background faces. Moreover, children with higher levels of anxiety were quicker to find both angry and sad faces whereas low anxious children showed an advantage for angry faces only. Results suggest a threat superiority effect in processing facial expressions in young children as in adults and that increased sensitivity for negative faces may be characteristic of children with anxiety problems.  相似文献   

12.
Using a visual search paradigm, we investigated how age affected attentional bias to emotional facial expressions. In Experiments 1 and 2, participants searched for a discrepant facial expression in a matrix of otherwise homogeneous faces. Both younger and older adults showed a more effective search when the discrepant face was angry rather than happy or neutral. However, when the angry faces served as non-target distractors, younger adults' search was less effective than happy or neutral distractor conditions. In contrast, older adults showed a more efficient search with angry distractors than happy or neutral distractors, indicating that older adults were better able to inhibit angry facial expressions. In Experiment 3, we found that even a top-down search goal could not override the angry face superiority effect in guiding attention. In addition, RT distribution analyses supported that both younger and older adults performed the top-down angry face search qualitatively differently from the top-down happy face search. The current research indicates that threat face processing involves automatic attentional shift and a controlled attentional process. The current results suggest that age only influenced the controlled attentional process.  相似文献   

13.
Emotional facial expressions are powerful social cues. Here we investigated how emotional expression affects the interpretation of eye gaze direction. Fifty-two observers judged where faces were looking by moving a slider on a measuring bar to the respective position. The faces displayed either an angry, happy, fearful or a neutral expression and were looking either straight at the observer, or were rotated 2°, 4°, 6° or 8° to the left and right. We found that happy faces were interpreted as directed closer to the observer, while fearful and angry faces were interpreted as directed further away. Judgments were most accurate for neutral faces, followed by happy, angry and fearful faces. These findings are discussed on the background of the “self-referential positivity bias”, suggesting that happy faces are preferably interpreted as directed towards the self while negative emotions are interpreted as directed further away.  相似文献   

14.
Using the item-method directed forgetting paradigm (i.e. intentionally forgetting specified information), we examined directed forgetting of facial identity as a function of facial expression and the sex of the expresser and perceiver. Participants were presented with happy and angry male and female faces cued for either forgetting or remembering, and were then asked to recognise previously studied faces from among a series of neutral faces. For each recognised test face, participants also recalled the face’s previously displayed emotional expression. We found that angry faces were more resistant to forgetting than were happy faces. Furthermore, angry expressions on male faces and happy expressions on female faces were recognised and recalled better than vice versa. Signal detection analyses revealed that male faces gave rise to a greater sensitivity than female faces did, and male participants, but not female participants, showed greater sensitivity to male faces than to female faces. Several theoretical implications are discussed.  相似文献   

15.
Social phobia has been associated with an attentional bias for angry faces. This study aimed at further characterising this attentional bias by investigating reaction times, heart rates, and ERPs while social phobics, spider phobics, and controls identified either the colour or the emotional quality of angry, happy, or neutral schematic faces. The emotional expression of angry faces did not interfere with the processing of their colour in social phobics, and heart rate, N170 amplitude and parietal late positive potentials (LPPs) of these subjects were also no different from those of non-phobic subjects. However, social phobics showed generally larger P1 amplitudes than non-phobic controls with spider phobic subjects in between. No general threat advantage for angry faces was found. All groups identified neutral schematic faces faster and showed larger late positive amplitudes to neutral than to emotional faces. Furthermore, in all groups the N170 was modulated by the emotional quality of faces. This effect was most pronounced in the emotion identification task.  相似文献   

16.
Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression.  相似文献   

17.
We establish attentional capture by emotional distractor faces presented as a "singleton" in a search task in which the emotion is entirely irrelevant. Participants searched for a male (or female) target face among female (or male) faces and indicated whether the target face was tilted to the left or right. The presence (vs. absence) of an irrelevant emotional singleton expression (fearful, angry, or happy) in one of the distractor faces slowed search reaction times compared to the singleton absent or singleton target conditions. Facilitation for emotional singleton targets was found for the happy expression but not for the fearful or angry expressions. These effects were found irrespective of face gender and the failure of a singleton neutral face to capture attention among emotional faces rules out a visual odd-one-out account for the emotional capture. The present study thus establishes irrelevant, emotional, attentional capture.  相似文献   

18.
Two experiments are reviewed that demonstrate effects of brain laterality on human classical conditioning. Pictures of facial emotional expressions were used as conditioned stimuli (CSs) together with shock as unconditioned stimlus (UCS). Bilateral electrodermal responses were recorded as dependent measures. In the first experiment, one group was conditioned to an angry face, and one group to a happy face. During extinction, the face-CSs were presented to the right hemisphere on half of the trials and to the left hemisphere on the other half of the trials. Results showed that the right hemisphere was superior in showing persisting effects of learning, and especially to the angry CS+. In the second experiment, lateralized presentations of the angry and happy faces were made during acquisition, with foveal presentations during extinction. Once again, the angry face elicited greater skin conductance responses (SCRs) during extinction in the group that had this stimulus presented to the right hemisphere during acquisition. It is concluded that emotional conditioning is differentially regulated by the two hemispheres of the brain.  相似文献   

19.
Threatening facial expressions like anger can signal potential danger. Past research has established that both adults and children have an attentional bias for angry faces, visually detecting their presence more quickly than happy or neutral faces. More recent research has suggested that specific features of angry faces (such as the downward-pointing “V” shaped brow) are the effective stimulus in their rapid detection. However, research examining this issue has only been done with adults. In the current research, we examine the detection of the features of the downward-pointing “V” in both adults and preschool children using a touchscreen visual search procedure. In two experiments, both adults and children detected the downward-pointing “V” more quickly than an upward-pointing “V”. As the first evidence that young children exhibit the same superior detection of the features of threatening facial expressions that adults do, this research provides important support for the existence of an evolved attentional bias for threatening stimuli.  相似文献   

20.
In the present experiment, sex differences in hemispheric asymmetry during classical conditioning to emotional stimuli are reported. 125 subjects (62 females and 63 males) were shown a slide of a happy face in the right (or left) visual half field (VHF), and simultaneously a slide of an angry face in the left (or right) VHF. Eight groups were formed by the combination of male and female subjects; left and right VHF positions of the angry/happy faces; and the administration/omission of the shock unconditioned stimulus (UCS). Dependent measures were skin conductance responses recorded from both hands. The results during extinction showed a significant larger SCR magnitude to the shock compared to the no-shock groups only for the female subjects. CS position during conditioning was also important in revealing differential responding to either the happy or angry faces. A right hemisphere effect was found for the angry face CS for both the male and female subjects, however with a greater difference for the females.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号