首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We investigated how age of faces and emotion expressed in faces affect young (n=30) and older (n=20) adults' visual inspection while viewing faces and judging their expressions. Overall, expression identification was better for young than older faces, suggesting that interpreting expressions in young faces is easier than in older faces, even for older participants. Moreover, there were age-group differences in misattributions of expressions, in that young participants were more likely to label disgusted faces as angry, whereas older adults were more likely to label angry faces as disgusted. In addition to effects of emotion expressed in faces, age of faces affected visual inspection of faces: Both young and older participants spent more time looking at own-age than other-age faces, with longer looking at own-age faces predicting better own-age expression identification. Thus, cues used in expression identification may shift as a function of emotion and age of faces, in interaction with age of participants.  相似文献   

2.
We systematically examined the impact of emotional stimuli on time perception in a temporal reproduction paradigm where participants reproduced the duration of a facial emotion stimulus using an oval-shape stimulus or vice versa. Experiment 1 asked participants to reproduce the duration of an angry face (or the oval) presented for 2,000 ms. Experiment 2 included a range of emotional expressions (happy, sad, angry, and neutral faces as well as the oval stimulus) presented for different durations (500, 1,500, and 2,000 ms). We found that participants over-reproduced the durations of happy and sad faces using the oval stimulus. By contrast, there was a trend of under-reproduction when the duration of the oval stimulus was reproduced using the angry face. We suggest that increased attention to a facial emotion produces the relativity of time perception.  相似文献   

3.
1IntroductionCorrectly identifying other people′s facial ex-pressions of emotions is important to human socialinteraction in all societies.Many studies suggestthat the identification of facial expressions in par-ticular and perceptual processing of emotional infor-mation is carried out mainly by the right hemi-sphere of the brain[1 ̄7].Damage to the righthemisphere generally produces more significant im-pairment in recognition of all facial expressions ofemotion than damage to the left hemisp…  相似文献   

4.
Sex differences in face recognition and influence of facial affect   总被引:1,自引:0,他引:1  
To study sex differences in the recognition of human faces with different facial expressions, 65 female and 64 male participants learned to associate names with various male and female neutral faces. During the recall phase, participants were then asked to name the same persons depicting different emotional expressions (neutral, happy, angry, and fearful). Females were faster than males at naming male faces, and males were faster than females at naming female faces. All participants were faster at naming neutral or happy female faces than neural or happy male faces. These results suggest that opposite-sex faces require less processing time than same-sex faces, which is consistent with an evolutionary account.  相似文献   

5.
We investigated how age of faces and emotion expressed in faces affect young (n=30) and older (n=20) adults’ visual inspection while viewing faces and judging their expressions. Overall, expression identification was better for young than older faces, suggesting that interpreting expressions in young faces is easier than in older faces, even for older participants. Moreover, there were age-group differences in misattributions of expressions, in that young participants were more likely to label disgusted faces as angry, whereas older adults were more likely to label angry faces as disgusted. In addition to effects of emotion expressed in faces, age of faces affected visual inspection of faces: Both young and older participants spent more time looking at own-age than other-age faces, with longer looking at own-age faces predicting better own-age expression identification. Thus, cues used in expression identification may shift as a function of emotion and age of faces, in interaction with age of participants.  相似文献   

6.
The current study tested whether the perception of angry faces is cross-culturally privileged over that of happy faces, by comparing perception of the offset of emotion in a dynamic flow of expressions. Thirty Chinese and 30 European-American participants saw movies that morphed an anger expression into a happy expression of the same stimulus person, or vice versa. Participants were asked to stop the movie at the point where they ceased seeing the initial emotion. As expected, participants cross-culturally continued to perceive anger longer than happiness. Moreover, anger was perceived longer in in-group than in out-group faces. The effects were driven by female rather than male targets. Results are discussed with reference to the important role of context in emotion perception.  相似文献   

7.
The current study tested whether the perception of angry faces is cross-culturally privileged over that of happy faces, by comparing perception of the offset of emotion in a dynamic flow of expressions. Thirty Chinese and 30 European-American participants saw movies that morphed an anger expression into a happy expression of the same stimulus person, or vice versa. Participants were asked to stop the movie at the point where they ceased seeing the initial emotion. As expected, participants cross-culturally continued to perceive anger longer than happiness. Moreover, anger was perceived longer in in-group than in out-group faces. The effects were driven by female rather than male targets. Results are discussed with reference to the important role of context in emotion perception.  相似文献   

8.
Facial information and attention to facial displays are distributed over spatial as well as temporal domains. Thus far, research on selective attention to (dis)approving faces in the context of social anxiety has concentrated primarily on the spatial domain. Using a rapid serial visual presentation (RSVP) paradigm, the present study examined the temporal characteristics of visual attention for happy and angry faces in high- (n=16) and low-socially anxious individuals (n=17), to test whether also in the temporal domain socially anxious individuals are characterized by threat-confirming attentional biases. Results indicated that presenting angry faces as the first target (T1) did not aggravate the detection of the emotional expression of the second target (T2). Yet, participants generally showed superior detection of the emotional expression of T2, if T2 was an angry face. Casting doubt on the role of such attenuated attentional blink for angry faces in social anxiety, no evidence emerged to indicate that this effect was relatively strong in high-socially anxious individuals. Finally, the presentation of an angry face as T2 resulted in a relatively hampered identification of a happy-T1. Again, this "backward blink" was not especially pronounced in high-socially anxious individuals. The present anger superiority effects are consistent with evolutionary models stressing the importance of being especially vigilant for signals of dominance. Since the effects were not especially pronounced in high-anxious individuals, the present study adds to previous findings indicating that socially anxious individuals are not characterized by a bias in the (explicit) detection of emotional expressions [Philippot, P., & Douilliez, C. (2005). Social phobics do not misinterpret facial expression of emotion. Behaviour Research and Therapy, 43, 639-652].  相似文献   

9.
This study investigated the effects of emotional response on an inhibitory task, the Stroop‐like day‐night task, in which participants are presented with two pictures. They are then requested to inhibit naming what the card shown to them represents and instead state what the other card represents. Specifically, 35 4‐ to 6‐year‐old children and 15 young adults were administered the emotion‐related happy‐sad task and the emotion‐unrelated up‐down task using the same stimulus set (happy and sad cartoon faces). The results suggested that vulnerability to errors in the happy‐sad task was not derived from increased inhibitory demand. The results also suggested that the happy‐sad task is more inhibitory‐demanding in terms of response time. These results suggested that the happy‐sad task elicits interference more than other variants of this task, not because the task involves emotional stimuli per se but because the task involves both emotional stimuli and emotional responses.  相似文献   

10.
The present study investigated whether facial expressions modulate visual attention in 7-month-old infants. First, infants' looking duration to individually presented fearful, happy, and novel facial expressions was compared to looking duration to a control stimulus (scrambled face). The face with a novel expression was included to examine the hypothesis that the earlier findings of greater allocation of attention to fearful as compared to happy faces could be due to the novelty of fearful faces in infants' rearing environment. The infants looked longer at the fearful face than at the control stimulus, whereas no such difference was found between the other expressions and the control stimulus. Second, a gap/overlap paradigm was used to determine whether facial expressions affect the infants' ability to disengage their fixation from a centrally presented face and shift attention to a peripheral target. It was found that infants disengaged their fixation significantly less frequently from fearful faces than from control stimuli and happy faces. Novel facial expressions did not have a similar effect on attention disengagement. Thus, it seems that adult-like modulation of the disengagement of attention by threat-related stimuli can be observed early in life, and that the influence of emotionally salient (fearful) faces on visual attention is not simply attributable to the novelty of these expressions in infants' rearing environment.  相似文献   

11.
In this paper, the role of self-reported anxiety and degree of conscious awareness as determinants of the selective processing of affective facial expressions is investigated. In two experiments, an attentional bias toward fearful facial expressions was observed, although this bias was apparent only for those reporting high levels of trait anxiety and only when the emotional face was presented in the left visual field. This pattern was especially strong when the participants were unaware of the presence of the facial stimuli. In Experiment 3, a patient with right-hemisphere brain damage and visual extinction was presented with photographs of faces and fruits on unilateral and bilateral trials. On bilateral trials, it was found that faces produced less extinction than did fruits. Moreover, faces portraying a fearful or a happy expression tended to produce less extinction than did neutral expressions. This suggests that emotional facial expressions may be less dependent on attention to achieve awareness. The implications of these results for understanding the relations between attention, emotion, and anxiety are discussed.  相似文献   

12.
The role of holistic or parts-based processing in face identification has been explored mostly with neutral faces. In the current study, we investigated the nature of processing (holistic vs. parts) in recognition memory for faces with emotional expressions. There were two phases in this experiment: learning phase and test phase. In the learning phase participants learned face–name associations of happy, neutral, and sad faces. The test phase consisted of a two-choice recognition test (whole face, eyes, or mouth) given either immediately or after a 24-hour delay. Results indicate that emotional faces were remembered better than neutral faces and performance was better with whole faces as compared to isolated parts. The performance in immediate and delayed recognition interacted with emotional information. Sad eyes and happy mouth were remembered better in the delayed recognition condition. These results suggest that in addition to holistic processing, specific parts–emotion combinations play a critical role in delayed recognition memory.  相似文献   

13.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed.  相似文献   

14.
We used the remember-know procedure (Tulving, 1985 ) to test the behavioural expression of memory following indirect and direct forms of emotional processing at encoding. Participants (N=32) viewed a series of facial expressions (happy, fearful, angry, and neutral) while performing tasks involving either indirect (gender discrimination) or direct (emotion discrimination) emotion processing. After a delay, participants completed a surprise recognition memory test. Our results revealed that indirect encoding of emotion produced enhanced memory for fearful faces whereas direct encoding of emotion produced enhanced memory for angry faces. In contrast, happy faces were better remembered than neutral faces after both indirect and direct encoding tasks. These findings suggest that fearful and angry faces benefit from a recollective advantage when they are encoded in a way that is consistent with the predictive nature of their threat. We propose that the broad memory advantage for happy faces may reflect a form of cognitive flexibility that is specific to positive emotions.  相似文献   

15.
Recent research suggests that emotion effects in word processing resemble those in other stimulus domains such as pictures or faces. The present study aims to provide more direct evidence for this notion by comparing emotion effects in word and face processing in a within-subject design. Event-related brain potentials (ERPs) were recorded as participants made decisions on the lexicality of emotionally positive, negative, and neutral German verbs or pseudowords, and on the integrity of intact happy, angry, and neutral faces or slightly distorted faces. Relative to neutral and negative stimuli both positive verbs and happy faces elicited posterior ERP negativities that were indistinguishable in scalp distribution and resembled the early posterior negativities reported by others. Importantly, these ERP modulations appeared at very different latencies. Therefore, it appears that similar brain systems reflect the decoding of both biological and symbolic emotional signals of positive valence, differing mainly in the speed of meaning access, which is more direct and faster for facial expressions than for words.  相似文献   

16.
Sixteen clinically depressed patients and sixteen healthy controls were presented with a set of emotional facial expressions and were asked to identify the emotion portrayed by each face. They, were subsequently given a recognition memory test for these faces. There was no difference between the groups in terms of their ability to identify emotion between from faces. All participants identified emotional expressions more accurately than neutral expressions, with happy expressions being identified most accurately. During the recognition memory phase the depressed patients demonstrated superior memory for sad expressions, and inferior memory for happy expressions, relative to neutral expressions. Conversely, the controls demonstrated superior memory for happy expressions, and inferior memory for sad expressions, relative to neutral expressions. These results are discussed in terms of the cognitive model of depression proposed by Williams, Watts, MacLeod, and Mathews (1997).  相似文献   

17.
We investigated whether and how emotional facial expressions affect sustained attention in face tracking. In a multiple-identity and object tracking paradigm, participants tracked multiple target faces that continuously moved around together with several distractor faces, and subsequently reported where each target face had moved to. The emotional expression (angry, happy, and neutral) of the target and distractor faces was manipulated. Tracking performance was better when the target faces were angry rather than neutral, whereas angry distractor faces did not affect tracking. The effect persisted when the angry faces were presented upside-down and when surface features of the faces were irrelevant to the ongoing task. There was only suggestive and weak evidence for a facilitatory effect of happy targets and a distraction effect of happy distractors in comparison to neutral faces. The results show that angry expressions on the target faces can facilitate sustained attention on the targets via increased vigilance, yet this effect likely depends on both emotional information and visual features of the angry faces.  相似文献   

18.
ABSTRACT

The attentional blink (AB) is the impaired ability to detect a second target (T2) when it follows shortly after the first (T1) among distractors in a rapid serial visual presentation (RSVP). Given questions about the automaticity of age differences in emotion processing, the current study examined whether emotion cues differentially impact the AB elicited in older and younger adults. Twenty-two younger (18–22 years) and 22 older adult participants (62–78 years) reported on the emotional content of target face stimulus pairs embedded in a RSVP of scrambled-face distractor images. Target pairs included photo-realistic faces of angry, happy, and neutral expressions. The order of emotional and neutral stimuli as T1 or T2 and the degree of temporal separation within the RSVP systematically varied. Target detection accuracy was used to operationalise the AB. Although older adults displayed a larger AB than younger adults, no age differences emerged in the impact of emotion on the AB. Angry T1 faces increased the AB of both age groups. Neither emotional T2 attenuated the AB. Negative facial expressions held the attention of younger and older adults in a comparable manner, exacerbating the AB and supporting a negativity bias instead of a positivity effect in older adults.  相似文献   

19.
We examined hemispheric specialization in a lateralized Stroop facial identification task. A 2 (presentation side: left or right visual field [LVF or RVF])x2 (picture emotion: happy or angry)x3 (emotion of distractor word: happy, angry, or blank) factorial design placed the right hemispheric specialization for emotional expression processing and the left hemispheric specialization for verbal processing in conflict. Faces (from ) and emotion words were briefly displayed, and participants responded with keypresses corresponding to the picture emotion. As predicted, greater Stroop interference in identification accuracy was found with incongruent displays of facial expression in the LVF and emotion words in the RVF, and females exhibited less Stroop interference. Reaction times were moderated by emotion and visual field.  相似文献   

20.
There is considerable evidence indicating that people are primed to monitor social signals of disapproval. Thus far, studies on selective attention have concentrated predominantly on the spatial domain, whereas the temporal consequences of identifying socially threatening information have received only scant attention. Therefore, this study focused on temporal attention costs and examined how the presentation of emotional expressions affects subsequent identification of task-relevant information. High (n = 30) and low (n = 31) socially anxious women were exposed to a dual-target rapid serial visual presentation (RSVP) paradigm. Emotional faces (neutral, happy, angry) were presented as the first target (T1) and neutral letter stimuli (p, q, d, b) as the second target (T2). Irrespective of social anxiety, the attentional blink was relatively large when angry faces were presented as T1. This apparent prioritized processing of angry faces is consistent with evolutionary models, stressing the importance of being especially attentive to potential signals of social threat.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号