首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
This paper reports three studies in which stronger orienting to perceived eye gaze direction was revealed when observers viewed faces showing fearful or angry, compared with happy or neutral, emotional expressions. Gaze-related spatial cueing effects to laterally presented fearful faces and centrally presented angry faces were also modulated by the anxiety level of participants, with high- but not low-state anxious individuals revealing enhanced shifts of attention. In contrast, both high- and low-state anxious individuals demonstrated enhanced orienting to averted gaze when viewing laterally presented angry faces. These results provide novel evidence for the rapid integration of facial expression and gaze direction information, and for the regulation of gaze-cued attention by both the emotion conveyed in the perceived face and the degree of anxiety experienced by the observer.  相似文献   

2.
S. Bentin and L. Y. Deouell (2000) have suggested that face recognition is achieved through a special-purpose neural mechanism, and its existence can be identified by a specific event-related potential (ERP) correlate, the N170 effect. In the present study, the authors explored the structural significance of N170 by comparing normal vs. morphed stimuli. They used a morphing procedure that allows a fine modification of some perceptual details (first-order relations). The authors also aimed to verify the independence of face identification from other cognitive mechanisms, such as comprehension of emotional facial expressions, by applying an emotion-by-emotion analysis to examine the emotional effect on N170 ERP variation. They analyzed the peak amplitude and latency variables in the temporal window of 120-180 ms. The ERP correlate showed a classic N170 ERP effect, more negative and more posteriorly distributed for morphed faces compared with normal faces. In addition, they found a lateralization effect, with a greater right-side distribution of the N170, but not directly correlated to the morphed or normal conditions. Two cognitive codes, structural and expression, are discussed, and the results are compared with the multilevel model proposed by V. Bruce and A. W. Young (1986, 1998).  相似文献   

3.
According to the sociometer hypothesis individuals with low self-esteem experience increased negative affect in response to negative social stimuli, even when these stimuli are not perceived consciously. Using an affective priming paradigm, the present study examined whether trait self-esteem would moderate mood following briefly presented facial expressions. Results from 43 undergraduates revealed that, after controlling for baseline mood, anxiety and depression, the degree of negative affect experienced by the participants following exposure to expressions of anger and disgust varied as a function of their self-esteem. Implications for individuals with low-self esteem and our understanding of the link between self-esteem and negative affect are discussed.  相似文献   

4.
Selective attention to emotional faces following recovery from depression   总被引:11,自引:0,他引:11  
This study was designed to examine attentional biases in the processing of emotional faces in currently and formerly depressed participants and healthy controls. Using a dot-probe task, the authors presented faces expressing happy or sad emotions paired with emotionally neutral faces. Whereas both currently and formerly depressed participants selectively attended to the sad faces, the control participants selectively avoided the sad faces and oriented toward the happy faces, a positive bias that was not observed for either of the depressed groups. These results indicate that attentional biases in the processing of emotional faces are evident even after individuals have recovered from a depressive episode. Implications of these findings for understanding the roles of cognitive and interpersonal functioning in depression are discussed.  相似文献   

5.
We investigated the perception of emotional stimuli in anxious individuals and non-anxious cohorts. Signal detection theory analysis was applied to the discrimination of emotionally charged faces at several points along a continuum of emotional intensity. This design permitted the derivation of multiple measures of sensitivity and response bias for fearful and for happy faces. Anxious individuals lacked a conservative bias in judging fearful stimuli and a liberal bias in judging positive stimuli compared with non-anxious individuals. In addition, anxious participants had lower perceptual sensitivity (d′) than non-anxious participants for mildly threatening stimuli, as well as a trend towards lower perceptual sensitivity for moderately positive stimuli. These results suggest that the processing of threat information in anxiety is affected by sensitivity and bias differently at different levels of affective intensity.  相似文献   

6.
7.
The study aimed to determine if the memory bias for negative faces previously demonstrated in depression and dysphoria generalises from long- to short-term memory. A total of 29 dysphoric (DP) and 22 non-dysphoric (ND) participants were presented with a series of faces and asked to identify the emotion portrayed (happiness, sadness, anger, or neutral affect). Following a delay, four faces were presented (the original plus three distractors) and participants were asked to identify the target face. Half of the trials assessed memory for facial emotion, and the remaining trials examined memory for facial identity. At encoding, no group differences were apparent. At memory testing, relative to ND participants, DP participants exhibited impaired memory for all types of facial emotion and for facial identity when the faces featured happiness, anger, or neutral affect, but not sadness. DP participants exhibited impaired identity memory for happy faces relative to angry, sad, and neutral, whereas ND participants exhibited enhanced facial identity memory when faces were angry. In general, memory for faces was not related to performance at encoding. However, in DP participants only, memory for sad faces was related to sadness recognition at encoding. The results suggest that the negative memory bias for faces in dysphoria does not generalise from long- to short-term memory.  相似文献   

8.
Poor decision making during adolescence occurs most frequently when situations are emotionally charged. However, relatively few studies have measured the development of cognitive control in response to emotional stimuli in this population. This study used both affective (emotional faces) and non-affective (letter) stimuli in two different flanker tasks to assess the ability to ignore task-irrelevant but distracting information, in 25 adults and 25 adolescents. On the non-emotional (letter) flanker task, the presence of incongruent flanking letters increased the number of errors, and also slowed participants’ ability to identify a central letter. Adolescents committed more errors than adults, but there were no age-related differences for the reaction time interference effect in the letter condition. Post-hoc testing revealed that age-related differences on the task were driven by the younger adolescents (11-14 years); adults and older adolescents (15-17 years) were equally accurate in the letter condition. In contrast, on the emotional face flanker task, not only were adolescents less accurate than adults but they were also more distracted by task-irrelevant fearful faces as evidenced by greater reaction time interference effects. Our findings suggest that the ability to self-regulate in adolescents, as evidenced by the ability to suppress irrelevant information on a flanker task, is more difficult when stimuli are affective in nature. The ability to ignore irrelevant flankers appears to mature earlier for non-affective stimuli than for affective stimuli.  相似文献   

9.
The goal of this review is to critically examine contradictory findings in the study of visual search for emotionally expressive faces. Several key issues are addressed: Can emotional faces be processed preattentively and guide attention? What properties of these faces influence search efficiency? Is search moderated by the emotional state of the observer? The authors argue that the evidence is consistent with claims that (a) preattentive search processes are sensitive to and influenced by facial expressions of emotion, (b) attention guidance is influenced by a dynamic interplay of emotional and perceptual factors, and (c) visual search for emotional faces is influenced by the emotional state of the observer to some extent. The authors also argue that the way in which contextual factors interact to determine search performance needs to be explored further to draw sound conclusions about the precise influence of emotional expressions on search efficiency. Methodological considerations (e.g., set size, distractor background, task set) and ecological limitations of the visual search task are discussed. Finally, specific recommendations are made for future research directions.  相似文献   

10.
Aging and attentional biases for emotional faces   总被引:10,自引:1,他引:9  
We examined age differences in attention to and memory for faces expressing sadness, anger, and happiness. Participants saw a pair of faces, one emotional and one neutral, and then a dot probe that appeared in the location of one of the faces. In two experiments, older adults responded faster to the dot if it was presented on the same side as a neutral face than if it was presented on the same side as a negative face. Younger adults did not exhibit this attentional bias. Interactions of age and valence were also found for memory for the faces, with older adults remembering positive better than negative faces. These findings reveal that in their initial attention, older adults avoid negative information. This attentional bias is consistent with older adults' generally better emotional well-being and their tendency to remember negative less well than positive information.  相似文献   

11.
The ability to rapidly detect facial expressions of anger and threat over other salient expressions has adaptive value across the lifespan. Although studies have demonstrated this threat superiority effect in adults, surprisingly little research has examined the development of this process over the childhood period. In this study, we examined the efficiency of children's facial processing in visual search tasks. In Experiment 1, children (N=49) aged 8 to 11 years were faster and more accurate in detecting angry target faces embedded in neutral backgrounds than vice versa, and they were slower in detecting the absence of a discrepant face among angry than among neutral faces. This search pattern was unaffected by an increase in matrix size. Faster detection of angry than neutral deviants may reflect that angry faces stand out more among neutral faces than vice versa, or that detection of neutral faces is slowed by the presence of surrounding angry distracters. When keeping the background constant in Experiment 2, children (N=35) aged 8 to 11 years were faster and more accurate in detecting angry than sad or happy target faces among neutral background faces. Moreover, children with higher levels of anxiety were quicker to find both angry and sad faces whereas low anxious children showed an advantage for angry faces only. Results suggest a threat superiority effect in processing facial expressions in young children as in adults and that increased sensitivity for negative faces may be characteristic of children with anxiety problems.  相似文献   

12.
Using the item-method directed forgetting paradigm (i.e. intentionally forgetting specified information), we examined directed forgetting of facial identity as a function of facial expression and the sex of the expresser and perceiver. Participants were presented with happy and angry male and female faces cued for either forgetting or remembering, and were then asked to recognise previously studied faces from among a series of neutral faces. For each recognised test face, participants also recalled the face’s previously displayed emotional expression. We found that angry faces were more resistant to forgetting than were happy faces. Furthermore, angry expressions on male faces and happy expressions on female faces were recognised and recalled better than vice versa. Signal detection analyses revealed that male faces gave rise to a greater sensitivity than female faces did, and male participants, but not female participants, showed greater sensitivity to male faces than to female faces. Several theoretical implications are discussed.  相似文献   

13.
We establish attentional capture by emotional distractor faces presented as a "singleton" in a search task in which the emotion is entirely irrelevant. Participants searched for a male (or female) target face among female (or male) faces and indicated whether the target face was tilted to the left or right. The presence (vs. absence) of an irrelevant emotional singleton expression (fearful, angry, or happy) in one of the distractor faces slowed search reaction times compared to the singleton absent or singleton target conditions. Facilitation for emotional singleton targets was found for the happy expression but not for the fearful or angry expressions. These effects were found irrespective of face gender and the failure of a singleton neutral face to capture attention among emotional faces rules out a visual odd-one-out account for the emotional capture. The present study thus establishes irrelevant, emotional, attentional capture.  相似文献   

14.
White M 《Perception》2002,31(6):675-682
In a face photo in which the two eyes have been moved up into the forehead region, configural spatial relations are altered more than categorical relations; in a photo in which only one eye is moved up, categorical relations are altered more. Matching the identities of two faces was slower when an unaltered photo was paired with a two-eyes-moved photo than when paired with a one-eye-moved photo, implicating configural relations in face identification. But matching the emotional expressions of the same faces was slower when an unaltered photo was paired with a one-eye-moved photo than when paired with a two-eyes-moved photo, showing that expression recognition uses categorically coded relations. The findings also indicate that changing spatial-relational information affects the perceptual encoding of identities and expressions rather than their memory representations.  相似文献   

15.
Attentional biases for emotional faces were investigated in high, medium, and low anxiety groups (N = 54) using a probe detection task. Four types of facial expression (threat, sad, happy, neutral) were used to examine the specificity of the bias. Attentional bias measures were derived from manual reaction times (RTs) to probes and the direction of initial eye movement (EM) to the faces. The RT data indicated enhanced vigilance for threat rather than neutral faces in high and medium, but not low, state anxiety. The bias for negative faces appeared to be a combined function of stimulus threat value and the individual's anxiety level. The RT bias did not seem to depend on overt orienting, as many participants made few EMs. However, those who made frequent EMs to the faces showed concordance between the RT and EM bias measures, and so the RT measure of attentional bias for negative versus positive faces at 500 ms appears to provide a valid index of the direction of initial orienting to emotional stimuli. There was no evidence of an anxiety - related bias for happy faces (predicted by the emotionality hypothesis), nor a dysphoria - related bias for sad faces. However, increased dysphoria scores were associated with reduced attentiveness to happy faces.  相似文献   

16.
Recent studies of the face in the crowd effect, the faster detection of angry than of happy faces in visual search, suggest that for schematic faces it reflects on perceptual features like inward pointing lines rather than on emotional expressions. Removing a potential confound, Experiments 1–2 replicate the preferential detection of stimuli with inward pointing lines, but Experiment 2a indicates that a surrounding circle is required for the effect to emerge. Experiments 3–7 failed to find evidence for faster detection of schematic faces comprising only the elements critical for the faster detection of angry faces according to a low level visual feature account, inward tilted brows and upturned mouth. Faster detection of anger was evident if eyes or eyes and noses were added, but only if their placement was consistent with the first order relations among these elements in a human face. Drawing the critical elements in thicker, higher contrast lines also led to an anger advantage, but this was smaller than that seen for the complete faces. The present results suggest that, while able to support faster target detection, a prevalence of inward pointing lines is not sufficient to explain the detection advantage of angry schematic faces.  相似文献   

17.
The present study contributes to the ongoing debate over the extent to which attentive resources are required for emotion perception. Although fearful facial expressions are strong competitors for attention, we predict that the magnitude of this effect may be modulated by anxiety. To test this hypothesis, healthy volunteers who varied in their self-reported levels of trait and state anxiety underwent an attentional blink task. Both fearful and happy facial expressions were subject to a strong attentional blink effect for low-anxious individuals. For those reporting high anxiety, a blink occurred for both fearful and happy facial expressions, but the magnitude of the attentional blink was significantly reduced for the fearful expressions. This supports the proposals that emotion perception is not fully automatic and that anxiety is related to a reduced ability to inhibit the processing of threat-related stimuli. Thus, individual differences in self-reported anxiety are an important determinant of the attentional control of emotional processing.  相似文献   

18.
Past research suggests an aging-related positivity effect in orienting to faces. However, these studies have eschewed direct comparison of orienting when positive and negative faces are presented simultaneously, thereby potentially underestimating the degree to which emotional valence influences such effects. In the current study younger and older adults viewed face pairs for 1000 ms, and upon face-pair offset indicated the location of a dot that appeared in the former location of one of the faces, to assess attentional orienting. When shown negative–neutral pairs, both age groups were biased to attend to negative faces, but when shown positive–negative pairs only younger adults showed a bias toward negative; older adults showed a lack of orienting toward either emotional face. Results suggest younger adults have a negativity bias in attention orienting regardless of the valence of nearby stimuli, whereas older adults show an absence of this bias when positive information is present.  相似文献   

19.
J. B. Halberstadt and P. M. Niedenthal (2001) reported that explanations of target individuals' emotional states biased memory for their facial expressions in the direction of the explanation. The researchers argued for, but did not test, a 2-stage model of the explanation effect, such that verbal explanation increases attention to facial features at the expense of higher level featural configuration, making the faces vulnerable to conceptual reintegration in terms of available emotion categories. The current 4 experiments provided convergent evidence for the "featural shift" hypothesis by examining memory for both faces and facial features following verbal explanation. Featural attention was evidenced by verbalizers' better memory for features relative to control participants and reintegration by a weaker explanation bias for features and configurally altered faces than for whole, unaltered faces. The results have implications for emotion, attribution, language, and the interaction of implicit and explicit processing.  相似文献   

20.
When a briefly presented and then masked visual object is identified, it impairs the identification of the second target for several hundred milliseconds. This phenomenon is known as attentional blink or attentional dwell time. The present study is an attempt to investigate the role of salient emotional information in shifts of covert visual attention over time. Two experiments were conducted using the dwell time paradigm, in which two successive targets are presented at different locations with a variable stimulus onset asynchrony (SOA). In the first experiment, real emotional faces (happy/sad) were presented as the first target, and letters (L/T) were presented as the second target. The order of stimulus presentation was reversed in the second experiment. In the first experiment, identification of the letters preceded by happy faces showed better performance than did those preceded by sad faces at SOAs less than 200 msec. Similarly, happy faces were identified better than sad faces were at short SOAs in Experiment 2. The results show that the time course of visual attention is dependent on emotional content of the stimuli. The findings indicate that happy faces are associated with distributed attention or broad scope of attention and require fewer attentional resources than do sad faces.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号