首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
Facial attributes such as race, sex, and age can interact with emotional expressions; however, only a couple of studies have investigated the nature of the interaction between facial age cues and emotional expressions and these have produced inconsistent results. Additionally, these studies have not addressed the mechanism/s driving the influence of facial age cues on emotional expression or vice versa. In the current study, participants categorised young and older adult faces expressing happiness and anger (Experiment 1) or sadness (Experiment 2) by their age and their emotional expression. Age cues moderated categorisation of happiness vs. anger and sadness in the absence of an influence of emotional expression on age categorisation times. This asymmetrical interaction suggests that facial age cues are obligatorily processed prior to emotional expressions. Finding a categorisation advantage for happiness expressed on young faces relative to both anger and sadness which are negative in valence but different in their congruence with old age stereotypes or structural overlap with age cues suggests that the observed influence of facial age cues on emotion perception is due to the congruence between relatively positive evaluations of young faces and happy expressions.  相似文献   

2.
Two experiments competitively test 3 potential mechanisms (negativity inhibiting responses, feature-based accounts, and evaluative context) for the response latency advantage for recognizing happy expressions by investigating how the race of a target can moderate the strength of the effect. Both experiments indicate that target race modulates the happy face advantage, such that European American participants displayed the happy face advantage for White target faces, but displayed a response latency advantage for angry (Experiments 1 and 2) and sad (Experiment 2) Black target faces. This pattern of findings is consistent with an evaluative context mechanism and inconsistent with negativity inhibition and feature-based accounts of the happy face advantage. Thus, the race of a target face provides an evaluative context in which facial expressions are categorized.  相似文献   

3.
Research on the interaction of emotional expressions with social category cues in face processing has focused on whether specific emotions are associated with single-category identities, thus overlooking the influence of intersectional identities. Instead, we examined how quickly people categorise intersectional targets by their race, gender, or emotional expression. In Experiment 1, participants categorised Black and White faces displaying angry, happy, or neutral expressions by either race or gender. Emotion influenced responses to men versus women only when gender was made salient by the task. Similarly, emotion influenced responses to Black versus White targets only when participants categorised by race. In Experiment 2, participants categorised faces by emotion so that neither category was more salient. As predicted, responses to Black women differed from those to both Black men and White women. Thus, examining race and gender separately is insufficient to understanding how emotion and social category cues are processed.  相似文献   

4.
The threat of appearing prejudiced and race-based attentional biases   总被引:1,自引:0,他引:1  
The current work tested whether external motivation to respond without prejudice toward Blacks is associated with biased patterns of selective attention that reflect a threat response to Black individuals. In a dot-probe attentional bias paradigm, White participants with low and high external motivation to respond without prejudice toward Blacks (i.e., low-EM and high-EM individuals, respectively) were presented with pairs of White and Black male faces that bore either neutral or happy facial expressions; on each trial, the faces were displayed for either 30 ms or 450 ms. The findings were consistent with those of previous research on threat and attention: High-EM participants revealed an attentional bias toward neutral Black faces presented for 30 ms, but an attentional bias away from neutral Black faces presented for 450 ms. These attentional biases were eliminated, however, when the faces displayed happy expressions. These findings suggest that high levels of external motivation to avoid prejudice result in anxious arousal in response to Black individuals, and that this response affects even basic attentional processes.  相似文献   

5.
Visual working memory (WM) for face identities is enhanced when faces express negative versus positive emotion. To determine the stage at which emotion exerts its influence on memory for person information, we isolated expression (angry/happy) to the encoding phase (Experiment 1; neutral test faces) or retrieval phase (Experiment 2; neutral study faces). WM was only enhanced by anger when expression was present at encoding, suggesting that retrieval mechanisms are not influenced by emotional expression. To examine whether emotional information is discarded on completion of encoding or sustained in WM, in Experiment 3 an emotional word categorisation task was inserted into the maintenance interval. Emotional congruence between word and face supported memory for angry but not for happy faces, suggesting that negative emotional information is preferentially sustained during WM maintenance. Our findings demonstrate that negative expressions exert sustained and beneficial effects on WM for faces that extend beyond encoding.  相似文献   

6.
A new model of mental representation is applied to social cognition: the attractor field model. Using the model, the authors predicted and found a perceptual advantage but a memory disadvantage for faces displaying evaluatively congruent expressions. In Experiment 1, participants completed a same/different perceptual discrimination task involving morphed pairs of angry-to-happy Black and White faces. Pairs of faces displaying evaluatively incongruent expressions (i.e., happy Black, angry White) were more likely to be labeled as similar and were less likely to be accurately discriminated from one another than faces displaying evaluatively congruent expressions (i.e., angry Black, happy White). Experiment 2 replicated this finding and showed that objective discriminability of stimuli moderated the impact of attractor field effects on perceptual discrimination accuracy. In Experiment 3, participants completed a recognition task for angry and happy Black and White faces. Consistent with the attractor field model, memory accuracy was better for faces displaying evaluatively incongruent expressions. Theoretical and practical implications of these findings are discussed.  相似文献   

7.
The purpose of the present study was to examine the time course of race and expression processing to determine how these cues influence early perceptual as well as explicit categorization judgments. Despite their importance in social perception, little research has examined how social category information and emotional expression are processed over time. Moreover, although models of face processing suggest that the two cues should be processed independently, this has rarely been directly examined. Event-related brain potentials were recorded as participants made race and emotion categorization judgments of Black and White men posing either happy, angry, or neutral expressions. Our findings support that processing of race and emotion cues occur independently and in parallel, relatively early in processing.  相似文献   

8.
Adults show better memory for ambiguous faces of their own race than for ambiguous faces of another race, even when the faces are identical and differentiated only by extraneous cues to racial category. We investigated whether similar context effects operate early in development. Young children raised in predominantly White environments were presented with computer-generated White-Black morphed faces, each paired with either the White or the Black face that contributed to its construction, and were told that the two faces in each pair were siblings. The children's subsequent recognition memory was more accurate for faces that had been paired with White siblings than for faces that had been paired with Black siblings. The same effect did not obtain when the ambiguous faces were paired with White or Black faces that did not contribute to their construction and did not look like siblings. These findings suggest that face memory in children is not driven exclusively by visual information present in faces and instead depends on an interplay of categorical and perceptual information about race and relationships.  相似文献   

9.
Sex differences in face recognition and influence of facial affect   总被引:1,自引:0,他引:1  
To study sex differences in the recognition of human faces with different facial expressions, 65 female and 64 male participants learned to associate names with various male and female neutral faces. During the recall phase, participants were then asked to name the same persons depicting different emotional expressions (neutral, happy, angry, and fearful). Females were faster than males at naming male faces, and males were faster than females at naming female faces. All participants were faster at naming neutral or happy female faces than neural or happy male faces. These results suggest that opposite-sex faces require less processing time than same-sex faces, which is consistent with an evolutionary account.  相似文献   

10.
The goal of this research was to examine the effects of facial expressions on the speed of sex recognition. Prior research revealed that sex recognition of female angry faces was slower compared with male angry faces and that female happy faces are recognized faster than male happy faces. We aimed to replicate and extend the previous research by using different set of facial stimuli, different methodological approach and also by examining the effects of some other previously unexplored expressions (such as crying) on the speed of sex recognition. In the first experiment, we presented facial stimuli of men and women displaying anger, fear, happiness, sadness, crying and three control conditions expressing no emotion. Results showed that sex recognition of angry females was significantly slower compared with sex recognition in any other condition, while sad, crying, happy, frightened and neutral expressions did not impact the speed of sex recognition. In the second experiment, we presented angry, neutral and crying expressions in blocks and again only sex recognition of female angry expressions was slower compared with all other expressions. The results are discussed in a context of perceptive features of male and female facial configuration, evolutionary theory and social learning context.  相似文献   

11.
Multiple facial cues such as facial expression and face gender simultaneously influence facial trustworthiness judgement in adults. The current work was to examine the effect of multiple facial cues on trustworthiness judgement across age groups. Eight-, 10-year-olds, and adults detect trustworthiness from happy and neutral adult faces (female and male faces) in Experiment 1. Experiment 2 included both adult and child faces wearing happy, angry, and neutral expressions. Nine-, 11-, 13-year-olds, and adults had to rate facial trustworthiness with a 7-point Likert scale. The results of Experiments 1 and 2 revealed that facial expression and face gender independently affected facial trustworthiness judgement in children aged 10 and below but simultaneously affected judgement in children aged 11 and above, adolescents, and adults. There was no own-age bias in children and adults. The results showed that children younger than 10 could not process multiple facial cues in the same manner as in older children and adults when judging trustworthiness. The current findings provide evidence for the stable-feature account, but not for the own-age bias account or the expertise account.  相似文献   

12.
Prior research has shown that race influences perceptions of facial expressions, with hostility detected earlier on young male Black than White faces. This study examined whether the interplay of race and age would moderate perceptions of hostility by having participants evaluate facial expressions of multiply-categorizable targets. Using a facial emotion change-detection task, we assessed evaluations of onset/offset of anger and happiness on faces of young and old Black and White men. Significant age by race interactions were observed: while participants perceived anger as lasting longer and appearing sooner on old compared to young White faces, this relationship was reversed for Black faces, with participants perceiving anger lasting longer and appearing sooner on young compared to old Black faces. Similar results were found for perceived happiness. These results suggest that perception during cross-categorization may be more complex than the simple additive function proposed by the double-jeopardy hypothesis, such that co-activation of other stereotypes may sometimes confer a protective benefit against bias.  相似文献   

13.
Using the item-method directed forgetting paradigm (i.e. intentionally forgetting specified information), we examined directed forgetting of facial identity as a function of facial expression and the sex of the expresser and perceiver. Participants were presented with happy and angry male and female faces cued for either forgetting or remembering, and were then asked to recognise previously studied faces from among a series of neutral faces. For each recognised test face, participants also recalled the face’s previously displayed emotional expression. We found that angry faces were more resistant to forgetting than were happy faces. Furthermore, angry expressions on male faces and happy expressions on female faces were recognised and recalled better than vice versa. Signal detection analyses revealed that male faces gave rise to a greater sensitivity than female faces did, and male participants, but not female participants, showed greater sensitivity to male faces than to female faces. Several theoretical implications are discussed.  相似文献   

14.
The influence of facial expressions of emotion on perceptions of affective sentence meaning was investigated by pairing happy, angry, suprised, and sad faces of “teachers” with sentences of varying affective tone. Ninety-five students judged the overall meaning communicated by these paired stimuli. The design allowed exploration of unique facial-verbal combination effects, overall cue integration effects, and sex differences. Clear effects of cue combinations emerged. Perceived sincerity was found to be a function of the consistency of evaluative (positivity) but not dominance cues. Perceived positivity was an interactive function of both evaluative cues and dominance cues. Perceived dominance was affected by the interaction of evaluative cues. The subtleties of cue combination were clarified through open-ended dependent measures. Also, as expected, females were found to be more sensitive than males to verbal-nonverbal cue conflict in perceptions of sincerity. However, no other sex differences were found. The findings were discussed with regard to the need for a firm empirical base upon which to integrate verbal and nonverbal research traditions in the communication of affective meaning.  相似文献   

15.
We investigated the effect of subliminally presented happy or angry faces on evaluative judgments when the facial muscles of participants were free to mimic or blocked. We hypothesized and showed that subliminally presented happy expressions lead to more positive judgments of cartoons compared to angry expressions only when facial muscles were not blocked. These results reveal the influence of socially driven embodied processes on affective judgments and have also potential implications for phenomena such as emotional contagion. (PsycINFO Database Record (c) 2011 APA, all rights reserved).  相似文献   

16.
Facial cues of threat such as anger and other race membership are detected preferentially in visual search tasks. However, it remains unclear whether these facial cues interact in visual search. If both cues equally facilitate search, a symmetrical interaction would be predicted; anger cues should facilitate detection of other race faces and cues of other race membership should facilitate detection of anger. Past research investigating this race by emotional expression interaction in categorisation tasks revealed an asymmetrical interaction. This suggests that cues of other race membership may facilitate the detection of angry faces but not vice versa. Utilising the same stimuli and procedures across two search tasks, participants were asked to search for targets defined by either race or emotional expression. Contrary to the results revealed in the categorisation paradigm, cues of anger facilitated detection of other race faces whereas differences in race did not differentially influence detection of emotion targets.  相似文献   

17.
Faces convey a variety of socially relevant cues that have been shown to affect recognition, such as age, sex, and race, but few studies have examined the interactive effect of these cues. White participants of two distinct age groups were presented with faces that differed in race, age, and sex in a face recognition paradigm. Replicating the other-race effect, young participants recognized young own-race faces better than young other-race faces. However, recognition performance did not differ across old faces of different races (Experiments 1, 2A). In addition, participants showed an other-age effect, recognizing White young faces better than White old faces. Sex affected recognition performance only when age was not varied (Experiment 2B). Overall, older participants showed a similar recognition pattern (Experiment 3) as young participants, displaying an other-race effect for young, but not old, faces. However, they recognized young and old White faces on a similar level. These findings indicate that face cues interact to affect recognition performance such that age and sex information reliably modulate the effect of race cues. These results extend accounts of face recognition that explain recognition biases (such as the other-race effect) as a function of dichotomous ingroup/outgroup categorization, in that outgroup characteristics are not simply additive but interactively determine recognition performance.  相似文献   

18.
Infants can form object categories based on perceptual cues, but their ability to form categories based on differential experience is less clear. Here we examined whether infants filter through perceptual differences among faces from different other‐race classes and represent them as a single other‐race class different only from own‐race faces. We used a familiarization/novelty‐preference procedure to investigate category formation for two other‐race face classes (Black vs. Asian) by White 6‐ and 9‐month‐olds. The data indicated that while White 6‐month‐olds categorically represented the distinction between Black and Asian faces, White 9‐month‐olds formed a broad other‐race category inclusive of Black and Asian faces, but exclusive of own‐race White faces. The findings provide evidence that narrowing can occur for mental processes other than discrimination: category formation is also affected. The results suggest that frequency of experience with own‐race versus other‐race classes of faces may propel infants to contrast own‐race faces with other‐race faces, but not different classes of other‐race faces with each other.  相似文献   

19.
Empirical evidence shows an effect of gaze direction on cueing spatial attention, regardless of the emotional expression shown by a face, whereas a combined effect of gaze direction and facial expression has been observed on individuals' evaluative judgments. In 2 experiments, the authors investigated whether gaze direction and facial expression affect spatial attention depending upon the presence of an evaluative goal. Disgusted, fearful, happy, or neutral faces gazing left or right were followed by positive or negative target words presented either at the spatial location looked at by the face or at the opposite spatial location. Participants responded to target words based on affective valence (i.e., positive/negative) in Experiment 1 and on letter case (lowercase/uppercase) in Experiment 2. Results showed that participants responded much faster to targets presented at the spatial location looked at by disgusted or fearful faces but only in Experiment 1, when an evaluative task was used. The present findings clearly show that negative facial expressions enhance the attentional shifts due to eye-gaze direction, provided that there was an explicit evaluative goal present.  相似文献   

20.
We investigated the influence of happy and angry expressions on memory for new faces. Participants were presented with happy and angry faces in an intentional or incidental learning condition and were later asked to recognise the same faces displaying a neutral expression. They also had to remember what the initial expressions of the faces had been. Remember/know/guess judgements were made both for identity and expression memory. Results showed that faces were better recognised when presented with a happy rather than an angry expression, but only when learning was intentional. This was mainly due to an increase of the "remember" responses for happy faces when encoding was intentional rather than incidental. In contrast, memory for emotional expressions was not different for happy and angry faces whatever the encoding conditions. We interpret these findings according to the social meaning of emotional expressions for the self.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号