首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This study assessed the speed of recognition of facial emotional expressions (happy and angry) as a function of violent video game play. Color photos of calm facial expressions morphed to either an angry or a happy facial expression. Participants were asked to make a speeded identification of the emotion (happiness or anger) during the morph. Typically, happy faces are identified faster than angry faces (the happy-face advantage). Results indicated that playing a violent video game led to a reduction in the happy face advantage. Implications of these findings are discussed with respect to the current models of aggressive behavior.  相似文献   

2.
The current longitudinal study (N = 107) examined mothers’ facial emotion recognition using reaction time and their infants’ affect-based attention at 5, 7, and 14 months of age using eyetracking. Our results, examining maternal and infant responses to angry, fearful and happy facial expressions, show that only maternal responses to angry facial expressions were robustly and positively linked across time points, indexing a consistent trait-like response to social threat among mothers. However, neither maternal responses to happy or fearful facial expressions nor infant responses to all three facial emotions show such consistency, pointing to the changeable nature of facial emotion processing, especially among infants. In general, infants’ attention toward negative emotions (i.e., angry and fear) at earlier timepoints was linked to their affect-biased attention for these emotions at 14 months but showed greater dynamic change across time. Moreover, our results provide limited evidence for developmental continuity in processing negative emotions and for the bidirectional interplay of infant affect-biased attention and maternal facial emotion recognition. This pattern of findings suggests that infants’ affect-biased attention to facial expressions of emotion are characterized by dynamic changes.  相似文献   

3.
We used the remember-know procedure (Tulving, 1985 ) to test the behavioural expression of memory following indirect and direct forms of emotional processing at encoding. Participants (N=32) viewed a series of facial expressions (happy, fearful, angry, and neutral) while performing tasks involving either indirect (gender discrimination) or direct (emotion discrimination) emotion processing. After a delay, participants completed a surprise recognition memory test. Our results revealed that indirect encoding of emotion produced enhanced memory for fearful faces whereas direct encoding of emotion produced enhanced memory for angry faces. In contrast, happy faces were better remembered than neutral faces after both indirect and direct encoding tasks. These findings suggest that fearful and angry faces benefit from a recollective advantage when they are encoded in a way that is consistent with the predictive nature of their threat. We propose that the broad memory advantage for happy faces may reflect a form of cognitive flexibility that is specific to positive emotions.  相似文献   

4.
The goal of this study was to explore the ability of violent men to recognise facial affect. In contrast to traditional approaches to this research question, we took the effects of the models' sex and different types of violent behaviour into consideration. Data obtained from 71 violent men revealed that they recognised facial expressions of fear (p = .019) and disgust (p = .013) more accurately when displayed by female than male models. The opposite was found for angry faces (p = .006), while the models' sex did not affect the recognition of sad, happy and surprised facial expressions or neutral faces. Furthermore, sexual coercion perpetrators were more accurate than other violent men in the recognition of female facial disgust (p = .006). These results are discussed in the context of social learning theory, and the hypothesis that female facial expressions of disgust could be subtle cues to their sexual infidelity that motivate sexual coercion in some men.  相似文献   

5.
为揭示高特质攻击个体对愤怒、恐惧威胁面部表情识别的特点及其电生理机制,本研究采用Buss-Perry攻击问卷选取高低特质攻击个体26名和27名为被试,采用面孔识别范式对高低特质攻击个体识别威胁面部表情时的ERP差异进行研究。结果发现,在愤怒、恐惧表情上,高特质攻击组在N170成分的潜伏期都显著短于低特质攻击组;在愤怒、恐惧表情上,高特质攻击组在P200成分的波幅都显著高于低特质攻击组。这表明高特质攻击个体对愤怒、恐惧威胁面部表情的识别具有高度敏感性,这种敏感性体现在面部表情识别的早期和中期阶段,而非晚期阶段,即高特质攻击个体在早期的前注意阶段就对愤怒、恐惧威胁面部表情进行优先注意;在中期的注意阶段,高特质攻击个体可以很好地确认愤怒、恐惧威胁面部表情。  相似文献   

6.
In a face-in-the-crowd setting, the authors examined visual search for photographically reproduced happy, angry, and fearful target faces among neutral distractor faces in 3 separate experiments. Contrary to the hypothesis, happy targets were consistently detected more quickly and accurately than angry and fearful targets, as were directed compared with averted targets. There was no consistent effect of social anxiety. A facial emotion recognition experiment suggested that the happy search advantage could be due to the ease of processing happy faces. In the final experiment with perceptually controlled schematic faces, the authors reported more effective detection of angry than happy faces. This angry advantage was most obvious for highly socially anxious individuals when their social fear was experimentally enhanced.  相似文献   

7.
Facial emotions are important for human communication. Unfortunately, traditional facial emotion recognition tasks do not inform about how respondents might behave towards others expressing certain emotions. Approach‐avoidance tasks do measure behaviour, but only on one dimension. In this study 81 participants completed a novel Facial Emotion Response Task. Images displaying individuals with emotional expressions were presented in random order. Participants simultaneously indicated how communal (quarrelsome vs. agreeable) and how agentic (dominant vs. submissive) they would be in response to each expression. We found that participants responded differently to happy, angry, fearful, and sad expressions in terms of both dimensions of behaviour. Higher levels of negative affect were associated with less agreeable responses specifically towards happy and sad expressions. The Facial Emotion Response Task might complement existing facial emotion recognition and approach‐avoidance tasks.  相似文献   

8.
This study addressed the relative reliance on face and body configurations for different types of emotion-related judgements: emotional state and motion intention. Participants viewed images of people with either emotionally congruent (both angry or fearful) or incongruent (angry/fearful; fearful/angry) faces and bodies. Congruent conditions provided baseline responses. Incongruent conditions revealed relative reliance on face and body information for different judgements. Body configurations influenced motion-intention judgements more than facial configurations: incongruent pairs with angry bodies were more frequently perceived as moving forward than those with fearful bodies; pairs with fearful bodies were more frequently perceived as moving away. In contrast, faces influenced emotional-state judgements more, but bodies moderated ratings of face emotion. Thus, both face and body configurations influence emotion perception, but the type of evaluation required influences their relative contributions. These findings highlight the importance of considering both the face and body as important sources of emotion information.  相似文献   

9.
This study addressed the relative reliance on face and body configurations for different types of emotion-related judgements: emotional state and motion intention. Participants viewed images of people with either emotionally congruent (both angry or fearful) or incongruent (angry/fearful; fearful/angry) faces and bodies. Congruent conditions provided baseline responses. Incongruent conditions revealed relative reliance on face and body information for different judgements. Body configurations influenced motion-intention judgements more than facial configurations: incongruent pairs with angry bodies were more frequently perceived as moving forward than those with fearful bodies; pairs with fearful bodies were more frequently perceived as moving away. In contrast, faces influenced emotional-state judgements more, but bodies moderated ratings of face emotion. Thus, both face and body configurations influence emotion perception, but the type of evaluation required influences their relative contributions. These findings highlight the importance of considering both the face and body as important sources of emotion information.  相似文献   

10.
The author's purpose in this study was to assess the relationship between self-reported aggression and "seeing" anger in others. Eighty-four undergraduate participants completed a self-report questionnaire about their own aggression (i.e., aggressive attitude, verbal aggression, and physical aggression), as well as measures of resiliency and locus of control. They also responded to a series of photographs depicting facial expressions of happy, sad, angry, and fearful emotions. The results indicated that individuals reporting higher levels of overall aggression also misidentified anger from the facial expressions when this was not the emotion presented (errors of commission). No significant differences appeared among individuals reporting high and low levels of aggression in terms of underreporting anger (errors of omission). The author also found significant correlations among identification of anger from photographs, resiliency, and locus of control. The findings of the study have important implications for understanding the relationship between aggression and one's perception of anger in others.  相似文献   

11.
In this study, we investigated a new instrument: the Southampton Test of Empathy for Preschoolers (STEP). The test incorporated 8 video vignettes of children in emotional scenarios, assessing a child's ability to understand (STEP-UND) and share (STEP-SHA) in the emotional experience of a story protagonist. Each vignette included 4 emotions (angry, happy, fearful, sad) that reflected emotion judgments based on the protagonist's facial expression, situation, verbal cues, and desire. The STEP was administered to 39 preschool children, and internal reliability, concurrent validity, and construct validity were addressed. The results showed good internal consistency. They also highlighted moderate concurrent validity with parent-rated empathy, a measure of facial indices, and construct validity with teacher-rated prosocial behavior.  相似文献   

12.
Facial expressions serve as cues that encourage viewers to learn about their immediate environment. In studies assessing the influence of emotional cues on behavior, fearful and angry faces are often combined into one category, such as "threat-related," because they share similar emotional valence and arousal properties. However, these expressions convey different information to the viewer. Fearful faces indicate the increased probability of a threat, whereas angry expressions embody a certain and direct threat. This conceptualization predicts that a fearful face should facilitate processing of the environment to gather information to disambiguate the threat. Here, we tested whether fearful faces facilitated processing of neutral information presented in close temporal proximity to the faces. In Experiment 1, we demonstrated that, compared with neutral faces, fearful faces enhanced memory for neutral words presented in the experimental context, whereas angry faces did not. In Experiment 2, we directly compared the effects of fearful and angry faces on subsequent memory for emotional faces versus neutral words. We replicated the findings of Experiment 1 and extended them by showing that participants remembered more faces from the angry face condition relative to the fear condition, consistent with the notion that anger differs from fear in that it directs attention toward the angry individual. Because these effects cannot be attributed to differences in arousal or valence processing, we suggest they are best understood in terms of differences in the predictive information conveyed by fearful and angry facial expressions.  相似文献   

13.
Research on early childhood predictors of violent behaviors in early adulthood is limited. The current study investigated whether individual, family, and community risk factors from 18 to 42 months of age were predictive of violent criminal arrests during late adolescence and early adulthood using a sample of 310 low-income male participants living in an urban community. In addition, differences in trajectories of overt conduct problems (CP), hyperactivity/attention problems (HAP), and co-occurring patterns of CP and HAP from age 1½ to 10 years were investigated in regard to their relationship to violent and nonviolent behaviors, depression, and anxiety at age 20. Results of multivariate analyses indicated that early childhood family income, home environment, emotion regulation, oppositional behavior, and minority status were all significant in distinguishing violent offending boys from those with no criminal records. In addition, trajectories of early childhood CP, but not attention deficit hyperactivity disorder, were significantly related to self-reports of violent behavior, depressive symptoms, and anxiety symptoms. Implications for the prevention of early childhood risk factors associated with adolescent and adult violent behavior for males are discussed.  相似文献   

14.
The important ability to discriminate facial expressions of emotion develops early in human ontogeny. In the present study, 7-month-old infants’ event-related potentials (ERPs) in response to angry and fearful emotional expressions were measured. The angry face evoked a larger negative component (Nc) at fronto-central leads between 300 and 600 ms after stimulus onset when compared to the amplitude of the Nc to the fearful face. Furthermore, over posterior channels, the angry expression elicited a N290 that was larger in amplitude and a P400 that was smaller in amplitude than for the fearful expression. This is the first study that shows that the ability of infants to discriminate angry and fearful facial expressions can be measured at the electrophysiological level. These data suggest that 7-month-olds allocated more attentional resources to the angry face as indexed by the Nc. Implications of this result may be that the social signal values were perceived differentially, not merely as “negative”. Furthermore, it is possible that the angry expression might have been more arousing and discomforting for the infant compared with the fearful expression.  相似文献   

15.
While the recognition of emotional expressions has been extensively studied, the behavioural response to these expressions has not. In the interpersonal circumplex, behaviour is defined in terms of communion and agency. In this study, we examined behavioural responses to both facial and postural expressions of emotion. We presented 101 Romanian students with facial and postural stimuli involving individuals (‘targets’) expressing happiness, sadness, anger, or fear. Using an interpersonal grid, participants simultaneously indicated how communal (i.e., quarrelsome or agreeable) and agentic (i.e., dominant or submissive) they would be towards people displaying these expressions. Participants were agreeable‐dominant towards targets showing happy facial expressions and primarily quarrelsome towards targets with angry or fearful facial expressions. Responses to targets showing sad facial expressions were neutral on both dimensions of interpersonal behaviour. Postural versus facial expressions of happiness and anger elicited similar behavioural responses. Participants responded in a quarrelsome‐submissive way to fearful postural expressions and in an agreeable way to sad postural expressions. Behavioural responses to the various facial expressions were largely comparable to those previously observed in Dutch students. Observed differences may be explained from participants’ cultural background. Responses to the postural expressions largely matched responses to the facial expressions.  相似文献   

16.
The present study utilized a short‐term longitudinal research design to examine the hypothesis that shyness in preschoolers is differentially related to different aspects of emotion processing. Using teacher reports of shyness and performance measures of emotion processing, including (1) facial emotion recognition, (2) non‐facial emotion recognition, and (3) emotional perspective‐taking, we examined 337 Head Start attendees twice at a 24‐week interval. Results revealed significant concurrent and longitudinal relationships between shyness and facial emotion recognition, and either minimal or non‐existent relationships between shyness and the other aspects of emotion processing. Correlational analyses of concurrent assessments revealed that shyness predicted poorer facial emotion recognition scores for negative emotions (sad, angry, and afraid), but not a positive emotion (happy). Analyses of change over time, on the other hand, revealed that shyness predicted change in facial emotion recognition scores for all four measured emotions. Facial emotion recognition scores did not predict changes in shyness. Results are discussed with respect to expanding the scope of research on shyness and emotion processing to include time‐dependent studies that allow for the specification of developmental processes. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

17.
为了考察拥挤感启动对威胁性面部表情识别的影响,以28名大学生为被试,进行不同拥挤启动条件下的愤怒-中性和恐惧-中性表情识别任务。信号检测论分析发现,拥挤感启动降低了愤怒表情识别的辨别力,不影响其判断标准,也不影响恐惧表情识别的辨别力和判断标准;主观报告的愤怒表情强度在拥挤感启动条件下显著高于非拥挤条件,恐惧、中性表情强度则不受拥挤感启动的影响。结果表明,拥挤感启动使人们辨别愤怒表情的知觉敏感性下降。  相似文献   

18.
The goal of this research was to examine the effects of facial expressions on the speed of sex recognition. Prior research revealed that sex recognition of female angry faces was slower compared with male angry faces and that female happy faces are recognized faster than male happy faces. We aimed to replicate and extend the previous research by using different set of facial stimuli, different methodological approach and also by examining the effects of some other previously unexplored expressions (such as crying) on the speed of sex recognition. In the first experiment, we presented facial stimuli of men and women displaying anger, fear, happiness, sadness, crying and three control conditions expressing no emotion. Results showed that sex recognition of angry females was significantly slower compared with sex recognition in any other condition, while sad, crying, happy, frightened and neutral expressions did not impact the speed of sex recognition. In the second experiment, we presented angry, neutral and crying expressions in blocks and again only sex recognition of female angry expressions was slower compared with all other expressions. The results are discussed in a context of perceptive features of male and female facial configuration, evolutionary theory and social learning context.  相似文献   

19.
Carr MB  Lutjemeier JA 《Adolescence》2005,40(159):601-619
Associations among facial affect recognition, empathy, and self-reported delin-quency were studied in a sample of 29 male youth offenders at a probation placement facility. Youth offenders were asked to recognize facial expressions of emotions from adult faces, child faces, and cartoon faces. Youth offenders also responded to a series of statements on emotional empathy, and provided self-reported acts of delinquency. Findings revealed a moderate positive relationship between ability to recognize the expression, angry, in adult faces, and self-reported acts of delinquent behavior, which included physical violence, theft, and vandalism. Findings revealed a moderate inverse relationship between ability to recognize facial expressions of emotions in child faces and self-reported acts of physical violence. With respect to specific facial expressions of emotions in child faces, a moderate inverse relationship was found between ability to recognize the expression, fearful, and self-reported acts of physical violence. A moderate positive relationship was found between ability to recognized the expression, fearful, in child faces, and ability to empathize with the emotional experiences of others. Strong and moderate links were found between the negative expressions, fearful and sad, and angry and sad, respectively. Additionally, a strong inverse relationship was found between ability to emphathize with the emotional experiences of others and self-reported acts of delinquent behavior. Lastly, a strong positive relationship was found between covert and overt self-reported acts of delinquent behavior. Results from this exploratory investigation suggest a link between facial affect recognition, empathy, and delinquency. Findings have important implications for educators and counselors who work with youth offenders within probation placement facilities.  相似文献   

20.
The current work examined contributions of emotion-resembling facial cues to impression formation. There exist common facial cues that make people look emotional, male or female, and from which we derive personality inferences. We first conducted a Pilot Study to assess these effects. We found that neutral female versus neutral male faces were rated as more submissive, affiliative, na?ve, honest, cooperative, babyish, fearful, happy, and less angry than neutral male faces. In our Primary Study, we then "warped" these same neutral faces over their corresponding anger and fear displays so the resultant facial appearance cues now structurally resembled emotion while retaining a neutral visage (e.g., no wrinkles, furrows, creases, etc.). The gender effects found in the Pilot Study were replicated in the Primary Study, suggesting clear stereotype-driven impressions. Critically, ratings of the neutral-over-fear warps versus neutral-over-anger warps also revealed a profile similar to the gender-based ratings, revealing perceptually driven impressions directly attributable to emotion overgeneralisation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号