首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
This research provides a systematic analysis of the nonverbal expression of pride. Study 1 manipulated behavioral movements relevant to pride (e.g., expanded posture and head tilt) to identify the most prototypical pride expression and determine the specific components that are necessary and sufficient for reliable recognition. Studies 2 and 3 tested whether the 2 conceptually and empirically distinct facets of pride ("authentic" and "hubristic"; J. L. Tracy & R. W. Robins, 2007a) are associated with distinct nonverbal expressions. Results showed that neither the prototypical pride expression nor several recognizable variants were differentially associated with either facet, suggesting that for the most part, authentic and hubristic pride share the same signal. Together these studies indicate that pride can be reliably assessed from nonverbal behaviors. In the Appendix, the authors provide guidelines for a pride behavioral coding scheme, akin to the Emotion Facial Action Coding System (EMFACS; P. Ekman & E. Rosenberg, 1997) for assessing "basic" emotions from observable nonverbal behaviors.  相似文献   

3.
Synthetic images of facial expression were used to assess whether judges can correctly recognize emotions exclusively on the basis of configurations of facial muscle movements. A first study showed that static, synthetic images modeled after a series of photographs that are widely used in facial expression research yielded recognition rates and confusion patterns comparable to posed photos. In a second study, animated synthetic images were used to examine whether schematic facial expressions consisting entirely of theoretically postulated facial muscle configurations can be correctly recognized. Recognition rates for the synthetic expressions were far above chance, and the confusion patterns were comparable to those obtained with posed photos. In addition, the effect of static versus dynamic presentation of the expressions was studied. Dynamic presentation increased overall recognition accuracy and reduced confusions between unrelated emotions.  相似文献   

4.
Recent research has shown that pride, like the "basic" emotions of anger, disgust, fear, happiness, sadness, and surprise, has a distinct, nonverbal expression that can be recognized by adults (J. L. Tracy & R. W. Robins, 2004b). In 2 experiments, the authors examined whether young children can identify the pride expression and distinguish it from expressions of happiness and surprise. Results suggest that (a) children can recognize pride at above-chance levels by age 4 years; (b) children recognize pride as well as they recognize happiness; (c) pride recognition, like happiness and surprise recognition, improves from age 3 to 7 years; and (d) children's ability to recognize pride cannot be accounted for by the use of a process of elimination (i.e., an exclusion rule) to identify an unknown entity. These findings have implications for the development of emotion recognition and children's ability to perceive and communicate pride.  相似文献   

5.
We examine the conditions under which the distinct positive emotions of hope versus pride facilitate more or less fluid cognitive processing. Using individuals' naturally occurring time of day preferences (i.e., morning vs. evening hours), we show that specific positive emotions can differentially influence processing resources. We argue that specific positive emotions are more likely to influence processing and behavior during nonoptimal times of day, when association-based processing is more likely. We show in three experiments that hope, pride, and a neutral state differentially influence fluid processing on cognitive tasks. Incidental hope facilitates fluid processing during nonoptimal times of day (compared with pride and neutral), improving performance on tasks requiring fluid intelligence (Experiment 1) and increasing valuation estimates on tasks requiring that preferences be constructed on the spot (Experiments 2 and 3). We also provide evidence that these differences in preference and valuation occur through a process of increased imagination (Experiment 3). We contribute to emotion theory by showing that different positive emotions have different implications for processing during nonoptimal times of day.  相似文献   

6.
We investigated whether emotional information from facial expression and hand movement quality was integrated when identifying the expression of a compound stimulus showing a static facial expression combined with emotionally expressive dynamic manual actions. The emotions (happiness, neutrality, and anger) expressed by the face and hands were either congruent or incongruent. In Experiment 1, the participants judged whether the stimulus person was happy, neutral, or angry. Judgments were mainly based on the facial expressions, but were affected by manual expressions to some extent. In Experiment 2, the participants were instructed to base their judgment on the facial expression only. An effect of hand movement expressive quality was observed for happy facial expressions. The results conform with the proposal that perception of facial expressions of emotions can be affected by the expressive qualities of hand movements.  相似文献   

7.
Previous research has found that 2‐ to 4‐month‐old infants display a behavioural pattern similar to adult expressions of shyness and related emotions (coyness, bashfulness, embarrassment). In the present study, 6 video‐clips of this pattern and 10 of control patterns varying on a number of features and contexts were presented to 37 judges in a free‐labelling task and in a rating task. Two examples of the target pattern were perceived as expressing primarily shyness and related emotions, three were perceived as expressing primarily happiness with varying degrees of these emotions, and one as expressing several other emotions as well as shyness and related ones. Yet, judges perceived shyness and related emotions almost exclusively in the target pattern, across different contexts and judgement tasks. The three clips perceived as most ‘shy’ were also used in a between‐judges session to control for priming effects. Overall, results suggest that young infants can be perceived as shy, coy, bashful or embarrassed, in particular when their expressive behaviour resembles the relevant adult expressions. Implications for the early development of these emotions are considered. Copyright © 2005 John Wiley Sons, Ltd.  相似文献   

8.
Recognizing emotion in faces: developmental effects of child abuse and neglect   总被引:12,自引:0,他引:12  
The contributions to the recognition of emotional signals of (a) experience and learning versus (b) internal predispositions are difficult to investigate because children are virtually always exposed to complex emotional experiences from birth. The recognition of emotion among physically abused and physically neglected preschoolers was assessed in order to examine the effects of atypical experience on emotional development. In Experiment 1, children matched a facial expression to an emotional situation. Neglected children had more difficulty discriminating emotional expressions than did control or physically abused children. Physically abused children displayed a response bias for angry facial expressions. In Experiment 2, children rated the similarity of facial expressions. Control children viewed discrete emotions as dissimilar, neglected children saw fewer distinctions between emotions, and physically abused children showed the most variance across emotions. These results suggest that to the extent that children's experience with the world varies, so too will their interpretation and understanding of emotional signals.  相似文献   

9.
Posed stimuli dominate the study of nonverbal communication of emotion, but concerns have been raised that the use of posed stimuli may inflate recognition accuracy relative to spontaneous expressions. Here, we compare recognition of emotions from spontaneous expressions with that of matched posed stimuli. Participants made forced-choice judgments about the expressed emotion and whether the expression was spontaneous, and rated expressions on intensity (Experiments 1 and 2) and prototypicality (Experiment 2). Listeners were able to accurately infer emotions from both posed and spontaneous expressions, from auditory, visual, and audiovisual cues. Furthermore, perceived intensity and prototypicality were found to play a role in the accurate recognition of emotion, particularly from spontaneous expressions. Our findings demonstrate that perceivers can reliably recognise emotions from spontaneous expressions, and that depending on the comparison set, recognition levels can even be equivalent to that of posed stimulus sets.  相似文献   

10.
The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.  相似文献   

11.
Prior research suggested that pride is recognized only when a head and facial expression (e.g., tilted head with a slight smile) is combined with a postural expression (e.g., expanded body and arm gestures). However, these studies used static photographs. In the present research, participants labeled the emotion conveyed by four dynamic cues to pride, presented as video clips: head and face alone, body posture alone, voice alone, and an expression in which head and face, body posture, and voice were presented simultaneously. Participants attributed pride to the head and face alone, even when postural or vocal information was absent. Pride can be conveyed without body posture or voice.  相似文献   

12.
Past studies found that, for preschoolers, a story specifying a situational cause and behavioural consequence is a better cue to fear and disgust than is the facial expression of those two emotions, but the facial expressions used were static. Two studies (Study 1: N = 68, 36–68 months; Study 2: N = 72, 49–90 months) tested whether this effect could be reversed when the expressions were dynamic and included facial, postural, and vocal cues. Children freely labelled emotions in three conditions: story, still face, and dynamic expression. Story remained a better cue than still face or dynamic expression for fear and disgust and also for the later emerging emotions of embarrassment and pride.  相似文献   

13.
Perceiving emotions correctly is foundational to the development of interpersonal skills. Five-month-old infants’ abilities to recognize, discriminate and categorize facial expressions of smiling were tested in three coordinated experiments. Infants were habituated to four degrees of smiling modeled by the same or different people; following habituation, infants were presented with a new degree of smile worn by the same and by a new person (Experiment 1), a new degree of smile and a fearful expression worn by the same person (Experiment 2) or a new degree of smile and a fearful expression worn by new people (Experiment 3). Infants showed significant novelty preferences for the new person smiling and for the fearful expressions over the new degree of smiling. These findings indicate that infants at 5 months can categorize the facial expression of smiling in static faces, and yet recognize the same person despite changes in facial expression; this is the youngest age at which these abilities have been demonstrated. The findings are discussed in light of the significance of emotion expression face processing in social interaction and infants’ categorization of faces.  相似文献   

14.
The relationship between knowledge of American Sign Language (ASL) and the ability to encode facial expressions of emotion was explored. Participants were 55 college students, half of whom were intermediate-level students of ASL and half of whom had no experience with a signed language. In front of a video camera, participants posed the affective facial expressions of happiness, sadness, fear, surprise, anger, and disgust. These facial expressions were randomized onto stimulus tapes that were then shown to 60 untrained judges who tried to identify the expressed emotions. Results indicated that hearing subjects knowledgeable in ASL were generally more adept than were hearing nonsigners at conveying emotions through facial expression. Results have implications for better understanding the nature of nonverbal communication in hearing and deaf individuals.  相似文献   

15.
Research into emotional communication to date has largely focused on facial and vocal expressions. In contrast, recent studies by Hertenstein, Keltner, App, Bulleit, and Jaskolka (2006) and Hertenstein, Holmes, McCullough, and Keltner (2009) exploring nonverbal communication of emotion discovered that people could identify anger, disgust, fear, gratitude, happiness, love, sadness and sympathy from the experience of being touched on either the arm or body by a stranger, without seeing the touch. The study showed that strangers were unable to communicate the self-focused emotions embarrassment, envy and pride, or the universal emotion surprise. Literature relating to touch indicates that the interpretation of a tactile experience is significantly influenced by the relationship between the touchers (Coan, Schaefer, & Davidson, 2006). The present study compared the ability of romantic couples and strangers to communicate emotions solely via touch. Results showed that both strangers and romantic couples were able to communicate universal and prosocial emotions, whereas only romantic couples were able to communicate the self-focused emotions envy and pride.  相似文献   

16.
Narratives are not only about events, but also about the emotions those events elicit. Understanding a narrative involves not just the affective valence of implied emotional states, but the formation of an explicit mental representation of those states. In turn, this representation provides a mechanism that particularizes emotion and modulates its display, which then allows emotional expression to be modified according to particular contexts. This includes understanding that a character may feel an emotion but inhibit its display or even express a deceptive emotion. We studied how 59 school-aged children with head injury and 87 normally-developing age-matched controls understand real and deceptive emotions in brief narratives. Children with head injury showed less sensitivity than controls to how emotions are expressed in narratives. While they understood the real emotions in the text, and could recall what provoked the emotion and the reason for concealing it, they were less able than controls to identify deceptive emotions. Within the head injury group, factors such as an earlier age at head injury and frontal lobe contusions were associated with poor understanding of deceptive emotions. The results are discussed in terms of the distinction between emotions as felt and emotions as a cognitive framework for understanding other people's actions and mental states. We conclude that children with head injury understand emotional communication, the spontaneous externalization of real affect, but not emotive communication, the conscious, strategic modification of affective signals to influence others through deceptive facial expressions.  相似文献   

17.
18.
Emerging Insights Into the Nature and Function of Pride   总被引:3,自引:0,他引:3  
ABSTRACT— Pride, a "self-conscious" emotion involving complex self-evaluative processes, is a fundamental human emotion. Recent research provides new insights into its nature and function. Like the "basic" emotions, pride is associated with a distinct, universally recognized, nonverbal expression, which is spontaneously displayed during pride experiences. Yet, pride differs from the basic emotions in its dependency on self-evaluations and in its complex structure, which is comprised of two theoretically and conceptually distinct facets that have divergent personality correlates and cognitive antecedents. In this article, we summarize findings from the growing body of research on pride and highlight the implications of this research for a broader understanding of emotions and social behavior.  相似文献   

19.
The aim of this study was to investigate the causes of the own-race advantage in facial expression perception. In Experiment 1, we investigated Western Caucasian and Chinese participants’ perception and categorization of facial expressions of six basic emotions that included two pairs of confusable expressions (fear and surprise; anger and disgust). People were slightly better at identifying facial expressions posed by own-race members (mainly in anger and disgust). In Experiment 2, we asked whether the own-race advantage was due to differences in the holistic processing of facial expressions. Participants viewed composite faces in which the upper part of one expression was combined with the lower part of a different expression. The upper and lower parts of the composite faces were either aligned or misaligned. Both Chinese and Caucasian participants were better at identifying the facial expressions from the misaligned images, showing interference on recognizing the parts of the expressions created by holistic perception of the aligned composite images. However, this interference from holistic processing was equivalent across expressions of own-race and other-race faces in both groups of participants. Whilst the own-race advantage in recognizing facial expressions does seem to reflect the confusability of certain emotions, it cannot be explained by differences in holistic processing.  相似文献   

20.
A great deal of what we know about the world has not been learned via first-hand observation but thanks to others' testimony. A crucial issue is to know which kind of cues people use to evaluate information provided by others. In this context, recent studies in adults and children underline that informants' facial expressions could play an essential role. To test the importance of the other's emotions in vocabulary learning, we used two avatars expressing happiness, anger or neutral emotions when proposing different verbal labels for an unknown object. Experiment 1 revealed that adult participants were significantly more likely than chance to choose the label suggested by the avatar displaying a happy face over the label suggested by the avatar displaying an angry face. Experiment 2 extended these results by showing that both adults and children as young as 3 years old showed this effect. These data suggest that decision making concerning newly acquired information depends on informant's expressions of emotions, a finding that is consistent with the idea that behavioural intents have facial signatures that can be used to detect another's intention to cooperate.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号