首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In two studies, subjects judged a set of facial expressions of emotion by either providing labels of their own choice to describe the stimuli (free-choice condition), choosing a label from a list of emotion words, or choosing a story from a list of emotion stories (fixed-choice conditions). In the free-choice condition, levels of agreement between subjects on the predicted emotion categories for six basic emotions were significantly greater than chance levels, and comparable to those shown in fixed-choice studies. As predicted, there was little to no agreement on a verbal label for contempt. Agreement on contempt was greatly improved when subjects were allowed to identify the expression in terms of an antecedent event for that emotion rather than in terms of a single verbal label, a finding that could not be attributed to the methodological artifact of exclusion in a fixed-choice paradigm. These findings support two conclusions: (1) that the labels used in fixed-choice paradigms accurately reflect the verbal categories people use when free labeling facial expressions of emotion, and (2) that lexically ambiguous emotions, such as contempt, are understood in terms of their situational meanings.This research was supported in part by a Research Scientist Award from the National Institute of Mental Health (MH 06091) to Paul Ekman.  相似文献   

2.
The effects of Parkinson's disease (PD) on spontaneous and posed facial activity and on the control of facial muscles were assessed by comparing 22 PD patients with 22 controls. Facial activity was analysed using the Facial Action Coding System (FACS; Ekman & Friesen, 1978). As predicted, PD patients showed reduced levels of spontaneous and posed facial expression in reaction to unpleasant odours compared to controls. PD patients were less successful than controls in masking or intensifying negative facial expressions. PD patients were also less able than controls to imitate specific facial muscle movements, but did not differ in the ability to pose emotional facial expressions. These results suggest that not only is spontaneous facial activity disturbed in PD, but also to some degree the ability to pose facial expressions, to mask facial expressions with other expressions, and to deliberately move specific muscles in the face.  相似文献   

3.
The claim that specific discrete emotions can be universally recognized from human facial expressions is based mainly on the study of expressions that were posed. The current study (N=50) examined recognition of emotion from 20 spontaneous expressions from Papua New Guinea photographed, coded, and labeled by P. Ekman (1980). For the 16 faces with a single predicted label, endorsement of that label ranged from 4.2% to 45.8% (mean 24.2%). For 4 faces with 2 predicted labels (blends), endorsement of one or the other ranged from 6.3% to 66.6% (mean 38.8%). Of the 24 labels Ekman predicted, 11 were endorsed at an above-chance level, and 13 were not. Spontaneous expressions do not achieve the level of recognition achieved by posed expressions.  相似文献   

4.
Component theory (C. Smith & H. Scott, 1997) predicts that presence of component movements (action units) alters the decoded meaning of a basic emotional expression. We tested whether the meaning of the basic expression of anger varied when different components were present in the expression. Participants were asked to label variants of anger from Ekman and Friesen's Pictures of Facial Affect using 15 anger terms, and invariance of labeling was tested by manipulating the judgment task. Data were analyzed using consensus analysis, multidimensional scaling, and numerical scaling. Components did not result in consensus about fine distinctions in the meanings of the anger expressions. Varying the type of task strongly affected results. We believe this occurred because language elicits different categorization processes than evaluation of facial expressions nonverbally.  相似文献   

5.
The view that certain facial expressions of emotion are universally agreed on has been challenged by studies showing that the forced-choice paradigm may have artificially forced agreement. This article addressed this methodological criticism by offering participants the opportunity to select a none of these terms are correct option from a list of emotion labels in a modified forced-choice paradigm. The results show that agreement on the emotion label for particular facial expressions is still greater than chance, that artifactual agreement on incorrect emotion labels is obviated, that participants select the none option when asked to judge a novel expression, and that adding 4 more emotion labels does not change the pattern of agreement reported in universality studies. Although the original forced-choice format may have been prone to artifactual agreement, the modified forced-choice format appears to remedy that problem.  相似文献   

6.
The present study investigates the perception of facial expressions of emotion, and explores the relation between the configural properties of expressions and their subjective attribution. Stimuli were a male and a female series of morphed facial expressions, interpolated between prototypes of seven emotions (happiness, sadness, fear, anger, surprise and disgust, and neutral) from Ekman and Friesen (1976). Topographical properties of the stimuli were quantified using the Facial Expression Measurement (FACEM) scheme. Perceived dissimilarities between the emotional expressions were elicited using a sorting procedure and processed with multidimensional scaling. Four dimensions were retained in the reconstructed facial-expression space, with positive and negative expressions opposed along D1, while the other three dimensions were interpreted as affective attributes distinguishing clusters of expressions categorized as "Surprise-Fear," "Anger," and "Disgust." Significant relationships were found between these affective attributes and objective facial measures of the stimuli. The findings support a componential explanatory scheme for expression processing, wherein each component of a facial stimulus conveys an affective value separable from its context, rather than a categorical-gestalt scheme. The findings further suggest that configural information is closely involved in the decoding of affective attributes of facial expressions. Configural measures are also suggested as a common ground for dimensional as well as categorical perception of emotional faces.  相似文献   

7.
Visual-field bias in the judgment of facial expression of emotion   总被引:2,自引:0,他引:2  
The left and right hemispheres of the brain are differentially related to the processing of emotions. Although there is little doubt that the right hemisphere is relatively superior for processing negative emotions, controversy exists over the hemispheric role in the processing of positive emotions. Eighty right-handed normal male participants were examined for visual-field (left-right) differences in the perception of facial expressions of emotion. Facial composite (RR, LL) and hemifacial (R, L) sets depicting emotion expressions of happiness and sadness were prepared. Pairs of such photographs were presented bilaterally for 150 ms, and participants were asked to select the photographs that looked more expressive. A left visual-field superiority (a right-hemisphere function) was found for sad facial emotion. A hemispheric advantage in the perception of happy expression was not found.  相似文献   

8.
Facial expression and emotional stimuli were varied orthogonally in a 3 x 4 factorial design in order to test whether facial expression is necessary or sufficient to influence emotional experience. Subjects watched a film eliciting fear, sadness, or no emotion, while holding their facial muscles in the position characteristic of fear or sadness, or in an effortful but nonemotional grimace; those in a fourth group received no facial instructions. The subjects believed that the study concerned subliminal perception and that the facial positions were necessary to prevent physiological recording artifacts. The films had powerful effects on reported emotions, the facial expressions none. Correlations between facial expression and reported emotion were zero. Sad and fearful subjects showed distinctive patterns of physiological arousal. Facial expression also tended to affect physiological responses in a manner consistent with an effort hypothesis.  相似文献   

9.
Speech and song are universal forms of vocalization that may share aspects of emotional expression. Research has focused on parallels in acoustic features, overlooking facial cues to emotion. In three experiments, we compared moving facial expressions in speech and song. In Experiment 1, vocalists spoke and sang statements each with five emotions. Vocalists exhibited emotion-dependent movements of the eyebrows and lip corners that transcended speech–song differences. Vocalists’ jaw movements were coupled to their acoustic intensity, exhibiting differences across emotion and speech–song. Vocalists’ emotional movements extended beyond vocal sound to include large sustained expressions, suggesting a communicative function. In Experiment 2, viewers judged silent videos of vocalists’ facial expressions prior to, during, and following vocalization. Emotional intentions were identified accurately for movements during and after vocalization, suggesting that these movements support the acoustic message. Experiment 3 compared emotional identification in voice-only, face-only, and face-and-voice recordings. Emotion judgements for voice-only singing were poorly identified, yet were accurate for all other conditions, confirming that facial expressions conveyed emotion more accurately than the voice in song, yet were equivalent in speech. Collectively, these findings highlight broad commonalities in the facial cues to emotion in speech and song, yet highlight differences in perception and acoustic-motor production.  相似文献   

10.
Individual differences in young children's frustration responses set the stage for myriad developmental outcomes and represent an area of intense empirical interest. Emotion regulation is hypothesized to comprise the interplay of complex behaviors, such as facial expressions, and activation of concurrent underlying neural systems. At present, however, the literature has mostly examined children's observed emotion regulation behaviors and assumed underlying brain activation through separate investigations, resulting in theoretical gaps in our understanding of how children regulate emotion in vivo. Our goal was to elucidate links between young children's emotion regulation‐related neural activation, facial muscular movements, and parent‐rated temperamental emotion regulation. Sixty‐five children (age 3–7) completed a frustration‐inducing computer task while lateral prefrontal cortex (LPFC) activation and concurrent facial expressions were recorded. Negative facial expressions with eye constriction were inversely associated with both parent‐rated temperamental emotion regulation and concurrent LPFC activation. Moreover, we found evidence that positive expressions with eye constriction during frustration may be associated with stronger LPFC activation. Results suggest a correspondence between facial expressions and LPFC activation that may explicate how children regulate emotion in real time.  相似文献   

11.
Facial emotions are important for human communication. Unfortunately, traditional facial emotion recognition tasks do not inform about how respondents might behave towards others expressing certain emotions. Approach‐avoidance tasks do measure behaviour, but only on one dimension. In this study 81 participants completed a novel Facial Emotion Response Task. Images displaying individuals with emotional expressions were presented in random order. Participants simultaneously indicated how communal (quarrelsome vs. agreeable) and how agentic (dominant vs. submissive) they would be in response to each expression. We found that participants responded differently to happy, angry, fearful, and sad expressions in terms of both dimensions of behaviour. Higher levels of negative affect were associated with less agreeable responses specifically towards happy and sad expressions. The Facial Emotion Response Task might complement existing facial emotion recognition and approach‐avoidance tasks.  相似文献   

12.
We investigated attachment differences in the perception of facial emotion expressions. Participants completed a dimensional assessment of adult attachment and recognition accuracy tasks for positive and negative facial emotion expressions. Consistently, avoidant participants who were in romantic relationships, in comparison to singles, had lower decoding accuracy for facial expressions of positive emotions. The results were in line with the hypothesis that being in relationship functions as a naturalistic prime of avoidant persons' defensive tendency to ignore affiliative signals, facial expressions of positive emotion in this instance. The results inform emerging research on attachment and emotion perception by highlighting the role of perceivers' motivated social cognitions.  相似文献   

13.
The Chimpanzee Facial Action Coding System (ChimpFACS) is an objective, standardized observational tool for measuring facial movement in chimpanzees based on the well-known human Facial Action Coding System (FACS; P. Ekman & W. V. Friesen, 1978). This tool enables direct structural comparisons of facial expressions between humans and chimpanzees in terms of their common underlying musculature. Here the authors provide data on the first application of the ChimpFACS to validate existing categories of chimpanzee facial expressions using discriminant functions analyses. The ChimpFACS validated most existing expression categories (6 of 9) and, where the predicted group memberships were poor, the authors discuss potential problems with ChimpFACS and/or existing categorizations. The authors also report the prototypical movement configurations associated with these 6 expression categories. For all expressions, unique combinations of muscle movements were identified, and these are illustrated as peak intensity prototypical expression configurations. Finally, the authors suggest a potential homology between these prototypical chimpanzee expressions and human expressions based on structural similarities. These results contribute to our understanding of the evolution of emotional communication by suggesting several structural homologies between the facial expressions of chimpanzees and humans and facilitating future research.  相似文献   

14.
Facial expressions are one example of emotional behavior that illustrate the importance of emotions to both basic survival and social interaction. Basic facial responses to stimuli such as sweet and bitter taste are important for species fitness and governed by simple rules. Even at this basic level, facial responses have communicative value to other species members. During evolution simple facial responses were extended for use in more complex nonverbal communications; the responses are labile. The perception and production of facial expressions are cognitive processes and numerous subcortical and cortical areas contribute to these operations. We suggest that no specific emotion center exists over and above cognitive systems in the brain, and that emotion should not be divorced from cognition.  相似文献   

15.
Multi-label tasks confound age differences in perceptual and cognitive processes. We examined age differences in emotion perception with a technique that did not require verbal labels. Participants matched the emotion expressed by a target to two comparison stimuli, one neutral and one emotional. Angry, disgusted, fearful, happy, and sad facial expressions of varying intensity were used. Although older adults took longer to respond than younger adults, younger adults only outmatched older adults for the lowest intensity disgust and fear expressions. Some participants also completed an identity matching task in which target stimuli were matched on personal identity instead of emotion. Although irrelevant to the judgment, expressed emotion still created interference. All participants were less accurate when the apparent difference in expressive intensity of the matched stimuli was large, suggesting that salient emotion cues increased difficulty of identity matching. Age differences in emotion perception were limited to very low intensity expressions.  相似文献   

16.
Facial behaviors of medal winners of the judo competition at the 2004 Athens Olympic Games were coded with P. Ekman and W. V. Friesen's (1978) Facial Affect Coding System (FACS) and interpreted using their Emotion FACS dictionary. Winners' spontaneous expressions were captured immediately when they completed medal matches, when they received their medal from a dignitary, and when they posed on the podium. The 84 athletes who contributed expressions came from 35 countries. The findings strongly supported the notion that expressions occur in relation to emotionally evocative contexts in people of all cultures, that these expressions correspond to the facial expressions of emotion considered to be universal, that expressions provide information that can reliably differentiate the antecedent situations that produced them, and that expressions that occur without inhibition are different than those that occur in social and interactive settings.  相似文献   

17.
The effects of Asian and Caucasian facial morphology were examined by having Canadian children categorize pictures of facial expressions of basic emotions. The pictures were selected from the Japanese and Caucasian Facial Expressions of Emotion set developed by D. Matsumoto and P. Ekman (1989). Sixty children between the ages of 5 and 10 years were presented with short stories and an array of facial expressions, and were asked to point to the expression that best depicted the specific emotion experienced by the characters. The results indicated that expressions of fear and surprise were better categorized from Asian faces, whereas expressions of disgust were better categorized from Caucasian faces. These differences originated in some specific confusions between expressions.  相似文献   

18.
The effects of Asian and Caucasian facial morphology were examined by having Canadian children categorize pictures of facial expressions of basic emotions. The pictures were selected from the Japanese and Caucasian Facial Expressions of Emotion set developed by D. Matsumoto and P. Ekman (1989). Sixty children between the ages of 5 and 10 years were presented with short stories and an array of facial expressions, and were asked to point to the expression that best depicted the specific emotion experienced by the characters. The results indicated that expressions of fear and surprise were better categorized from Asian faces, whereas expressions of disgust were better categorized from Caucasian faces. These differences originated in some specific confusions between expressions.  相似文献   

19.
Pictures of facial expressions of emotion are used in a wide range of experiments. The last decade has seen an increase in the number of studies presenting local sets of emotion stimuli. However, only a few existing sets contain pictures of Latin Americans, despite the growing attention emotion research is receiving in this region. Here we present the development and validation of the Universidad Nacional de Cordoba, Expresiones de Emociones Faciales (UNCEEF), a Facial Action Coding System (FACS)-verified set of pictures of Argentineans expressing the six basic emotions, plus neutral expressions. FACS scores, recognition rates, Hu scores, and discrimination indices are reported. Evidence of convergent validity was obtained using the Pictures of Facial Affect in an Argentine sample. However, recognition accuracy was greater for UNCEEF. The importance of local sets of emotion pictures is discussed.  相似文献   

20.
This study examined recognition of facial expressions of emotion among women diagnosed with borderline personality disorder (BPD; n = 21), compared to a group of women with histories of childhood sexual abuse with no current or prior diagnosis of BPD (n = 21) and a group of women with no history of sexual abuse or BPD (n = 20). Facial recognition was assessed by a slide set developed by Ekman and Matsumoto (Japanese and Caucasian Facial Expressions of Emotion and Neutral Faces, 1992), expanded and improved from previous slide sets, and utilized a coding system that allowed for free responses rather than the more typical fixed-response format. Results indicated that borderline individuals were primarily accurate perceivers of others' emotions and showed a tendency toward heightened sensitivity on recognition of fear, specifically. Results are discussed in terms of emotional appraisal ability and emotion dysregulation among individuals with BPD.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号