首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Recently, investigators have challenged long‐standing assumptions that facial expressions of emotion follow specific emotion‐eliciting events and relate to other emotion‐specific responses. We address these challenges by comparing spontaneous facial expressions of anger, sadness, laughter, and smiling with concurrent, “on‐line” appraisal themes from narrative data, and by examining whether coherence between facial and appraisal components were associated with increased experience of emotion. Consistent with claims that emotion systems are loosely coupled, facial expressions of anger and sadness co‐occurred to a moderate degree with the expected appraisal themes, and when this happened, the experience of emotion was stronger. The results for the positive emotions were more complex, but lend credence to the hypothesis that laughter and smiling are distinct. Smiling co‐occurred with appraisals of pride, but never occurred with appraisals of anger. In contrast, laughter occurred more often with appraisals of anger, a finding consistent with recent evidence linking laughter to the dissociation or undoing of negative emotion.  相似文献   

2.
In the leading model of face perception, facial identity and facial expressions of emotion are recognized by separate mechanisms. In this report, we provide evidence supporting the independence of these processes by documenting an individual with severely impaired recognition of facial identity yet normal recognition of facial expressions of emotion. NM, a 40-year-old prosopagnosic, showed severely impaired performance on five of six tests of facial identity recognition. In contrast, she performed in the normal range on four different tests of emotion recognition. Because the tests of identity recognition and emotion recognition assessed her abilities in a variety of ways, these results provide solid support for models in which identity recognition and emotion recognition are performed by separate processes.  相似文献   

3.
The objectives of this study were to propose a method of presenting dynamic facial expressions to experimental subjects, in order to investigate human perception of avatar's facial expressions of different levels of emotional intensity. The investigation concerned how perception varies according to the strength of facial expression, as well as according to an avatar's gender. To accomplish these goals, we generated a male and a female virtual avatar with five levels of intensity of happiness and anger using a morphing technique. We then recruited 16 normal healthy subjects and measured each subject's emotional reaction by scoring affective arousal and valence after showing them the avatar's face. Through this study, we were able to investigate human perceptual characteristics evoked by male and female avatars' graduated facial expressions of happiness and anger. In addition, we were able to identify that a virtual avatar's facial expression could affect human emotion in different ways according to the avatar's gender and the intensity of its facial expressions. However, we could also see that virtual faces have some limitations because they are not real, so subjects recognized the expressions well, but were not influenced to the same extent. Although a virtual avatar has some limitations in conveying its emotion using facial expressions, this study is significant in that it shows that a new potential exists to use or manipulate emotional intensity by controlling a virtual avatar's facial expression linearly using a morphing technique. Therefore, it is predicted that this technique may be used for assessing emotional characteristics of humans, and may be of particular benefit for work with people with emotional disorders through a presentation of dynamic expression of various emotional intensities.  相似文献   

4.
Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals.  相似文献   

5.
Most previous research reporting emotion-recognition deficits in schizophrenia has used posed facial expressions of emotion and chronic-schizophrenia patients. In contrast, the present research examined the ability of patients with acute paranoid and nonparanoid (disorganized) schizophrenia to recognize genuine as well as posed facial expressions of emotion. Evidence of an emotion-recognition deficit in schizophrenia was replicated, but only when posed facial expressions were used. For genuine expressions of emotion, the paranoid-schizophrenia group was more accurate than controls, nonparanoid-schizophrenia patients, and depressed patients. Future research clearly needs to consider the posed versus genuine nature of the emotional stimuli used and the type of schizophrenia patients examined.  相似文献   

6.
We investigated attachment differences in the perception of facial emotion expressions. Participants completed a dimensional assessment of adult attachment and recognition accuracy tasks for positive and negative facial emotion expressions. Consistently, avoidant participants who were in romantic relationships, in comparison to singles, had lower decoding accuracy for facial expressions of positive emotions. The results were in line with the hypothesis that being in relationship functions as a naturalistic prime of avoidant persons' defensive tendency to ignore affiliative signals, facial expressions of positive emotion in this instance. The results inform emerging research on attachment and emotion perception by highlighting the role of perceivers' motivated social cognitions.  相似文献   

7.
E Kotsoni  M de Haan  M H Johnson 《Perception》2001,30(9):1115-1125
Recent research indicates that adults show categorical perception of facial expressions of emotion. It is not known whether this is a basic characteristic of perception that is present from the earliest weeks of life, or whether it is one that emerges more gradually with experience in perceiving and interpreting expressions. We report two experiments designed to investigate whether young infants, like adults, show categorical perception of facial expressions. 7-month-old infants were shown photographic quality continua of interpolated (morphed) facial expressions derived from two prototypes of fear and happiness. In the first experiment, we used a visual-preference technique to identify the infants' category boundary between happiness and fear. In the second experiment, we used a combined familiarisation-visual-preference technique to compare infants' discrimination of pairs of expressions that were equally physically different but that did or did not cross the emotion-category boundary. The results suggest that 7-month-old infants (i) show evidence of categorical perception of facial expressions of emotion, and (ii) show persistent interest in looking at fearful expressions.  相似文献   

8.
Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.  相似文献   

9.
Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.  相似文献   

10.
We examined how the recognition of facial emotion was influenced by manipulation of both spatial and temporal properties of 3-D point-light displays of facial motion. We started with the measurement of 3-D position of multiple locations on the face during posed expressions of anger, happiness, sadness, and surprise, and then manipulated the spatial and temporal properties of the measurements to obtain new versions of the movements. In two experiments, we examined recognition of these original and modified facial expressions: in experiment 1, we manipulated the spatial properties of the facial movement, and in experiment 2 we manipulated the temporal properties. The results of experiment 1 showed that exaggeration of facial expressions relative to a fixed neutral expression resulted in enhanced ratings of the intensity of that emotion. The results of experiment 2 showed that changing the duration of an expression had a small effect on ratings of emotional intensity, with a trend for expressions with shorter durations to have lower ratings of intensity. The results are discussed within the context of theories of encoding as related to caricature and emotion.  相似文献   

11.
Facial expressions of emotion are nonverbal behaviors that allow us to interact efficiently in social life and respond to events affecting our welfare. This article reviews 21 studies, published between 1932 and 2015, examining the production of facial expressions of emotion by blind people. It particularly discusses the impact of visual experience on the development of this behavior from birth to adulthood. After a discussion of three methodological considerations, the review of studies reveals that blind subjects demonstrate differing capacities for producing spontaneous expressions and voluntarily posed expressions. Seventeen studies provided evidence that blind and sighted spontaneously produce the same pattern of facial expressions, even if some variations can be found, reflecting facial and body movements specific to blindness or differences in intensity and control of emotions in some specific contexts. This suggests that lack of visual experience seems to not have a major impact when this behavior is generated spontaneously in real emotional contexts. In contrast, eight studies examining voluntary expressions indicate that blind individuals have difficulty posing emotional expressions. The opportunity for prior visual observation seems to affect performance in this case. Finally, we discuss three new directions for research to provide additional and strong evidence for the debate regarding the innate or the culture-constant learning character of the production of emotional facial expressions by blind individuals: the link between perception and production of facial expressions, the impact of display rules in the absence of vision, and the role of other channels in expression of emotions in the context of blindness.  相似文献   

12.
Interpreting and responding appropriately to facial expressions of emotion are important aspects of social skills. Some children, adolescents, and adults with various psychological and psychiatric disorders recognize facial expressions less proficiently than their peers in the general population. We wished to determine if such deficits existed in a group of 133 children and adolescents with emotional and behavioral disorders (EBD). The subjects were receiving in-patient psychiatric services for at least one of substance-related disorders, adjustment disorders, anxiety disorders, mood disorders or disruptive behavior disorders. After being read stories describing various emotional reactions, all subjects were tested for their ability to recognize the 6 basic facial expressions of emotion depicted in Ekman and Friesen's (1976) normed photographs. Overall, they performed well on this task at levels comparable to those occurring in the general population. Accuracy increased with age, irrespective of gender, ethnicity, or clinical diagnosis. After adjusting for age effects, the subjects diagnosed with either adjustment disorders, mood disorders, or disruptive behavior disorders were significantly more accurate at identifying anger than those without those diagnoses. In addition, subjects with mood disorders identified sadness significantly more accurately than those without this diagnosis, although the effect was greatest with younger children.  相似文献   

13.
It has been proposed that self-face representations are involved in interpreting facial emotions of others. We experimentally primed participants' self-face representations. In Study 1, we assessed eye tracking patterns and performance on a facial emotion discrimination task, and in Study 2, we assessed emotion ratings between self and nonself groups. Results show that experimental priming of self-face representations increases visual exploration of faces, facilitates the speed of facial expression processing, and increases the emotional distance between expressions. These findings suggest that the ability to interpret facial expressions of others is intimately associated with the representations we have of our own faces.  相似文献   

14.
The authors aimed to examine the possible association between (a) accurately reading emotion in facial expressions and (b) social and academic competence among elementary school-aged children. Participants were 840 7-year-old children who completed a test of the ability to read emotion in facial expressions. Teachers rated children's social and academic behavior using behavioral rating scales. The authors found that children who had more difficulty identifying emotion in faces also were more likely to have more problems overall and, more specifically, with peer relationships among boys and with learning difficulties among girls. Findings suggest that nonverbal receptive skill plays a significant role in children's social and academic adjustment.  相似文献   

15.
We examined interference effects of emotionally associated background colours during fast valence categorisations of negative, neutral and positive expressions. According to implicitly learned colour–emotion associations, facial expressions were presented with colours that either matched the valence of these expressions or not. Experiment 1 included infrequent non-matching trials and Experiment 2 a balanced ratio of matching and non-matching trials. Besides general modulatory effects of contextual features on the processing of facial expressions, we found differential effects depending on the valance of target facial expressions. Whereas performance accuracy was mainly affected for neutral expressions, performance speed was specifically modulated by emotional expressions indicating some susceptibility of emotional expressions to contextual features. Experiment 3 used two further colour–emotion combinations, but revealed only marginal interference effects most likely due to missing colour–emotion associations. The results are discussed with respect to inherent processing demands of emotional and neutral expressions and their susceptibility to contextual interference.  相似文献   

16.
The present study investigated emotion recognition accuracy and its relation to social adjustment in 7-10 year-old children. The ability to recognize basic emotions from facial and vocal expressions was measured and compared to peer popularity and to teacher-rated social competence. The results showed that emotion recognition was related to these measures of social adjustment, but the gender of a child and emotion category affected this relationship. Emotion recognition accuracy was significantly related to social adjustment for the girls, but not for the boys. For the girls, especially the recognition of surprise was related to social adjustment. Together, these results suggest that the ability to recognize others' emotional states from nonverbal cues is an important socio-cognitive ability for school-aged girls.  相似文献   

17.
Emotional facial expressions provide important insights into various valenced feelings of humans. Recent cross-species neuroscientific advances offer insights into molecular foundations of mammalian affects and hence, by inference, the related emotional/affective facial expressions in humans. This is premised on deep homologies based on affective neuroscience studies of valenced primary emotional systems across species. Thus, emerging theoretical perspectives suggest that ancient cross-species emotional systems are intimately linked not only to emotional action patterns evident in all mammals, but also, by inference, distinct emotional facial expressions studied intensively in humans. Thus, the goal of the present theoretical work was to relate categories of human emotional facial expressions—e.g. especially of anger fear, joy and sadness—to respective underlying primary cross-mammalian emotional circuits. This can potentially provide coherent theoretical frameworks for the eventual molecular study of emotional facial expressions in humans.  相似文献   

18.
Multi-label tasks confound age differences in perceptual and cognitive processes. We examined age differences in emotion perception with a technique that did not require verbal labels. Participants matched the emotion expressed by a target to two comparison stimuli, one neutral and one emotional. Angry, disgusted, fearful, happy, and sad facial expressions of varying intensity were used. Although older adults took longer to respond than younger adults, younger adults only outmatched older adults for the lowest intensity disgust and fear expressions. Some participants also completed an identity matching task in which target stimuli were matched on personal identity instead of emotion. Although irrelevant to the judgment, expressed emotion still created interference. All participants were less accurate when the apparent difference in expressive intensity of the matched stimuli was large, suggesting that salient emotion cues increased difficulty of identity matching. Age differences in emotion perception were limited to very low intensity expressions.  相似文献   

19.
Looking at another person’s facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals’ facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual—and not just conceptual—processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.  相似文献   

20.
The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion-identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号