首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Detection of emotional facial expressions has been shown to be more efficient than detection of neutral expressions. However, it remains unclear whether this effect is attributable to visual or emotional factors. To investigate this issue, we conducted two experiments using the visual search paradigm with photographic stimuli. We included a single target facial expression of anger or happiness in presentations of crowds of neutral facial expressions. The anti-expressions of anger and happiness were also presented. Although anti-expressions produced changes in visual features comparable to those of the emotional facial expressions, they expressed relatively neutral emotions. The results consistently showed that reaction times (RTs) for detecting emotional facial expressions (both anger and happiness) were shorter than those for detecting anti-expressions. The RTs for detecting the expressions were negatively related to experienced emotional arousal. These results suggest that efficient detection of emotional facial expressions is not attributable to their visual characteristics but rather to their emotional significance.  相似文献   

2.
The relationship between knowledge of American Sign Language (ASL) and the ability to encode facial expressions of emotion was explored. Participants were 55 college students, half of whom were intermediate-level students of ASL and half of whom had no experience with a signed language. In front of a video camera, participants posed the affective facial expressions of happiness, sadness, fear, surprise, anger, and disgust. These facial expressions were randomized onto stimulus tapes that were then shown to 60 untrained judges who tried to identify the expressed emotions. Results indicated that hearing subjects knowledgeable in ASL were generally more adept than were hearing nonsigners at conveying emotions through facial expression. Results have implications for better understanding the nature of nonverbal communication in hearing and deaf individuals.  相似文献   

3.
The six basic emotions (disgust, anger, fear, happiness, sadness, and surprise) have long been considered discrete categories that serve as the primary units of the emotion system. Yet recent evidence indicated underlying connections among them. Here we tested the underlying relationships among the six basic emotions using a perceptual learning procedure. This technique has the potential of causally changing participants’ emotion detection ability. We found that training on detecting a facial expression improved the performance not only on the trained expression but also on other expressions. Such a transfer effect was consistently demonstrated between disgust and anger detection as well as between fear and surprise detection in two experiments (Experiment 1A, n?=?70; Experiment 1B, n?=?42). Notably, training on any of the six emotions could improve happiness detection, while sadness detection could only be improved by training on sadness itself, suggesting the uniqueness of happiness and sadness. In an emotion recognition test using a large sample of Chinese participants (n?=?1748), the confusion between disgust and anger as well as between fear and surprise was further confirmed. Taken together, our study demonstrates that the “basic” emotions share some common psychological components, which might be the more basic units of the emotion system.  相似文献   

4.
A possible relationship between recognition of facial affect and aberrant eye movement was examined in patients with schizophrenia. A Japanese version of standard pictures of facial affect was prepared. These pictures of basic emotions (surprise, anger, happiness, disgust, fear, sadness) were shown to 19 schizophrenic patients and 20 healthy controls who identified emotions while their eye movements were measured. The proportion of correct identifications of 'disgust' was significantly lower for schizophrenic patients, their eye fixation time was significantly longer for all pictures of facial affect, and their eye movement speed was slower for some facial affects (surprise, fear, and sadness). One index, eye fixation time for "happiness," showed a significant difference between the high- and low-dosage antipsychotic drug groups. Some expected facial affect recognition disorder was seen in schizophrenic patients responding to the Japanese version of affect pictures, but there was no correlation between facial affect recognition disorder and aberrant eye movement.  相似文献   

5.
Deficits in facial emotion recognition occur frequently after stroke, with adverse social and behavioural consequences. The aim of this study was to investigate the neural underpinnings of the recognition of emotional expressions, in particular of the distinct basic emotions (anger, disgust, fear, happiness, sadness and surprise). A group of 110 ischaemic stroke patients with lesions in (sub)cortical areas of the cerebrum was included. Emotion recognition was assessed with the Ekman 60 Faces Test of the FEEST. Patient data were compared to data of 162 matched healthy controls (HC’s). For the patients, whole brain voxel-based lesion–symptom mapping (VLSM) on 3-Tesla MRI images was performed. Results showed that patients performed significantly worse than HC’s on both overall recognition of emotions, and specifically of disgust, fear, sadness and surprise. VLSM showed significant lesion–symptom associations for FEEST total in the right fronto-temporal region. Additionally, VLSM for the distinct emotions showed, apart from overlapping brain regions (insula, putamen and Rolandic operculum), also regions related to specific emotions. These were: middle and superior temporal gyrus (anger); caudate nucleus (disgust); superior corona radiate white matter tract, superior longitudinal fasciculus and middle frontal gyrus (happiness) and inferior frontal gyrus (sadness). Our findings help in understanding how lesions in specific brain regions can selectively affect the recognition of the basic emotions.  相似文献   

6.
Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD). Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.  相似文献   

7.
Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.  相似文献   

8.
Findings from subjects with unilateral brain damage, as well as from normal subjects studied with tachistoscopic paradigms, argue that emotion is processed differently by each brain hemisphere. An open question concerns the extent to which such lateralised processing might occur under natural, freeviewing conditions. To explore this issue, we asked 28 normal subjects to discriminate emotions expressed by pairs of faces shown side-by-side, with no time or viewing constraints. Images of neutral expressions were shown paired with morphed images of very faint emotional expressions (happiness, surprise, disgust, fear, anger, or sadness). We found a surprising and robust laterality effect: When discriminating negative emotional expressions, subjects performed significantly better when the emotional face was to the left of the neutral face; conversely, when discriminating positive expressions, subjects performed better when the emotional face was to the right. We interpret this valence-specific laterality effect as consistent with the idea that the right hemisphere is specialised to process negative emotions, whereas the left is specialised to process positive emotions. The findings have important implications for how humans perceive facial emotion under natural conditions.  相似文献   

9.
Emotion theorists assume certain facial displays to convey information about the expresser's emotional state. In contrast, behavioral ecologists assume them to indicate behavioral intentions or action requests. To test these contrasting positions, over 2,000 online participants were presented with facial expressions and asked what they revealed-feeling states, behavioral intentions, or action requests. The majority of the observers chose feeling states as the message of facial expressions of disgust, fear, sadness, happiness, and surprise, supporting the emotions view. Only the anger display tended to elicit more choices of behavioral intention or action request, partially supporting the behavioral ecology view. The results support the view that facial expressions communicate emotions, with emotions being multicomponential phenomena that comprise feelings, intentions, and wishes.  相似文献   

10.
Hosie  J. A.  Gray  C. D.  Russell  P. A.  Scott  C.  Hunter  N. 《Motivation and emotion》1998,22(4):293-313
This paper reports the results of three tasks comparing the development of the understanding of facial expressions of emotion in deaf and hearing children. Two groups of hearing and deaf children of elementary school age were tested for their ability to match photographs of facial expressions of emotion, and to produce and comprehend emotion labels for the expressions of happiness, sadness, anger, fear, disgust, and surprise. Accuracy data showed comparable levels of performance for deaf and hearing children of the same age. Happiness and sadness were the most accurately matched expressions and the most accurately produced and comprehended labels. Anger was the least accurately matched expression and the most poorly comprehended emotion label. Disgust was the least accurately labeled expression; however, deaf children were more accurate at labeling this expression, and also at labeling fear, than hearing children. Error data revealed that children confused anger with disgust, and fear with surprise. However, the younger groups of deaf and hearing children also showed a tendency to confuse the negative expressions of anger, disgust, and fear with sadness. The results suggest that, despite possible differences in the early socialisation of emotion, deaf and hearing children share a common understanding of the emotions conveyed by distinctive facial expressions.  相似文献   

11.
To study different aspects of facial emotion recognition, valid methods are needed. The more widespread methods have some limitations. We propose a more ecological method that consists of presenting dynamic faces and measuring verbal reaction times. We presented 120 video clips depicting a gradual change from a neutral expression to a basic emotion (anger, disgust, fear, happiness, sadness and surprise), and recorded hit rates and reaction times of verbal labelling of emotions. Our results showed that verbal responses to six basic emotions differed in hit rates and reaction times: happiness > surprise > disgust > anger > sadness > fear (this means these emotional responses were more accurate and faster). Generally, our data are in accordance with previous findings, but our differentiation of responses is better than the data from previous experiments on six basic emotions.  相似文献   

12.
The present study investigates the perception of facial expressions of emotion, and explores the relation between the configural properties of expressions and their subjective attribution. Stimuli were a male and a female series of morphed facial expressions, interpolated between prototypes of seven emotions (happiness, sadness, fear, anger, surprise and disgust, and neutral) from Ekman and Friesen (1976). Topographical properties of the stimuli were quantified using the Facial Expression Measurement (FACEM) scheme. Perceived dissimilarities between the emotional expressions were elicited using a sorting procedure and processed with multidimensional scaling. Four dimensions were retained in the reconstructed facial-expression space, with positive and negative expressions opposed along D1, while the other three dimensions were interpreted as affective attributes distinguishing clusters of expressions categorized as "Surprise-Fear," "Anger," and "Disgust." Significant relationships were found between these affective attributes and objective facial measures of the stimuli. The findings support a componential explanatory scheme for expression processing, wherein each component of a facial stimulus conveys an affective value separable from its context, rather than a categorical-gestalt scheme. The findings further suggest that configural information is closely involved in the decoding of affective attributes of facial expressions. Configural measures are also suggested as a common ground for dimensional as well as categorical perception of emotional faces.  相似文献   

13.
The Emotion Recognition Task is a computer-generated paradigm for measuring the recognition of six basic facial emotional expressions: anger, disgust, fear, happiness, sadness, and surprise. Video clips of increasing length were presented, starting with a neutral face that changes into a facial expression of different intensities (20%-100%). The present study describes methodological aspects of the paradigm and its applicability in healthy participants (N=58; 34 men; ages between 22 and 75), specifically focusing on differences in recognition performance between the six emotion types and age-related change. The results showed that happiness was the easiest emotion to recognize, while fear was the most difficult. Moreover, older adults performed worse than young adults on anger, sadness, fear, and happiness, but not on disgust and surprise. These findings indicate that this paradigm is probably more sensitive than emotion perception tasks using static images, suggesting it is a useful tool in the assessment of subtle impairments in emotion perception.  相似文献   

14.
Abstract

Facial expressions of happiness, excitement, surprise, fear, anger, disgust, sadness, and calm were presented stereoscopically to create pair wise perceptual conflict. Dominance of one expression over another as the most common result, but basic emotions (happiness, fear, etc.) failed to dominate non-basic emotions (excitement, calm), Instead, extremely pleasant or extremely unpleasant emotions dominated less valenced emotions (e.g. surprise). Blends of the presented pairs also occurred, mainly when the emotions were adjacent according to a circumplex structure of emotion. Blends were most common among negatively valenced emotions, such as fear, anger, and disgust.  相似文献   

15.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed.  相似文献   

16.
研究通过"命名复制"、"选择建构"和"自由绘画"三种任务,考察了"嘴""眼""眉"等情绪标签在幼儿情绪识别中的优势效应。结果表明:(1)幼儿识别高兴、悲伤和愤怒要早于惊讶,识别高兴的成绩最好,3~4岁是识别悲伤和惊讶的快速发展期;(2)幼儿识别情绪过程中并未出现典型的"嘴部优势"效应,不同情绪具有不同的代表性的情绪标签,识别高兴和惊讶时嘴部是优势区域,而识别悲伤和愤怒时眼睛和眉毛是优势区域;(3)不同顺序的任务差异显著,先前的情绪命名和复制、建构等任务会显著提高儿童的自由绘画成绩。  相似文献   

17.
18.
We investigated whether categorical perception and dimensional perception can co-occur while decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of four basic emotions (i.e., happiness–fear and anger–disgust) were created by a morphing technique. Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The results show that the happiness–fear continuum was divided into two clusters based on valence, even when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical changes within each cluster. In Experiment 2, we found a category boundary within other continua (i.e., surprise–sadness and excitement–disgust) with regard to the arousal and valence dimensions. These findings indicate that categorical perception and dimensional perception co-occurred when emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of categorical and dimensional accounts.  相似文献   

19.
20.
This study investigated how the cultural match or mismatch between observer and perceiver can affect the accuracy of judgements of facial emotion, and how acculturation can affect cross-cultural recognition accuracy. The sample consisted of 51 Caucasian-Australians, 51 people of Chinese heritage living in Australia (PCHA) and 51 Mainland Chinese. Participants were required to identify the emotions of happiness, sadness, fear, anger, surprise and disgust displayed in photographs of Caucasian and Chinese faces. The PCHA group also responded to an acculturation measure that assessed their adoption of Australian cultural values and adherence to heritage (Chinese) cultural values. Counter to the hypotheses, the Caucasian-Australian and PCHA groups were found to be significantly more accurate at identifying both the Chinese and Caucasian facial expressions than the Mainland Chinese group. Adoption of Australian culture was found to predict greater accuracy in recognising the emotions displayed on Caucasian faces for the PCHA group.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号