首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The six basic emotions (disgust, anger, fear, happiness, sadness, and surprise) have long been considered discrete categories that serve as the primary units of the emotion system. Yet recent evidence indicated underlying connections among them. Here we tested the underlying relationships among the six basic emotions using a perceptual learning procedure. This technique has the potential of causally changing participants’ emotion detection ability. We found that training on detecting a facial expression improved the performance not only on the trained expression but also on other expressions. Such a transfer effect was consistently demonstrated between disgust and anger detection as well as between fear and surprise detection in two experiments (Experiment 1A, n?=?70; Experiment 1B, n?=?42). Notably, training on any of the six emotions could improve happiness detection, while sadness detection could only be improved by training on sadness itself, suggesting the uniqueness of happiness and sadness. In an emotion recognition test using a large sample of Chinese participants (n?=?1748), the confusion between disgust and anger as well as between fear and surprise was further confirmed. Taken together, our study demonstrates that the “basic” emotions share some common psychological components, which might be the more basic units of the emotion system.  相似文献   

2.
3.
Abstract

Facial expressions of happiness, excitement, surprise, fear, anger, disgust, sadness, and calm were presented stereoscopically to create pair wise perceptual conflict. Dominance of one expression over another as the most common result, but basic emotions (happiness, fear, etc.) failed to dominate non-basic emotions (excitement, calm), Instead, extremely pleasant or extremely unpleasant emotions dominated less valenced emotions (e.g. surprise). Blends of the presented pairs also occurred, mainly when the emotions were adjacent according to a circumplex structure of emotion. Blends were most common among negatively valenced emotions, such as fear, anger, and disgust.  相似文献   

4.
Hosie  J. A.  Gray  C. D.  Russell  P. A.  Scott  C.  Hunter  N. 《Motivation and emotion》1998,22(4):293-313
This paper reports the results of three tasks comparing the development of the understanding of facial expressions of emotion in deaf and hearing children. Two groups of hearing and deaf children of elementary school age were tested for their ability to match photographs of facial expressions of emotion, and to produce and comprehend emotion labels for the expressions of happiness, sadness, anger, fear, disgust, and surprise. Accuracy data showed comparable levels of performance for deaf and hearing children of the same age. Happiness and sadness were the most accurately matched expressions and the most accurately produced and comprehended labels. Anger was the least accurately matched expression and the most poorly comprehended emotion label. Disgust was the least accurately labeled expression; however, deaf children were more accurate at labeling this expression, and also at labeling fear, than hearing children. Error data revealed that children confused anger with disgust, and fear with surprise. However, the younger groups of deaf and hearing children also showed a tendency to confuse the negative expressions of anger, disgust, and fear with sadness. The results suggest that, despite possible differences in the early socialisation of emotion, deaf and hearing children share a common understanding of the emotions conveyed by distinctive facial expressions.  相似文献   

5.
Recent research has shown that pride, like the "basic" emotions of anger, disgust, fear, happiness, sadness, and surprise, has a distinct, nonverbal expression that can be recognized by adults (J. L. Tracy & R. W. Robins, 2004b). In 2 experiments, the authors examined whether young children can identify the pride expression and distinguish it from expressions of happiness and surprise. Results suggest that (a) children can recognize pride at above-chance levels by age 4 years; (b) children recognize pride as well as they recognize happiness; (c) pride recognition, like happiness and surprise recognition, improves from age 3 to 7 years; and (d) children's ability to recognize pride cannot be accounted for by the use of a process of elimination (i.e., an exclusion rule) to identify an unknown entity. These findings have implications for the development of emotion recognition and children's ability to perceive and communicate pride.  相似文献   

6.
ABSTRACT. The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.  相似文献   

7.
To study different aspects of facial emotion recognition, valid methods are needed. The more widespread methods have some limitations. We propose a more ecological method that consists of presenting dynamic faces and measuring verbal reaction times. We presented 120 video clips depicting a gradual change from a neutral expression to a basic emotion (anger, disgust, fear, happiness, sadness and surprise), and recorded hit rates and reaction times of verbal labelling of emotions. Our results showed that verbal responses to six basic emotions differed in hit rates and reaction times: happiness > surprise > disgust > anger > sadness > fear (this means these emotional responses were more accurate and faster). Generally, our data are in accordance with previous findings, but our differentiation of responses is better than the data from previous experiments on six basic emotions.  相似文献   

8.
Theory of mind studies of emotion usually focus on children's ability to predict other people's feelings. This study examined children's spontaneous references to mental states in explaining others' emotions. Children (4‐, 6‐ and 10‐year‐olds, n = 122) were told stories and asked to explain both typical and atypical emotional reactions of characters. Because atypical emotional reactions are unexpected, we hypothesized that children would be more likely to refer to mental states, such as desires and beliefs, in explaining them than when explaining typical emotions. From the development of lay theories of emotion, derived the prediction that older children would refer more often to mental states than younger children. The developmental shift from a desire‐psychology to a belief‐psychology led to the expectation that references to desires would increase at an earlier age than references to beliefs. Our findings confirmed these expectations only partly, because the nature of the emotion (happiness, anger, sadness or fear) interacted with these factors. Whereas anger, happiness and sadness mainly evoked desire references, fear evoked more belief references, even in 4‐year‐olds. The fact that other factors besides age can also play an influential role in children's mental state reasoning is discussed. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

9.
The ability to recognize and label emotional facial expressions is an important aspect of social cognition. However, existing paradigms to examine this ability present only static facial expressions, suffer from ceiling effects or have limited or no norms. A computerized test, the Emotion Recognition Task (ERT), was developed to overcome these difficulties. In this study, we examined the effects of age, sex, and intellectual ability on emotion perception using the ERT. In this test, emotional facial expressions are presented as morphs gradually expressing one of the six basic emotions from neutral to four levels of intensity (40%, 60%, 80%, and 100%). The task was administered in 373 healthy participants aged 8–75. In children aged 8–17, only small developmental effects were found for the emotions anger and happiness, in contrast to adults who showed age‐related decline on anger, fear, happiness, and sadness. Sex differences were present predominantly in the adult participants. IQ only minimally affected the perception of disgust in the children, while years of education were correlated with all emotions but surprise and disgust in the adult participants. A regression‐based approach was adopted to present age‐ and education‐ or IQ‐adjusted normative data for use in clinical practice. Previous studies using the ERT have demonstrated selective impairments on specific emotions in a variety of psychiatric, neurologic, or neurodegenerative patient groups, making the ERT a valuable addition to existing paradigms for the assessment of emotion perception.  相似文献   

10.
Abstract

Young children's early understanding of emotion was investigated by examining their use of emotion terms such as happy, sad, mud, and cry. Five children's emotion language was examined longitudinally from the age of 2 to 5 years, and as a comparison their reference to pains via such terms as burn, sting, and hurt was also examined. In Phase 1 we confirmed and extended prior findings demonstrating that by 2 years of age terms for the basic emotions of happiness, sadness, anger, and fear are commonly used by children as are terms for such related states as crying and hurting. At this early age children produce such terms to refer to self and to others, and to past and future as well as to present states. Over the years from 2 to 5 children's emotion vocabulary expands, their discussion of hypothetical emotions gets underway, and the complexity of their emotion utterances increases. In Phase 2 our analyses go beyond children's production of emotion terms to analyses of their conception of emotion. We focus especially on when children use emotion terms to refer to subjective experiential states of persons. From their earliest uses of these terms in our data children  相似文献   

11.
关于不同情绪是否对应不同的生理反应一直存在争议。Nummenmma等人(2014)使用自创的emBODY工具,发现每种情绪都有其独特的身体感觉地图(BSMs)。本研究以中国大学生为被试,以emBODY为研究工具绘制快乐、爱、恐惧、焦虑四种情绪的BSMs,并要求被试口头报告BSMs所反映的身体感觉。结果发现,四种情绪具有不同的身体感觉,体现为BSMs的差异与主观报告的差异。质性资料分析发现BSMs所反映的身体感觉不仅包括生理反应、也包括认知、感受和行为倾向,这为情绪与身体的关系提供了新的证据。被试对身体部位活动性增强或减弱的理解不一致、只呈现身体的一面等是BSMs作为研究工具的潜在局限,未来研究需要做出改进。  相似文献   

12.
研究通过"命名复制"、"选择建构"和"自由绘画"三种任务,考察了"嘴""眼""眉"等情绪标签在幼儿情绪识别中的优势效应。结果表明:(1)幼儿识别高兴、悲伤和愤怒要早于惊讶,识别高兴的成绩最好,3~4岁是识别悲伤和惊讶的快速发展期;(2)幼儿识别情绪过程中并未出现典型的"嘴部优势"效应,不同情绪具有不同的代表性的情绪标签,识别高兴和惊讶时嘴部是优势区域,而识别悲伤和愤怒时眼睛和眉毛是优势区域;(3)不同顺序的任务差异显著,先前的情绪命名和复制、建构等任务会显著提高儿童的自由绘画成绩。  相似文献   

13.
In this study, deaf children's understanding of their own emotions was compared with that of hearing peers. Twenty‐six deaf children (mean age 11 years) and 26 hearing children, matched for age and gender, were presented with various tasks that tap into their emotion awareness and regulation (coping) regarding the four basic emotions (happiness, anger, sadness, and fear). The findings suggest that deaf children have no difficulties in identifying their own basic emotions and the elicitors, or multiple emotions of opposite valence (happy and sad). Yet, they did show an impaired capacity to differentiate between their own emotions within the negative spectrum, which suggests a more generic evaluation of the situation. Deaf children's emotion regulation strategies showed a strong preference for approaching the situation at hand, but almost no deaf child reported the use of an avoidant tactic in order to diminish the negative impact of the situation. Overall, deaf children's emotion regulation strategies seemed less effective than those of their hearing peers. The implications for deaf children's emotional development are discussed.  相似文献   

14.
This article documents the neglect of love in many contemporary emotion theories, despite its prominence in the lay psychology of emotion. We argue that love should be considered a basic emotion, like anger, sadness, happiness, and fear. We discuss the criteria that various theorists use to distinguish basic from nonbasic emotions, and we marshal arguments and evidence from a variety of sources suggesting that love fits the criteria for basicness. We conclude that a number of controversies over the status of love can be resolved by distinguishing between the momentary surge form of love, a basic emotion having properties similar to joy, sadness, fear, etc., and relational love, a bond that develops between people, associated with states that include not only surge love, but many other emotions such as distress and anxiety. Finally, we suggest that “love” is the broad, everyday name for emotions related to three interrelated behavioral systems discussed by Bowlby (1979): attachment, caregiving, and sex.  相似文献   

15.
Information about the emotions experienced by observers when they witness crimes would have important theoretical and practical implications, but to date no study has broadly assessed such emotional reactions. This study addressed this gap in the literature. Observers in seven countries viewed seven videos portraying actual crimes and rated their emotional reactions to each using 14 emotion scales. Observers reported significantly high levels of negative emotions including anger, contempt, disgust, fear and sadness‐related emotions, and anger, contempt and disgust were the most salient emotions experienced by viewers across all countries. Witnesses also reported significantly high levels of positive emotions as well (compared to not feeling the emotion at all), which was unexpected. Country moderated the emotion ratings; post‐hoc analyses indicated that masculine‐oriented cultures reported less nervousness, surprise, excitement, fear and embarrassment than feminine cultures.  相似文献   

16.
There are two contrasting hypotheses that attempt to explain how emotion perception might be organised in the brain. One suggests that all emotions are lateralised to the right hemisphere whereas the other suggests that emotions may be differently lateralised according to valence. Here these two theories are contrasted, in addition to considering the role of emotional intensity in explaining possible differences in strength of lateralisation across emotions. Participants completed a Chimeric Faces Test for each of the six basic emotions: anger, disgust, fear, happiness, sadness and surprise. All emotions showed significant lateralisation to the right hemisphere, however, differences in strength of lateralisation within the right hemisphere were found. Stronger patterns of right hemisphere lateralisation were found for positive emotions and for emotions of higher intensity. The results support the right-hemisphere hypothesis, but suggest that there may be variability in organisation within the right hemisphere across different types of emotion.  相似文献   

17.
Haptics plays an important role in emotion perception. However, most studies of the affective aspects of haptics have investigated emotional valence rather than emotional categories. In the present study, we explored the associations of different textures with six basic emotions: fear, anger, happiness, disgust, sadness and surprise. Participants touched twenty-one different textures and evaluated them using six emotional scales. Additionally, we explored whether individual differences in participants’ levels of alexithymia are related to the intensity of emotions associated with touching the textures. Alexithymia is a trait related to difficulties in identifying, describing and communicating emotions to others. The findings show that people associated touching different textures with distinct emotions. Textures associated with each of the basic emotions were identified. The study also revealed that a higher alexithymia level corresponds to a higher intensity of associations between textures and the emotions of disgust, anger and sadness.  相似文献   

18.
Age differences in emotion recognition from lexical stimuli and facial expressions were examined in a cross-sectional sample of adults aged 18 to 85 (N = 357). Emotion-specific response biases differed by age: Older adults were disproportionately more likely to incorrectly label lexical stimuli as happiness, sadness, and surprise and to incorrectly label facial stimuli as disgust and fear. After these biases were controlled, findings suggested that older adults were less accurate at identifying emotions than were young adults, but the pattern differed across emotions and task types. The lexical task showed stronger age differences than the facial task, and for lexical stimuli, age groups differed in accuracy for all emotional states except fear. For facial stimuli, in contrast, age groups differed only in accuracy for anger, disgust, fear, and happiness. Implications for age-related changes in different types of emotional processing are discussed.  相似文献   

19.
This study compares the ability of children aged from 6 to 11 to freely produce emotional labels based on detailed scenarios (labelling task), and their ability to depict basic emotions in their human figure drawing (subsequent drawing task). This comparison assesses the relevance of the use of a human figure drawing task in order to test children's comprehension of basic emotions. Such a comparison has never been undertaken up to now, the two tasks being seen as belonging to relatively separate fields of investigation. Results indicate corresponding developmental patterns for both tasks and a clear‐cut gap between simple emotions (happiness and sadness) and complex emotions (anger, fear, and disgust) in the ability to label and to depict basic emotions. These results suggest that a drawing task can be used to assess children's understanding of basic emotions. Results are discussed according to the development of perceptual skills and the development of emotion conceptualization.  相似文献   

20.
Reading the non‐verbal cues from faces to infer the emotional states of others is central to our daily social interactions from very early in life. Despite the relatively well‐documented ontogeny of facial expression recognition in infancy, our understanding of the development of this critical social skill throughout childhood into adulthood remains limited. To this end, using a psychophysical approach we implemented the QUEST threshold‐seeking algorithm to parametrically manipulate the quantity of signals available in faces normalized for contrast and luminance displaying the six emotional expressions, plus neutral. We thus determined observers' perceptual thresholds for effective discrimination of each emotional expression from 5 years of age up to adulthood. Consistent with previous studies, happiness was most easily recognized with minimum signals (35% on average), whereas fear required the maximum signals (97% on average) across groups. Overall, recognition improved with age for all expressions except happiness and fear, for which all age groups including the youngest remained within the adult range. Uniquely, our findings characterize the recognition trajectories of the six basic emotions into three distinct groupings: expressions that show a steep improvement with age – disgust, neutral, and anger; expressions that show a more gradual improvement with age – sadness, surprise; and those that remain stable from early childhood – happiness and fear, indicating that the coding for these expressions is already mature by 5 years of age. Altogether, our data provide for the first time a fine‐grained mapping of the development of facial expression recognition. This approach significantly increases our understanding of the decoding of emotions across development and offers a novel tool to measure impairments for specific facial expressions in developmental clinical populations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号