首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Six actors attempted to communicate by facial expression seven assumedly basic emotions (pleasure, surprise, fear, hate, sorrow, disgust and interest), and all pairwise blends, e.g., fear+sorrow. One hundred and eighty-two subjects (divided into groups as to the six actors) judged pictures of these emotions by three methods: (1) mapping, placing the pictures on coordinate systems with denotated axes, (2) identification and (3) sorting similar emotions into the same pile, followed by multidimensional scaling and cluster analyses. Recognition of the emotions was fairly good, though not equally good for all emotions and their blends; also the actors' ability to express emotions varied. Emotions of opposite hedonic tone did not blend well. Interest seemed to lend poignancy to the basic emotion with which it was blended rather than to constitute an emotion in itself. Expressions seemed to be more easily identified if the actor did not try to feel the emotion too deeply.  相似文献   

2.
    
The current study tested whether men and women receive different degrees of social punishment for violating norms of emotional expression. Participants watched videos of male and female targets (whose reactions were pre-tested to be equivalent in expressivity and valence) viewing either a positive or negative slideshow, with their emotional reaction to the slideshow manipulated to be affectively congruent, affectively incongruent, or flat. Participants then rated the target on a number of social evaluation measures. Displaying an incongruent emotional expression, relative to a congruent one, harmed judgments of women more than men. Women are expected to be more emotionally expressive than men, making an incongruent expression more deviant for women. These results highlight the importance of social norms in construing another person’s emotion displays, which can subsequently determine acceptance or rejection of that person.  相似文献   

3.
    
The purpose of the present investigation was to assess whether interpersonal closeness facilitates earlier emotion detection as the emotional expression unfolds. Female undergraduate participants were either paired with a close friend or an acquaintance (n = 92 pairs). Participants viewed morphed movies of their partner and a stranger gradually shifting from a neutral to either a sad, angry, or happy expression. As predicted, findings indicate a closeness advantage. Close friends detected the onset of their partners’ angry and sad expressions earlier than acquaintances. Additionally, close friends were more accurate than acquaintances in identifying angry and sad expressions at the onset, particularly in non-vignette conditions when these expressions were void of context. These findings suggest that closeness does indeed facilitate emotional perception, particularly in ambiguous situations for negative emotions.  相似文献   

4.
    
Most people automatically withdraw from socially threatening situations. However, people high in trait anger could be an exception to this rule, and may even display an eagerness to approach hostile situations. To test this hypothesis, we asked 118 participants to complete an approach-avoidance task, in which participants made approach or avoidance movements towards faces with an angry or happy expression, and a direct or averted eye gaze. As expected, higher trait anger predicted faster approach (than avoidance) movements towards angry faces. Crucially, this effect occurred only for angry faces with a direct eye gaze, presumably because they pose a specific social threat, in contrast to angry faces with an averted gaze. No parallel effects were observed for happy faces, indicating that the effects of trait anger were specific to hostile stimuli. These findings suggest that people high in trait anger may automatically approach hostile interaction partners.  相似文献   

5.
    
Adults perceive emotional expressions categorically, with discrimination being faster and more accurate between expressions from different emotion categories (i.e. blends with two different predominant emotions) than between two stimuli from the same category (i.e. blends with the same predominant emotion). The current study sought to test whether facial expressions of happiness and fear are perceived categorically by pre-verbal infants, using a new stimulus set that was shown to yield categorical perception in adult observers (Experiments 1 and 2). These stimuli were then used with 7-month-old infants (N = 34) using a habituation and visual preference paradigm (Experiment 3). Infants were first habituated to an expression of one emotion, then presented with the same expression paired with a novel expression either from the same emotion category or from a different emotion category. After habituation to fear, infants displayed a novelty preference for pairs of between-category expressions, but not within-category ones, showing categorical perception. However, infants showed no novelty preference when they were habituated to happiness. Our findings provide evidence for categorical perception of emotional expressions in pre-verbal infants, while the asymmetrical effect challenges the notion of a bias towards negative information in this age group.  相似文献   

6.
Children who experienced autism, mental retardation, and language disorders; and, children in a clinical control group were shown photographs of human female, orangutan, and canine (boxer) faces expressing happiness, sadness, anger, surprise and a neutral expression. For each species of faces, children were asked to identify the happy, sad, angry, or surprised expressions. In Experiment 1, error patterns suggested that children who experienced autism were attending to features of the lower face when making judgements about emotional expressions. Experiment 2 supported this impression. When recognizing facial emotion, children without autism performed better when viewing the full face, compared to the upper and lower face alone. Children with autism performed no better when viewing the full face than they did when viewing partial faces; and, performed no better than chance when viewing the upper face alone. The results are discussed with respect to differences in the manner that children with and without autism process social information communicated by the face.  相似文献   

7.
Using signal detection methods, possible effects of emotion type (happy, angry), gender of the stimulus face, and gender of the participant on the detection and response bias of emotion in briefly presented faces were investigated. Fifty-seven participants (28 men, 29 women) viewed 90 briefly presented faces (30 happy, 30 angry, and 30 neutral, each with 15 male and 15 female faces) answering yes if the face was perceived as emotional and no if it was not perceived as emotional. Sensitivity [d', z(hit rate) minus z(false alarm rate)] and response bias (β, likelihood ratio of "signal plus noise" vs. "noise") were measured for each face combination for each presentation time (6.25, 12.50, 18.75, 25.00, 31.25 ms). The d' values were higher for happy than for angry faces and higher for angry-male than for angry-female faces, and there were no effects of gender-of-participant. Results also suggest a greater tendency for participants to judge happy-female faces as emotional, as shown by lower β values for these faces as compared to the other emotion-gender combinations. This happy-female response bias suggests, at least, a partial explanation to happy-superiority effects in studies where performance is only measured as percent correct responses, and, in general, that women are expected to be happy.  相似文献   

8.
    
A preference for static face patterns is observed in newborns and disappears around 3 months after birth. A previous study has demonstrated that 5‐month‐old infants prefer schematic faces only when the internal features are moving, suggesting that face‐specific movement enhances infants' preference. The present study investigates the facilitative effect of the movement of internal facial features on infants' preference. To examine infants' preference, we used animated face patterns consisting of a head‐shaped contour and three disk blobs. The inner blobs expanded and contracted to represent the opening and closing of the eyes and mouth, and were constrained to open and close only in a biologically possible vertical direction resembling the facial muscle structure. We compared infants' preferential looking time for this vertically moving (VM) face pattern with their looking time for a horizontally moving (HM) face pattern in which blobs transformed at the same speed in a biologically impossible, horizontal direction. In Experiment 1, 7 to 8‐month‐olds preferred the VM to the HM, but 5 to 6‐month‐olds did not. However, the preference was diminished in both cases when the moving face patterns were presented without contour (Experiment 2). Our results suggest that internal facial features with vertical movements promote face preference in 7 to 8‐month‐olds. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

9.
采用无任务浏览范式,选取积极、中性和消极的情绪面孔作为材料,将面孔划分为3个兴趣区(眼睛、鼻子和嘴),考察中国个体在加工本族与高加索情绪面孔的眼动注视特点。研究结果显示,在浏览本族的(积极、中性和消极)情绪面孔时,中国被试对眼睛和鼻子的注视次数和时长显著多于对嘴部的注视次数和时长。在浏览高加索的(积极、中性和消极)情绪面孔时,中国被试对眼睛和鼻子的注视次数和时长显著多于其对嘴部的注视次数和时长,并且对鼻子的注视次数和时长也显著多于眼睛的注视次数和时长。结果表明,在加工本族面孔表情,中国被试将鼻子和眼睛作为主要注视区域,而加工高加索面孔表情时则是将鼻子作为最主要注视区域,即中国被试加工本族和异族面孔表情的注视特点存在差异。  相似文献   

10.
    
Multiple facial cues such as facial expression and face gender simultaneously influence facial trustworthiness judgement in adults. The current work was to examine the effect of multiple facial cues on trustworthiness judgement across age groups. Eight-, 10-year-olds, and adults detect trustworthiness from happy and neutral adult faces (female and male faces) in Experiment 1. Experiment 2 included both adult and child faces wearing happy, angry, and neutral expressions. Nine-, 11-, 13-year-olds, and adults had to rate facial trustworthiness with a 7-point Likert scale. The results of Experiments 1 and 2 revealed that facial expression and face gender independently affected facial trustworthiness judgement in children aged 10 and below but simultaneously affected judgement in children aged 11 and above, adolescents, and adults. There was no own-age bias in children and adults. The results showed that children younger than 10 could not process multiple facial cues in the same manner as in older children and adults when judging trustworthiness. The current findings provide evidence for the stable-feature account, but not for the own-age bias account or the expertise account.  相似文献   

11.
    
Exposure to fearful facial expressions enhances vision at low spatial-frequencies and impairs vision at high spatial-frequencies. This perceptual trade-off is thought to be a consequence of a fear-related activation of the magnocellular visual pathway to the amygdala. In this study we examined the generality of the effect of emotion on low-level visual perception by assessing participants' orientation sensitivity to low and high spatial-frequency targets following exposure to disgust, fear, and neutral facial expressions. The results revealed that exposure to fear and disgust expressions have opposing effects on early vision: fearful expressions enhanced low spatial-frequency vision and impaired high spatial-frequency vision, while disgust expressions, like neutral expressions, impaired low spatial-frequency vision and enhanced high spatial-frequency vision. Thus we show the effect of exposure to fear on visual perception is not a general emotional effect, but rather one that may that depend on amygdala activation, or one that may be specific to fear.  相似文献   

12.
Ratings of emotion in laterally presented faces: sex and handedness effects   总被引:2,自引:0,他引:2  
Sixteen right-handed participants (8 male and 8 female students) and 16 left-handed participants (8 male and 8 female students) were presented with cartoon faces expressing emotions ranging from extremely positive to extremely negative. A forced-choice paradigm was used in which the participants were asked to rate the faces as either positive or negative. Compared to men, women rated faces more positively, especially in response to right visual field presentations. Women rated neutral and mildly positive faces more positively in the right than in the left visual field, whereas men rated these faces consistently across visual fields. Handedness did not affect the ratings of emotion. The data suggest a positive emotional bias of the left hemisphere in women.  相似文献   

13.
表情判别能力的发展特点与影响因素   总被引:1,自引:0,他引:1  
乔建中 《心理科学》1998,21(1):52-56
本实验通过不同年龄学生对各类表情的判别之研究,探讨了表情判别能力的发展特点及其与个体情绪发展间的关系。结果表明:①身段表情判别能力是导致表情判别年龄差异的核心因素;②面部表情判别能力发展较早,在小学阶段已相当完善,身段表情判别能力发展较晚,到大学阶段才达到与前者相同的水平;③身段表情判别能力的发展与个体情绪发展的阶段性特点及其主要社会适应问题相关联。  相似文献   

14.
    
Sutherland and Young's perspective is a timely and rigorous examination of trait impressions based on facial cues. We propose three strtegies to further advance the field: incorporating natural language processing, including diverse facial stimuli, and re-interpreting developmental data.  相似文献   

15.
    
We tested the hypothesis that exposure to babyish faces can serve a social surrogacy function, such that even limited exposure to babyish faces can fulfill social belongingness needs. We manipulated the sex and facial maturity of a target face seen in an imagined social interaction, on a between-participants basis. Regardless of target sex, individuals indicated greater satisfaction of social belongingness needs following an imagined interaction with a babyish face, compared to a mature adult face. These results indicate that brief exposure to babyish (relative to mature) faces, even without an extensive interaction, can lead to the satisfaction of social belongingness needs.  相似文献   

16.
面孔社会知觉指知觉者基于面孔所有者的面孔信息对面孔所有者的人格特质等进行知觉推断的过程。表情是人们进行面孔社会知觉的关键线索之一。表情可以单独通过本身的局部特征和结构信息影响面孔社会知觉, 还可以通过对知觉者的情绪诱发或表情传达的行为倾向性来影响面孔社会知觉的结果。考虑到现实生活中多种表情类型的组合及特定表情(伪装表情)高频出现以及知觉者判断人格特质存在主观性, 未来研究要加强多种表情类型对面孔社会知觉的影响研究, 还要进一步将知觉者因素作为未来研究的变量。  相似文献   

17.
    
This study examined the relationship between the intensity of emotional expressions in facial stimuli and receivers' decoding accuracy for six basic emotions: anger, disgust, fear, happiness, sadness, and surprise. A laboratory experiment was conducted using the forced-choice method, in which the intensity of each stimulus was manipulated at every 10% interval using the morphing technique. To explore whether a linear relationship would be observed when the intensity was finely manipulated at 10% intervals, a hierarchical multiple regression analysis was performed. The mean percentage of correct responses for each stimulus was the dependent variable, and the linear, quadratic, and cubic terms of the stimulus intensity were the independent variables. The results showed that the linear model was not adopted as the final model for all facial expressions; that is, the effect of the squared term of intensity was significant for anger, disgust, fear, and sadness, while the effect of the cubic term of intensity was significant for happiness and surprise. Our findings indicate that a higher intensity of emotional expression does not yield higher decoding accuracy.  相似文献   

18.
    
We tested the effect of mask use and other-race effect on (a) face recognition, (b) recognition of facial expressions, and (c) social distance. Caucasian subjects were tested in a matching-to-sample paradigm with either masked or unmasked Caucasian and Asian faces. The participants exhibited the best performance in recognizing an unmasked face condition and the poorest to recognize a masked face that they had seen earlier without a mask. Accuracy was poorer for Asian faces than Caucasian faces. The second experiment presented Asian or Caucasian faces having emotional expressions, with and without masks. The participants' emotion recognition performance decreased for masked faces. From the most accurately to least accurately recognized emotions were as follows: happy, neutral, disgusted, fearful. Performance was poorer for Asian stimuli compared to Caucasian. In Experiment 3 the same participants indicated the social distance they would prefer with each pictured person. They preferred a wider distance with unmasked faces compared to masked faces. Distance from farther to closer was as follows: disgusted, fearful, neutral, and happy. They preferred wider social distance for Asian compared to Caucasian faces. Altogether, findings indicated that during the COVID-19 pandemic mask wearing decreased recognition of faces and emotional expressions, negatively impacting communication among people from different ethnicities.  相似文献   

19.
20.
Various studies have indicated that children with ADHD have difficulties with the facial interpretation of affect. Research in the adult ADHD population overall is scarce. The present study explores how adults with ADHD react to a simple attention task and an emotion-containing task. Thirty adults clinically diagnosed with ADHD and 30 non-ADHD controls completed a computer-based task through a set of facial expressions standardized for Chile. The task was composed of two parts: first, a simple attention task with facial expressions; and second, a facial expression-labelling task. Reaction Time (RT) and Accuracy (Acc) of responses were evaluated. Participants with ADHD responded significantly faster and were significantly less accurate in both tasks compared to controls. Across both groups, emotion-specific errors increased in the facial expression of anger. Additionally, the ADHD group was significantly faster in responding for anger, but not for happiness or neutral expressions. Impulsivity commonly associated with ADHD may account for faster RT and lower Acc. Moreover, happiness may be more pleasant to identify than anger. These results are consistent with studies that have recorded greater Acc for positive emotions in comparison with negative emotions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号