首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
An immense body of research demonstrates that emotional facial expressions can be processed unconsciously. However, it has been assumed that such processing takes place solely on a global valence-based level, allowing individuals to disentangle positive from negative emotions but not the specific emotion. In three studies, we investigated the specificity of emotion processing under conditions of limited awareness using a modified variant of an affective priming task. Faces with happy, angry, sad, fearful, and neutral expressions were presented as masked primes for 33 ms (Study 1) or 14 ms (Studies 2 and 3) followed by emotional target faces (Studies 1 and 2) or emotional adjectives (Study 3). Participants' task was to categorise the target emotion. In all three studies, discrimination of targets was significantly affected by the emotional primes beyond a simple positive versus negative distinction. Results indicate that specific aspects of emotions might be automatically disentangled in addition to valence, even under conditions of subjective unawareness.  相似文献   

2.
An immense body of research demonstrates that emotional facial expressions can be processed unconsciously. However, it has been assumed that such processing takes place solely on a global valence-based level, allowing individuals to disentangle positive from negative emotions but not the specific emotion. In three studies, we investigated the specificity of emotion processing under conditions of limited awareness using a modified variant of an affective priming task. Faces with happy, angry, sad, fearful, and neutral expressions were presented as masked primes for 33 ms (Study 1) or 14 ms (Studies 2 and 3) followed by emotional target faces (Studies 1 and 2) or emotional adjectives (Study 3). Participants’ task was to categorise the target emotion. In all three studies, discrimination of targets was significantly affected by the emotional primes beyond a simple positive versus negative distinction. Results indicate that specific aspects of emotions might be automatically disentangled in addition to valence, even under conditions of subjective unawareness.  相似文献   

3.
Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.  相似文献   

4.
胡治国  刘宏艳 《心理科学》2015,(5):1087-1094
正确识别面部表情对成功的社会交往有重要意义。面部表情识别受到情绪背景的影响。本文首先介绍了情绪背景对面部表情识别的增强作用,主要表现为视觉通道的情绪一致性效应和跨通道情绪整合效应;然后介绍了情绪背景对面部表情识别的阻碍作用,主要表现为情绪冲突效应和语义阻碍效应;接着介绍了情绪背景对中性和歧义面孔识别的影响,主要表现为背景的情绪诱发效应和阈下情绪启动效应;最后对现有研究进行了总结分析,提出了未来研究的建议。  相似文献   

5.
王亚鹏  董奇 《心理科学》2006,29(6):1512-1514
本文从情绪的效价载荷及其脑功能成像研究、面部表情的识别及其脑功能成像研究以及情绪的诱发及其脑功能成像研究等三方面介绍了情绪加工的脑机制及其研究现状。从现有的研究成果来看,大脑皮层在加工不同效价载荷的情绪时具有很大的重叠性;有关面部表情识别的研究表明,不同的神经环路负责调节对不同面部表情的反应;有关诱发的情绪的研究表明,前扣带回皮层在表征实验诱发的情绪时扮演着一个非常重要的角色。文章最后指出了情绪研究目前面临的一些问题,并指出在我国开展情绪的脑机制研究的重要意义。  相似文献   

6.
Previous studies have demonstrated that emotions are automatically processed. Even with subliminal presentations, subjects involuntarily mimic specific facial expressions, are influenced by the valence of a preceding emotion during judgments, and exhibit slowed responses to personally meaningful emotions; these effects are due to reflexive mimicry, unconscious carryover of valence, and attentional capture, respectively. However, perception-action effects indicate that rapid processing should involve deep, semantic-level representations of emotion (e.g., “fear”), even in the absence of a clinical emotion disorder. To test this hypothesis, we developed an emotional Stroop task (Emostroop) in which subjects responded nonverbally to emotion words superimposed over task-irrelevant images of faces displaying congruent or incongruent emotional expressions. Subjects reliably responded more slowly to incongruent than to congruent stimuli, and this interference was related to trait measures of emotionality. Rapid processing of facial emotions spontaneously activates semantic, content-rich representations at the level of the specific emotion.  相似文献   

7.
Our facial expressions give others the opportunity to access our feelings, and constitute an important nonverbal tool for communication. Many recent studies have investigated emotional perception in adults, and our knowledge of neural processes involved in emotions is increasingly precise. Young children also use faces to express their internal states and perceive emotions in others, but little is known about the neurodevelopment of expression recognition. The goal of the current study was to determine the normal development of facial emotion perception. We recorded ERPs in 82 children 4 to 15 years of age during an implicit processing task with emotional faces. Task and stimuli were the same as those used and validated in an adult study; we focused on the components that showed sensitivity to emotions in adults (P1, N170 and frontal slow wave). An effect of the emotion expressed by faces was seen on the P1 in the youngest children. With increasing age this effect disappeared while an emotional sensitivity emerged on N170. Early emotional processing in young children differed from that observed in the adolescents, who approached adults. In contrast, the later frontal slow wave, although showing typical age effects, was more positive for neutral and happy faces across age groups. Thus, despite the precocious utilization of facial emotions, the neural processing involved in the perception of emotional faces develops in a staggered fashion throughout childhood, with the adult pattern appearing only late in adolescence.  相似文献   

8.
The left and right hemispheres of the brain are differentially related to the processing of emotions. Although there is little doubt that the right hemisphere is relatively superior for processing negative emotions, controversy exists over the hemispheric role in the processing of positive emotions. Eighty right-handed normal male participants were examined for visual-field (left-right) differences in the perception of facial expressions of emotion. Facial composite (RR, LL) and hemifacial (R, L) sets depicting emotion expressions of happiness and sadness were prepared. Pairs of such photographs were presented bilaterally for 150 ms, and participants were asked to select the photographs that looked more expressive. A left visual-field superiority (a right-hemisphere function) was found for sad facial emotion. A hemispheric advantage in the perception of happy expression was not found.  相似文献   

9.
Visual-field bias in the judgment of facial expression of emotion   总被引:2,自引:0,他引:2  
The left and right hemispheres of the brain are differentially related to the processing of emotions. Although there is little doubt that the right hemisphere is relatively superior for processing negative emotions, controversy exists over the hemispheric role in the processing of positive emotions. Eighty right-handed normal male participants were examined for visual-field (left-right) differences in the perception of facial expressions of emotion. Facial composite (RR, LL) and hemifacial (R, L) sets depicting emotion expressions of happiness and sadness were prepared. Pairs of such photographs were presented bilaterally for 150 ms, and participants were asked to select the photographs that looked more expressive. A left visual-field superiority (a right-hemisphere function) was found for sad facial emotion. A hemispheric advantage in the perception of happy expression was not found.  相似文献   

10.
The relationship between knowledge of American Sign Language (ASL) and the ability to encode facial expressions of emotion was explored. Participants were 55 college students, half of whom were intermediate-level students of ASL and half of whom had no experience with a signed language. In front of a video camera, participants posed the affective facial expressions of happiness, sadness, fear, surprise, anger, and disgust. These facial expressions were randomized onto stimulus tapes that were then shown to 60 untrained judges who tried to identify the expressed emotions. Results indicated that hearing subjects knowledgeable in ASL were generally more adept than were hearing nonsigners at conveying emotions through facial expression. Results have implications for better understanding the nature of nonverbal communication in hearing and deaf individuals.  相似文献   

11.
ABSTRACT— Current theories of emotion perception posit that basic facial expressions signal categorically discrete emotions or affective dimensions of valence and arousal. In both cases, the information is thought to be directly "read out" from the face in a way that is largely immune to context. In contrast, the three studies reported here demonstrated that identical facial configurations convey strikingly different emotions and dimensional values depending on the affective context in which they are embedded. This effect is modulated by the similarity between the target facial expression and the facial expression typically associated with the context. Moreover, by monitoring eye movements, we demonstrated that characteristic fixation patterns previously thought to be determined solely by the facial expression are systematically modulated by emotional context already at very early stages of visual processing, even by the first time the face is fixated. Our results indicate that the perception of basic facial expressions is not context invariant and can be categorically altered by context at early perceptual levels.  相似文献   

12.
In Study 1, we examined the moderating impact of alexithymia (i.e., a difficulty identifying and describing feelings to other people and an externally oriented cognitive style) on the automatic processing of affective information. The affective priming paradigm was used, and lower priming effects for high alexithymia scorers were observed when congruent (incongruent) pairs involving nonverbal primes (angry face) and verbal target were presented. The results held after controlling for participants' negative affectivity. The same effects were replicated in Studies 2 and 3, with trait anxiety and depression entered as additional covariates. In Study 3, no moderating impact of alexithymia was found for verbal-facial pairs suggesting that the results cannot be merely explained in terms of transcoding limitations for high alexithymia scorers. Overall, the present results suggest that alexithymia could be related to a difficulty in processing and automatically using high arousal emotional information to respond to concomittant behavioural demands.  相似文献   

13.
长期以来,关于面孔表情识别的研究主要是围绕着面孔本身的结构特征来进行的,但是近年来的研究发现,面孔表情的识别也会受到其所在的情境背景(如语言文字、身体背景、自然与社会场景等)的影响,特别是在识别表情相似的面孔时,情境对面孔表情识别的影响更大。本文首先介绍和分析了近几年关于语言文字、身体动作、自然场景和社会场景等情境影响个体对面孔表情的识别的有关研究;其次,又分析了文化背景、年龄以及焦虑程度等因素对面孔表情识别情境效应的影响;最后,强调了未来的研究应重视研究儿童被试群体、拓展情绪的类别、关注真实生活中的面孔情绪感知等。  相似文献   

14.
We investigated whether emotional information from facial expression and hand movement quality was integrated when identifying the expression of a compound stimulus showing a static facial expression combined with emotionally expressive dynamic manual actions. The emotions (happiness, neutrality, and anger) expressed by the face and hands were either congruent or incongruent. In Experiment 1, the participants judged whether the stimulus person was happy, neutral, or angry. Judgments were mainly based on the facial expressions, but were affected by manual expressions to some extent. In Experiment 2, the participants were instructed to base their judgment on the facial expression only. An effect of hand movement expressive quality was observed for happy facial expressions. The results conform with the proposal that perception of facial expressions of emotions can be affected by the expressive qualities of hand movements.  相似文献   

15.
Emotional cues contain important information about the intentions and feelings of others. Despite a wealth of research into children's understanding of facial signals of emotions, little research has investigated the developmental trajectory of interpreting affective cues in the voice. In this study, 48 children ranging between 5 and 10 years were tested using forced‐choice tasks with non‐verbal vocalizations and emotionally inflected speech expressing different positive, neutral and negative states. Children as young as 5 years were proficient in interpreting a range of emotional cues from vocal signals. Consistent with previous work, performance was found to improve with age. Furthermore, the two tasks, examining recognition of non‐verbal vocalizations and emotionally inflected speech, respectively, were sensitive to individual differences, with high correspondence of performance across the tasks. From this demonstration of children's ability to recognize emotions from vocal stimuli, we also conclude that this auditory emotion recognition task is suitable for a wide age range of children, providing a novel, empirical way to investigate children's affect recognition skills.  相似文献   

16.
Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals.  相似文献   

17.
Some theoretical and applied implications of individual differences in nonverbal expressiveness were investigated in a medical setting. In Study I, the abilities of 21 physicians to express different emotions through voice tone were assessed and related to physician personality and to actual patient ratings of the physician. Study II replicated Study I using visual as well as vocal cues (i.e., videotapes) of a new sample of physicians, and added a study of physician greetings. It was found that: (1) Aspects of expressive ability were reliably correlated with a cluster of personality traits, thus supporting the notion that nonverbal affective style may be a window to inner dispositions; and (2) expressive ability was related to patient satisfaction with the interpersonal manner of their physicians and to the judged likeability of the physician's greeting, thus providing evidence for the importance of this ability for social interaction.  相似文献   

18.
The utility of recognising emotion expressions for coordinating social interactions is well documented, but less is known about how continuously changing emotion displays are perceived. The nonlinear dynamic systems view of emotions suggests that mixed emotion expressions in the middle of displays of changing expressions may be decoded differently depending on the expression origin. Hysteresis is when an impression (e.g., disgust) persists well after changes in facial expressions that favour an alternative impression (e.g., anger). In expression changes based on photographs (Study 1) and avatar images (Studies 2a-c, 3), we found hystereses particularly in changes between emotions that are perceptually similar (e.g., anger-disgust). We also consistently found uncertainty (neither emotion contributing to the mixed expression was perceived), which was more prevalent in expression sequences than in static images. Uncertainty occurred particularly in changes between emotions that are perceptually dissimilar, such as changes between happiness and negative emotions. This suggests that the perceptual similarity of emotion expressions may determine the extent to which hysteresis and uncertainty occur. Both hysteresis and uncertainty effects support our premise that emotion decoding is state dependent, a characteristic of dynamic systems. We propose avenues to test possible underlying mechanisms.  相似文献   

19.
Hess U 《Emotion (Washington, D.C.)》2003,3(1):76-80; discussion 92-6
P. Rozin and A. B. Cohen (2003) contend that confusion is an emotion because it is valenced, it has a distinct facial expression, and it has a distinct internal state. On the basis of these criteria, they call for further study of this unstudied stateand challenge emotion researchers to consider "confusion" to be an emotion. The author agrees with Rozin and Cohen (2003) that confusion is an affective state, is valenced, has an (internal) object, may be expressed facially, and that laypersons may, under certain circumstances, consider it an emotion. However, its expression is likely to be an expressive component of emotions for which goal obstruction is central. Further, confusion may also not be as commonly considered an emotion by laypersons, as Rozin and Cohen contend. Finally, confusion is not unstudied, only most of the time it is not emotion researchers who do the researching.  相似文献   

20.
In this study we used an affective priming task to address the issue of whether the processing of emotional facial expressions occurs automatically independent of attention or attentional resources. Participants had to attend to the emotion expression of the prime face, or to a nonemotional feature of the prime face, the glasses. When participants attended to glasses (emotion unattended), they had to report whether the face wore glasses or not (the glasses easy condition) or whether the glasses were rounded or squared (the shape difficult condition). Affective priming, measured on valence decisions on target words, was mainly defined as interference from incongruent rather than facilitation from congruent trials. Significant priming effects were observed just in the emotion and glasses tasks but not in the shape task. When the key–response mapping increased in complexity, taxing working memory load, affective priming effects were reduced equally for the three types of tasks. Thus, attentional load and working memory load affected additively to the observed reduction in affective priming. These results cast some doubts on the automaticity of processing emotional facial expressions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号