首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
采用类别知觉情绪识别范式,考察高、低羞怯儿童对快乐-愤怒和快乐-悲伤模糊情绪面孔的知觉偏差和知觉敏感性。结果发现:(1)相对于低羞怯儿童,高羞怯儿童倾向于将快乐-愤怒模糊情绪面孔知觉为愤怒,将快乐-悲伤模糊情绪面孔知觉为悲伤;(2)两组儿童在快乐-愤怒、快乐-悲伤模糊情绪面孔类别界线处的斜率差异均不显著。研究表明高羞怯儿童具有敌意归因偏向和更高的悲伤共情反应,而对快乐-愤怒和快乐-悲伤表情的类别转变不敏感。  相似文献   

2.
采用类别知觉情绪识别范式,考察高、低羞怯儿童对快乐-愤怒和快乐-悲伤模糊情绪面孔的知觉偏差和知觉敏感性。结果发现:(1)相对于低羞怯儿童,高羞怯儿童倾向于将快乐-愤怒模糊情绪面孔知觉为愤怒,将快乐-悲伤模糊情绪面孔知觉为悲伤;(2)两组儿童在快乐-愤怒、快乐-悲伤模糊情绪面孔类别界线处的斜率差异均不显著。研究表明高羞怯儿童具有敌意归因偏向和更高的悲伤共情反应,而对快乐-愤怒和快乐-悲伤表情的类别转变不敏感。  相似文献   

3.
Human faces are among the most important visual stimuli that we encounter at all ages. This importance partly stems from the face as a conveyer of information on the emotional state of other individuals. Previous research has demonstrated specific scanning patterns in response to threat-related compared to non-threat-related emotional expressions. This study investigated how visual scanning patterns toward faces which display different emotional expressions develop during infancy. The visual scanning patterns of 4-month-old and 7-month-old infants and adults when looking at threat-related (i.e., angry and fearful) versus non-threat-related (i.e., happy, sad, and neutral) emotional faces were examined. We found that infants as well as adults displayed an avoidant looking pattern in response to threat-related emotional expressions with reduced dwell times and relatively less fixations to the inner features of the face. In addition, adults showed a pattern of eye contact avoidance when looking at threat-related emotional expressions that was not yet present in infants. Thus, whereas a general avoidant reaction to threat-related facial expressions appears to be present from very early in life, the avoidance of eye contact might be a learned response toward others' anger and fear that emerges later during development.  相似文献   

4.
Schizophrenia patients have deficits in cognitive control as well as in a number of emotional domains. The antisaccade task is a measure of cognitive control that requires the inhibition of a reflex-like eye movement to a peripheral stimulus. Antisaccade performance has been shown to be modulated by the emotional content of the peripheral stimuli, with emotional stimuli leading to higher error rates than neutral stimuli, reflecting an implicit emotion processing effect. The aim of the present study was to investigate the impact on antisaccade performance of threat-related emotional facial stimuli in schizophrenia patients, first-degree relatives of schizophrenia patients and healthy controls. Fifteen patients, 22 relatives and 26 controls, matched for gender, age and verbal intelligence, carried out an antisaccade task with pictures of faces displaying disgusted, fearful and neutral expressions as peripheral stimuli. We observed higher antisaccade error rates in schizophrenia patients compared to first-degree relatives and controls. Relatives and controls did not differ significantly from each other. Antisaccade error rate was influenced by the emotional nature of the stimuli: participants had higher antisaccade error rates in response to fearful faces compared to neutral and disgusted faces. As this emotional influence on cognitive control did not differ between groups we conclude that implicit processing of emotional faces is intact in patients with schizophrenia and those at risk for the illness.  相似文献   

5.
Four experiments investigated priming of emotion recognition using a range of emotional stimuli, including facial expressions, words, pictures, and nonverbal sounds. In each experiment, a prime-target paradigm was used with related, neutral, and unrelated pairs. In Experiment 1, facial expression primes preceded word targets in an emotion classification task. A pattern of priming of emotional word targets by related primes with no inhibition of unrelated primes was found. Experiment 2 reversed these primes and targets and found the same pattern of results, demonstrating bidirectional priming between facial expressions and words. Experiment 2 also found priming of facial expression targets by picture primes. Experiment 3 demonstrated that priming occurs not just between pairs of stimuli that have a high co-occurrence in the environment (for example, nonverbal sounds and facial expressions), but with stimuli that co-occur less frequently and are linked mainly by their emotional category (for example, nonverbal sounds and printed words). This shows the importance of the prime and target sharing a common emotional category, rather than their previous co-occurrence. Experiment 4 extended the findings by showing that there are category-based effects as well as valence effects in emotional priming, supporting a categorical view of emotion recognition.  相似文献   

6.
Culture and the categorization of emotions.   总被引:6,自引:0,他引:6  
  相似文献   

7.
This study investigated whether observers' facial reactions to the emotional facial expressions of others represent an affective or a cognitive response to these emotional expressions. Three hypotheses were contrasted: (1) facial reactions to emotional facial expressions are due to mimicry as part of an affective empathic reaction; (2) facial reactions to emotional facial expressions are a reflection of shared affect due to emotion induction; and (3) facial reactions to emotional facial expressions are determined by cognitive load depending on task difficulty. Two experiments were conducted varying type of task, presentation of stimuli, and task difficulty. The results show that depending on the nature of the rating task, facial reactions to facial expressions may be either affective or cognitive. Specifically, evidence for facial mimicry was only found when individuals made judgements regarding the valence of an emotional facial expression. Other types of judgements regarding facial expressions did not seem to elicit mimicry but may lead to facial responses related to cognitive load.  相似文献   

8.
Memories of objects are biased toward what is typical of the category to which they belong. Prior research on memory for emotional facial expressions has demonstrated a bias towards an emotional expression prototype (e.g., slightly happy faces are remembered as happier). We investigate an alternate source of bias in memory for emotional expressions – the central tendency bias. The central tendency bias skews reconstruction of a memory trace towards the center of the distribution for a particular attribute. This bias has been attributed to a Bayesian combination of an imprecise memory for a particular object with prior information about its category. Until now, studies examining the central tendency bias have focused on simple stimuli. We extend this work to socially relevant, complex, emotional facial expressions. We morphed facial expressions on a continuum from sad to happy. Different ranges of emotion were used in four experiments in which participants viewed individual expressions and, after a variable delay, reproduced each face by adjusting a morph to match it. Estimates were biased toward the center of the presented stimulus range, and the bias increased at longer memory delays, consistent with the Bayesian prediction that as trace memory loses precision, category knowledge is given more weight. The central tendency effect persisted within and across emotion categories (sad, neutral, and happy). This article expands the scope of work on inductive category effects to memory for complex, emotional stimuli.  相似文献   

9.
In primates, dominance/submission relationships are generally automatically and nonaggressively established in face-to-face confrontations. Researchers have argued that this process involves an explicit psychological stress-manipulation mechanism: Striding with a threatening expression, while keeping direct eye contact, outstresses rivals so that they submissively avert their gaze. In contrast, researchers have proposed a reflexive and implicit modulation of face-to-face confrontation in humans, on the basis of evidence that dominant and submissive individuals exhibit vigilant and avoidant responses, respectively, to facial anger in masked emotional Stroop tasks. However, these tasks do not provide an ecologically valid index of gaze behavior. Therefore, we directly measured gaze responses to masked angry, happy, and neutral facial expressions with a saccade-latency paradigm and found that increased dominance traits predict a more prolonged gaze to (or reluctance to avert gaze from) masked anger. Furthermore, greater non-dominance-related reward sensitivity predicts more persistent gaze to masked happiness. These results strongly suggest that implicit and reflexive mechanisms underlie dominant and submissive gaze behavior in face-to-face confrontations.  相似文献   

10.
Recognition of facial expressions has traditionally been investigated by presenting facial expressions without any context information. However, we rarely encounter an isolated facial expression; usually, we perceive a person's facial reaction as part of the surrounding context. In the present study, we addressed the question of whether emotional scenes influence the explicit recognition of facial expressions. In three experiments, participants were required to categorize facial expressions (disgust, fear, happiness) that were shown against backgrounds of natural scenes with either a congruent or an incongruent emotional significance. A significant interaction was found between facial expressions and the emotional content of the scenes, showing a response advantage for facial expressions accompanied by congruent scenes. This advantage was robust against increasing task load. Taken together, the results show that the surrounding scene is an important factor in recognizing facial expressions.  相似文献   

11.
Although expressions of facial emotion hold a special status in attention relative to other complex objects, whether they summon our attention automatically and against our intentions remains a debated issue. Studies supporting the strong view that attentional capture by facial expressions of emotion is entirely automatic reported that a unique (singleton) emotional face distractor interfered with search for a target that was also unique on a different dimension. Participants could therefore search for the odd-one out face to locate the target and attentional capture by irrelevant emotional faces might be contingent on the adoption of an implicit set for singletons. Here, confirming this hypothesis, an irrelevant emotional face captured attention when the target was the unique face with a discrepant orientation, both when this orientation was unpredictable and when it remained constant. By contrast, no such capture was observed when the target could not be found by monitoring displays for a discrepant face and participants had to search for a face with a specific orientation. Our findings show that attentional capture by emotional faces is not purely stimulus driven and thereby resolve the apparent inconsistency that prevails in the literature on the automaticity of attentional capture by emotional faces.  相似文献   

12.
胡治国  刘宏艳 《心理科学》2015,(5):1087-1094
正确识别面部表情对成功的社会交往有重要意义。面部表情识别受到情绪背景的影响。本文首先介绍了情绪背景对面部表情识别的增强作用,主要表现为视觉通道的情绪一致性效应和跨通道情绪整合效应;然后介绍了情绪背景对面部表情识别的阻碍作用,主要表现为情绪冲突效应和语义阻碍效应;接着介绍了情绪背景对中性和歧义面孔识别的影响,主要表现为背景的情绪诱发效应和阈下情绪启动效应;最后对现有研究进行了总结分析,提出了未来研究的建议。  相似文献   

13.
Volitional attentional control has been found to rely on prefrontal neuronal circuits. According to the attentional control theory of anxiety, impairment in the volitional control of attention is a prominent feature in anxiety disorders. The present study investigated this assumption in socially anxious individuals using an emotional saccade task with facial expressions (happy, angry, fearful, sad, neutral). The gaze behavior of participants was recorded during the emotional saccade task, in which participants performed either pro- or antisaccades in response to peripherally presented facial expressions. The results show that socially anxious persons have difficulties in inhibiting themselves to reflexively attend to facial expressions: They made more erratic prosaccades to all facial expressions when an antisaccade was required. Thus, these findings indicate impaired attentional control in social anxiety. Overall, the present study shows a deficit of socially anxious individuals in attentional control—for example, in inhibiting the reflexive orienting to neutral as well as to emotional facial expressions. This result may be due to a dysfunction in the prefrontal areas being involved in attentional control.  相似文献   

14.
The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general “facial response effect” is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.  相似文献   

15.
Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX discrimination and identification tasks on morphed affective and linguistic facial expression continua. The continua were created by morphing end-point photo exemplars into 11 images, changing linearly from one expression to another in equal steps. For both affective and linguistic expressions, hearing non-signers exhibited better discrimination across category boundaries than within categories for both experiments, thus replicating previous results with affective expressions and demonstrating CP effects for non-canonical facial expressions. Deaf signers, however, showed significant CP effects only for linguistic facial expressions. Subsequent analyses indicated that order of presentation influenced signers’ response time performance for affective facial expressions: viewing linguistic facial expressions first slowed response time for affective facial expressions. We conclude that CP effects for affective facial expressions can be influenced by language experience.  相似文献   

16.
Recent studies have shown that cueing eye gaze can affect the processing of visual information, and this phenomenon is called the gaze-orienting effect (visual-GOE). Emerging evidence has shown that the cueing eye gaze also affects the processing of auditory information (auditory-GOE). However, it is unclear whether the auditory-GOE is modulated by emotion. We conducted three behavioural experiments to investigate whether cueing eye gaze influenced the orientation judgement to a sound, and whether the effect was modulated by facial expressions. The current study set four facial expressions (angry, fearful, happy, and neutral), manipulated the display type of facial expressions, and changed the sequence of gaze and emotional expressions. Participants were required to judge the sound orientation after facial expressions and gaze cues. The results showed that the orientation judgement of sound was influenced by gaze direction in all three experiments, and the orientation judgement of sound was faster when the face was oriented to the target location (congruent trials) than when the face was oriented away from the target location (incongruent trials). The modulation of emotion on auditory-GOE was observed only when gaze shifted followed by facial expression (Exp3); the auditory-GOE was significantly greater for angry faces than for neutral faces. These findings indicate that auditory-GOE as a social phenomenon exists widely, and the effect was modulated by facial expression. Gaze shift before the presentation of emotion was the key influencing factor for the emotional modulation in an auditory target gaze-orienting task. Our findings suggest that the integration of facial expressions and eye gaze was context-dependent.  相似文献   

17.
李静华  郑勇 《心理科学》2014,37(1):40-47
通过行为实验(点探测)和ERP实验(情绪Stroop)两个实验任务,考察了内隐/外显不同水平攻击者的注意偏向及其脑机制。结果表明:高外显攻击者对愤怒面孔存在注意偏向;高外显攻击者在愤怒面孔上的N100波幅显著低于中性面孔,表明其注意分配和调节能力较弱;高外显攻击者较之低外显攻击者的P300波幅更小表明其存在注意等方面的认知加工缺陷,N400波幅更小表明其对愤怒面孔进行语义编码时加工更为流畅;外显攻击组与内隐攻击组在FCz和Cz电极点上不同。这为内隐/外显攻击二者有着独立结构提供了行为与认知神经科学的证据。  相似文献   

18.
Research demonstrates that women experience disgust more readily and with more intensity than men. The experience of disgust is associated with increased attention to disgust-related stimuli, but no prior study has examined sex differences in attention to disgust facial expressions. We hypothesised that women, compared to men, would demonstrate increased attention to disgust facial expressions. Participants (n?=?172) completed an eye tracking task to measure visual attention to emotional facial expressions. Results indicated that women spent more time attending to disgust facial expressions compared to men. Unexpectedly, we found that men spent significantly more time attending to neutral faces compared to women. The findings indicate that women’s increased experience of emotional disgust also extends to attention to disgust facial stimuli. These findings may help to explain sex differences in the experience of disgust and in diagnoses of anxiety disorders in which disgust plays an important role.  相似文献   

19.
Theoretical accounts suggest an increased and automatic neural processing of emotional, especially threat-related, facial expressions and emotional prosody. In line with this assumption, several functional imaging studies showed activation to threat-related faces and voices in subcortical and cortical brain areas during attentional distraction or unconscious stimulus processing. Furthermore, electrophysiological studies provided evidence for automatic early brain responses to emotional facial expressions and emotional prosody. However, there is increasing evidence that available cognitive resources modulate brain responses to emotional signals from faces and voices, even though conflicting findings may occur depending on contextual factors, specific emotions, sensory modality, and neuroscientific methods used. The current review summarizes these findings and suggests that further studies should combine information from different sensory modalities and neuroscientific methods such as functional neuroimaging and electrophysiology. Furthermore, it is concluded that the variable saliency and relevance of emotional social signals on the one hand and available cognitive resources on the other hand interact in a dynamic manner, making absolute boundaries of the automatic processing of emotional information from faces and voices unlikely.  相似文献   

20.
Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers’ discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum “felt the same” or “felt different.” In the identification task, images were presented individually and participants were asked to label the emotion displayed on the face (e.g., “Does she look happy or sad?”). Results suggest that 3.5-year-olds have the same category boundary as adults. They were more likely to report that the image pairs felt “different” at the image pair that crossed the category boundary. These results suggest that 3.5-year-olds perceive happy and sad emotional facial expressions categorically as adults do. Categorizing emotional expressions is advantageous for children if it allows them to use social information faster and more efficiently.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号