首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Attending versus ignoring a stimulus can later determine how it will be affectively evaluated. Here, we asked whether attentional states could also modulate subsequent sensitivity to facial expressions of emotion. In a dual-task procedure, participants first rapidly searched for a gender-defined face among two briefly displayed neutral faces. Then a test face with the previously attended or ignored face’s identity was presented, and participants judged whether it was emotionally expressive (happy, angry, or fearful) or neutral. Intensity of expression in the test face was varied so that an expression detection threshold could be determined. When fearful or angry expressions were judged, expression sensitivity was worse for faces bearing the same identity as a previously ignored versus attended face. When happy expressions were judged, sensitivity was unaffected by prior attention. These data support the notion that the motivational value of stimuli may be reduced by processes associated with selective ignoring.  相似文献   

2.
Is facial expression recognition marked by specific event-related potentials (ERPs) effects? Are conscious and unconscious elaborations of emotional facial stimuli qualitatively different processes? In Experiment 1, ERPs elicited by supraliminal stimuli were recorded when 21 participants viewed emotional facial expressions of four emotions and a neutral stimulus. Two ERP components (N2 and P3) were analyzed for their peak amplitude and latency measures. First, emotional face-specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). A more posterior distribution of ERPs was found for N2. Moreover, a lateralization effect was revealed for negative (right lateralization) and positive (left lateralization) facial expressions. In Experiment 2 (20 participants), 1-ms subliminal stimulation was carried out. Unaware information processing was revealed to be quite similar to aware information processing for peak amplitude but not for latency. In fact, unconscious stimulation produced a more delayed peak variation than conscious stimulation.  相似文献   

3.
The aim of this study was to investigate the causes of the own-race advantage in facial expression perception. In Experiment 1, we investigated Western Caucasian and Chinese participants’ perception and categorization of facial expressions of six basic emotions that included two pairs of confusable expressions (fear and surprise; anger and disgust). People were slightly better at identifying facial expressions posed by own-race members (mainly in anger and disgust). In Experiment 2, we asked whether the own-race advantage was due to differences in the holistic processing of facial expressions. Participants viewed composite faces in which the upper part of one expression was combined with the lower part of a different expression. The upper and lower parts of the composite faces were either aligned or misaligned. Both Chinese and Caucasian participants were better at identifying the facial expressions from the misaligned images, showing interference on recognizing the parts of the expressions created by holistic perception of the aligned composite images. However, this interference from holistic processing was equivalent across expressions of own-race and other-race faces in both groups of participants. Whilst the own-race advantage in recognizing facial expressions does seem to reflect the confusability of certain emotions, it cannot be explained by differences in holistic processing.  相似文献   

4.
Using a visual search paradigm, we investigated how a top-down goal modified attentional bias for threatening facial expressions. In two experiments, participants searched for a facial expression either based on stimulus characteristics or a top-down goal. In Experiment 1, participants searched for a discrepant facial expression in a homogenous crowd of faces. Consistent with previous research, we obtained a shallower response time (RT) slope when the target face was angry than when it was happy. In Experiment 2, participants searched for a specific type of facial expression (allowing a top-down goal). When the display included a target, we found a shallower RT slope for the angry than for the happy face search. However, when an angry or happy face was present in the display in opposition to the task goal, we obtained equivalent RT slopes, suggesting that the mere presence of an angry face in opposition to the task goal did not support the well-known angry face superiority effect. Furthermore, RT distribution analyses supported the special status of an angry face only when it was combined with the top-down goal. On the basis of these results, we suggest that a threatening facial expression may guide attention as a high-priority stimulus in the absence of a specific goal; however, in the presence of a specific goal, the efficiency of facial expression search is dependent on the combined influence of a top-down goal and the stimulus characteristics.  相似文献   

5.
We investigated whether lines and shapes that present face-like features would be associated with emotions. In Experiment 1, participants associated concave, convex, or straight lines with the words happy or sad. Participants found it easiest to associate the concave line with happy and the convex line with sad. In Experiment 2, participants rated (valence, pleasantness, liking, and tension) and categorised (valence and emotion words) two convex and concave lines that were paired with six distinct pairs of eyes. The presence of eyes affected participants’ valence ratings and response latencies; more congruent eye–mouth matches produced more consistent ratings and faster reaction times. In Experiment 3, we examined whether dots that resembled eyes would be associated with emotional words. Participants found it easier to match certain sets of dots with specific emotions. These results suggest that facial gestures that are associated with specific emotions can be captured using relatively simple shapes and lines.  相似文献   

6.
1IntroductionCorrectly identifying other people′s facial ex-pressions of emotions is important to human socialinteraction in all societies.Many studies suggestthat the identification of facial expressions in par-ticular and perceptual processing of emotional infor-mation is carried out mainly by the right hemi-sphere of the brain[1 ̄7].Damage to the righthemisphere generally produces more significant im-pairment in recognition of all facial expressions ofemotion than damage to the left hemisp…  相似文献   

7.
The left and right hemispheres of the brain are differentially related to the processing of emotions. Although there is little doubt that the right hemisphere is relatively superior for processing negative emotions, controversy exists over the hemispheric role in the processing of positive emotions. Eighty right-handed normal male participants were examined for visual-field (left-right) differences in the perception of facial expressions of emotion. Facial composite (RR, LL) and hemifacial (R, L) sets depicting emotion expressions of happiness and sadness were prepared. Pairs of such photographs were presented bilaterally for 150 ms, and participants were asked to select the photographs that looked more expressive. A left visual-field superiority (a right-hemisphere function) was found for sad facial emotion. A hemispheric advantage in the perception of happy expression was not found.  相似文献   

8.
Visual-field bias in the judgment of facial expression of emotion   总被引:2,自引:0,他引:2  
The left and right hemispheres of the brain are differentially related to the processing of emotions. Although there is little doubt that the right hemisphere is relatively superior for processing negative emotions, controversy exists over the hemispheric role in the processing of positive emotions. Eighty right-handed normal male participants were examined for visual-field (left-right) differences in the perception of facial expressions of emotion. Facial composite (RR, LL) and hemifacial (R, L) sets depicting emotion expressions of happiness and sadness were prepared. Pairs of such photographs were presented bilaterally for 150 ms, and participants were asked to select the photographs that looked more expressive. A left visual-field superiority (a right-hemisphere function) was found for sad facial emotion. A hemispheric advantage in the perception of happy expression was not found.  相似文献   

9.
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression.  相似文献   

10.
This study investigated whether sensitivity to and evaluation of facial expressions varied with repeated exposure to non-prototypical facial expressions for a short presentation time. A morphed facial expression was presented for 500 ms repeatedly, and participants were required to indicate whether each facial expression was happy or angry. We manipulated the distribution of presentations of the morphed facial expressions for each facial stimulus. Some of the individuals depicted in the facial stimuli expressed anger frequently (i.e., anger-prone individuals), while the others expressed happiness frequently (i.e., happiness-prone individuals). After being exposed to the faces of anger-prone individuals, the participants became less sensitive to those individuals’ angry faces. Further, after being exposed to the faces of happiness-prone individuals, the participants became less sensitive to those individuals’ happy faces. We also found a relative increase in the social desirability of happiness-prone individuals after exposure to the facial stimuli.  相似文献   

11.

This paper describes a method to measure the sensitivity of an individual to different facial expressions. It shows that individual participants are more sensitive to happy than to fearful expressions and that the differences are statistically significant using the model-comparison approach. Sensitivity is measured by asking participants to discriminate between an emotional facial expression and a neutral expression of the same face. The expression was diluted to different degrees by combining it in different proportions with the neutral expression using morphing software. Sensitivity is defined as measurement of the proportion of neutral expression in a stimulus required for participants to discriminate the emotional expression on 75% of presentations. Individuals could reliably discriminate happy expressions diluted with a greater proportion of the neutral expression compared with that required for discrimination of fearful expressions. This tells us that individual participants are more sensitive to happy compared with fearful expressions. Sensitivity is equivalent when measured on two different testing sessions, and greater sensitivity to happy expressions is maintained with short stimulus durations and stimuli generated using different morphing software. Increased sensitivity to happy compared with fear expressions was affected at smaller image sizes for some participants. Application of the approach for use with clinical populations, as well as understanding the relative contribution of perceptual processing and affective processing in facial expression recognition, is discussed.

  相似文献   

12.
We systematically examined the impact of emotional stimuli on time perception in a temporal reproduction paradigm where participants reproduced the duration of a facial emotion stimulus using an oval-shape stimulus or vice versa. Experiment 1 asked participants to reproduce the duration of an angry face (or the oval) presented for 2,000 ms. Experiment 2 included a range of emotional expressions (happy, sad, angry, and neutral faces as well as the oval stimulus) presented for different durations (500, 1,500, and 2,000 ms). We found that participants over-reproduced the durations of happy and sad faces using the oval stimulus. By contrast, there was a trend of under-reproduction when the duration of the oval stimulus was reproduced using the angry face. We suggest that increased attention to a facial emotion produces the relativity of time perception.  相似文献   

13.
通过要求被试判断同时呈现的视听信息情绪效价的关系,考察视听情绪信息整合加工特点。实验一中词汇效价与韵律效价不冲突,实验二中词汇效价与韵律效价冲突。两个实验一致发现当面孔表情为积极时,被试对视听通道情绪信息关系判断更准确;实验二还发现,当面孔表情为消极时,相对于韵律线索,被试根据语义线索对视听信息关系判断更迅速。上述结果说明视听信息在同时呈现时,视觉信息可能先行加工,并影响到随后有关视听关系的加工。  相似文献   

14.
为探讨高特质焦虑者在前注意阶段对情绪刺激的加工模式以明确其情绪偏向性特点, 本研究采用偏差-标准反转Oddball范式探讨了特质焦虑对面部表情前注意加工的影响。结果发现: 对于低特质焦虑组, 悲伤面孔所诱发的早期EMMN显著大于快乐面孔, 而对于高特质焦虑组, 快乐和悲伤面孔所诱发的早期EMMN差异不显著。并且, 高特质焦虑组的快乐面孔EMMN波幅显著大于低特质焦虑组。结果表明, 人格特质是影响面部表情前注意加工的重要因素。不同于普通被试, 高特质焦虑者在前注意阶段对快乐和悲伤面孔存在相类似的加工模式, 可能难以有效区分快乐和悲伤情绪面孔。  相似文献   

15.
The aim of this study was to examine the moderating role of emotional awareness in the relationship between emotion regulation strategies and emotional information processing. A total of 120 female students regulated emotions while watching an unpleasant film. Before and after emotion induction, participants completed a set of tasks that required matching facial expressions. The results demonstrated that participants who were high in emotional awareness showed a significantly smaller increase in error responses (i.e., incorrect matches) than participants who were low in emotional awareness. However, this effect was observed only in suppression (i.e., inhibition of an emotionally expressive behavior), masking (i.e., emotion experienced with a happy expression) and control (i.e., no regulation) conditions. Among reappraisers, who were instructed to adopt a neutral attitude toward the film, regardless of whether they were high or low in emotional awareness, there was not a significant increase in error responses. This study shows that the potentially damaging impact of negative emotions on the processing of emotional information can be prevented by a high emotional awareness or with the implementation of reappraisal as an emotion regulation strategy.  相似文献   

16.
3个实验逐步深入地考察了囚徒困境博弈中, 对手的高兴、中性和愤怒面部表情对个体合作行为的影响及相关变量的中介和调节作用。实验1的结果表明, 对手的高兴表情比愤怒表情诱发了更高的合作水平, 且高兴和中性表情均比愤怒表情产生了更高的合作预期, 合作预期中介了面部表情与合作行为的关系; 实验2引入指导语操纵被试的直觉或理性决策模式, 发现实验1的结果仅出现在直觉条件下, 却未出现在理性条件下, 且整体而言直觉决策模式下比理性决策模式下更合作; 实验3采用更加严格的时间压力范式操纵被试的直觉或理性决策模式, 发现除了高兴表情比中性表情也诱发了更多的合作行为外, 其他几乎复制了实验2的结果。基于这些结果, 建立了有调节的中介模型, 以期揭示他人面部表情、合作预期、合作行为及个体决策模式之间的复杂关系。  相似文献   

17.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed.  相似文献   

18.
The perception of tactile stimuli on the face is modulated if subjects concurrently observe a face being touched; this effect is termed "visual remapping of touch" or the VRT effect. Given the high social value of this mechanism, we investigated whether it might be modulated by specific key information processed in face-to-face interactions: facial emotional expression. In two separate experiments, participants received tactile stimuli, near the perceptual threshold, either on their right, left, or both cheeks. Concurrently, they watched several blocks of movies depicting a face with a neutral, happy, or fearful expression that was touched or just approached by human fingers (Experiment 1). Participants were asked to distinguish between unilateral and bilateral felt tactile stimulation. Tactile perception was enhanced when viewing touch toward a fearful face compared with viewing touch toward the other two expressions. In order to test whether this result can be generalized to other negative emotions or whether it is a fear-specific effect, we ran a second experiment, where participants watched movies of faces-touched or approached by fingers-with either a fearful or an angry expression (Experiment 2). In line with the first experiment, tactile perception was enhanced when subjects viewed touch toward a fearful face and not toward an angry face. Results of the present experiments are interpreted in light of different mechanisms underlying different emotions recognition, with a specific involvement of the somatosensory system when viewing a fearful expression and a resulting fear-specific modulation of the VRT effect. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

19.
We examined dysfunctional memory processing of facial expressions in relation to alexithymia. Individuals with high and low alexithymia, as measured by the Toronto Alexithymia Scale (TAS-20), participated in a visual search task (Experiment 1A) and a change-detection task (Experiments 1B and 2), to assess differences in their visual short-term memory (VSTM). In the visual search task, the participants were asked to judge whether all facial expressions (angry and happy faces) in the search display were the same or different. In the change-detection task, they had to decide whether all facial expressions changed between successive two displays. We found individual differences only in the change-detection task. Individuals with high alexithymia showed lower sensitivity for the happy faces compared to the angry faces, while individuals with low alexithymia showed sufficient recognition for both facial expressions. Experiment 2 examined whether individual differences were observed during early storage or later retrieval stage of the VSTM process using a single-probe paradigm. We found no effect of single-probe, indicating that individual differences occurred at the storage stage. The present results provide new evidence that individuals with high alexithymia show specific impairment in VSTM processes (especially the storage stage) related to happy but not to angry faces.  相似文献   

20.
Are emotions perceived automatically? Two psychological refractory period experiments were conducted to ascertain whether emotion perception requires central attentional resources. Task 1 required an auditory discrimination (tone vs. noise), whereas Task 2 required a discrimination between happy and angry faces. The difficulty of Task 2 was manipulated by varying the degree of emotional expression. The stimulus onset asynchrony (SOA) between Task 1 and Task 2 was also varied. Experiment 1 revealed additive effects of SOA and Task 2 emotion-perception difficulty. Experiment 2 replicated the additive relationship with a stronger manipulation of emotion-perception difficulty. According to locus-of-slack logic, our participants did not process emotional expressions while central resources were devoted to Task 1. We conclude that emotion perception is not fully automatic.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号