首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
本研究采用面孔情绪探测任务, 通过状态-特质焦虑问卷筛选出高、低特质性焦虑被试, 考察场景对不同情绪效价以及不同情绪强度的面部表情加工的影响, 并探讨特质性焦虑在其中所发挥的作用。结果发现:(1)对于不同情绪效价的面部表情, 场景对其情绪探测的影响存在差异:对于快乐面部表情, 在100%、80%和20%三个情绪层级上, 在场景与面孔情绪性一致情况下, 被试对面孔情绪探测的正确率显著高于不一致情况; 对于恐惧面部表情, 在80%、60%、40%和20%四个情绪层级上, 均发现一致条件比不一致条件有着更高的情绪探测正确率。(2)对于高特质性焦虑组, 一致条件和不一致条件中的面孔情绪探测正确率并没有显著差异, 即高特质性焦虑组并未表现出显著的场景效应; 而低特质性焦虑组则差异显著, 即出现显著的场景效应。该研究结果表明:(1)对于情绪强度较低的面部表情, 快乐与恐惧面孔情绪探测都更容易受到场景的影响。(2)相比于中等强度快乐面孔, 场景更容易影响中等强度恐惧面孔情绪的探测。(3)特质性焦虑的个体因素在场景对面孔情绪探测的影响中发挥调节作用, 高特质性焦虑者在情绪识别中较少受到场景信息的影响。  相似文献   

2.
This study assessed the speed of recognition of facial emotional expressions (happy and angry) as a function of violent video game play. Color photos of calm facial expressions morphed to either an angry or a happy facial expression. Participants were asked to make a speeded identification of the emotion (happiness or anger) during the morph. Typically, happy faces are identified faster than angry faces (the happy-face advantage). Results indicated that playing a violent video game led to a reduction in the happy face advantage. Implications of these findings are discussed with respect to the current models of aggressive behavior.  相似文献   

3.
The present study investigates the perception of facial expressions of emotion, and explores the relation between the configural properties of expressions and their subjective attribution. Stimuli were a male and a female series of morphed facial expressions, interpolated between prototypes of seven emotions (happiness, sadness, fear, anger, surprise and disgust, and neutral) from Ekman and Friesen (1976). Topographical properties of the stimuli were quantified using the Facial Expression Measurement (FACEM) scheme. Perceived dissimilarities between the emotional expressions were elicited using a sorting procedure and processed with multidimensional scaling. Four dimensions were retained in the reconstructed facial-expression space, with positive and negative expressions opposed along D1, while the other three dimensions were interpreted as affective attributes distinguishing clusters of expressions categorized as "Surprise-Fear," "Anger," and "Disgust." Significant relationships were found between these affective attributes and objective facial measures of the stimuli. The findings support a componential explanatory scheme for expression processing, wherein each component of a facial stimulus conveys an affective value separable from its context, rather than a categorical-gestalt scheme. The findings further suggest that configural information is closely involved in the decoding of affective attributes of facial expressions. Configural measures are also suggested as a common ground for dimensional as well as categorical perception of emotional faces.  相似文献   

4.
This study investigated how target sex, target age, and expressive ambiguity influence emotion perception. Undergraduate participants (N = 192) watched morphed video clips of eight child and eight adult facial expressions shifting from neutral to either sadness or anger. Participants were asked to stop the video clip when they first saw an emotion appear (perceptual sensitivity) and were asked to identify the emotion that they saw (accuracy). Results indicate that female participants identified sad expressions sooner in female targets than in male targets. Participants were also more accurate identifying angry facial expressions by male children than by female children. Findings are discussed in terms of the effects of ambiguity, gender, and age on the perception of emotional expressions.  相似文献   

5.
The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion-identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion.  相似文献   

6.
Expression influences the recognition of familiar faces   总被引:3,自引:0,他引:3  
Face recognition has been assumed to be independent of facial expression. We used familiar and unfamiliar faces that were morphed from a happy to an angry expression within a given identity. Participants performed speeded two-choice decisions according to whether or not a face was familiar. Consistent with earlier findings, reaction times for classifications of unfamiliar faces were independent of facial expressions. In contrast, expression clearly influenced the recognition of familiar faces. with fastest recognition for moderately happy expressions. This suggests that representations of familiar faces for recognition preserve some information about typical emotional expressions.  相似文献   

7.
Individuals with borderline personality disorder (BPD) have been hypothesized to exhibit significant problems associated with emotional sensitivity. The current study examined emotional sensitivity (i.e., low threshold for recognition of emotional stimuli) in BPD by comparing 20 individuals with BPD and 20 normal controls on their accuracy in identifying emotional expressions. Results demonstrated that, as facial expressions morphed from neutral to maximum intensity, participants with BPD correctly identified facial affect at an earlier stage than did healthy controls. Participants with BPD were more sensitive than healthy controls in identifying emotional expressions in general, regardless of valence. These findings could not be explained by participants with BPD responding faster with more errors. Overall, results appear to support the contention that heightened emotional sensitivity may be a core feature of BPD.  相似文献   

8.
Little is known regarding how attention to emotional stimuli is affected during simultaneously performed exercise. Attentional biases to emotional face stimuli were assessed in 34 college students (17 women) using the dot-probe task during counterbalanced conditions of moderate- (heart rate at 45% peak oxygen consumption) and high-intensity exercise (heart rate at 80% peak oxygen consumption) compared with seated rest. The dot-probe task consisted of 1 emotional face (pleasant or unpleasant) paired with a neutral face for 1,000 ms; 256 trials (128 trials for each valence) were presented during each condition. Each condition lasted approximately 10 min. Participants were instructed to perform each trial of the dot-probe task as quickly and accurately as possible during the exercise and rest conditions. During moderate-intensity exercise, participants exhibited significantly greater attentional bias scores to pleasant compared with unpleasant faces (p < .01), whereas attentional bias scores to emotional faces did not differ at rest or during high-intensity exercise (p > .05). In addition, the attentional bias to unpleasant faces was significantly reduced during moderate-intensity exercise compared with that during rest (p < .05). These results provide behavioral evidence that during exercise at a moderate intensity, there is a shift in attention allocation toward pleasant emotional stimuli and away from unpleasant emotional stimuli. Future work is needed to determine whether acute exercise may be an effective treatment approach to reduce negative bias or enhance positive bias in individuals diagnosed with mood or anxiety disorders, or whether attentional bias during exercise predicts adherence to exercise.  相似文献   

9.
Extracting meaning from faces to understand other people's mental states and intentions, and to quickly adjust our actions accordingly, is a vital aspect of our social interactions. However, not all emotionally relevant attributes of a person are directly observable from facial features or expressions. In this study event-related brain potentials were used to investigate the effects of affective information about a person's biography that cannot be derived from the visual appearance of the face. Faces of well-known and initially unfamiliar persons with neutral expressions were associated with negative, positive or neutral biographical information. For well-known faces, early event-related brain potential (ERP) modulations induced by emotional knowledge, their scalp topographies and time course strongly resemble the effects frequently reported for emotional facial expressions even though here, access to stored semantic knowledge is required. These results demonstrate that visually opaque affective knowledge is extracted at high speed and modulates sensory processing in the visual cortex.  相似文献   

10.
Deception has been reported to be influenced by task-relevant emotional information from an external stimulus. However, it remains unclear how task-irrelevant emotional information would influence deception. In the present study, facial expressions of different valence and emotion intensity were presented to participants, where they were asked to make either truthful or deceptive gender judgments according to the preceding cues. We observed the influence of facial expression intensity upon individuals’ cognitive cost of deceiving (mean difference of individuals’ truthful and deceptive response times). Larger cost was observed for high intensity faces compared to low intensity faces. These results provided insights on how automatic attraction of attention evoked by task-irrelevant emotional information in facial expressions influenced individuals’ cognitive cost of deceiving.  相似文献   

11.
The objectives of this study were to propose a method of presenting dynamic facial expressions to experimental subjects, in order to investigate human perception of avatar's facial expressions of different levels of emotional intensity. The investigation concerned how perception varies according to the strength of facial expression, as well as according to an avatar's gender. To accomplish these goals, we generated a male and a female virtual avatar with five levels of intensity of happiness and anger using a morphing technique. We then recruited 16 normal healthy subjects and measured each subject's emotional reaction by scoring affective arousal and valence after showing them the avatar's face. Through this study, we were able to investigate human perceptual characteristics evoked by male and female avatars' graduated facial expressions of happiness and anger. In addition, we were able to identify that a virtual avatar's facial expression could affect human emotion in different ways according to the avatar's gender and the intensity of its facial expressions. However, we could also see that virtual faces have some limitations because they are not real, so subjects recognized the expressions well, but were not influenced to the same extent. Although a virtual avatar has some limitations in conveying its emotion using facial expressions, this study is significant in that it shows that a new potential exists to use or manipulate emotional intensity by controlling a virtual avatar's facial expression linearly using a morphing technique. Therefore, it is predicted that this technique may be used for assessing emotional characteristics of humans, and may be of particular benefit for work with people with emotional disorders through a presentation of dynamic expression of various emotional intensities.  相似文献   

12.
Although facial information is distributed over spatial as well as temporal domains, thus far research on selective attention to disapproving faces has concentrated predominantly on the spatial domain. This study examined the temporal characteristics of visual attention towards facial expressions by presenting a Rapid Serial Visual Presentation (RSVP) paradigm to high (n=33) and low (n=34) socially anxious women. Neutral letter stimuli (p, q, d, b) were presented as the first target (T1), and emotional faces (neutral, happy, angry) as the second target (T2). Irrespective of social anxiety, the attentional blink was attenuated for emotional faces. Emotional faces as T2 did not influence identification accuracy of a preceding (neutral) target. The relatively low threshold for the (explicit) identification of emotional expressions is consistent with the view that emotional facial expressions are processed relatively efficiently.  相似文献   

13.
Previous studies have revealed that decoding of facial expressions is a specific component of face comprehension and that semantic information might be processed separately from the basic stage of face perception. In order to explore event-related potentials (ERPs) related to recognition of facial expressions and the effect of the semantic content of the stimulus, we analyzed 20 normal subjects. Faces with three prototypical emotional expressions (fear, happiness, and sadness) and with three morphed expressions were presented in random order. The neutral stimuli represented the control condition. Whereas ERP profiles were similar with respect to an early negative ERP (N170), differences in peak amplitude were observed later between incongruous (morphed) expressions and congruous (prototypical) ones. In fact, the results demonstrated that the emotional morphed faces elicited a negative peak at about 360 ms, mainly distributed over the posterior site. The electrophysiological activity observed may represent a specific cognitive process underlying decoding of facial expressions in case of semantic anomaly detection. The evidence is in favor of the similarity of this negative deflection with the N400 ERP effect elicited in linguistic tasks. A domain-specific semantic module is proposed to explain these results.  相似文献   

14.
Some theories of emotion emphasise a close relationship between interoception and subjective experiences of emotion. In this study, we used facial expressions to examine whether interoceptive sensibility modulated emotional experience in a social context. Interoceptive sensibility was measured using the heartbeat detection task. To estimate individual emotional sensitivity, we made morphed photos that ranged between a neutral and an emotional facial expression (i.e., anger, sadness, disgust and happy). Recognition rates of particular emotions from these photos were calculated and considered as emotional sensitivity thresholds. Our results indicate that participants with accurate interoceptive awareness are sensitive to the emotions of others, especially for expressions of sadness and happy. We also found that false responses to sad faces were closely related with an individual's degree of social anxiety. These results suggest that interoceptive awareness modulates the intensity of the subjective experience of emotion and affects individual traits related to emotion processing.  相似文献   

15.
Although affective facial pictures are widely used in emotion research, standardised affective stimuli sets are rather scarce, and the existing sets have several limitations. We therefore conducted a validation study of 490 pictures of human facial expressions from the Karolinska Directed Emotional Faces database (KDEF). Pictures were evaluated on emotional content and were rated on an intensity and arousal scale. Results indicate that the database contains a valid set of affective facial pictures. Hit rates, intensity, and arousal of the 20 best KDEF pictures for each basic emotion are provided in an appendix.  相似文献   

16.
Facial stimuli are widely used in behavioural and brain science research to investigate emotional facial processing. However, some studies have demonstrated that dynamic expressions elicit stronger emotional responses compared to static images. To address the need for more ecologically valid and powerful facial emotional stimuli, we created Dynamic FACES, a database of morphed videos (n?=?1026) from younger, middle-aged, and older adults displaying naturalistic emotional facial expressions (neutrality, sadness, disgust, fear, anger, happiness). To assess adult age differences in emotion identification of dynamic stimuli and to provide normative ratings for this modified set of stimuli, healthy adults (n?=?1822, age range 18–86 years) categorised for each video the emotional expression displayed, rated the expression distinctiveness, estimated the age of the face model, and rated the naturalness of the expression. We found few age differences in emotion identification when using dynamic stimuli. Only for angry faces did older adults show lower levels of identification accuracy than younger adults. Further, older adults outperformed middle-aged adults’ in identification of sadness. The use of dynamic facial emotional stimuli has previously been limited, but Dynamic FACES provides a large database of high-resolution naturalistic, dynamic expressions across adulthood. Information on using Dynamic FACES for research purposes can be found at http://faces.mpib-berlin.mpg.de.  相似文献   

17.
Previous studies have suggested a link between the processing of the emotional expression of a face and how attractive it appears. In two experiments we investigated the interrelationship between attractiveness and happiness. In Experiment 1 we presented morphed faces varying in attractiveness and happiness and asked participants to choose the more attractive of two simultaneously presented faces. In the second experiment we used the same stimuli as in Experiment 1 and asked participants to choose the happier face. The results of Experiment 1 revealed that the evaluation of attractiveness is strongly influenced by the intensity of a smile expressed on a face: A happy facial expression could even compensate for relative unattractiveness. Conversely, the findings of Experiment 2 showed that facial attractiveness also influences the evaluation of happiness: It was easier to choose the happier of two faces if the happier face was also more attractive. We discuss the interrelationship of happiness and attractiveness with regard to evolutionary relevance of positive affective status and rewarding effects.  相似文献   

18.
The present work represents the first study to investigate the relationship between adult attachment avoidance and anxiety and automatic affective responses to basic facial emotions. Subliminal affective priming methods allowed for the assessment of unconscious affective reactions. An affective priming task using masked sad and happy faces, both of which are approach‐related facial expressions, was administered to 30 healthy volunteers. Participants also completed the Relationship Scales Questionnaire and measures of anxiety and depression. Attachment avoidance was negatively associated with affective priming due to sad (but not happy) facial expressions. This association occurred independently of attachment anxiety, depressivity, and trait anxiety. Attachment anxiety was not correlated with priming due to sad or happy facial expressions. The present results are consistent with the assumption that attachment avoidance moderates automatic affective reaction to sad faces. Our data indicate that avoidant attachment is related to a low automatic affective responsivity to sad facial expressions.  相似文献   

19.
Three experiments examined 3- and 5-year-olds’ recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression remained neutral (Experiment 1) or varied between immediate and delayed tests: from neutral to smile and anger (Experiment 2), from smile to neutral and anger (Experiment 3, condition 1), or from anger to neutral and smile (Experiment 3, condition 2). In all experiments, immediate face recognition was not influenced by emotional expression for either age group. Delayed face recognition was most accurate for faces in identical emotional expression. For 5-year-olds, delayed face recognition (with varied emotional expression) was not influenced by which emotional expression had been displayed during the immediate recognition test. Among 3-year-olds, accuracy decreased when facial expressions varied from neutral to smile and anger but was constant when facial expressions varied from anger or smile to neutral, smile or anger. Three-year-olds’ recognition was facilitated when faces initially displayed smile or anger expressions, but this was not the case for 5-year-olds. Results thus indicate a developmental progression in face identity recognition with varied emotional expressions between ages 3 and 5.  相似文献   

20.
This study investigated whether sensitivity to and evaluation of facial expressions varied with repeated exposure to non-prototypical facial expressions for a short presentation time. A morphed facial expression was presented for 500 ms repeatedly, and participants were required to indicate whether each facial expression was happy or angry. We manipulated the distribution of presentations of the morphed facial expressions for each facial stimulus. Some of the individuals depicted in the facial stimuli expressed anger frequently (i.e., anger-prone individuals), while the others expressed happiness frequently (i.e., happiness-prone individuals). After being exposed to the faces of anger-prone individuals, the participants became less sensitive to those individuals’ angry faces. Further, after being exposed to the faces of happiness-prone individuals, the participants became less sensitive to those individuals’ happy faces. We also found a relative increase in the social desirability of happiness-prone individuals after exposure to the facial stimuli.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号