首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This experiment examines how emotion is perceived by using facial and vocal cues of a speaker. Three levels of facial affect were presented using a computer-generated face. Three levels of vocal affect were obtained by recording the voice of a male amateur actor who spoke a semantically neutral word in different simulated emotional states. These two independent variables were presented to subjects in all possible permutations—visual cues alone, vocal cues alone, and visual and vocal cues together—which gave a total set of 15 stimuli. The subjects were asked to judge the emotion of the stimuli in a two-alternative forced choice task (either HAPPY or ANGRY). The results indicate that subjects evaluate and integrate information from both modalities to perceive emotion. The influence of one modality was greater to the extent that the other was ambiguous (neutral). The fuzzy logical model of perception (FLMP) fit the judgments significantly better than an additive model, which weakens theories based on an additive combination of modalities, categorical perception, and influence from only a single modality.  相似文献   

2.
Lexical decision and word-naming experiments were conducted to examine influences of emotions in visual word recognition. Emotional states of happiness and sadness were induced with classical music. In the first two experiments, happy and sad participants (and neutral-emotion participants in Experiment 2) made lexical decisions about letter-strings, some of which were words with meanings strongly associated with the emotions happiness, love, sadness, and anger. Emotional state of the perceiver was associated with facilitation of response to words categorically related to that emotion (i.e. happy and sad words). However, such facilitation was not observed for words that were related by valence, but not category, to the induced emotions (i.e. love and anger words). Evidence for categorical influences of emotional state in word recognition was also observed in a third experiment that employed a word-naming task. Together the results support a categorical emotions model of the influences of emotion in information processing (Niedenthal, Setterlund, & Jones, 1994). Moreover, the result of the wordnaming experiment suggests that the effects of emotion are evident at very early stages in cognitive processing.  相似文献   

3.
A free-vision chimeric facial emotion judgment task and a tachistoscopic face-recognition reaction time task were administered to 20 male right-handed subjects. The tachistoscopic task involved judgments of whether a poser in the centrally presented full-face photograph was the same or different poser than in a profile photograph presented in the left or right visual field (LVF, RVF). The free-vision task was that used by J. Levy, W. Heller, M. Banich, and L. Burton (1983, Brain and Cognition, 2, 404-419) and involved judging which of two chimeric faces appeared happier, in which the two chimeras were mirror images of each other and each chimera consisted of a smiling half-face joined at the midline to a neutral half-face of the same poser. For the tachistoscopic task, subjects were divided into groups of Fast and Slow responders by a median split of the mean reaction times. For the Fast subjects, judgments were faster in the LVF than in the RVF, and there was a significant interaction between visual field and profile direction, such that responses were faster for medially oriented profiles; i.e., LVF responses were faster for right-facing than for left-facing profiles, with the reverse relationship in the RVF. The Slow responders did not show these effects. Only the Fast group showed the bias for choosing the chimera with the smile on the left as happier, and mean response speed and the LVF advantage on the tachistoscopic test correlated with the leftward bias on the free-vision task for all subjects combined. It was suggested that overall response speed on the face-matching task reflected the extent to which specialized and more efficient right hemisphere functions were activated.  相似文献   

4.
We establish attentional capture by emotional distractor faces presented as a "singleton" in a search task in which the emotion is entirely irrelevant. Participants searched for a male (or female) target face among female (or male) faces and indicated whether the target face was tilted to the left or right. The presence (vs. absence) of an irrelevant emotional singleton expression (fearful, angry, or happy) in one of the distractor faces slowed search reaction times compared to the singleton absent or singleton target conditions. Facilitation for emotional singleton targets was found for the happy expression but not for the fearful or angry expressions. These effects were found irrespective of face gender and the failure of a singleton neutral face to capture attention among emotional faces rules out a visual odd-one-out account for the emotional capture. The present study thus establishes irrelevant, emotional, attentional capture.  相似文献   

5.
Recent research has looked at whether the expectancy of an emotion can account for subsequent valence specific laterality effects of prosodic emotion, though no research has examined this effect for facial emotion. In the study here (n=58), we investigated this issue using two tasks; an emotional face perception task and a novel word task that involved categorising positive and negative words. In the face perception task a valence specific laterality effect was found for surprise (positive) and anger (negative) faces in the control but not expectancy condition. Interestingly, lateralisation differed for face gender, revealing a left hemisphere advantage for male faces and a right hemisphere advantage for female faces. In the word task, an affective priming effect was found, with higher accuracy when valence of picture prime and word target were congruent. Target words were also responded to faster when presented to the LVF versus RVF in the expectancy but not control condition. These findings suggest that expecting an emotion influences laterality processing but that this differs in terms of the perceptual/experience dimension of the task. Further, that hemispheric processing of emotional expressions appear to differ in the gender of the image.  相似文献   

6.
In this study, we examined whether integration of visual and auditory information about emotions requires limited attentional resources. Subjects judged whether a voice expressed happiness or fear, while trying to ignore a concurrently presented static facial expression. As an additional task, the subjects had to add two numbers together rapidly (Experiment 1), count the occurrences of a target digit in a rapid serial visual presentation (Experiment 2), or judge the pitch of a tone as high or low (Experiment 3). The visible face had an impact on judgments of the emotion of the heard voice in all the experiments. This cross-modal effect was independent of whether or not the subjects performed a demanding additional task. This suggests that integration of visual and auditory information about emotions may be a mandatory process, unconstrained by attentional resources.  相似文献   

7.
《Brain and cognition》2011,75(3):324-331
Recent research has looked at whether the expectancy of an emotion can account for subsequent valence specific laterality effects of prosodic emotion, though no research has examined this effect for facial emotion. In the study here (n = 58), we investigated this issue using two tasks; an emotional face perception task and a novel word task that involved categorising positive and negative words. In the face perception task a valence specific laterality effect was found for surprise (positive) and anger (negative) faces in the control but not expectancy condition. Interestingly, lateralisation differed for face gender, revealing a left hemisphere advantage for male faces and a right hemisphere advantage for female faces. In the word task, an affective priming effect was found, with higher accuracy when valence of picture prime and word target were congruent. Target words were also responded to faster when presented to the LVF versus RVF in the expectancy but not control condition.These findings suggest that expecting an emotion influences laterality processing but that this differs in terms of the perceptual/experience dimension of the task. Further, that hemispheric processing of emotional expressions appear to differ in the gender of the image.  相似文献   

8.
The present study investigated whether counter-regulation in affective processing is triggered by emotions. Automatic attention allocation to valent stimuli was measured in the context of positive and negative affective states. Valence biases were assessed by comparing the detection of positive versus negative words in a visual search task (Experiment 1) or by comparing interference effects of positive and negative distractor words in an emotional Stroop task (Experiment 2). Imagining a hypothetical emotional situation (Experiment 1) or watching romantic versus depressing movie clips (Experiment 2) increased attention allocation to stimuli that were opposite in valence to the current emotional state. Counter-regulation is assumed to reflect a basic mechanism underlying implicit emotion regulation.  相似文献   

9.
It is possible that the visual discrimination of emotion categories and emotion word vocabulary develop via common emotion-specific processes. In contrast, it is possible that they develop with vocabulary development more generally. This study contrasts these two possibilities. Twenty-three 26-month-olds participated in a visual perceptual discrimination task involving emotional facial expressions. After familiarization to a 100% happy face, toddlers were tested for their visual preference for a novel sad face in a side-by-side presentation paired with the familiar happy face. Parental report was used to quantify production of emotion words and vocabulary generally. Visual preference for the novel emotion (sad) in the discrimination task correlated with emotion word vocabulary size but not with overall vocabulary size.  相似文献   

10.
Visual field differences for the recognition of emotional expression were investigated using a tachistoscopic procedure. Cartoon line drawings of five adult male characters, each with five emotional expressions ranging from extremely positive to extremely negative, were used as stimuli. Single stimuli were presented unilaterally for 85 msec. Subjects (N = 20) were asked to compare this target face to a subsequent centrally presented face and to decide whether the emotional expressions of the two faces, or the character represented by the two faces, were the same or different. Significant left visual field (LVF) superiorities for both character and emotional expression recognition were found. Subsequent analyses demonstrated the independence of these effects. The LVF superiority for emotional judgments was related to the degree of affective expression, but that for character recognition was not. The results of this experiment are consistent with experimental and clinical literature which has indicated a right hemispheric superiority for face recognition and for processing emotional stimuli. The asymmetry for emotion recognition is interpreted as being an expression of the right hemisphere's synthetic and integrative characteristics, its holistic nature, and its use of imagic associations.  相似文献   

11.
The present paper investigated recognition errors in affective judgement of facial emotional expressions. Twenty-eight females and sixteen males participated in the study. The results showed that in both males and females emotional displays could be correctly classified, but females had a higher rate of correct classification; males were more likely to have difficulty distinguishing one emotion from another. Females rated emotions identically regardless of whether the emotion was displayed by a male or female face. Furthermore, the two-factor structure of emotion, based on a valence and an arousal dimension, was only present for female subjects. These results further extend our knowledge about gender differences in affective information processing.  相似文献   

12.
The influences of sex and lateralized visual hemispace bias in the judgment of the emotional valence of faces during a free viewing condition are evaluated. 73 subjects (aged 18 to 52 yr.) viewed videotaped facial expressions of emotion in normal and mirror-reversed orientation and classified each face as a positive, negative, or neutral expression. There was a significant interaction between the sex of the rater and the orientation of the face that influenced the proportion of correct classifications. Male and female perceivers did not differ in the accuracy of their affect judgments for faces viewed in normal orientation, whereas reversal of the orientation of the faces resulted in a significant enhancement of accuracy judgments for the males but not the females. The results suggest greater cerebral lateralization of perceptual processes in males.  相似文献   

13.
14.
In a sample of 325 college students, we examined how context influences judgments of facial expressions of emotion, using a newly developed facial affect recognition task in which emotional faces are superimposed upon emotional and neutral contexts. This research used a larger sample size than previous studies, included more emotions, varied the intensity level of the expressed emotion to avoid potential ceiling effects from very easy recognition, did not explicitly direct attention to the context, and aimed to understand how recognition is influenced by non-facial information, both situationally-relevant and situationally-irrelevant. Both accuracy and RT varied as a function of context. For all facial expressions of emotion other than happiness, accuracy increased when the emotion of the face and context matched, and decreased when they mismatched. For all emotions, participants responded faster when the emotion of the face and image matched and slower when they mismatched. Results suggest that the judgment of the facial expression is itself influenced by the contextual information instead of both being judged independently and then combined. Additionally, the results have implications for developing models of facial affect recognition and indicate that there are factors other than the face that can influence facial affect recognition judgments.  相似文献   

15.
The present study investigated whether the processing characteristics of categorizing emotional facial expressions are different from those of categorizing facial age and sex information. Given that emotions change rapidly, it was hypothesized that processing facial expressions involves a more flexible task set that causes less between-task interference than the task sets involved in processing age or sex of a face. Participants switched between three tasks: categorizing a face as looking happy or angry (emotion task), young or old (age task), and male or female (sex task). Interference between tasks was measured by global interference and response interference. Both measures revealed patterns of asymmetric interference. Global between-task interference was reduced when a task was mixed with the emotion task. Response interference, as measured by congruency effects, was larger for the emotion task than for the nonemotional tasks. The results support the idea that processing emotional facial expression constitutes a more flexible task set that causes less interference (i.e., task-set "inertia") than processing the age or sex of a face.  相似文献   

16.
Previous studies have demonstrated that emotions are automatically processed. Even with subliminal presentations, subjects involuntarily mimic specific facial expressions, are influenced by the valence of a preceding emotion during judgments, and exhibit slowed responses to personally meaningful emotions; these effects are due to reflexive mimicry, unconscious carryover of valence, and attentional capture, respectively. However, perception-action effects indicate that rapid processing should involve deep, semantic-level representations of emotion (e.g., “fear”), even in the absence of a clinical emotion disorder. To test this hypothesis, we developed an emotional Stroop task (Emostroop) in which subjects responded nonverbally to emotion words superimposed over task-irrelevant images of faces displaying congruent or incongruent emotional expressions. Subjects reliably responded more slowly to incongruent than to congruent stimuli, and this interference was related to trait measures of emotionality. Rapid processing of facial emotions spontaneously activates semantic, content-rich representations at the level of the specific emotion.  相似文献   

17.
The present study investigated whether the processing characteristics of categorizing emotional facial expressions are different from those of categorizing facial age and sex information. Given that emotions change rapidly, it was hypothesized that processing facial expressions involves a more flexible task set that causes less between-task interference than the task sets involved in processing age or sex of a face. Participants switched between three tasks: categorizing a face as looking happy or angry (emotion task), young or old (age task), and male or female (sex task). Interference between tasks was measured by global interference and response interference. Both measures revealed patterns of asymmetric interference. Global between-task interference was reduced when a task was mixed with the emotion task. Response interference, as measured by congruency effects, was larger for the emotion task than for the nonemotional tasks. The results support the idea that processing emotional facial expression constitutes a more flexible task set that causes less interference (i.e., task-set “inertia”) than processing the age or sex of a face  相似文献   

18.
We present here new evidence of cross-cultural agreement in the judgement of facial expression. Subjects in 10 cultures performed a more complex judgment task than has been used in previous cross-cultural studies. Instead of limiting the subjects to selecting only one emotion term for each expression, this task allowed them to indicate that multiple emotions were evident and the intensity of each emotion. Agreement was very high across cultures about which emotion was the most intense. The 10 cultures also agreed about the second most intense emotion signaled by an expression and about the relative intensity among expressions of the same emotion. However, cultural differences were found in judgments of the absolute level of emotional intensity.  相似文献   

19.
Visual working memory (WM) for face identities is enhanced when faces express negative versus positive emotion. To determine the stage at which emotion exerts its influence on memory for person information, we isolated expression (angry/happy) to the encoding phase (Experiment 1; neutral test faces) or retrieval phase (Experiment 2; neutral study faces). WM was only enhanced by anger when expression was present at encoding, suggesting that retrieval mechanisms are not influenced by emotional expression. To examine whether emotional information is discarded on completion of encoding or sustained in WM, in Experiment 3 an emotional word categorisation task was inserted into the maintenance interval. Emotional congruence between word and face supported memory for angry but not for happy faces, suggesting that negative emotional information is preferentially sustained during WM maintenance. Our findings demonstrate that negative expressions exert sustained and beneficial effects on WM for faces that extend beyond encoding.  相似文献   

20.
采用线索-靶子实验范式, 要求被试完成面孔性别辨认任务, 从注意定向方面探讨面孔性别靶刺激返回抑制效应的性别差异。结果显示有效线索位置上面孔性别靶刺激的反应时均显著长于无效线索位置上的面孔性别靶刺激的反应时, 即表现出明显的返回抑制效应。进一步分析显示, 男性被试对同性和异性面孔靶刺激的返回抑制量无显著差异, 表明男性对面孔刺激的返回抑制不受面孔性别的影响; 而卵泡期女性对异性面孔靶刺激的返回抑制量显著小于黄体期女性, 表明女性对面孔靶刺激的返回抑制量与面孔性别和生理周期的交互作用有关。这些结果再一次为返回抑制的盲目机制提供了部分实验证据, 同时我们推测面孔性别对返回抑制的影响可能依赖于观察者对性信息的敏感性。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号