首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Emotionally intoned sentences (happy, sad, angry, and neutral voices) were dichotically paired with monotone sentences. A left ear advantage was found for recognizing emotional intonation, while a simultaneous right ear advantage was found for recognizing the verbal content of the sentences. The results indicate a right hemispheric superiority in recognizing emotional stimuli. These findings are most reasonably attributed to differential lateralization of emotional functions, rather than to subject strategy effects. No evidence was found to support a hypothesis that each hemisphere is involved in processing different types of emotion.  相似文献   

2.
Valence-specific laterality effects have been frequently obtained in facial emotion perception but not in vocal emotion perception. We report a dichotic listening study further examining whether valence-specific laterality effects generalise to vocal emotions. Based on previous literature, we tested whether valence-specific laterality effects were dependent on blocked presentation of the emotion conditions, on the naturalness of the emotional stimuli, or on listener sex. We presented happy and sad sentences, paired with neutral counterparts, dichotically in an emotion localisation task, with vocal stimuli being preceded by verbal labels indicating target emotions. The measure was accuracy. When stimuli of the same emotion were presented as a block, a valence-specific laterality effect was demonstrated, but only in original stimuli and not morphed stimuli. There was a separate interaction with listener sex. We interpret our findings as suggesting that the valence-specific laterality hypothesis is supported only in certain circumstances. We discuss modulating factors, and we consider whether the mechanisms underlying those factors may be attentional or experiential in nature.  相似文献   

3.
The valence hypothesis suggests that the right hemisphere is specialised for negative emotions and the left hemisphere is specialised for positive emotions (Silberman & Weingartner, 1986). It is unclear to what extent valence-specific effects in facial emotion perception depend upon the gender of the perceiver. To explore this question 46 participants completed a free view lateralised emotion perception task which involved judging which of two faces expressed a particular emotion. Eye fixations of 24 of the participants were recorded using an eye tracker. A significant valence-specific laterality effect was obtained, with positive emotions more accurately identified when presented to the right of centre, and negative emotions more accurately identified when presented to the left of centre. The valence-specific laterality effect did not depend on the gender of the perceiver. Analysis of the eye tracking data showed that males made more fixations while recognising the emotions and that the left-eye was fixated substantially more than the right-eye during emotion perception. Finally, in a control condition where both faces were identical, but expressed a faint emotion, the participants were significantly more likely to select the right side when the emotion label was positive. This finding adds to evidence suggesting that valence effects in facial emotion perception are not only caused by the perception of the emotion but by other processes.  相似文献   

4.
The present study examined the reliability of a dichotic listening task using nonverbal stimuli. Twenty undergraduate students (all right-handed native English speakers) had to report whether they had heard a target emotion. The task used English words (bower, dower, power, tower) pronounced with an angry, happy, neutral, or sad emotional tone. Results showed a relatively high level of test-retest reliability for the laterality effect. In addition, a significant gender by ear of presentation interaction was obtained. The interaction reflected the fact that a strong left ear advantage was found in females but not in males. The findings indicate that the task used here should be considered a reliable means to assess the lateralization of emotions. Issues concerning the relation between gender and laterality are addressed in the discussion.  相似文献   

5.
6.
Older adults have greater difficulty than younger adults perceiving vocal emotions. To better characterise this effect, we explored its relation to age differences in sensory, cognitive and emotional functioning. Additionally, we examined the role of speaker age and listener sex. Participants (N?=?163) aged 19–34 years and 60–85 years categorised neutral sentences spoken by ten younger and ten older speakers with a happy, neutral, sad, or angry voice. Acoustic analyses indicated that expressions from younger and older speakers denoted the intended emotion with similar accuracy. As expected, younger participants outperformed older participants and this effect was statistically mediated by an age-related decline in both optimism and working-memory. Additionally, age differences in emotion perception were larger for younger as compared to older speakers and a better perception of younger as compared to older speakers was greater in younger as compared to older participants. Last, a female perception benefit was less pervasive in the older than the younger group. Together, these findings suggest that the role of age for emotion perception is multi-faceted. It is linked to emotional and cognitive change, to processing biases that benefit young and own-age expressions, and to the different aptitudes of women and men.  相似文献   

7.
Recent research has looked at whether the expectancy of an emotion can account for subsequent valence specific laterality effects of prosodic emotion, though no research has examined this effect for facial emotion. In the study here (n=58), we investigated this issue using two tasks; an emotional face perception task and a novel word task that involved categorising positive and negative words. In the face perception task a valence specific laterality effect was found for surprise (positive) and anger (negative) faces in the control but not expectancy condition. Interestingly, lateralisation differed for face gender, revealing a left hemisphere advantage for male faces and a right hemisphere advantage for female faces. In the word task, an affective priming effect was found, with higher accuracy when valence of picture prime and word target were congruent. Target words were also responded to faster when presented to the LVF versus RVF in the expectancy but not control condition. These findings suggest that expecting an emotion influences laterality processing but that this differs in terms of the perceptual/experience dimension of the task. Further, that hemispheric processing of emotional expressions appear to differ in the gender of the image.  相似文献   

8.
Empirical tests of the "right hemisphere dominance" versus "valence" theories of emotion processing are confounded by known sex differences in lateralization. Moreover, information about the sex of the person posing an emotion might be processed differently by men and women because of an adaptive male bias to notice expressions of threat and vigilance in other male faces. The purpose of this study was to investigate whether sex of poser and emotion displayed influenced lateralization in men and women by analyzing "laterality quotient" scores on a test which depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression. We found that men (N = 50) were significantly more lateralized for emotions indicative of vigilance and threat (happy, sad, angry, and surprised) in male faces relative to female faces and compared to women (N = 44). These data indicate that sex differences in functional cerebral lateralization for facial emotion may be specific to the emotion presented and the sex of face presenting it.  相似文献   

9.
《Brain and cognition》2011,75(3):324-331
Recent research has looked at whether the expectancy of an emotion can account for subsequent valence specific laterality effects of prosodic emotion, though no research has examined this effect for facial emotion. In the study here (n = 58), we investigated this issue using two tasks; an emotional face perception task and a novel word task that involved categorising positive and negative words. In the face perception task a valence specific laterality effect was found for surprise (positive) and anger (negative) faces in the control but not expectancy condition. Interestingly, lateralisation differed for face gender, revealing a left hemisphere advantage for male faces and a right hemisphere advantage for female faces. In the word task, an affective priming effect was found, with higher accuracy when valence of picture prime and word target were congruent. Target words were also responded to faster when presented to the LVF versus RVF in the expectancy but not control condition.These findings suggest that expecting an emotion influences laterality processing but that this differs in terms of the perceptual/experience dimension of the task. Further, that hemispheric processing of emotional expressions appear to differ in the gender of the image.  相似文献   

10.
The left and right hemispheres of the brain are differentially related to the processing of emotions. Although there is little doubt that the right hemisphere is relatively superior for processing negative emotions, controversy exists over the hemispheric role in the processing of positive emotions. Eighty right-handed normal male participants were examined for visual-field (left-right) differences in the perception of facial expressions of emotion. Facial composite (RR, LL) and hemifacial (R, L) sets depicting emotion expressions of happiness and sadness were prepared. Pairs of such photographs were presented bilaterally for 150 ms, and participants were asked to select the photographs that looked more expressive. A left visual-field superiority (a right-hemisphere function) was found for sad facial emotion. A hemispheric advantage in the perception of happy expression was not found.  相似文献   

11.
Visual-field bias in the judgment of facial expression of emotion   总被引:2,自引:0,他引:2  
The left and right hemispheres of the brain are differentially related to the processing of emotions. Although there is little doubt that the right hemisphere is relatively superior for processing negative emotions, controversy exists over the hemispheric role in the processing of positive emotions. Eighty right-handed normal male participants were examined for visual-field (left-right) differences in the perception of facial expressions of emotion. Facial composite (RR, LL) and hemifacial (R, L) sets depicting emotion expressions of happiness and sadness were prepared. Pairs of such photographs were presented bilaterally for 150 ms, and participants were asked to select the photographs that looked more expressive. A left visual-field superiority (a right-hemisphere function) was found for sad facial emotion. A hemispheric advantage in the perception of happy expression was not found.  相似文献   

12.
Spatial frequencies have been shown to play an important role in face identification, but very few studies have investigated the role of spatial frequency content in identifying different emotions. In the present study we investigated the role of spatial frequency in identifying happy and sad facial expressions. Two experiments were conducted to investigate (a) the role of specific spatial frequency content in emotion identification, and (b) hemispherical asymmetry in emotion identification. Given the links between global processing, happy emotions, and low frequencies, we hypothesized that low spatial frequencies would be important for identifying the happy expression. Correspondingly, we also hypothesized that high spatial frequencies would be important in identifying the sad expression given the links between local processing, sad emotions, and high spatial frequencies. As expected we found that the identification of happy expression was dependent on low spatial frequencies and the identification of sad expression was dependent on high spatial frequencies. There was a hemispheric asymmetry with the identification of sad expression, especially in the right hemisphere, possibly mediated by high spatial frequency content. Results indicate the importance of spatial frequency content in the identification of happy and sad emotional expressions and point to the mechanisms involved in emotion identification.  相似文献   

13.
This study assessed both left- and right-hemisphere functions simultaneously when two-syllable words differing only in the initial stop consonant and spoken in different emotional tones were paired dichotically. Seventy-two right-handed normally achieving children, 12 boys and 12 girls at each of grades 1, 3, and 5, were instructed to detect either the presence of a specific word or of a specific emotion. In addition, 30 right-handed learning disabled (LD) children (age-matched to the normal controls) were assessed to determine whether LD children distribute verbal and nonverbal functions to different hemispheres. Results indicated that although both control and LD children demonstrated an overall REA for word stimuli and an LEA for emotional stimuli, and that emotional stimuli were easier to process than word stimuli, LD children were less accurate in processing both types of stimuli than their control counterparts. ‘Complementary specialization,’ as assessed through distribution of laterality effects, was found to be greater for control children than for LD children. However, the lack of consistency in complementary specialization found among the three developmental grade levels may be indicative that independent brain mechanisms underlying verbal and emotional processing have yet to be fully established in children. Further, in contrast to adult findings, a larger LEA was obtained for the emotion ‘happy’ than for the emotion ‘sad.’ It was concluded that whereas independent hemisphere processing for words and emotions is somewhat prevalent for control children, LD children might not be as strongly lateralized for opposite hemisphere processing of these functions.  相似文献   

14.
The present study investigated the possible role of ceiling effects in producing laterality effects of small magnitude in dichotic emotion detection. Twenty two right-handed undergraduate students participated in the present experiment. They were required to detect the presence of a target emotion in the expressions tones of happiness, sadness, anger, and neutrality presented dichotically. Stimuli were adjusted to 70 dB and occurred simultaneously with a white noise mask that had an intensity of 65, 70, 80, or 85 dB. Results showed a left ear advantage (LEA) for the 65 dB mask and a right ear advantage for the 85 dB mask, but only after two testing sessions. The possible existence of a generalized right ear bias that might affect the observed LEA for non-verbal tasks is discussed. Alternative explanations and limitations of the present experiment are also presented.  相似文献   

15.
Older adults perceive less intense negative emotion in facial expressions compared to younger counterparts. Prior research has also demonstrated that mood alters facial emotion perception. Nevertheless, there is little evidence which evaluates the interactive effects of age and mood on emotion perception. This study investigated the effects of sad mood on younger and older adults’ perception of emotional and neutral faces. Participants rated the intensity of stimuli while listening to sad music and in silence. Measures of mood were administered. Younger and older participants’ rated sad faces as displaying stronger sadness when they experienced sad mood. While younger participants showed no influence of sad mood on happiness ratings of happy faces, older adults rated happy faces as conveying less happiness when they experienced sad mood. This study demonstrates how emotion perception can change when a controlled mood induction procedure is applied to alter mood in young and older participants.  相似文献   

16.
To inform how emotions in speech are implicitly processed and registered in memory, we compared how emotional prosody, emotional semantics, and both cues in tandem prime decisions about conjoined emotional faces. Fifty-two participants rendered facial affect decisions (Pell, 2005a), indicating whether a target face represented an emotion (happiness or sadness) or not (a facial grimace), after passively listening to happy, sad, or neutral prime utterances. Emotional information from primes was conveyed by: (1) prosody only; (2) semantic cues only; or (3) combined prosody and semantic cues. Results indicated that prosody, semantics, and combined prosody–semantic cues facilitate emotional decisions about target faces in an emotion-congruent manner. However, the magnitude of priming did not vary across tasks. Our findings highlight that emotional meanings of prosody and semantic cues are systematically registered during speech processing, but with similar effects on associative knowledge about emotions, which is presumably shared by prosody, semantics, and faces.  相似文献   

17.
Patients with lesions to either the right or left hemisphere and control subjects were asked to judge the similarity of pairs of photographs of a person displaying different emotions, and of pairs of emotion words. The results were submitted to a multidimensional scaling analysis. Right-hemisphere-damaged subjects were found to be more impaired at perceiving facial emotions than were left-hemisphere-damaged subjects or controls, and this impairment was not confined to the perception of a subset of facial emotions nor to judging emotional valence (Pleasantness versus Unpleasantness). Rather, subtle impairments in perceiving a wide range of facial emotions were found, mostly concerning differentiation of the Positive-Negative and Attention-Rejection dimensions, and concerning the strategies the subjects used to make their judgments. The right-hemisphere-damaged subjects performed comparably to controls in their ratings of emotion words, suggesting that their ability to conceptualize emotional states was intact and that their impairment was strictly in the perception of emotion.  相似文献   

18.
The perception of visual aftereffects has been long recognized, and these aftereffects reveal a relationship between perceptual categories. Thus, emotional expression aftereffects can be used to map the categorical relationships among emotion percepts. One might expect a symmetric relationship among categories, but an evolutionary, functional perspective predicts an asymmetrical relationship. In a series of 7 experiments, the authors tested these predictions. Participants fixated on a facial expression, then briefly viewed a neutral expression, then reported the apparent facial expression of the 2nd image. Experiment 1 revealed that happy and sad are opposites of one another; each evokes the other as an aftereffect. The 2nd and 3rd experiments reveal that fixating on any negative emotions yields an aftereffect perceived as happy, whereas fixating on a happy face results in the perception of a sad aftereffect. This suggests an asymmetric relationship among categories. Experiments 4-7 explored the mechanism driving this effect. The evolutionary and functional explanations for the category asymmetry are discussed.  相似文献   

19.
To inform how emotions in speech are implicitly processed and registered in memory, we compared how emotional prosody, emotional semantics, and both cues in tandem prime decisions about conjoined emotional faces. Fifty-two participants rendered facial affect decisions (Pell, 2005a), indicating whether a target face represented an emotion (happiness or sadness) or not (a facial grimace), after passively listening to happy, sad, or neutral prime utterances. Emotional information from primes was conveyed by: (1) prosody only; (2) semantic cues only; or (3) combined prosody and semantic cues. Results indicated that prosody, semantics, and combined prosody-semantic cues facilitate emotional decisions about target faces in an emotion-congruent manner. However, the magnitude of priming did not vary across tasks. Our findings highlight that emotional meanings of prosody and semantic cues are systematically registered during speech processing, but with similar effects on associative knowledge about emotions, which is presumably shared by prosody, semantics, and faces.  相似文献   

20.
Research on emotion processing in the visual modality suggests a processing advantage for emotionally salient stimuli, even at early sensory stages; however, results concerning the auditory correlates are inconsistent. We present two experiments that employed a gating paradigm to investigate emotional prosody. In Experiment 1, participants heard successively building segments of Jabberwocky “sentences” spoken with happy, angry, or neutral intonation. After each segment, participants indicated the emotion conveyed and rated their confidence in their decision. Participants in Experiment 2 also heard Jabberwocky “sentences” in successive increments, with half discriminating happy from neutral prosody, and half discriminating angry from neutral prosody. Participants in both experiments identified neutral prosody more rapidly and accurately than happy or angry prosody. Confidence ratings were greater for neutral sentences, and error patterns also indicated a bias for recognising neutral prosody. Taken together, results suggest that enhanced processing of emotional content may be constrained by stimulus modality.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号