首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Functional hemispheric specialization in recognizing faces expressing emotions was investigated in 18 normal hearing and 18 congenitally deaf children aged 13-14 years. Three kinds of faces were presented: happy, to express positive emotions, sad, to express negative emotions, and neutral. The subjects' task was to recognize the test face exposed for 20 msec in the left or right visual field. The subjects answered by pointing at the exposed stimulus on the response card that contained three different faces. The errors committed in expositions of faces in the left and right visual field were analyzed. In the control group the right hemisphere dominated in case of sad and neutral faces. There were no significant differences in recognition of happy faces. The differentiated hemispheric organization pattern in normal hearing persons supports the hypothesis of different processing of positive and negative emotions expressed by faces. The observed hemispheric asymmetry was a result of two factors: (1) processing of faces as complex patterns requiring visuo-spatial analysis, and (2) processing of emotions contained in them. Functional hemispheric asymmetry was not observed in the group of deaf children for any kind of emotion expressed in the presented faces. The results suggest that lack of auditory experience influences the organization of functional hemispheric specialization. It can be supposed that in deaf children, the analysis of information contained in emotional faces takes place in both hemispheres.  相似文献   

2.
Previous studies of cerebral asymmetry for the perception of American Sign Language (ASL) have used only static representations of signs; in this study we present moving signs. Congenitally deaf, native ASL signers identified moving signs, static representations of signs, and English words. The stimuli were presented rapidly by motion picture to each visual hemifield. Normally hearing English speakers also identified the English words. Consistent with previous findings, both the deaf and the hearing subjects showed a left-hemisphere advantage to the English words; likewise, the deaf subjects showed a right hemisphere advantage to the statically presented signs. With the moving signs, the deaf showed no lateral asymmetry. The shift from right dominance to a more balanced hemispheric involvement with the change from static to moving signs is consistent with Kimura's position that the left hemisphere predominates in the analysis of skilled motor sequencing (Kimura 1976). The results also indicate that ASL may be more bilaterally represented than is English and that the spatial component of language stimuli can greatly influence lateral asymmetries.  相似文献   

3.
Cerebral laterality was examined for third-, fourth-, and fifth-grade deaf and hearing subjects. The experimental task involved the processing of word and picture stimuli presented singly to the right and left visual hemifields. The analyses indicated the deaf children were faster than the hearing children in overall processing efficiency, and that they performed differently in regard to hemispheric lateralization. The deaf children processed the stimuli more efficiently in the right hemisphere, while the hearing children demonstrated a left-hemisphere proficiency. This finding is discussed in terms of the hypothesis that cerebral lateralization is influenced by auditory processing.  相似文献   

4.
Through computational modeling, here we examine whether visual and task characteristics of writing systems alone can account for lateralization differences in visual word recognition between different languages without assuming influence from left hemisphere (LH) lateralized language processes. We apply a hemispheric processing model of face recognition to visual word recognition; the model implements a theory of hemispheric asymmetry in perception that posits low spatial frequency biases in the right hemisphere and high spatial frequency (HSF) biases in the LH. We show two factors that can influence lateralization: (a) Visual similarity among words: The more similar the words in the lexicon look visually, the more HSF/LH processing is required to distinguish them, and (b) Requirement to decompose words into graphemes for grapheme‐phoneme mapping: Alphabetic reading (involving grapheme‐phoneme conversion) requires more HSF/LH processing than logographic reading (no grapheme‐phoneme mapping). These factors may explain the difference in lateralization between English and Chinese orthographic processing.  相似文献   

5.
This study explores the use of two types of facial expressions, linguistic and affective, in a lateralized recognition accuracy test with hearing and deaf subjects. The linguistic expressions represent unfamiliar facial expression for the hearing subjects whereas they serve as meaningful linguistic emblems for deaf signers. Hearing subjects showed left visual field advantages for both types of signals while deaf subjects' visual field asymmetries were greatly influenced by the order of presentation. The results suggest that for hearing persons, the right hemisphere may predominate in the recognition of all forms of facial expression. For deaf signers, hemispheric specialization for the processing of facial signals may be influenced by the differences these signals serve in this population. The use of noncanonical facial signals in laterality paradigms is encouraged as it provides an additional avenue of exploration into the underlying determinants of hemispheric specialization for recognition of facial expression.  相似文献   

6.
The aim of this study was to determine the influence of sex on hemispheric asymmetry and cooperation in a face recognition task. We used a masked priming paradigm in which the prime stimulus was centrally presented; it could be a bisymmetric face or a hemi-face in which facial information was presented in the left or the right visual field and projected to the right or the left hemisphere. The target stimulus was always a bisymmetric face presented centrally. Faces were selected from Minear and Park’s (2004) database. Fifty-two right-handed students (26 men, 26 women) participated in this experiment, in which accuracy (percentage of correct responses) and reaction times (RTs in ms) were measured. Although accuracy data showed that the percentage of correct recognition – when prime and target matched – was equivalent in men and women, men’s RTs were longer than women’s in all conditions. Accuracy and RTs showed that men are more strongly lateralized than women, with right hemispheric dominance. These results suggest that men are as good at face recognition as women, but there are functional differences in the two sexes. The findings are discussed in terms of functional cerebral networks distributed over both hemispheres and of interhemispheric transmission.  相似文献   

7.
The cerebral lateralization pattern for speech production in normal hearing and congenitally deaf children was studied using the dual-task paradigm. Performance under the verbal task conditions showed predicted left hemispheric dominance for speech production in the normal hearing children. No developmental trends in asymmetry were found, suggesting that speech lateralization is present in normal 3-year-old children. These data support the developmental invariance hypothesis of cerebral organization. Deaf children showed more symmetrical patterns of cerebral control for speech production. No developmental trends in functional brain organization were observed among prepubescent deaf children.  相似文献   

8.
This study examines two phenomena related to face perception, both of which depend on experience and holistic processing: perceivers process faces more efficiently in the right hemisphere of the brain (a hemispheric asymmetry), and they typically show greater recognition accuracy for members of their racial ingroup (a cross-race recognition deficit). The current study tests the possibility that these two effects are related. If asymmetry depends on experience, it should be particularly evident with (more familiar) ingroup faces; if cross-race recognition relies on holistic processing, it should be particularly evident for faces presented to the right hemisphere. Black and White participants viewed Black and White faces presented to either the left or right visual field. As predicted, participants showed a more pronounced asymmetry for ingroup (rather than outgroup) faces, and cross-race recognition deficits were more pronounced for stimuli presented to the left (rather than the right) visual field.  相似文献   

9.
In Experiment 1 neither hearing nor prelingually deaf signing adolescents showed marked lateralization for lexical decision but, unlike the hearing, the deaf were not impaired by the introduction of pseudohomophones. In Experiment 2 semantic categorization produced a left hemisphere advantage in the hearing for words but not pictures whereas in the deaf words and signs but not pictures showed a right hemisphere advantage. In Experiment 3 the lexical decision and semantic categorization findings were confirmed and both groups showed a right hemisphere advantage for a face/nonface decision task. The possible effect of initial language acquisition on the development of hemispheric lateralization for language is discussed.  相似文献   

10.
11.
There is evidence that automatic visual attention favors the right side. This study investigated whether this lateral asymmetry interacts with the right hemisphere dominance for visual location processing and left hemisphere dominance for visual shape processing. Volunteers were tested in a location discrimination task and a shape discrimination task. The target stimuli (S2) could occur in the left or right hemifield. They were preceded by an ipsilateral, contralateral or bilateral prime stimulus (S1). The attentional effect produced by the right S1 was larger than that produced by the left S1. This lateral asymmetry was similar between the two tasks suggesting that the hemispheric asymmetries of visual mechanisms do not contribute to it. The finding that it was basically due to a longer reaction time to the left S2 than to the right S2 for the contralateral S1 condition suggests that the inhibitory component of attention is laterally asymmetric.  相似文献   

12.
A group of congenitally deaf adults and a group of hearing adults, both fluent in sign language, were tested to determine cerebral lateralization. In the most revealing task, subjects were given a series of trials in which they were fist presented with a videotaped sign and then with a word exposed tachistoscopically to the right visual field or left visual field, and were required to judge whether the word corresponded to the sign or not. The results suggested that the comparison processes involved in the decision were performed more efficiently by the left hemisphere for hearing subjects and by the right hemisphere for deaf subjects. However, the deaf subjects performed as well as the hearing subjects in the left hemisphere, suggesting that the deaf are not impeded by their auditory-speech handicap from developing the left hemisphere for at least some types of linguistic processing.  相似文献   

13.
Evidence supporting individual variations in patterns of hemispheric involvement in the recognition of visuo-spatial and verbal stimuli among dextrals is reported. In Experiment 1, subjects' asymmetry scores on a task that was nonlateralized for the group as a whole were significantly correlated with their asymmetry scores on right-hemisphere-specialized tasks, including face recognition. In Experiment 2, subjects' asymmetry scores on a task that was nonlateralized for the group as a whole were significantly correlated with their asymmetry scores on a left-hemisphere-specialized word recognition task. These results suggest that individual dextrals' asymmetry scores on lateralized tasks are a joint function of a subject's underlying hemispheric specialization for that task and stable individual variations in asymmetric hemispheric reliance.  相似文献   

14.
Recently, we reported a strong right visual field/left hemisphere advantage for motion processing in deaf signers and a slight reverse asymmetry in hearing nonsigners (Bosworth & Dobkins, 1999). This visual field asymmetry in deaf signers may be due to auditory deprivation or to experience with a visual-manual language, American Sign Language (ASL). In order to separate these two possible sources, in this study we added a third group, hearing native signers, who have normal hearing and have learned ASL from their deaf parents. As in our previous study, subjects performed a direction-of-motion discrimination task at different locations across the visual field. In addition to investigating differences in left vs right visual field asymmetries across subject groups, we also asked whether performance differences exist for superior vs inferior visual fields and peripheral vs central visual fields. Replicating our previous study, a robust right visual field advantage was observed in deaf signers, but not in hearing nonsigners. Like deaf signers, hearing signers also exhibited a strong right visual field advantage, suggesting that this effect is related to experience with sign language. These results suggest that perceptual processes required for the acquisition and comprehension of language (motion processing in the case of ASL) are recruited by the left, language-dominant, hemisphere. Deaf subjects also exhibited an inferior visual field advantage that was significantly larger than that observed in either hearing group. In addition, there was a trend for deaf subjects to perform relatively better on peripheral than on central stimuli, while both hearing groups showed the reverse pattern. Because deaf signers differed from hearing signers and nonsigners along these domains, the inferior and peripheral visual field advantages observed in deaf subjects is presumably related to auditory deprivation. Finally, these visual field asymmetries were not modulated by attention for any subject group, suggesting they are a result of sensory, and not attentional, factors.  相似文献   

15.
Groups of deaf subjects, exposed to tachistoscopic bilateral presentation of English words and American Sign Language (ASL) signs, showed weaker right visual half-field (VHF) superiority for words than hearing comparison groups with both a free-recall and matching response. Deaf subjects showed better, though nonsignificant, recognition of left VHF signs with bilateral presentation of signs but shifted to superior right VHF response to signs when word-sign combinations were presented. Cognitive strategies and hemispheric specialization for ASL are discussed as possible factors affecting half-field asymmetry.  相似文献   

16.
Recent findings suggest a right hemispheric dominance in gaze-triggered shifts of attention. The aim of this study was to clarify the dominant hemisphere in the gaze processing that mediates attentional shift. A target localization task, with preceding non-predicative gaze cues presented to each visual field, was undertaken by 44 healthy subjects, measuring reaction time (RT). A face identification task was also given to determine hemispheric dominance in face processing for each subject. RT differences between valid and invalid cues were larger when presented in the left rather than the right visual field. This held true regardless of individual hemispheric dominance in face processing. Together, these results indicate right hemispheric dominance in gaze-triggered reflexive shifts of attention in normal healthy subjects.  相似文献   

17.
Previous findings have demonstrated that hemispheric organization in deaf users of American Sign Language (ASL) parallels that of the hearing population, with the left hemisphere showing dominance for grammatical linguistic functions and the right hemisphere showing specialization for non-linguistic spatial functions. The present study addresses two further questions: first, do extra-grammatical discourse functions in deaf signers show the same right-hemisphere dominance observed for discourse functions in hearing subjects; and second, do discourse functions in ASL that employ spatial relations depend upon more general intact spatial cognitive abilities? We report findings from two right-hemisphere damaged deaf signers, both of whom show disruption of discourse functions in absence of any disruption of grammatical functions. The exact nature of the disruption differs for the two subjects, however. Subject AR shows difficulty in maintaining topical coherence, while SJ shows difficulty in employing spatial discourse devices. Further, the two subjects are equally impaired on non-linguistic spatial tasks, indicating that spared spatial discourse functions can occur even when more general spatial cognition is disrupted. We conclude that, as in the hearing population, discourse functions involve the right hemisphere; that distinct discourse functions can be dissociated from one another in ASL; and that brain organization for linguistic spatial devices is driven by its functional role in language processing, rather than by its surface, spatial characteristics.  相似文献   

18.
A visual hemifield experiment investigated hemispheric specialization among hearing children and adults and prelingually, profoundly deaf youngsters who were exposed intensively to Cued Speech (CS). Of interest was whether deaf CS users, who undergo a development of phonology and grammar of the spoken language similar to that of hearing youngsters, would display similar laterality patterns in the processing of written language. Semantic, rhyme, and visual judgement tasks were used. In the visual task no VF advantage was observed. A RVF (left hemisphere) advantage was obtained for both the deaf and the hearing subjects for the semantic task, supporting Neville's claim that the acquisition of competence in the grammar of language is critical in establishing the specialization of the left hemisphere for language. For the rhyme task, however, a RVF advantage was obtained for the hearing subjects, but not for the deaf ones, suggesting that different neural resources are recruited by deaf and hearing subjects. Hearing the sounds of language may be necessary to develop left lateralised processing of rhymes.  相似文献   

19.
Face recognition and word reading are thought to be mediated by relatively independent cognitive systems lateralised to the right and left hemispheres, respectively. In this case, we should expect a higher incidence of face recognition problems in patients with right hemisphere injury and a higher incidence of reading problems in patients with left hemisphere injury. We tested this hypothesis in a group of 31 patients with unilateral right or left hemisphere infarcts in the territory of the posterior cerebral arteries. In most domains tested (e.g., visual attention, object recognition, visuo-construction, motion perception), we found that both patient groups performed significantly worse than a matched control group. In particular, we found a significant number of face recognition deficits in patients with left hemisphere injury and a significant number of patients with word reading deficits following right hemisphere injury. This suggests that face recognition and word reading may be mediated by more bilaterally distributed neural systems than is commonly assumed.  相似文献   

20.
Conflicting evidence has appeared in the literature concerning hemispheric asymmetry in the perception of rhythm. The present study investigated the effects of rhythm length on relative cerebral dominance. Twenty-four subjects were presented with sequences of one to four time intervals bounded by light flashes. The subjects' task was to determine if two such sequences were the same or different. The first rhythm was presented in both visual fields and the second only to one visual field. Reaction times and number of errors were recorded. It was found that increasing rhythm length resulted in a shift in cerebral dominance from left to right hemisphere. An interpretation of these findings was suggested in terms of the preferred mode of processing of each hemisphere, analytic vs. holistic.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号