首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
Recently, we reported a strong right visual field/left hemisphere advantage for motion processing in deaf signers and a slight reverse asymmetry in hearing nonsigners (Bosworth & Dobkins, 1999). This visual field asymmetry in deaf signers may be due to auditory deprivation or to experience with a visual-manual language, American Sign Language (ASL). In order to separate these two possible sources, in this study we added a third group, hearing native signers, who have normal hearing and have learned ASL from their deaf parents. As in our previous study, subjects performed a direction-of-motion discrimination task at different locations across the visual field. In addition to investigating differences in left vs right visual field asymmetries across subject groups, we also asked whether performance differences exist for superior vs inferior visual fields and peripheral vs central visual fields. Replicating our previous study, a robust right visual field advantage was observed in deaf signers, but not in hearing nonsigners. Like deaf signers, hearing signers also exhibited a strong right visual field advantage, suggesting that this effect is related to experience with sign language. These results suggest that perceptual processes required for the acquisition and comprehension of language (motion processing in the case of ASL) are recruited by the left, language-dominant, hemisphere. Deaf subjects also exhibited an inferior visual field advantage that was significantly larger than that observed in either hearing group. In addition, there was a trend for deaf subjects to perform relatively better on peripheral than on central stimuli, while both hearing groups showed the reverse pattern. Because deaf signers differed from hearing signers and nonsigners along these domains, the inferior and peripheral visual field advantages observed in deaf subjects is presumably related to auditory deprivation. Finally, these visual field asymmetries were not modulated by attention for any subject group, suggesting they are a result of sensory, and not attentional, factors.  相似文献   

2.
Cerebral laterality was examined for third-, fourth-, and fifth-grade deaf and hearing subjects. The experimental task involved the processing of word and picture stimuli presented singly to the right and left visual hemifields. The analyses indicated the deaf children were faster than the hearing children in overall processing efficiency, and that they performed differently in regard to hemispheric lateralization. The deaf children processed the stimuli more efficiently in the right hemisphere, while the hearing children demonstrated a left-hemisphere proficiency. This finding is discussed in terms of the hypothesis that cerebral lateralization is influenced by auditory processing.  相似文献   

3.
We investigated the relative role of the left versus right hemisphere in the comprehension of American Sign Language (ASL). Nineteen lifelong signers with unilateral brain lesions [11 left hemisphere damaged (LHD) and 8 right hemisphere damaged (RHD)] performed three tasks, an isolated single-sign comprehension task, a sentence-level comprehension task involving simple one-step commands, and a sentence-level comprehension task involving more complex multiclause/multistep commands. Eighteen of the participants were deaf, one RHD subject was hearing and bilingual (ASL and English). Performance was examined in relation to two factors: whether the lesion was in the right or left hemisphere and whether the temporal lobe was involved. The LHD group performed significantly worse than the RHD group on all three tasks, confirming left hemisphere dominance for sign language comprehension. The group with left temporal lobe involvement was significantly impaired on all tasks, whereas each of the other three groups performed at better than 95% correct on the single sign and simple sentence comprehension tasks, with performance falling off only on the complex sentence comprehension items. A comparison with previously published data suggests that the degree of difficulty exhibited by the deaf RHD group on the complex sentences is comparable to that observed in hearing RHD subjects. Based on these findings we hypothesize (i) that deaf and hearing individuals have a similar degree of lateralization of language comprehension processes and (ii) that language comprehension depends primarily on the integrity of the left temporal lobe.  相似文献   

4.
A visual hemifield experiment investigated hemispheric specialization among hearing children and adults and prelingually, profoundly deaf youngsters who were exposed intensively to Cued Speech (CS). Of interest was whether deaf CS users, who undergo a development of phonology and grammar of the spoken language similar to that of hearing youngsters, would display similar laterality patterns in the processing of written language. Semantic, rhyme, and visual judgement tasks were used. In the visual task no VF advantage was observed. A RVF (left hemisphere) advantage was obtained for both the deaf and the hearing subjects for the semantic task, supporting Neville's claim that the acquisition of competence in the grammar of language is critical in establishing the specialization of the left hemisphere for language. For the rhyme task, however, a RVF advantage was obtained for the hearing subjects, but not for the deaf ones, suggesting that different neural resources are recruited by deaf and hearing subjects. Hearing the sounds of language may be necessary to develop left lateralised processing of rhymes.  相似文献   

5.
This study explores the use of two types of facial expressions, linguistic and affective, in a lateralized recognition accuracy test with hearing and deaf subjects. The linguistic expressions represent unfamiliar facial expression for the hearing subjects whereas they serve as meaningful linguistic emblems for deaf signers. Hearing subjects showed left visual field advantages for both types of signals while deaf subjects' visual field asymmetries were greatly influenced by the order of presentation. The results suggest that for hearing persons, the right hemisphere may predominate in the recognition of all forms of facial expression. For deaf signers, hemispheric specialization for the processing of facial signals may be influenced by the differences these signals serve in this population. The use of noncanonical facial signals in laterality paradigms is encouraged as it provides an additional avenue of exploration into the underlying determinants of hemispheric specialization for recognition of facial expression.  相似文献   

6.
The purpose of this study was to investigate hemispheric functional asymmetry in 18 normal hearing children and 18 congenitally deaf children aged 13-14 years. The task was identification of a visual stimulus (3-letter word or photograph of a face) presented in either the left or right visual field. The children responded by pointing to the target stimulus on a response card which contained four different words or three different faces. The percentage of errors for presentations to the two visual fields were analysed to determine hemispheric dominance. The pattern of hemispheric differences for the hearing children was consistent with that from previous investigations. The results for the deaf children differed from those of the normals. In word perception we observed a right hemisphere advantage and in the face recognition a lack of hemispheric differences. These results point to a lack of auditory experiences which is affecting the functional organization of the two hemispheres. It is suggested that the necessity to make use of visuo-spatial information in the process of communication causes right hemisphere dominance in verbal tasks. This may influence the perception of other visuo-spatial stimuli which may yield a lack of hemispheric asymmetry in face recognition.  相似文献   

7.
Functional hemispheric specialization in recognizing faces expressing emotions was investigated in 18 normal hearing and 18 congenitally deaf children aged 13-14 years. Three kinds of faces were presented: happy, to express positive emotions, sad, to express negative emotions, and neutral. The subjects' task was to recognize the test face exposed for 20 msec in the left or right visual field. The subjects answered by pointing at the exposed stimulus on the response card that contained three different faces. The errors committed in expositions of faces in the left and right visual field were analyzed. In the control group the right hemisphere dominated in case of sad and neutral faces. There were no significant differences in recognition of happy faces. The differentiated hemispheric organization pattern in normal hearing persons supports the hypothesis of different processing of positive and negative emotions expressed by faces. The observed hemispheric asymmetry was a result of two factors: (1) processing of faces as complex patterns requiring visuo-spatial analysis, and (2) processing of emotions contained in them. Functional hemispheric asymmetry was not observed in the group of deaf children for any kind of emotion expressed in the presented faces. The results suggest that lack of auditory experience influences the organization of functional hemispheric specialization. It can be supposed that in deaf children, the analysis of information contained in emotional faces takes place in both hemispheres.  相似文献   

8.
In two experiments, deaf and hearing subjects learned paired associate lists in which rated visual imagery and signability (a measure of the ease with which a word can be represented as a gestural sign) were orthogonally varied. Visual presentation of three alternating study-recall trials resulted in significant positive effects of imagery for both deaf and hearing subjects, whereas signability facilitated recall only for deaf subjects. Examination of the relation between item attributes and reported learning strategy indicated that both deaf and hearing subjects used imaginal mediators more frequently for high-imagery than low-imagery pairs. A gestural sign strategy was reported almost exclusively by deaf subjects, particularly for high-signability pairs. These results suggest that an examination of the effects of sign language variables will contribute to an understanding of the qualitative differences in the associative learning of the deaf and hearing.  相似文献   

9.
Two closely related semantic processing tasks were studied under identical procedural conditions in order to examine lateral visual field effects on reaction times. In Experiment 1, reaction times did not differ as a function of visual field when subjects decided whether a lateral word was a member of a foveally presented category word (category membership task). On the other hand, reaction times were faster for right than for left visual field stimulus presentations when subjects decided whether two words, one lateral and one foveal, belonged to the same category (category matching task), although this advantage did not occur immediately. In Experiment 2, the laterality effect in the category matching task was studied as a function of word familiarity. Two groups of subjects performed the matching task for two blocks of trials, one group receiving the same word list in each block and the other receiving different lists. No visual field differences in reaction times were observed for either group during the first block of trials, but a distinct right field advantage appeared for both during the second block. The data from these experiments suggest that category matching strategies rely upon structures or processes localized in the left hemisphere, although their influence is not immediate. Category membership strategies, on the other hand, do not depend upon such localized structures.  相似文献   

10.
Hemispheric differences for orthographic and phonological processing   总被引:5,自引:3,他引:2  
The role of hemispheric differences for the encoding of words was assessed by requiring subjects to match tachistoscopically presented word pairs on the basis of their rhyming or visual similarity. The interference between a word pair's orthography and phonology produced matching errors which were differentially affected by the visual field/hemisphere of projection and sex of subject. In general, right visual field/left hemisphere presentations yielded fewer errors when word pairs shared similar phonology under rhyme matching and similar orthography under visual matching. Left visual field/right hemisphere presentations yielded fewer errors when word pairs were phonologically dissimilar under rhyme matching and orthographically dissimilar under visual matching. Males made more errors and demonstrated substantially stronger hemispheric effects than females. These patterns suggested visual field/hemispheric differences for orthographic and phonological encoding occurred during the initial stages of word processing and were more pronounced for male compared to female subjects.  相似文献   

11.
Previous studies of cerebral asymmetry for the perception of American Sign Language (ASL) have used only static representations of signs; in this study we present moving signs. Congenitally deaf, native ASL signers identified moving signs, static representations of signs, and English words. The stimuli were presented rapidly by motion picture to each visual hemifield. Normally hearing English speakers also identified the English words. Consistent with previous findings, both the deaf and the hearing subjects showed a left-hemisphere advantage to the English words; likewise, the deaf subjects showed a right hemisphere advantage to the statically presented signs. With the moving signs, the deaf showed no lateral asymmetry. The shift from right dominance to a more balanced hemispheric involvement with the change from static to moving signs is consistent with Kimura's position that the left hemisphere predominates in the analysis of skilled motor sequencing (Kimura 1976). The results also indicate that ASL may be more bilaterally represented than is English and that the spatial component of language stimuli can greatly influence lateral asymmetries.  相似文献   

12.
In Experiment 1 neither hearing nor prelingually deaf signing adolescents showed marked lateralization for lexical decision but, unlike the hearing, the deaf were not impaired by the introduction of pseudohomophones. In Experiment 2 semantic categorization produced a left hemisphere advantage in the hearing for words but not pictures whereas in the deaf words and signs but not pictures showed a right hemisphere advantage. In Experiment 3 the lexical decision and semantic categorization findings were confirmed and both groups showed a right hemisphere advantage for a face/nonface decision task. The possible effect of initial language acquisition on the development of hemispheric lateralization for language is discussed.  相似文献   

13.
The nature of hemispheric processing in the prelingually deaf was examined in a picture-letter matching task. It was hypothesized that linguistic competence in the deaf would be associated with normal or near-normal laterality (i.e., a left hemisphere advantage for analytic linguistic tasks). Subjects were shown a simple picture of a common object (e.g., lamp), followed by brief unilateral presentation of a manually signed or orthographic letter, and they had to indicate as quickly as possible whether the letter was present in the spelling of the object's label. While hearing subjects showed a marked left hemisphere advantage, no such superiority was found for either a linguistically skilled or unskilled group of deaf students. In the skilled group, however, there was a suggestion of a right hemisphere advantage for manually signed letters. It was concluded that while hemispheric asymmetry of function does not develop normally in the deaf, the absence of this normal pattern does not preclude the development of the analytic skills needed to deal with the structure of language.  相似文献   

14.
A divided visual field, priming paradigm was used to observe how adults who have a history of developmental language disorder (DLD) access lexically ambiguous words. The results show that sustained semantic access to subordinate word meanings (such as BANK-RIVER), which is seen in control subjects, is disrupted in the right cerebral hemisphere for this special population of readers. In the left hemisphere, only the most dominant meaning of the ambiguous word shows sustained priming in both controls and DLD participants. Therefore, for the DLD readers the subordinate meanings of words are not primed in either hemisphere and, thus, may not be available during online processing and integration of discourse. This right hemisphere lexical access deficit might contribute to the language comprehension difficulties exhibited by adult readers with a history of DLD.  相似文献   

15.
Rhyming and the right hemisphere   总被引:2,自引:0,他引:2  
Subjects determined whether two successively presented and orthographically different words rhymed with each other. The first word was presented at fixation and the second was presented either to the left or to the right of fixation, either alone (unilateral presentation) or accompanied by a distractor word in the other visual hemifield (bilateral presentation). Subjects were more accurate when the words did not rhyme, when presentation was unilateral, and when the target was flashed to the right visual hemifield. It was predicted that bilateral presentation would produce interference when information from both visual fields was processed by one hemisphere (callosal relay), but not when each of the two hemispheres performed a task independently (direct access). That is, callosal relay tasks should show greater laterality effects with bilateral presentations, whereas direct access tasks should show similar laterality effects with both bilateral and unilateral presentations. Greater laterality effects were observed for bilaterally presented rhyming words, but nonrhyming words showed similar laterality effects for both bilateral and unilateral presentations. These results suggest that judgment of nonrhyming words can be performed by either hemisphere, but that judgment of rhyming words requires callosal relay to the left hemisphere. The absence of a visual field difference with nonrhyming word pairs suggests further that judgment of nonrhyming word pairs may be accomplished by the right hemisphere when presentation is to the left visual field.  相似文献   

16.
Sign language displays all the complex linguistic structure found in spoken languages, but conveys its syntax in large part by manipulating spatial relations. This study investigated whether deaf signers who rely on a visual-spatial language nonetheless show a principled cortical separation for language and nonlanguage visual-spatial functioning. Four unilaterally brain-damaged deaf signers, fluent in American Sign Language (ASL) before their strokes, served as subjects. Three had damage to the left hemisphere and one had damage to the right hemisphere. They were administered selected tests of nonlanguage visual-spatial processing. The pattern of performance of the four patients across this series of tests suggests that deaf signers show hemispheric specialization for nonlanguage visual-spatial processing that is similar to hearing speaking individuals. The patients with damage to the left hemisphere, in general, appropriately processed visual-spatial relationships, whereas, in contrast, the patient with damage to the right hemisphere showed consistent and severe visual-spatial impairment. The language behavior of these patients was much the opposite, however. Indeed, the most striking separation between linguistic and nonlanguage visual-spatial functions occurred in the left-hemisphere patient who was most severely aphasic for sign language. Her signing was grossly impaired, yet her visual-spatial capacities across the series of tests were surprisingly normal. These data suggest that the two cerebral hemispheres of congenitally deaf signers can develop separate functional specialization for nonlanguage visual-spatial processing and for language processing, even though sign language is conveyed in large part via visual-spatial manipulation.  相似文献   

17.
To investigate hemispheric differences in the timing of word priming, the modulation of event-related potentials by semantic word relationships was examined in each cerebral hemisphere. Primes and targets, either categorically (silk-wool) or associatively (needle-sewing) related, were presented to the left or right visual field in a go/no-go lexical decision task. The results revealed significant reaction-time and physiological differences in both visual fields only for associatively related word pairs, but an electrophysiological difference also tended to reach significance for categorically related words when presented in the left visual field. ERP waveforms showed a different time-course of associative priming effects according to the field of presentation. In the right visual field/left hemisphere, both N400 and Late Positive Component (LPC/P600) were modulated by semantic relatedness, while only a late effect was present in the left visual field/ right hemisphere.  相似文献   

18.
Left-Hemisphere Dominance for Motion Processing in Deaf Signers   总被引:4,自引:0,他引:4  
Evidence from neurophysiological studies in animals as well as humans has demonstrated robust changes in neural organization and function following early-onset sensory deprivation. Unfortunately, the perceptual consequences of these changes remain largely unexplored. The study of deaf individuals who have been auditorily deprived since birth and who rely on a visual language (i.e., American Sign Language, ASL) for communication affords a unique opportunity to investigate the degree to which perception in the remaining, intact senses (e.g., vision) is modified as a result of altered sensory and language experience. We studied visual motion perception in deaf individuals and compared their performance with that of hearing subjects. Thresholds and reaction times were obtained for a motion discrimination task, in both central and peripheral vision. Although deaf and hearing subjects had comparable absolute scores on this task, a robust and intriguing difference was found regarding relative performance for left-visual-field (LVF) versus right-visual-field (RVF) stimuli: Whereas hearing subjects exhibited a slight LVF advantage, the deaf exhibited a strong RVF advantage. Thus, for deaf subjects, the left hemisphere may be specialized for motion processing. These results suggest that perceptual processes required for the acquisition and comprehension of language (motion processing, in the case of ASL) are recruited (or "captured") by the left, language-dominant hemisphere.  相似文献   

19.
Popular theory on the tendency to cradle an infant to the left side points to the specialization of the right hemisphere for the perception and expression of emotion. J. S. Sieratzki and B. Woll (1996) recently suggested that more emphasis be placed on the auditory modality, specifically focusing on the role of prosodic information. In this study, the direction of the lateral cradling bias in a group of profoundly deaf children, a group of deaf adults, and a control group of adults with no hearing impairment was investigated. The authors found a strong leftward cradling bias in all groups, a bias that was, if anything, stronger in the deaf participants. Given that people who are profoundly deaf, especially those who have been deaf from birth, have not been exposed to auditory prosody, the data do not support the suggestion that such prosodic information is the basis for the leftward cradling bias.  相似文献   

20.
Popular theory on the tendency to cradle an infant to the left side points to the specialization of the right hemisphere for the perception and expression of emotion. J. S. Sieratzki and B. Woll (1996) recently suggested that more emphasis be placed on the auditory modality, specifically focusing on the role of prosodic information. In this study, the direction of the lateral cradling bias in a group of profoundly deaf children, a group of deaf adults, and a control group of adults with no hearing impairment was investigated. The authors found a strong leftward cradling bias in all groups, a bias that was, if anything, stronger in the deaf participants. Given that people who are profoundly deaf, especially those who have been deaf from birth, have not been exposed to auditory prosody, the data do not support the suggestion that such prosodic information is the basis for the leftward cradling bias.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号