首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Recently, we reported a strong right visual field/left hemisphere advantage for motion processing in deaf signers and a slight reverse asymmetry in hearing nonsigners (Bosworth & Dobkins, 1999). This visual field asymmetry in deaf signers may be due to auditory deprivation or to experience with a visual-manual language, American Sign Language (ASL). In order to separate these two possible sources, in this study we added a third group, hearing native signers, who have normal hearing and have learned ASL from their deaf parents. As in our previous study, subjects performed a direction-of-motion discrimination task at different locations across the visual field. In addition to investigating differences in left vs right visual field asymmetries across subject groups, we also asked whether performance differences exist for superior vs inferior visual fields and peripheral vs central visual fields. Replicating our previous study, a robust right visual field advantage was observed in deaf signers, but not in hearing nonsigners. Like deaf signers, hearing signers also exhibited a strong right visual field advantage, suggesting that this effect is related to experience with sign language. These results suggest that perceptual processes required for the acquisition and comprehension of language (motion processing in the case of ASL) are recruited by the left, language-dominant, hemisphere. Deaf subjects also exhibited an inferior visual field advantage that was significantly larger than that observed in either hearing group. In addition, there was a trend for deaf subjects to perform relatively better on peripheral than on central stimuli, while both hearing groups showed the reverse pattern. Because deaf signers differed from hearing signers and nonsigners along these domains, the inferior and peripheral visual field advantages observed in deaf subjects is presumably related to auditory deprivation. Finally, these visual field asymmetries were not modulated by attention for any subject group, suggesting they are a result of sensory, and not attentional, factors.  相似文献   

2.
The case of human deafness constitutes a unique opportunity to examine possible consequences for perceptual processing due to altered sensory experiences. We tested whether deaf??in contrast to hearing??individuals are more susceptible to visual distraction from peripheral than from central face versus object stimuli. The participants were required to classify the gender of a target male or female symbol presented either alone (low perceptual load) or together with three filler symbols (high perceptual load), while ignoring gender-congruent or -incongruent face versus object distractors presented at central or peripheral positions. The gender classifications were affected by distractor gender under low, but not under high, perceptual load in hearing participants. In contrast, the responses of deaf participants were similarly influenced by distractor gender under both levels of perceptual load. There was no evidence for generally enhanced attention to the visual periphery in deaf individuals. Our results indicate that auditory deprivation may result in enhanced attentional capacities under high perceptual load.  相似文献   

3.
Temporal processing in deaf signers   总被引:4,自引:0,他引:4  
The auditory and visual modalities differ in their capacities for temporal analysis, and speech relies on more rapid temporal contrasts than does sign language. We examined whether congenitally deaf signers show enhanced or diminished capacities for processing rapidly varying visual signals in light of the differences in sensory and language experience of deaf and hearing individuals. Four experiments compared rapid temporal analysis in deaf signers and hearing subjects at three different levels: sensation, perception, and memory. Experiment 1 measured critical flicker frequency thresholds and Experiment 2, two-point thresholds to a flashing light. Experiments 3-4 investigated perception and memory for the temporal order of rapidly varying nonlinguistic visual forms. In contrast to certain previous studies, specifically those investigating the effects of short-term sensory deprivation, no significant differences between deaf and hearing subjects were found at any level. Deaf signers do not show diminished capacities for rapid temporal analysis, in comparison to hearing individuals. The data also suggest that the deficits in rapid temporal analysis reported previously for children with developmental language delay cannot be attributed to lack of experience with speech processing and production.  相似文献   

4.
This study investigated serial recall by congenitally, profoundly deaf signers for visually specified linguistic information presented in their primary language, American Sign Language (ASL), and in printed or fingerspelled English. There were three main findings. First, differences in the serial-position curves across these conditions distinguished the changing-state stimuli from the static stimuli. These differences were a recency advantage and a primacy disadvantage for the ASL signs and fingerspelled English words, relative to the printed English words. Second, the deaf subjects, who were college students and graduates, used a sign-based code to recall ASL signs, but not to recall English words; this result suggests that well-educated deaf signers do not translate into their primary language when the information to be recalled is in English. Finally, mean recall of the deaf subjects for ordered lists of ASL signs and fingerspelled and printed English words was significantly less than that of hearing control subjects for the printed words; this difference may be explained by the particular efficacy of a speech-based code used by hearing individuals for retention of ordered linguistic information and by the relatively limited speech experience of congenitally, profoundly deaf individuals.  相似文献   

5.
The aim of this work was to explore the nature of elementary operations (engage, move, disengage, and filtering) of spatial attention in deaf experts in sign language. Good communication skills require deaf people to rapidly change attention to at least two separate spatial locations, the facial expression and the hand signs of the speaker. Overtraining imposed by sign language demands might have modified certain characteristics of the spatial attention operations. To test that, a spatial orienting task was used in two experiments. Experiment 1 showed that deaf subjects reoriented their attention to the target location faster than hearing subjects in invalid trials. Experiment 2 indicated that inhibition of return decays faster in deaf than in hearing people. These results suggest that deaf subjects can disengage their attention faster than hearing subjects, fostering search of relevant information in more spatial locations.  相似文献   

6.
Studies have reported a right visual field (RVF) advantage for coherent motion detection by deaf and hearing signers but not non-signers. Yet two studies [Bosworth R. G., & Dobkins, K. R. (2002). Visual field asymmetries for motion processing in deaf and hearing signers. Brain and Cognition, 49, 170-181; Samar, V. J., & Parasnis, I. (2005). Dorsal stream deficits suggest hidden dyslexia among deaf poor readers: Correlated evidence from reduced perceptual speed and elevated coherent motion detection thresholds. Brain and Cognition, 58, 300-311.] reported a small, non-significant RVF advantage for deaf signers when short duration motion stimuli were used (200-250 ms). Samar and Parasnis (2005) reported that this small RVF advantage became significant when non-verbal IQ was statistically controlled. This paper presents extended analyses of the correlation between non-verbal IQ and visual field asymmetries in the data set of Samar and Parasnis (2005). We speculate that this correlation might plausibly be driven by individual differences either in age of acquisition of American Sign Language (ASL) or in the degree of neurodevelopmental insult associated with various etiologies of deafness. Limited additional analyses are presented that indicate a need for further research on the cause of this apparent IQ-laterality relationship. Some potential implications of this relationship for lateralization studies of deaf signers are discussed. Controlling non-verbal IQ may improve the reliability of short duration coherent motion tasks to detect adaptive dorsal stream lateralization due to exposure to ASL in deaf research participants.  相似文献   

7.
An organism's survival depends on the ability to rapidly orient attention to unanticipated events in the world. Yet, the conditions needed to elicit such involuntary capture remain in doubt. Especially puzzling are spatial cueing experiments, which have consistently shown that involuntary shifts of attention to highly salient distractors are not determined by stimulus properties, but instead are contingent on attentional control settings induced by task demands. Do we always need to be set for an event to be captured by it, or is there a class of events that draw attention involuntarily even when unconnected to task goals? Recent results suggest that a task-irrelevant event will capture attention on first presentation, suggesting that salient stimuli that violate contextual expectations might automatically capture attention. Here, we investigated the role of contextual expectation by examining whether an irrelevant motion cue that was presented only rarely (~3–6% of trials) would capture attention when observers had an active set for a specific target colour. The motion cue had no effect when presented frequently, but when rare produced a pattern of interference consistent with attentional capture. The critical dependence on the frequency with which the irrelevant motion singleton was presented is consistent with early theories of involuntary orienting to novel stimuli. We suggest that attention will be captured by salient stimuli that violate expectations, whereas top-down goals appear to modulate capture by stimuli that broadly conform to contextual expectations.  相似文献   

8.
There are well-known differences in resolution and performance across the visual field with performance generally better for the lower than the upper visual hemifield. Here we attempted to assess how transient attention summoned by a peripheral precue affects performance across the visual field. Four different attentional precueing tasks were used, varying in difficulty and attentional load. When a single discrimination target was presented (experiments 1 and 2), precues that summon transient attention had very little, if any, effect upon performance. However, when the target was presented among distractors (experiments 3 and 4), the precue had a substantial effect upon discrimination performance. The results showed that asymmetries in visual resolution between the upper and lower hemifields become more pronounced with increasing eccentricity. Furthermore, when the observers performed a precued acuity task with distractors, involving the judgment of the relative position of a small disk within a larger one, there was an asymmetry in the transient attentional effect on discrimination performance; the benefits of transient attention were larger in the upper than in the lower hemifield. Areas in the visual field where visual performance is generally worse thus appear to receive the largest attentional boost when needed. Possible ecological explanations for this are discussed.  相似文献   

9.
Left-Hemisphere Dominance for Motion Processing in Deaf Signers   总被引:4,自引:0,他引:4  
Evidence from neurophysiological studies in animals as well as humans has demonstrated robust changes in neural organization and function following early-onset sensory deprivation. Unfortunately, the perceptual consequences of these changes remain largely unexplored. The study of deaf individuals who have been auditorily deprived since birth and who rely on a visual language (i.e., American Sign Language, ASL) for communication affords a unique opportunity to investigate the degree to which perception in the remaining, intact senses (e.g., vision) is modified as a result of altered sensory and language experience. We studied visual motion perception in deaf individuals and compared their performance with that of hearing subjects. Thresholds and reaction times were obtained for a motion discrimination task, in both central and peripheral vision. Although deaf and hearing subjects had comparable absolute scores on this task, a robust and intriguing difference was found regarding relative performance for left-visual-field (LVF) versus right-visual-field (RVF) stimuli: Whereas hearing subjects exhibited a slight LVF advantage, the deaf exhibited a strong RVF advantage. Thus, for deaf subjects, the left hemisphere may be specialized for motion processing. These results suggest that perceptual processes required for the acquisition and comprehension of language (motion processing, in the case of ASL) are recruited (or "captured") by the left, language-dominant hemisphere.  相似文献   

10.
A chimpanzee ( Pan troglodytes ) performed a visual search task using a modified matching-to-sample procedure in which a sample stimulus was followed by the search display, which contained one stimulus identical to the sample (target) and several uniform stimuli different from the sample (distractors). On cued trials, while the subject was observing the sample, a white square (precue) appeared at the location where the target was to be presented (valid trials), or elsewhere (invalid trials). The validity of the precue (correspondence between the cued and the target locations) was changed from 0% to 100% across conditions. Cost-benefit analyses were performed on the difference between valid and noncued trials (benefit) and between invalid and noncued trials (cost). Under the high-validity conditions, the response times were shorter when the cued location corresponded to the target location than when the precue did not appear. When the cued location did not correspond to the target location, on the other hand, the subject took longer to select the target than on noncued trials. When the validity of the precue was relatively low, however, cost of the invalid trials disappeared, while benefit of the valid trials remained. These results confirmed the two-process (automatic and attentional) theory of priming in human information processing; the advance information had the same effects on a chimpanzee's visual search performance as on humans'.  相似文献   

11.
Parafoveal attention in congenitally deaf and hearing young adults   总被引:3,自引:1,他引:2  
This reaction-time study compared the performance of 20 congenitally and profoundly deaf, and 20 hearing college students on a parafoveal stimulus detection task in which centrally presented prior cues varied in their informativeness about stimulus location. In one condition, subjects detected a parafoveally presented circle with no other information being present in the visual field. In another condition, spatially complex and task-irrelevant foveal information was present which the subjects were instructed to ignore. The results showed that although both deaf and hearing people utilized cues to direct attention to specific locations and had difficulty in ignoring foveal information, deaf people were more proficient in redirecting attention from one spatial location to another in the presence of irrelevant foveal information. These results suggest that differences exist in the development of attentional mechanisms in deaf and hearing people. Both groups showed an overall right visual-field advantage in stimulus detection which was attenuated when the irrelevant foveal information was present. These results suggest a left-hemisphere superiority for detection of parafoveally presented stimuli independent of cue informativeness for both groups.  相似文献   

12.
Previous findings have demonstrated that hemispheric organization in deaf users of American Sign Language (ASL) parallels that of the hearing population, with the left hemisphere showing dominance for grammatical linguistic functions and the right hemisphere showing specialization for non-linguistic spatial functions. The present study addresses two further questions: first, do extra-grammatical discourse functions in deaf signers show the same right-hemisphere dominance observed for discourse functions in hearing subjects; and second, do discourse functions in ASL that employ spatial relations depend upon more general intact spatial cognitive abilities? We report findings from two right-hemisphere damaged deaf signers, both of whom show disruption of discourse functions in absence of any disruption of grammatical functions. The exact nature of the disruption differs for the two subjects, however. Subject AR shows difficulty in maintaining topical coherence, while SJ shows difficulty in employing spatial discourse devices. Further, the two subjects are equally impaired on non-linguistic spatial tasks, indicating that spared spatial discourse functions can occur even when more general spatial cognition is disrupted. We conclude that, as in the hearing population, discourse functions involve the right hemisphere; that distinct discourse functions can be dissociated from one another in ASL; and that brain organization for linguistic spatial devices is driven by its functional role in language processing, rather than by its surface, spatial characteristics.  相似文献   

13.
A series of experiments explored habituation and dishabituation to repeated auditory distractors. Participants memorised lists of visually presented items in silence or while ignoring continuously presented auditory distractors. No habituation could be observed, in that the size of the auditory distractor effect did not decrease during the experiment. However, there was evidence for attentional orienting when novel auditory material was presented after a long period of repetitive stimulation, in that a change of distractors was associated with a temporary decrease in recall performance. The results are most consistent with theoretical accounts that claim that the auditory distractor effect is caused primarily by automatic interference, but that still allow attention to play a limited role in the short-term maintenance of information.  相似文献   

14.
15.
It has been suggested that personally significant (PS) information interferes with performance only when presented within the focus of attention. However, this claim was never tested by a systematic manipulation of attention, but only by using correlative measures of its locus. We addressed this issue in two experiments, utilizing a cued visual search paradigm that allowed us to directly manipulate attention and to measure behavioral and physiological responses. One of the stimuli in the search display had a higher luminance value (i.e., was cued), and, orthogonally, one of the stimuli could be a PS or neutral name. When the cue did not predict target location, PS distractors mildly interfered with task performance, regardless of the cue's location. However, when the cue predicted target location, responses were facilitated for cued targets, indicating that attention was shifted to the cue. Importantly, PS distractors interfered with task performance and elicited enhanced orienting responses only when they were cued. This implies that PS information affects performance only when presented within but not outside the focus of attention.  相似文献   

16.
刘幸娟  张阳  张明 《心理科学》2011,34(3):558-564
基于位置的返回抑制(IOR)是指对先前注意过的位置上靶子反应变慢的现象。探讨听觉障碍被试检测任务IOR的时程和量是否受听觉剥夺的影响。实验1中,听觉障碍被试和听力正常组被试具有相同的IOR时程和量;但在取消中央线索化的实验2中,当SOA为350ms时,听力正常被试没有出现IOR,听觉障碍被试出现了IOR,说明听觉障碍被试的注意脱离快于听力正常被试。听觉障碍被试对外周靶子的反应快于听力正常被试,表明听障人群外周注意资源增强。这些结果表明听觉障碍被试的空间注意更具有效性和策略性。  相似文献   

17.
Two experiments were conducted on short-term recall of printed English words by deaf signers of American Sign Language (ASL). Compared with hearing subjects, deaf subjects recalled significantly fewer words when ordered recall of words was required, but not when free recall was required. Deaf subjects tended to use a speech-based code in probed recall for order, and the greater the reliance on a speech-based code, the more accurate the recall. These results are consistent with the hypothesis that a speech-based code facilitates the retention of order information.  相似文献   

18.
ERPs were recorded from deaf and hearing native signers and from hearing subjects who acquired ASL late or not at all as they viewed ASL signs that formed sentences. The results were compared across these groups and with those from hearing subjects reading English sentences. The results suggest that there are constraints on the organization of the neural systems that mediate formal languages and that these are independent of the modality through which language is acquired. These include different specializations of anterior and posterior cortical regions in aspects of grammatical and semantic processing and a bias for the left hemisphere to mediate aspects of mnemonic functions in language. Additionally, the results suggest that the nature and timing of sensory and language experience significantly impact the development of the language systems of the brain. Effects of the early acquisition of ASL include an increased role for the right hemisphere and for parietal cortex and this occurs in both hearing and deaf native signers. An increased role of posterior temporal and occipital areas occurs in deaf native signers only and thus may be attributable to auditory deprivation.  相似文献   

19.
This study aimed to evaluate the type of attentional selection (location- and/or object-based) triggered by two different types of central noninformative cues: eye gaze and arrows. Two rectangular objects were presented in the visual field, and subjects' attention was directed to the end of a rectangle via the observation of noninformative directional arrows or eye gaze. Similar experiments with peripheral cues have shown an object-based effect: faster target identification when the target is presented on the cued object as compared to the uncued object, even when the distance between target and cue was the same. The three reported experiments aimed to compare the location- and object-based attentional orienting observed with arrows and eye gaze, in order to dissociate the orienting mechanisms underlying the two types of orienting cues. Results showed similar cueing effects on the cued versus oppositely cued locations for the two cue types, replicating several studies with nonpredictive gaze and arrow cues. However, a pure object-based effect occurred only when an arrow cue was presented, whereas a pure location-based effect was only found for eye-gaze cues. It is suggested that attention is nonspecifically directed to nearby objects when a noninformative arrow is used as cue, whereas it is selectively directed to a specific cued location when noninformative eye gaze is used. This may be mediated by theory of mind mechanisms.  相似文献   

20.
A large body of literature has characterized unimodal monolingual and bilingual lexicons and how neighborhood density affects lexical access; however there have been relatively fewer studies that generalize these findings to bimodal (M2) second language (L2) learners of sign languages. The goal of the current study was to investigate parallel language activation in M2L2 learners of sign language and to characterize the influence of spoken language and sign language neighborhood density on the activation of ASL signs. A priming paradigm was used in which the neighbors of the sign target were activated with a spoken English word and compared the activation of the targets in sparse and dense neighborhoods. Neighborhood density effects in auditory primed lexical decision task were then compared to previous reports of native deaf signers who were only processing sign language. Results indicated reversed neighborhood density effects in M2L2 learners relative to those in deaf signers such that there were inhibitory effects of handshape density and facilitatory effects of location density. Additionally, increased inhibition for signs in dense handshape neighborhoods was greater for high proficiency L2 learners. These findings support recent models of the hearing bimodal bilingual lexicon, which posit lateral links between spoken language and sign language lexical representations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号