首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A normally hearing left-handed patient familiar with American Sign Language (ASL) was assessed under sodium amytal conditions and with left cortical stimulation in both oral speech and signed English. Lateralization was mixed but complementary in each language mode: the right hemisphere perfusion severely disrupted motoric aspects of both types of language expression, the left hemisphere perfusion specifically disrupted features of grammatical and semantic usage in each mode of expression. Both semantic and syntactic aspects of oral and signed responses were altered during left posterior temporal-parietal stimulation. Findings are discussed in terms of the neurological organization of ASL and linguistic organization in cases of early left hemisphere damage.  相似文献   

2.
ERPs were recorded from deaf and hearing native signers and from hearing subjects who acquired ASL late or not at all as they viewed ASL signs that formed sentences. The results were compared across these groups and with those from hearing subjects reading English sentences. The results suggest that there are constraints on the organization of the neural systems that mediate formal languages and that these are independent of the modality through which language is acquired. These include different specializations of anterior and posterior cortical regions in aspects of grammatical and semantic processing and a bias for the left hemisphere to mediate aspects of mnemonic functions in language. Additionally, the results suggest that the nature and timing of sensory and language experience significantly impact the development of the language systems of the brain. Effects of the early acquisition of ASL include an increased role for the right hemisphere and for parietal cortex and this occurs in both hearing and deaf native signers. An increased role of posterior temporal and occipital areas occurs in deaf native signers only and thus may be attributable to auditory deprivation.  相似文献   

3.
The neural network supporting aspects of syntactic, prosodic, and semantic information processing is specified on the basis of two experiments using functional magnetic resonance imaging (fMRI). In these two studies, the presence/absence of lexical-semantic and syntactic information is systematically varied in spoken language stimuli. Inferior frontal and temporal brain areas in the left and the right hemisphere are identified to support different aspects of auditory language processing. Two additional experiments using event-related brain potentials investigate the possible interaction of syntactic and prosodic information, on the one hand, and syntactic and semantic information, on the other. While the first two information types were shown to interact early during processing, the latter two information types do not. Implications for models of auditory language comprehension are discussed.  相似文献   

4.
Data from lesion studies suggest that the ability to perceive speech sounds, as measured by auditory comprehension tasks, is supported by temporal lobe systems in both the left and right hemisphere. For example, patients with left temporal lobe damage and auditory comprehension deficits (i.e., Wernicke's aphasics), nonetheless comprehend isolated words better than one would expect if their speech perception system had been largely destroyed (70-80% accuracy). Further, when comprehension fails in such patients their errors are more often semantically-based, than-phonemically based. The question addressed by the present study is whether this ability of the right hemisphere to process speech sounds is a result of plastic reorganization following chronic left hemisphere damage, or whether the ability exists in undamaged language systems. We sought to test these possibilities by studying auditory comprehension in acute left versus right hemisphere deactivation during Wada procedures. A series of 20 patients undergoing clinically indicated Wada procedures were asked to listen to an auditorily presented stimulus word, and then point to its matching picture on a card that contained the target picture, a semantic foil, a phonemic foil, and an unrelated foil. This task was performed under three conditions, baseline, during left carotid injection of sodium amytal, and during right carotid injection of sodium amytal. Overall, left hemisphere injection led to a significantly higher error rate than right hemisphere injection. However, consistent with lesion work, the majority (75%) of these errors were semantic in nature. These findings suggest that auditory comprehension deficits are predominantly semantic in nature, even following acute left hemisphere disruption. This, in turn, supports the hypothesis that the right hemisphere is capable of speech sound processing in the intact brain.  相似文献   

5.
The effects of slowed speech on auditory comprehension in aphasia   总被引:1,自引:1,他引:0  
The present study investigates the effects of slowed speech on auditory comprehension in aphasia. Specifically, an attempt was made to isolate the effects of added time on comprehension at the language processing stages of auditory perception, by increasing the duration of the vowel segments in each word; word recognition and semantic analysis, by adding silences between words; and syntactic analysis, by adding silences at constituent phrase boundaries. Sentences were also read at a slow rate to see the effects of naturally slowed speech on sentence comprehension. Test sentences consisted of simple active and passive declarative sentences, and complex sentences with embedded medial and final relative clauses. Sentences were either semantically reversible or nonreversible. Thirty-four aphasic patients who varied in both severity and type of aphasia were tested on a picture verification task. Results indicated that slowing facilitated language comprehension significantly only in the syntactic condition. Neither syntactic complexity nor semantic reversibility interacted with slowed speech to facilitate auditory language comprehension. Further, it was only the Wernicke's aphasics who showed significant improvement with time added at constituent boundaries. These results suggest that time alone does not facilitate language comprehension in aphasia, but that rather it is the interaction of time with syntactic processing which improves comprehension.  相似文献   

6.
The task of the neural organization of the native language, acquired by a direct method and of another language, studied at school, performed correspondingly by the left and the right hemispheres, was studied in the bilingual patient after unilateral electroconvulsive therapy used in psychiatry. It was established that in such a bilingual type the right hemisphere is concerned with the formation of deep semantic structures of the native language while the left hemisphere is responsible for the formation of second language deep structures and of surface structures of both languages. The effect of language learning method on cerebral organization of bilingualism is postulated.  相似文献   

7.
研究采用事件相关电位技术从隐喻语义理解角度考察了高功能孤独症成人非字面语义理解中的行为特点及脑半球参与模式。行为结果显示:高功能孤独症成人对新异隐喻句的反应时最长,两类隐喻句的反应时均长于普通成人,但错误率上两组被试间无差异。脑电结果显示:高功能孤独症组对新异隐喻句的N400波幅最大,未出现半球偏侧化现象,传统隐喻句在左半球的N400波幅大于右半球,两类隐喻句的N400波幅均大于正常被试。结论:高功能孤独症成人具备隐喻理解能力,并对新异隐喻的理解做出更多努力,但右脑功能异常仍旧存在,左右半球在总体功能或神经联通性上仍旧弱于常人。  相似文献   

8.
An event-related fMRI study of syntactic and semantic violations   总被引:11,自引:0,他引:11  
We used event-related functional magnetic resonance imaging to identify brain regions involved in syntactic and semantic processing. Healthy adult males read well-formed sentences randomly intermixed with sentences which either contained violations of syntactic structure or were semantically implausible. Reading anomalous sentences, as compared to well-formed sentences, yielded distinct patterns of activation for the two violation types. Syntactic violations elicited significantly greater activation than semantic violations primarily in superior frontal cortex. Semantically incongruent sentences elicited greater activation than syntactic violations in the left hippocampal and parahippocampal gyri, the angular gyri bilaterally, the right middle temporal gyrus, and the left inferior frontal sulcus. These results demonstrate that syntactic and semantic processing result in nonidentical patterns of activation, including greater frontal engagement during syntactic processing and larger increases in temporal and temporo-parietal regions during semantic analyses.  相似文献   

9.
Previous laterality studies have implicated the right hemisphere in the processing of metaphors, however it is not clear if this result is due to metaphoricity per se or another aspect of semantic processing. Three divided visual field experiments varied metaphorical and literal sentence familiarity. We found a right hemisphere advantage for unfamiliar sentences containing distant semantic relationships, and a left hemisphere advantage for familiar sentences containing close semantic relationships, regardless of whether sentences were metaphorical or literal. This pattern of results is consistent with theories postulating predominantly left hemisphere processing of close semantic relationships and predominantly right hemisphere processing of distant semantic relationships.  相似文献   

10.
Regional cerebral blood flow (rCBF) was measured by the xenon-133 inhalation method in 10 cerebrally healthy subjects at rest and during linguistic activation tests. These consisted of a comprehension test (binaural listening to a narrative text) and a speech test (making sentences from a list of words presented orally at 30-s intervals). The comprehension task induced a moderate increase in the mean right CBF and in both inferior parietal areas, whereas the speech test resulted in a diffuse increase in the mean CBF of both hemispheres, predominating regionally in both inferior parietal, left operculary, and right upper motor and premotor areas. It is proposed that the activation pattern induced by linguistic stimulation depends on not only specific factors, such as syntactic and semantic aspects of language, but also the contents of the material proposed and the attention required by the test situation.  相似文献   

11.
The study of the neural basis of syntactic processing has greatly benefited from neuroimaging techniques. Research on syntactic processing in bilinguals has used a variety of techniques, including mainly functional magnetic resonance imaging (fMRI) and event-related potentials (ERP). This paper reports on a functional near-infrared spectroscopy (fNIRS) study on syntactic processing in highly proficient young adult speakers of Portuguese (mother tongue) (L1) and French (second language) (L2). They made a syntactic judgment of visually presented sentences, which either did or did not contain noun-verb agreement violations. The results showed that syntactic processing in both languages resulted in significant activation in anterior frontal regions of the left hemisphere and in the temporal superior posterior areas of the right hemisphere, with a more prominent activation for L2 in some areas. These findings corroborate previously reported neuroimaging evidence, showing the suitability of fNIRS for the study of syntactic processing in the bilingual brain.  相似文献   

12.
The functional specificity of different brain areas recruited in auditory language processing was investigated by means of event-related functional magnetic resonance imaging (fMRI) while subjects listened to speech input varying in the presence or absence of semantic and syntactic information. There were two sentence conditions containing syntactic structure, i.e., normal speech (consisting of function and content words), syntactic speech (consisting of function words and pseudowords), and two word-list conditions, i.e., real words and pseudowords. The processing of auditory language, in general, correlates with significant activation in the primary auditory cortices and in adjacent compartments of the superior temporal gyrus bilaterally. Processing of normal speech appeared to have a special status, as no frontal activation was observed in this case but was seen in the three other conditions. This difference may point toward a certain automaticity of the linguistic processes used during normal speech comprehension. When considering the three other conditions, we found that these were correlated with activation in both left and right frontal cortices. An increase of activation in the planum polare bilaterally and in the deep portion of the left frontal operculum was found exclusively when syntactic processes were in focus. Thus, the present data may be taken to suggest an involvement of the left frontal and bilateral temporal cortex when processing syntactic information during comprehension.  相似文献   

13.
A right-handed patient, with two left hemisphere lesions, a small one in the prefrontal lobe and a larger one in the temporal, presents an unusual syndrome: a massive deficit for oral language (expression and comprehension) contrasting with a fairly good preservation of written language (expression and comprehension). The processing of isolated words and sentences has been extensively tested with repetition and dictation tasks. The patient performs rather well with nouns, verbs, and adjectives, poorly with adverbs and function words, and completely fails with nonsense words. A remarkable feature of his repetition is the frequency of semantic paraphasias. Thus, this patient exhibits a behavior rather similar to deep dyslexia, hence the possible label "deep dysphasia." The paper presents a "preunderstanding" hypothesis to account for such behaviors.  相似文献   

14.
The functional specificity of different brain regions recruited in auditory language processing was investigated by means of event-related functional magnetic resonance imaging (fMRI) while subjects listened to speech input varying in the presence or absence of semantic and syntactic information. There were two sentence conditions containing syntactic structure, i.e., normal speech (consisting of function and content words), syntactic speech (consisting of function words and pseudowords), and two word-list conditions, i.e., real words and pseudowords. The processing of auditory language, in general, correlates with significant activation in the primary auditory cortices and in adjacent compartments of the superior temporal gyrus bilaterally. Processing of normal speech appeared to have a special status, as no frontal activation was observed in this case but was seen in the other three conditions. This difference may point toward a certain automaticity of the linguistic processes used during normal speech comprehension. When considering the three other conditions, we found that these were correlated with activation in both left and right frontal cortices. An increase of activation in the planum polare bilaterally and in the deep portion of the left frontal operculum was found exclusively when syntactic processes were in focus. Thus, the present data may be taken to suggest an involvement of the left frontal and bilateral temporal cortex when processing syntactic information during comprehension.  相似文献   

15.
We investigated the relative role of the left versus right hemisphere in the comprehension of American Sign Language (ASL). Nineteen lifelong signers with unilateral brain lesions [11 left hemisphere damaged (LHD) and 8 right hemisphere damaged (RHD)] performed three tasks, an isolated single-sign comprehension task, a sentence-level comprehension task involving simple one-step commands, and a sentence-level comprehension task involving more complex multiclause/multistep commands. Eighteen of the participants were deaf, one RHD subject was hearing and bilingual (ASL and English). Performance was examined in relation to two factors: whether the lesion was in the right or left hemisphere and whether the temporal lobe was involved. The LHD group performed significantly worse than the RHD group on all three tasks, confirming left hemisphere dominance for sign language comprehension. The group with left temporal lobe involvement was significantly impaired on all tasks, whereas each of the other three groups performed at better than 95% correct on the single sign and simple sentence comprehension tasks, with performance falling off only on the complex sentence comprehension items. A comparison with previously published data suggests that the degree of difficulty exhibited by the deaf RHD group on the complex sentences is comparable to that observed in hearing RHD subjects. Based on these findings we hypothesize (i) that deaf and hearing individuals have a similar degree of lateralization of language comprehension processes and (ii) that language comprehension depends primarily on the integrity of the left temporal lobe.  相似文献   

16.
Fourteen native speakers of German heard normal sentences, sentences which were either lacking dynamic pitch variation (flattened speech), or comprised of intonation contour exclusively (degraded speech). Participants were to listen carefully to the sentences and to perform a rehearsal task. Passive listening to flattened speech compared to normal speech produced strong brain responses in right cortical areas, particularly in the posterior superior temporal gyrus (pSTG). Passive listening to degraded speech compared to either normal or flattened speech particularly involved fronto-opercular and subcortical (Putamen, Caudate Nucleus) regions bilaterally. Additionally the Rolandic operculum (premotor cortex) in the right hemisphere subserved processing of neat sentence intonation. As a function of explicit rehearsing sentence intonation we found several activation foci in the left inferior frontal gyrus (Broca's area), the left inferior precentral sulcus, and the left Rolandic fissure. The data allow several suggestions: First, both flattened and degraded speech evoked differential brain responses in the pSTG, particularly in the planum temporale (PT) bilaterally indicating that this region mediates integration of slowly and rapidly changing acoustic cues during comprehension of spoken language. Second, the bilateral circuit active whilst participants receive degraded speech reflects general effort allocation. Third, the differential finding for passive perception and explicit rehearsal of intonation contour suggests a right fronto-lateral network for processing and a left fronto-lateral network for producing prosodic information. Finally, it appears that brain areas which subserve speech (frontal operculum) and premotor functions (Rolandic operculum) coincidently support the processing of intonation contour in spoken sentence comprehension.  相似文献   

17.
As we listen to speech, our ability to understand what was said requires us to retrieve and bind together individual word meanings into a coherent discourse representation. This so‐called semantic unification is a fundamental cognitive skill, and its development relies on the integration of neural activity throughout widely distributed functional brain networks. In this proof‐of‐concept study, we examine, for the first time, how these functional brain networks develop in children. Twenty‐six children (ages 4–17) listened to well‐formed sentences and sentences containing a semantic violation, while EEG was recorded. Children with stronger vocabulary showed N400 effects that were more concentrated to centroparietal electrodes and greater EEG phase synchrony (phase lag index; PLI) between right centroparietal and bilateral frontocentral electrodes in the delta frequency band (1–3 Hz) 1.27–1.53 s after listening to well‐formed sentences compared to sentences containing a semantic violation. These effects related specifically to individual differences in receptive vocabulary, perhaps pointing to greater recruitment of functional brain networks important for top‐down semantic unification with development. Less skilled children showed greater delta phase synchrony for violation sentences 3.41–3.64 s after critical word onset. This later effect was partly driven by individual differences in nonverbal reasoning, perhaps pointing to non‐verbal compensatory processing to extract meaning from speech in children with less developed vocabulary. We suggest that functional brain network communication, as measured by momentary changes in the phase synchrony of EEG oscillations, develops throughout the school years to support language comprehension in different ways depending on children's verbal and nonverbal skill levels.  相似文献   

18.
Sixteen right-handed adult males with localized insult to either the right or left hemisphere and five control subjects without brain damage read aloud target sentences embedded in paragraphs, while intoning their voices in either a declarative, interrogative, happy, or sad mode. Acoustical analysis of the speech wave was performed. Right-anterior (pre-Rolandic) and right-central (pre- and post-Rolandic) brain-damaged patients spoke with less pitch variation and restricted intonational range across emotional and nonemotional domains, while patients with right posterior (post-Rolandic) damage had exaggerated pitch variation and intonational range across both domains. No such deficits were found in patients with left posterior damage, whose prosody was similar to that of normal control subjects. It is suggested that damage to the right hemisphere alone may result in a primary disturbance of speech prosody that may be independent of the disturbances in affect often noted in right-brain-damaged populations.  相似文献   

19.
Whether the brain's speech-production system is also involved in speech comprehension is a topic of much debate. Research has focused on whether motor areas are involved in listening, but overlap between speaking and listening might occur not only at primary sensory and motor levels, but also at linguistic levels (where semantic, lexical, and syntactic processes occur). Using functional MRI adaptation during speech comprehension and production, we found that the brain areas involved in semantic, lexical, and syntactic processing are mostly the same for speaking and for listening. Effects of primary processing load (indicative of sensory and motor processes) overlapped in auditory cortex and left inferior frontal cortex, but not in motor cortex, where processing load affected activity only in speaking. These results indicate that the linguistic parts of the language system are used for both speaking and listening, but that the motor system does not seem to provide a crucial contribution to listening.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号