首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In studies on auditory speech perception, participants are often asked to perform active tasks, e.g. decide whether the perceived sound is a speech sound or not. However, information about the stimulus, inherent in such tasks, may induce expectations that cause altered activations not only in the auditory cortex, but also in frontal areas such as inferior frontal gyrus (IFG) and motor cortices, even in the absence of an explicit task. To investigate this, we applied spectral mixes of a flute sound and either vowels or specific music instrument sounds (e.g. trumpet) in an fMRI study, in combination with three different instructions. The instructions either revealed no information about stimulus features, or explicit information about either the music instrument or the vowel features. The results demonstrated that, besides an involvement of posterior temporal areas, stimulus expectancy modulated in particular a network comprising IFG and premotor cortices during this passive listening task.  相似文献   

2.
When table tennis players anticipate the course of the ball while preparing their motor responses, they not only observe their opponents striking the ball but also listen to events such as the sound of racket–ball contact. Because visual stimuli can be detected more easily when accompanied by a sound, we assumed that complementary sensory audiovisual information would influence the anticipation of biological motion, especially when the racket–ball contact is not presented visually, but has to be inferred from continuous movement kinematics and an abrupt sound. Twenty-six observers were examined with fMRI while watching point-light displays (PLDs) of an opposing table tennis player. Their task was to anticipate the resultant ball flight. The sound was presented complementary to the veracious event or at a deviant time point in its kinematics.Results showed that participants performed best in the complementary condition. Using a region-of-interest approach, fMRI data showed that complementary audiovisual stimulation elicited higher activation in the left temporo-occipital middle temporal gyrus (MTGto), the left primary motor cortex, and the right anterior intraparietal sulcus (aIPS). Both hemispheres also revealed higher activation in the ventral premotor cortex (vPMC) and the pars opercularis of the inferior frontal gyrus (BA 44). Ranking the behavioral effect of complementary versus conflicting audiovisual information over participants revealed an association between the complementary information and higher activation in the right vPMC. We conclude that the recruitment of movement representations in the auditory and visual modalities in the vPMC can be influenced by task-relevant cross-modal audiovisual interaction.  相似文献   

3.
The present functional magnetic resonance imaging study examined the neural response to familiar and unfamiliar, sport and non-sport environmental sounds in expert and novice athletes. Results revealed differential neural responses dependent on sports expertise. Experts had greater neural activation than novices in focal sensorimotor areas such as the supplementary motor area, and pre- and postcentral gyri. Novices showed greater activation than experts in widespread areas involved in perception (i.e. supramarginal, middle occipital, and calcarine gyri; precuneus; inferior and superior parietal lobules), and motor planning and processing (i.e. inferior frontal, middle frontal, and middle temporal gyri). These between-group neural differences also appeared as an expertise effect within specific conditions. Experts showed greater activation than novices during the sport familiar condition in regions responsible for auditory and motor planning, including the inferior frontal gyrus and the parietal operculum. Novices only showed greater activation than experts in the supramarginal gyrus and pons during the non-sport unfamiliar condition, and in the middle frontal gyrus during the sport unfamiliar condition. These results are consistent with the view that expert athletes are attuned to only the most familiar, highly relevant sounds and tune out unfamiliar, irrelevant sounds. Furthermore, these findings that athletes show activation in areas known to be involved in action planning when passively listening to sounds suggests that auditory perception of action can lead to the re-instantiation of neural areas involved in producing these actions, especially if someone has expertise performing the actions.  相似文献   

4.
Motor functions of the Broca's region   总被引:8,自引:0,他引:8  
Broca's region in the dominant cerebral hemisphere is known to mediate the production of language but also contributes to comprehension. This region evolved only in humans and is constituted of Brodmann's areas 44 and 45 in the inferior frontal gyrus. There is, however, evidence that Broca's region overlaps, at least in part, with the ventral premotor cortex. We summarize the evidence that the motor related part of Broca's area is localized in the opercular portion of the inferior frontal cortex, mainly in area 44 of Brodmann. According to our own data, there seems to be a homology between Brodmann area 44 in humans and the monkey area F5. The non-language related motor functions of Broca's region comprise complex hand movements, associative sensorimotor learning and sensorimotor integration. Brodmann's area 44 is also a part of a specialized parieto-premotor network and interacts significantly with the neighboring premotor areas.  相似文献   

5.
We examined the role of motor affordances of objects for working memory retention processes. Three experiments are reported in which participants passively viewed pictures of real world objects or had to retain the objects in working memory for a comparison with an S2 stimulus. Brain activation was recorded by means of functional magnetic resonance imaging (fMRI). Retaining information about objects for which hand actions could easily be retrieved (manipulable objects) in working memory activated the hand region of the ventral premotor cortex (PMC) contralateral to the dominant hand. Conversely, nonmanipulable objects activated the left inferior frontal gyrus. This suggests that working memory for objects with motor affordance is based on motor programs associated with their use. An additional study revealed that motor program activation can be modulated by task demands: Holding manipulable objects in working memory for an upcoming motor comparison task was associated with left ventral PMC activation. However, retaining the same objects for a subsequent size comparison task led to activation in posterior brain regions. This suggests that the activation of hand motor programs are under top down control. By this they can flexibly be adapted to various task demands. It is argued that hand motor programs may serve a similar working memory function as speech motor programs for verbalizable working memory contents, and that the premotor system mediates the temporal integration of motor representations with other task-relevant representations in support of goal oriented behavior.  相似文献   

6.
Many believe that the ability to understand the actions of others is made possible by mirror neurons and a network of brain areas known as the action-observation network (AON). Despite nearly two decades of research into mirror neurons and the AON, however, there is little evidence that they enable the inference of the intention of observed actions. Instead, theories of action selection during action execution indicate that a ventral pathway, linking middle temporal gyrus with the anterior inferior frontal gyrus, might encode these abstract features during action observation. Here I propose that action understanding requires more than merely the AON, and might be achieved through interactions between a ventral pathway and the dorsal AON.  相似文献   

7.
Despite a widespread familiarity with the often compelling urge to yawn after perceiving someone else yawn, an understanding of the neural mechanism underlying contagious yawning remains incomplete. In the present auditory fMRI study, listeners used a 4-point scale to indicate how much they felt like yawning following the presentation of a yawn, breath, or scrambled yawn sound. Not only were yawn sounds given significantly higher ratings, a trait positively correlated with each individual’s empathy measure, but relative to control stimuli, random effects analyses revealed enhanced hemodynamic activity in the right posterior inferior frontal gyrus (pIFG) in response to hearing yawns. Moreover, pIFG activity was greatest for yawn stimuli associated with high as opposed to low yawn ratings and for control sounds associated with equally high yawn ratings. These results support a relationship between contagious yawning and empathy and provide evidence for pIFG involvement in contagious yawning. A supplemental figure for this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental.  相似文献   

8.
In this paper we examine the evidence for human brain areas dedicated to visual or auditory word form processing by comparing cortical activation for auditory word repetition, reading, picture naming, and environmental sound naming. Both reading and auditory word repetition activated left lateralised regions in the frontal operculum (Broca's area), posterior superior temporal gyrus (Wernicke's area), posterior inferior temporal cortex, and a region in the mid superior temporal sulcus relative to baseline conditions that controlled for sensory input and motor output processing. In addition, auditory word repetition increased activation in a lateral region of the left mid superior temporal gyrus but critically, this area is not specific to auditory word processing, it is also activated in response to environmental sounds. There were no reading specific activations, even in the areas previously claimed as visual word form areas: activations were either common to reading and auditory word repetition or common to reading and picture naming. We conclude that there is no current evidence for cortical sites dedicated to visual or auditory word form processing.  相似文献   

9.
Sentence comprehension is a complex task that involves both language-specific processing components and general cognitive resources. Comprehension can be made more difficult by increasing the syntactic complexity or the presentation rate of a sentence, but it is unclear whether the same neural mechanism underlies both of these effects. In the current study, we used event-related functional magnetic resonance imaging (fMRI) to monitor neural activity while participants heard sentences containing a subject-relative or object-relative center-embedded clause presented at three different speech rates. Syntactically complex object-relative sentences activated left inferior frontal cortex across presentation rates, whereas sentences presented at a rapid rate recruited frontal brain regions such as anterior cingulate and premotor cortex, regardless of syntactic complexity. These results suggest that dissociable components of a large-scale neural network support the processing of syntactic complexity and speech presented at a rapid rate during auditory sentence processing.  相似文献   

10.
Fourteen native speakers of German heard normal sentences, sentences which were either lacking dynamic pitch variation (flattened speech), or comprised of intonation contour exclusively (degraded speech). Participants were to listen carefully to the sentences and to perform a rehearsal task. Passive listening to flattened speech compared to normal speech produced strong brain responses in right cortical areas, particularly in the posterior superior temporal gyrus (pSTG). Passive listening to degraded speech compared to either normal or flattened speech particularly involved fronto-opercular and subcortical (Putamen, Caudate Nucleus) regions bilaterally. Additionally the Rolandic operculum (premotor cortex) in the right hemisphere subserved processing of neat sentence intonation. As a function of explicit rehearsing sentence intonation we found several activation foci in the left inferior frontal gyrus (Broca's area), the left inferior precentral sulcus, and the left Rolandic fissure. The data allow several suggestions: First, both flattened and degraded speech evoked differential brain responses in the pSTG, particularly in the planum temporale (PT) bilaterally indicating that this region mediates integration of slowly and rapidly changing acoustic cues during comprehension of spoken language. Second, the bilateral circuit active whilst participants receive degraded speech reflects general effort allocation. Third, the differential finding for passive perception and explicit rehearsal of intonation contour suggests a right fronto-lateral network for processing and a left fronto-lateral network for producing prosodic information. Finally, it appears that brain areas which subserve speech (frontal operculum) and premotor functions (Rolandic operculum) coincidently support the processing of intonation contour in spoken sentence comprehension.  相似文献   

11.
Positron emission tomography was used to investigate whether the motor-iconic basis of certain forms in American Sign Language (ASL) partially alters the neural systems engaged during lexical retrieval. Most ASL nouns denoting tools and ASL verbs referring to tool-based actions are produced with a handshape representing the human hand holding a tool and with an iconic movement depicting canonical tool use, whereas the visual iconicity of animal signs is more idiosyncratic and inconsistent across signs. We investigated whether the motor-iconic relation between a sign and its referent alters the neural substrate for lexical retrieval in ASL. Ten deaf native ASL signers viewed photographs of tools/utensils or of actions performed with or without an implement and were asked to overtly produce the ASL sign for each object or action. The control task required subjects to judge the orientation of unknown faces. Compared to the control task, naming tools engaged left inferior and middle frontal gyri, bilateral parietal lobe, and posterior inferotemporal cortex. Naming actions performed with or without a tool engaged left inferior frontal gyrus, bilateral parietal lobe, and posterior middle temporal gyrus at the temporo-occipital junction (area MT). When motor-iconic verbs were compared with non-iconic verbs, no differences in neural activation were found. Overall, the results indicate that even when the form of a sign is indistinguishable from a pantomimic gesture, the neural systems underlying its production mirror those engaged when hearing speakers name tools or tool-based actions with speech.  相似文献   

12.
苏得权  曾红  陈骐  叶浩生 《心理学报》2016,(12):1499-1506
相关线索能够诱发药物依赖者的心理渴求,而健康人不会对相关线索产生心理渴求。15名海洛因成瘾者和12名没有任何物质滥用的健康被试参与实验,收集了他们在观看相关线索与对照线索时的脑神经活动。结果发现,药物线索能够诱发戒断者更多脑区的活动,包括扣带回和楔前叶。两组被试在对照动作线索刺激诱发作用下,其颞叶、顶叶均出现了较为一致的活动。在用药动作线索诱发作用下,戒断组双侧颞中回、双侧顶下小叶、左侧顶上小叶和右侧额下回显示出显著活动,并且与对照动作线索激活脑区一致;健康组被试除枕叶-颞叶联合区外,没有出现显著的脑区活动。以上结果表明,用药动作线索诱发了海洛因戒断者颞中回、顶下小叶、额下回等镜像神经系统的活动,这些脑区对不同类型的相关线索十分敏感,它们可能通过对用药动作的心理模拟,参与了用药动作线索的快速自动化加工。  相似文献   

13.
概化理论广泛应用于各种心理测评实践中。当有预算限制时,概化理论需要考虑如何设计一个测量可靠性相对较高且可行性也相对较强的测量程序,这就要求通过某些途径估计最佳样本量。拉格朗日乘法是概化理论预算限制下最佳样本量估计较为成熟的方法。探讨了概化理论预算限制下最佳样本量估计的一些影响因素,如受总预算舍入的影响等,也提出了一些后续改善的建议,如推导出拉格朗日乘法的统一公式等  相似文献   

14.
Position emission tomography was used to investigate whether retrieval of perceptual knowledge from long-term memory activates unique cortical regions associated with the modality and/or attribute type retrieved. Knowledge about the typical color, size, and sound of common objects and animals was probed, in response to written words naming the objects. Relative to a nonsemantic control task, all the attribute judgments activated similar left temporal and frontal regions. Visual (color, size) knowledge selectively activated the right posterior inferior temporal (PIT) cortex, whereas sound judgments elicited selective activation in the left posterior superior temporal gyrus and the adjacent parietal cortex. All of the attribute judgments activated a left PIT region, but color retrieval generated more activation in this area. Size judgments activated the right medial parietal cortex. These results indicate that the retrieval of perceptual semantic information activates not only a general semantic network, but also cortical areas specialized for the modality and attribute type of the knowledge retrieved.  相似文献   

15.
Transcranial magnetic stimulation studies have so far reported the results of mapping the primary motor cortex (M1) for hand and tongue muscles in stuttering disorder. This study was designed to evaluate the feasibility of repetitive navigated transcranial magnetic stimulation (rTMS) for locating the M1 for laryngeal muscle and premotor cortical area in the caudal opercular part of inferior frontal gyrus, corresponding to Broca’s area in stuttering subjects by applying new methodology for mapping these motor speech areas. Sixteen stuttering and eleven control subjects underwent rTMS motor speech mapping using modified patterned rTMS. The subjects performed visual object naming task during rTMS applied to the (a) left M1 for laryngeal muscles for recording corticobulbar motor-evoked potentials (CoMEP) from cricothyroid muscle and (b) left premotor cortical area in the caudal opercular part of inferior frontal gyrus while recording long latency responses (LLR) from cricothyroid muscle. The latency of CoMEP in control subjects was 11.75 ± 2.07 ms and CoMEP amplitude was 294.47 ± 208.87 µV, and in stuttering subjects CoMEP latency was 12.13 ± 0.75 ms and 504.64 ± 487.93 µV CoMEP amplitude. The latency of LLR in control subjects was 52.8 ± 8.6 ms and 54.95 ± 4.86 in stuttering subjects. No significant differences were found in CoMEP latency, CoMEP amplitude, and LLR latency between stuttering and control-fluent speakers. These results indicate there are probably no differences in stuttering compared to controls in functional anatomy of the pathway used for transmission of information from premotor cortex to the M1 cortices for laryngeal muscle representation and from there via corticobulbar tract to laryngeal muscles.  相似文献   

16.
Hearing Spaces     
In this paper I argue that empty space can be heard. This position contrasts with the generally held view that the only things that can be heard are sounds, their properties, echoes, and perhaps sound sources. Specifically, I suggest that when sounds reverberate in enclosed environments we auditorily represent the volume of space surrounding us. Clearly, we can learn the approximate size of an enclosed space through hearing a sound reverberate within it, and so any account that denies that we hear empty space must instead show how beliefs about volumes of space can be derived indirectly from what is heard. That is, if space is not auditorily represented when we hear sounds reverberate, what is? I consider whether hearing reverberation can be thought of as hearing a distinct sound, hearing echoes, or hearing a property of a sound. I argue that experiences of reverberation cannot be reduced to the perception of any of these types and that therefore empty space is represented in auditory perceptual content. In the final section I outline two ways in which space might be represented.  相似文献   

17.
Phonological developmental dyslexics remain impaired in phonetic categorical perception (CP) even in adulthood. We studied the brain correlates of CP in dyslexics and controls using a block design fMRI protocol and stimuli from an phonetic continuum between natural /Pa/ and /Ta/ syllables. Subjects performed a pseudo-passive listening task which does not imply voluntary categorical judgment. In the control group, categorical deviant stimuli elicited specific activations in the left angular gyrus, the right inferior frontal gyrus and the right superior cingulate cortex. These regions were not activated in the dyslexic group in which activation was observed for acoustic but not phonetic changes in stimuli. Failures to activate key regions for language perception and auditory attention in dyslexic might account for persistent deficits in phonological awareness and reading tasks.  相似文献   

18.
The sound “OM” is believed to bring mental peace and calm. The cortical activation associated with listening to sound “OM” in contrast to similar non-meaningful sound (TOM) and listening to a meaningful Hindi word (AAM) has been investigated using functional magnetic resonance imaging (MRI). The behaviour interleaved gradient technique was employed in order to avoid interference of scanner noise. The results reveal that listening to “OM” sound in contrast to the meaningful Hindi word condition activates areas of bilateral cerebellum, left middle frontal gyrus (dorsolateral middle frontal/BA 9), right precuneus (BA 5) and right supramarginal gyrus (SMG). Listening to “OM” sound in contrast to “non-meaningful” sound condition leads to cortical activation in bilateral middle frontal (BA9), right middle temporal (BA37), right angular gyrus (BA 40), right SMG and right superior middle frontal gyrus (BA 8). The conjunction analysis reveals that the common neural regions activated in listening to “OM” sound during both conditions are middle frontal (left dorsolateral middle frontal cortex) and right SMG. The results correspond to the fact that listening to “OM” sound recruits neural systems implicated in emotional empathy.  相似文献   

19.
Does language comprehension depend, in part, on neural systems for action? In previous studies, motor areas of the brain were activated when people read or listened to action verbs, but it remains unclear whether such activation is functionally relevant for comprehension. In the experiments reported here, we used off-line theta-burst transcranial magnetic stimulation to investigate whether a causal relationship exists between activity in premotor cortex and action-language understanding. Right-handed participants completed a lexical decision task, in which they read verbs describing manual actions typically performed with the dominant hand (e.g., "to throw," "to write") and verbs describing nonmanual actions (e.g., "to earn," "to wander"). Responses to manual-action verbs (but not to nonmanual-action verbs) were faster after stimulation of the hand area in left premotor cortex than after stimulation of the hand area in right premotor cortex. These results suggest that premotor cortex has a functional role in action-language understanding.  相似文献   

20.
A crosslinguistic, positron emission tomography (PET) study was conducted to determine the influence of linguistic experience on the perception of segmental (consonants and vowels) and suprasegmental (tones) information. Chinese and English subjects (10 per group) were presented binaurally with lists consisting of five Chinese monosyllabic morphemes (speech) or low-pass-filtered versions of the same stimuli (nonspeech). The first and last items were targeted for comparison; the time interval between target tones was filled with irrelevant distractor tones. A speeded-response, selective attention paradigm required subjects to make discrimination judgments of the target items while ignoring intervening distractor tones. PET scans were acquired for five tasks presented twice: one passive listening to pitch (nonspeech) and four active (speech = consonant, vowel, and tone; nonspeech = pitch). Significant regional changes in blood flow were identified from comparisons of group-averaged images of active tasks relative to passive listening. Chinese subjects show increased activity in left premotor cortex, pars opercularis, and pars triangularis across the four tasks. English subjects, on the other hand, show increased activity in left inferior frontal gyrus regions only in the vowel task and in right inferior frontal gyrus regions in the pitch task. Findings suggest that functional circuits engaged in speech perception depend on linguistic experience. All linguistic information signaled by prosodic cues engages left-hemisphere mechanisms. Storage and executive processes of working memory that are implicated in phonological processing are mediated in discrete regions of the left frontal lobe.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号