首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Rapid adaptation to auditory-visual spatial disparity   总被引:1,自引:0,他引:1       下载免费PDF全文
The so-called ventriloquism aftereffect is a remarkable example of rapid adaptative changes in spatial localization caused by visual stimuli. After exposure to a consistent spatial disparity of auditory and visual stimuli, localization of sound sources is systematically shifted to correct for the deviation of the sound from visual positions during the previous adaptation period. In the present study, this aftereffect was induced by presenting, within 17 min, 1800 repetitive noise or pure-tone bursts in combination with synchronized, and 20° disparate flashing light spots, in total darkness. Post-adaptive sound localization, measured by a method of manual pointing, was significantly shifted 2.4° (noise), 3.1° (1 kHz tones), or 5.8° (4 kHz tones) compared with the pre-adaptation condition. There was no transfer across frequencies; that is, shifts in localization were insignificant when the frequencies used for adaptation and the post-adaptation localization test were different. It is hypothesized that these aftereffects may rely on shifts in neural representations of auditory space with respect to those of visual space, induced by intersensory spatial disparity, and may thus reflect a phenomenon of neural short-term plasticity.  相似文献   

2.
A period of exposure to trains of simultaneous but spatially offset auditory and visual stimuli can induce a temporary shift in the perception of sound location. This phenomenon, known as the ‘ventriloquist aftereffect’, reflects a realignment of auditory and visual spatial representations such that they approach perceptual alignment despite their physical spatial discordance. Such dynamic changes to sensory representations are likely to underlie the brain’s ability to accommodate inter-sensory discordance produced by sensory errors (particularly in sound localization) and variability in sensory transduction. It is currently unknown, however, whether these plastic changes induced by adaptation to spatially disparate inputs occurs automatically or whether they are dependent on selectively attending to the visual or auditory stimuli. Here, we demonstrate that robust auditory spatial aftereffects can be induced even in the presence of a competing visual stimulus. Importantly, we found that when attention is directed to the competing stimuli, the pattern of aftereffects is altered. These results indicate that attention can modulate the ventriloquist aftereffect.  相似文献   

3.
Despite the often encountered affirmation that vision completely dominates other modalities in intersensory conflict, there are cases where discordant auditorv information affects the localization of a visual signal. Experiment I shows that “auditory capture” occurs with a visual input reduced to a single luminous point in complete darkness, but not with a textured background. The task was to point at a flashing luminous point alternately in the presence of a synchronous sound coming from a source situated 15° to one side (“conflict trials,” designed to measure immediate reaction to conflict) and in its absence (“test trials,” to measure aftereffects). Adaptive immédiate reactions and aftereffects were observed in the dark, but not with a textured background. In Experiment II, on the other hand, “visual capture” of auditory localization was observed at the levels of both measures in the dark and with the textured background. That visual texture affects the degree of auditory capture of vision, but not the degree of visual capture of audition was confirmed at the level of aftereffects in Experiment III, where bisensory monitoring was substituted for pointing during exposure to conflict. The empirical finding eliminates apparent contradictions in the literature on ventriloquism, but cannot itself be explained in terms either of relative accuracy of visual and auditory localization or attentional adjustments.  相似文献   

4.
In this study, we show that the contingent auditory motion aftereffect is strongly influenced by visual motion information. During an induction phase, participants listened to rightward-moving sounds with falling pitch alternated with leftward-moving sounds with rising pitch (or vice versa). Auditory aftereffects (i.e., a shift in the psychometric function for unimodal auditory motion perception) were bigger when a visual stimulus moved in the same direction as the sound than when no visual stimulus was presented. When the visual stimulus moved in the opposite direction, aftereffects were reversed and thus became contingent upon visual motion. When visual motion was combined with a stationary sound, no aftereffect was observed. These findings indicate that there are strong perceptual links between the visual and auditory motion-processing systems.  相似文献   

5.
Multisensory-mediated auditory localization   总被引:1,自引:0,他引:1  
Multisensory integration is a powerful mechanism for maximizing sensitivity to sensory events. We examined its effects on auditory localization in healthy human subjects. The specific objective was to test whether the relative intensity and location of a seemingly irrelevant visual stimulus would influence auditory localization in accordance with the inverse effectiveness and spatial rules of multisensory integration that have been developed from neurophysiological studies with animals [Stein and Meredith, 1993 The Merging of the Senses (Cambridge, MA: MIT Press)]. Subjects were asked to localize a sound in one condition in which a neutral visual stimulus was either above threshold (supra-threshold) or at threshold. In both cases the spatial disparity of the visual and auditory stimuli was systematically varied. The results reveal that stimulus salience is a critical factor in determining the effect of a neutral visual cue on auditory localization. Visual bias and, hence, perceptual translocation of the auditory stimulus appeared when the visual stimulus was supra-threshold, regardless of its location. However, this was not the case when the visual stimulus was at threshold. In this case, the influence of the visual cue was apparent only when the two cues were spatially coincident and resulted in an enhancement of stimulus localization. These data suggest that the brain uses multiple strategies to integrate multisensory information.  相似文献   

6.
In the ventriloquism aftereffect, brief exposure to a consistent spatial disparity between auditory and visual stimuli leads to a subsequent shift in subjective sound localization toward the positions of the visual stimuli. Such rapid adaptive changes probably play an important role in maintaining the coherence of spatial representations across the various sensory systems. In the research reported here, we used event-related potentials (ERPs) to identify the stage in the auditory processing stream that is modulated by audiovisual discrepancy training. Both before and after exposure to synchronous audiovisual stimuli that had a constant spatial disparity of 15°, participants reported the perceived location of brief auditory stimuli that were presented from central and lateral locations. In conjunction with a sound localization shift in the direction of the visual stimuli (the behavioral ventriloquism aftereffect), auditory ERPs as early as 100 ms poststimulus (N100) were systematically modulated by the disparity training. These results suggest that cross-modal learning was mediated by a relatively early stage in the auditory cortical processing stream.  相似文献   

7.
Correctly integrating sensory information across different modalities is a vital task, yet there are illusions which cause the incorrect localization of multisensory stimuli. A common example of these phenomena is the "ventriloquism effect". In this illusion, the localization of auditory signals is biased by the presence of visual stimuli. For instance, when a light and sound are simultaneously presented, observers may erroneously locate the sound closer to the light than its actual position. While this phenomenon has been studied extensively in azimuth at a single depth, little is known about the interactions of stimuli at different depth planes. In the current experiment, virtual acoustics and stereo-image displays were used to test the integration of visual and auditory signals across azimuth and depth. The results suggest that greater variability in the localization of sounds in depth may lead to a greater bias from visual stimuli in depth than in azimuth. These results offer interesting implications for understanding multisensory integration.  相似文献   

8.
The attention network test (ANT) assesses efficiency across alerting, orienting, and executive components of visual attention. This study examined approaches to assessing auditory attention networks, and performance was compared to the visual ANT. Results showed (1) alerting was sufficiently elicited in a pitch discrimination and sound localization task, although these effects were unrelated, (2) weak orienting of attention was elicited through pitch discrimination, which varied based on ISI and conflict level, but robust orienting of attention was found through sound localization, and (3) executive control was sufficiently assessed in both pitch discrimination and sound localization tasks, but these effects were unrelated. Correlation analysis suggested that, unlike alerting and orienting, sound localization auditory executive control functions tap a shared attention network system. Overall, the results suggest that auditory ANT measures are largely task and modality specific, with sound localization offering potential to assess all three attention networks in a single task.  相似文献   

9.
Previous research has demonstrated that the localization of auditory or tactile stimuli can be biased by the simultaneous presentation of a visual stimulus from a different spatial position. We investigated whether auditory localization judgments could also be affected by the presentation of spatially displaced tactile stimuli, using a procedure designed to reveal perceptual interactions across modalities. Participants made left-right discrimination responses regarding the perceived location of sounds, which were presented either in isolation or together with tactile stimulation to the fingertips. The results demonstrate that the apparent location of a sound can be biased toward tactile stimulation when it is synchronous, but not when it is asynchronous, with the auditory event. Directing attention to the tactile modality did not increase the bias of sound localization toward synchronous tactile stimulation. These results provide the first demonstration of the tactile capture of audition.  相似文献   

10.
Previous research has demonstrated that the localization of auditory or tactile stimuli can be biased by the simultaneous presentation of a visual stimulus from a different spatial position. We investigated whether auditory localization judgments could also be affected by the presentation of spatially displaced tactile stimuli, using a procedure designed to reveal perceptual interactions across modalities. Participants made left—right discrimination responses regarding the perceived location of sounds, which were presented either in isolation or together with tactile stimulation to the fingertips. The results demonstrate that the apparent location of a sound can be biased toward tactile stimulation when it is synchronous, but not when it is asynchronous, with the auditory event. Directing attention to the tactile modality did not increase the bias of sound localization toward synchronous tactile stimulation. These results provide the first demonstration of the tactilecapture of audition.  相似文献   

11.
Subjects were tested for both ear-hand and eye-hand co-ordination before and after monitoring a synchronous series of noise bursts and of light flashes coming from the same spatial position, but with the virtual position of the flashes displaced 15° laterally by prisms. Attention was forced on both stimuli by the instruction to detect occasional reductions in intensity. No subject reported noticing the spatial discrepancy. Nevertheless ear--hand co-ordination was shifted in the direction of the prismatic displacement, and eye-hand co-ordination in the opposite direction. Both shifts were observed with instructions suggesting that the sound and the light came from one single source, with instructions suggesting two separate sources, and also with no information regarding the spatial relationship of sound and light. It is concluded that the resolution of auditory-visual spatial conflict involves recalibrations of both visual and auditory data and that these alterations last long enough to be detected as after-effects.  相似文献   

12.
Several studies have shown that handedness has an impact on visual spatial abilities. Here we investigated the effect of laterality on auditory space perception. Participants (33 right-handers, 20 left-handers) completed two tasks of sound localization. In a dark, anechoic, and sound-proof room, sound stimuli (broadband noise) were presented via 21 loudspeakers mounted horizontally (from 80° on the left to 80° on the right). Participants had to localize the target either by using a swivel hand-pointer or by head-pointing. Individual lateral preferences of eye, ear, hand, and foot were obtained using a questionnaire. With both pointing methods, participants showed a bias in sound localization that was to the side contralateral to the preferred hand, an effect that was unrelated to their overall precision. This partially parallels findings in the visual modality as left-handers typically have a more rightward bias in visual line bisection compared with right-handers. Despite the differences in neural processing of auditory and visual spatial information these findings show similar effects of lateral preference on auditory and visual spatial perception. This suggests that supramodal neural processes are involved in the mechanisms generating laterality in space perception.  相似文献   

13.
It is well known that discrepancies in the location of synchronized auditory and visual events can lead to mislocalizations of the auditory source, so-called ventriloquism. In two experiments, we tested whether such cross-modal influences on auditory localization depend on deliberate visual attention to the biasing visual event. In Experiment 1, subjects pointed to the apparent source of sounds in the presence or absence of a synchronous peripheral flash. They also monitored for target visual events, either at the location of the peripheral flash or in a central location. Auditory localization was attracted toward the synchronous peripheral flash, but this was unaffected by where deliberate visual attention was directed in the monitoring task. In Experiment 2, bilateral flashes were presented in synchrony with each sound, to provide competing visual attractors. When these visual events were equally salient on the two sides, auditory localization was unaffected by which side subjects monitored for visual targets. When one flash was larger than the other, auditory localization was slightly but reliably attracted toward it, but again regardless of where visual monitoring was required. We conclude that ventriloquism largely reflects automatic sensory interactions, with little or no role for deliberate spatial attention.  相似文献   

14.
The categorization and identification of previously ignored visual or auditory stimuli is typically slowed down—a phenomenon that has been called the negative priming effect and can be explained by the episodic retrieval of response-inadequate prime information and/or an inhibitory model. A similar after-effect has been found in visuospatial tasks: participants are slowed down in localizing a visual stimulus that appears at a previously ignored location. In the auditory modality, however, such an after-effect of ignoring a sound at a specific location has never been reported. Instead, participants are impaired in their localization performance when the sound at the previously ignored location changes identity, a finding which is compatible with the so-called feature-mismatch hypothesis. Here, we describe the properties of auditory spatial in contrast to visuospatial negative priming and report two experiments that specify the nature of this auditory after-effect. Experiment 1 shows that the detection of identity-location mismatches is a genuinely auditory phenomenon that can be replicated even when the sound sources are invisible. Experiment 2 reveals that the detection of sound-identity mismatches in the probe depends on the processing demands in the prime. This finding implies that the localization of irrelevant sound sources is not the inevitable consequence of processing the auditory prime scenario but depends on the difficulty of the target search process among distractor sounds.  相似文献   

15.
Viewing a distorted face induces large aftereffects in the appearance of an undistorted face. The authors examined the processes underlying this adaptation by comparing how selective the aftereffects are for different dimensions of the images including size, spatial frequency content, contrast, and color. Face aftereffects had weaker selectivity for changes in the size, contrast, or color of the images and stronger selectivity for changes in contrast polarity or spatial frequency. This pattern could arise if the adaptation is contingent on the perceived similarity of the stimuli as faces. Consistent with this, changing contrast polarity or spatial frequency had larger effects on the perceived identity of a face, and aftereffects were also selective for different individual faces. These results suggest that part of the sensitivity changes underlying the adaptation may arise at visual levels closely associated with the representation of faces.  相似文献   

16.
It is suggested that aftereffects caused by swept change of sound level (Reinhardt-Rutland & Anstis, Note 1) may contribute to auditory motion aftereffects (Grantham & Wightman, 1979). The latter appear to show selectivity for frequency; they are substantial if the frequency is .5 kHz but not if it is 2 kHz. An experiment was carried out to show to what extent there is such selectivity for aftereffects from swept sound-level change; this showed that they are substantial over a much wider range of frequencies than auditory motion aftereffects. It is concluded that the frequency selectivity of auditory motion aftereffects might be explained by frequency selectivity for inter-aural phase differences.  相似文献   

17.
When you are looking for an object, does hearing its characteristic sound make you find it more quickly? Our recent results supported this possibility by demonstrating that when a cat target, for example, was presented among other objects, a simultaneously presented “meow” sound (containing no spatial information) reduced the manual response time for visual localization of the target. To extend these results, we determined how rapidly an object-specific auditory signal can facilitate target detection in visual search. On each trial, participants fixated a specified target object as quickly as possible. The target’s characteristic sound speeded the saccadic search time within 215–220 msec and also guided the initial saccade toward the target, compared with presentation of a distractor’s sound or with no sound. These results suggest that object-based auditory—visual interactions rapidly increase the target object’s salience in visual search.  相似文献   

18.
Studies of auditory localization revealed that where a subject hears a sound is dependent on both his perceived head position and the auditory cues at his ears. If an error is induced between his true and registered head posture, then errors in his auditory localizations of corresponding size and time course result. The presence of visual information prevents the development of postural errors and, consequently, prevents the development of errors in auditory localization, too. These observations are related to the oculogravic illusion and are interpreted as one aspect of the functioning of a spatial reference system involved in the maintenance of the constancies of auditory and visual detection.  相似文献   

19.
Previous research on visual contingent aftereffects has been concerned with examining the effects of various parameters (e.g., spatial frequency and luminance) on the adaptation to, and decay of, contingent aftereffects. The current study tested the viability of using visual contingent aftereffects in a display context. Using established characteristics of contingent aftereffects, a program of contingent aftereffect adaptation was designed. Studies were conducted to determine if subjects who were adapted to see visual contingent aftereffects invoked by a visual display could achieve more rapid or certain identification of a display under low luminance conditions. The results confirmed (a) that contingent aftereffects can improve performance on a visual discrimination task requiring information from a display and (b) that contingent aftereffects are more enhanced at low levels of illumination.  相似文献   

20.
Four experiments explored possible roles for working memory in sound localization. In each experiment, the angular error of localization was assessed when performed alone, or concurrently with a working-memory task. The role of the phonological slave systems in auditory localization was ruled out by Experiments 1 and 2, while an engagement of central resources was suggested by the results of Experiment 3. Experiment 4 examined the involvement of visuo-spatial systems in auditory localization and revealed impairment of localization by the concurrent spatial working-memory task. A comparison of dual-task decrement across all four studies suggests that localization places greater demand on central than on spatial resources.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号