首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The categorization and identification of previously ignored visual or auditory stimuli is typically slowed down—a phenomenon that has been called the negative priming effect and can be explained by the episodic retrieval of response-inadequate prime information and/or an inhibitory model. A similar after-effect has been found in visuospatial tasks: participants are slowed down in localizing a visual stimulus that appears at a previously ignored location. In the auditory modality, however, such an after-effect of ignoring a sound at a specific location has never been reported. Instead, participants are impaired in their localization performance when the sound at the previously ignored location changes identity, a finding which is compatible with the so-called feature-mismatch hypothesis. Here, we describe the properties of auditory spatial in contrast to visuospatial negative priming and report two experiments that specify the nature of this auditory after-effect. Experiment 1 shows that the detection of identity-location mismatches is a genuinely auditory phenomenon that can be replicated even when the sound sources are invisible. Experiment 2 reveals that the detection of sound-identity mismatches in the probe depends on the processing demands in the prime. This finding implies that the localization of irrelevant sound sources is not the inevitable consequence of processing the auditory prime scenario but depends on the difficulty of the target search process among distractor sounds.  相似文献   

2.
The attention network test (ANT) assesses efficiency across alerting, orienting, and executive components of visual attention. This study examined approaches to assessing auditory attention networks, and performance was compared to the visual ANT. Results showed (1) alerting was sufficiently elicited in a pitch discrimination and sound localization task, although these effects were unrelated, (2) weak orienting of attention was elicited through pitch discrimination, which varied based on ISI and conflict level, but robust orienting of attention was found through sound localization, and (3) executive control was sufficiently assessed in both pitch discrimination and sound localization tasks, but these effects were unrelated. Correlation analysis suggested that, unlike alerting and orienting, sound localization auditory executive control functions tap a shared attention network system. Overall, the results suggest that auditory ANT measures are largely task and modality specific, with sound localization offering potential to assess all three attention networks in a single task.  相似文献   

3.
Previous research has demonstrated that the localization of auditory or tactile stimuli can be biased by the simultaneous presentation of a visual stimulus from a different spatial position. We investigated whether auditory localization judgments could also be affected by the presentation of spatially displaced tactile stimuli, using a procedure designed to reveal perceptual interactions across modalities. Participants made left-right discrimination responses regarding the perceived location of sounds, which were presented either in isolation or together with tactile stimulation to the fingertips. The results demonstrate that the apparent location of a sound can be biased toward tactile stimulation when it is synchronous, but not when it is asynchronous, with the auditory event. Directing attention to the tactile modality did not increase the bias of sound localization toward synchronous tactile stimulation. These results provide the first demonstration of the tactile capture of audition.  相似文献   

4.
Previous research has demonstrated that the localization of auditory or tactile stimuli can be biased by the simultaneous presentation of a visual stimulus from a different spatial position. We investigated whether auditory localization judgments could also be affected by the presentation of spatially displaced tactile stimuli, using a procedure designed to reveal perceptual interactions across modalities. Participants made left—right discrimination responses regarding the perceived location of sounds, which were presented either in isolation or together with tactile stimulation to the fingertips. The results demonstrate that the apparent location of a sound can be biased toward tactile stimulation when it is synchronous, but not when it is asynchronous, with the auditory event. Directing attention to the tactile modality did not increase the bias of sound localization toward synchronous tactile stimulation. These results provide the first demonstration of the tactilecapture of audition.  相似文献   

5.
Studies of auditory localization revealed that where a subject hears a sound is dependent on both his perceived head position and the auditory cues at his ears. If an error is induced between his true and registered head posture, then errors in his auditory localizations of corresponding size and time course result. The presence of visual information prevents the development of postural errors and, consequently, prevents the development of errors in auditory localization, too. These observations are related to the oculogravic illusion and are interpreted as one aspect of the functioning of a spatial reference system involved in the maintenance of the constancies of auditory and visual detection.  相似文献   

6.
In the ventriloquism aftereffect, brief exposure to a consistent spatial disparity between auditory and visual stimuli leads to a subsequent shift in subjective sound localization toward the positions of the visual stimuli. Such rapid adaptive changes probably play an important role in maintaining the coherence of spatial representations across the various sensory systems. In the research reported here, we used event-related potentials (ERPs) to identify the stage in the auditory processing stream that is modulated by audiovisual discrepancy training. Both before and after exposure to synchronous audiovisual stimuli that had a constant spatial disparity of 15°, participants reported the perceived location of brief auditory stimuli that were presented from central and lateral locations. In conjunction with a sound localization shift in the direction of the visual stimuli (the behavioral ventriloquism aftereffect), auditory ERPs as early as 100 ms poststimulus (N100) were systematically modulated by the disparity training. These results suggest that cross-modal learning was mediated by a relatively early stage in the auditory cortical processing stream.  相似文献   

7.
Correctly integrating sensory information across different modalities is a vital task, yet there are illusions which cause the incorrect localization of multisensory stimuli. A common example of these phenomena is the "ventriloquism effect". In this illusion, the localization of auditory signals is biased by the presence of visual stimuli. For instance, when a light and sound are simultaneously presented, observers may erroneously locate the sound closer to the light than its actual position. While this phenomenon has been studied extensively in azimuth at a single depth, little is known about the interactions of stimuli at different depth planes. In the current experiment, virtual acoustics and stereo-image displays were used to test the integration of visual and auditory signals across azimuth and depth. The results suggest that greater variability in the localization of sounds in depth may lead to a greater bias from visual stimuli in depth than in azimuth. These results offer interesting implications for understanding multisensory integration.  相似文献   

8.
The effect of a background sound on the auditory localization of a single sound source was examined. Nine loudspeakers were arranged crosswise in the horizontal and the median vertical plane. They ranged from -20 degrees to +20 degrees, with the center loudspeaker at 0 degree azimuth and elevation. Using vertical and horizontal centimeter scales, listeners verbally estimated the position of a 500-ms broadband noise stimulus being presented at the same time as a 2 s background sound, emitted by one of the four outer loudspeakers. When the background sound consisted of continuous broadband noise, listeners consistently shifted the apparent target positions away from the background sound locations. This auditory contrast effect, which is consistent with earlier findings, equally occurred in both planes. But when the background sound was changed to a pulse train of noise bursts, the contrast effect decreased in the horizontal plane and increased in the vertical plane. This discrepancy might be due to general differences in the processing of interaural and spectral localization information.  相似文献   

9.
In order to pinpoint the location of a sound source, we make use of a variety of spatial cues that arise from the direction-dependent manner in which sounds interact with the head, torso and external ears. Accurate sound localization relies on the neural discrimination of tiny differences in the values of these cues and requires that the brain circuits involved be calibrated to the cues experienced by each individual. There is growing evidence that the capacity for recalibrating auditory localization continues well into adult life. Many details of how the brain represents auditory space and of how those representations are shaped by learning and experience remain elusive. However, it is becoming increasingly clear that the task of processing auditory spatial information is distributed over different regions of the brain, some working hierarchically, others independently and in parallel, and each apparently using different strategies for encoding sound source location.  相似文献   

10.
Several studies have shown that handedness has an impact on visual spatial abilities. Here we investigated the effect of laterality on auditory space perception. Participants (33 right-handers, 20 left-handers) completed two tasks of sound localization. In a dark, anechoic, and sound-proof room, sound stimuli (broadband noise) were presented via 21 loudspeakers mounted horizontally (from 80° on the left to 80° on the right). Participants had to localize the target either by using a swivel hand-pointer or by head-pointing. Individual lateral preferences of eye, ear, hand, and foot were obtained using a questionnaire. With both pointing methods, participants showed a bias in sound localization that was to the side contralateral to the preferred hand, an effect that was unrelated to their overall precision. This partially parallels findings in the visual modality as left-handers typically have a more rightward bias in visual line bisection compared with right-handers. Despite the differences in neural processing of auditory and visual spatial information these findings show similar effects of lateral preference on auditory and visual spatial perception. This suggests that supramodal neural processes are involved in the mechanisms generating laterality in space perception.  相似文献   

11.
Multisensory-mediated auditory localization   总被引:1,自引:0,他引:1  
Multisensory integration is a powerful mechanism for maximizing sensitivity to sensory events. We examined its effects on auditory localization in healthy human subjects. The specific objective was to test whether the relative intensity and location of a seemingly irrelevant visual stimulus would influence auditory localization in accordance with the inverse effectiveness and spatial rules of multisensory integration that have been developed from neurophysiological studies with animals [Stein and Meredith, 1993 The Merging of the Senses (Cambridge, MA: MIT Press)]. Subjects were asked to localize a sound in one condition in which a neutral visual stimulus was either above threshold (supra-threshold) or at threshold. In both cases the spatial disparity of the visual and auditory stimuli was systematically varied. The results reveal that stimulus salience is a critical factor in determining the effect of a neutral visual cue on auditory localization. Visual bias and, hence, perceptual translocation of the auditory stimulus appeared when the visual stimulus was supra-threshold, regardless of its location. However, this was not the case when the visual stimulus was at threshold. In this case, the influence of the visual cue was apparent only when the two cues were spatially coincident and resulted in an enhancement of stimulus localization. These data suggest that the brain uses multiple strategies to integrate multisensory information.  相似文献   

12.
Two experiments are reported with identical auditory stimulation in three-dimensional space but with different instructions. Participants localized a cued sound (Experiment 1) or identified a sound at a cued location (Experiment 2). A distractor sound at another location had to be ignored. The prime distractor and the probe target sound were manipulated with respect to sound identity (repeated vs. changed) and location (repeated vs. changed). The localization task revealed a symmetric pattern of partial repetition costs: Participants were impaired on trials with identity-location mismatches between the prime distractor and probe target-that is, when either the sound was repeated but not the location or vice versa. The identification task revealed an asymmetric pattern of partial repetition costs: Responding was slowed down when the prime distractor sound was repeated as the probe target, but at another location; identity changes at the same location were not impaired. Additionally, there was evidence of retrieval of incompatible prime responses in the identification task. It is concluded that feature binding of auditory prime distractor information takes place regardless of whether the task is to identify or locate a sound. Instructions determine the kind of identity-location mismatch that is detected. Identity information predominates over location information in auditory memory.  相似文献   

13.
In a previous paper, experiments were reported which demonstrated that human subjects can judge accurately the azimuthal direction of sounds using a tactile localization device. It was also demonstrated that a tactile analogue of selective auditory attention was possible with this system. Three additional experiments, reported here, indicate that subjects are also able to judge the distance of the sound source and can concurrently judge both the azimuthal direction and distance of a source. Comparisons were made between conditions where head movements were permitted lactive~ and conditions where the head was held still Ipassive, and between normal auditory judgments and tactile judgments. Active tactile performance was essentially similar to auditory performance. Active performance was superior to passive in both directional and distance judgment, but different components of the motor-sensory complex were found to contribute to active superiority in the two tasks. The implication of these experiments for the design of auditory prosthetic devices is discussed.  相似文献   

14.
In auditory localization experiments, where the subject observes from a fixed position, both relative sound intensity and arrival time at the two ears determine the extent of localization performance. The present experiment investigated the role of binaural cues in a different context, the sound-position discrimination task, where the subject is free to move and interact with the sound source. The role of binaural cues was investigated in rats by producing an interaural imbalance through unilateral removal of the middle auditory ossicle (incus) prior to discrimination training. Discrete trial go-right/go-left sound-position discrimination of unilaterally incudectomised rats was then compared with that of normal rats and of rats with the incus of both sides removed. While bilateral incus removal affected binaural intensity and arrival times, the symmetry of sound input between the two ears was preserved. Percentage of correct responses and videotaped observations of sound approach and exploration showed that the unilateral rats failed to localize the sounding speaker. Rats with symmetrical binaural input (normal and bilaterally incudectomised rats) accurately discriminated sound position for the duration of the experiment. Previously reported monaural localization based upon following the intensity gradient to the sound source was not observed in the unilaterally incudectomised rats of the present experiment. It is concluded that sound-position discrimination depends upon the use of binaural cues.  相似文献   

15.
Exposure to synchronous but spatially discordant auditory and visual inputs produces, beyond immediate cross-modal biases, adaptive recalibrations of the respective localization processes that manifest themselves in aftereffects. Such recalibrations probably play an important role in maintaining the coherence of spatial representations across the various spatial senses. The present study is part of a research program focused on the way recalibrations generalize to stimulus values different from those used for adaptation. Considering the case of sound frequency, we recently found that, in contradiction with an earlier report, auditory aftereffects generalize nearly entirely across two octaves. In this new experiment, participants were adapted to an 18 degrees auditory-visual discordance with either 400 or 6400 Hz tones, and their subsequent sound localization was tested across this whole four-octave frequency range. Substantial aftereffects, decreasing significantly with increasing difference between test and adapter frequency, were obtained at all combinations of adapter and test frequency. Implications of these results concerning the functional site at which visual recalibration of auditory localization might take place are discussed.  相似文献   

16.
Rapid adaptation to auditory-visual spatial disparity   总被引:1,自引:0,他引:1       下载免费PDF全文
The so-called ventriloquism aftereffect is a remarkable example of rapid adaptative changes in spatial localization caused by visual stimuli. After exposure to a consistent spatial disparity of auditory and visual stimuli, localization of sound sources is systematically shifted to correct for the deviation of the sound from visual positions during the previous adaptation period. In the present study, this aftereffect was induced by presenting, within 17 min, 1800 repetitive noise or pure-tone bursts in combination with synchronized, and 20° disparate flashing light spots, in total darkness. Post-adaptive sound localization, measured by a method of manual pointing, was significantly shifted 2.4° (noise), 3.1° (1 kHz tones), or 5.8° (4 kHz tones) compared with the pre-adaptation condition. There was no transfer across frequencies; that is, shifts in localization were insignificant when the frequencies used for adaptation and the post-adaptation localization test were different. It is hypothesized that these aftereffects may rely on shifts in neural representations of auditory space with respect to those of visual space, induced by intersensory spatial disparity, and may thus reflect a phenomenon of neural short-term plasticity.  相似文献   

17.
Studies of reactions to audiovisual spatial conflict (alias “ventriloquism”) are generally presented as informing on the processes of intermodal coordination. However, most of the literature has failed to isolate genuine perceptual effects from voluntary postperceptual adjustments. A new approach, based on psychophysical staircases, is applied to the case of the immediate visual bias of auditory localization. Subjects have to judge the apparent origin of stereophonically controlled sound bursts as left or right of a median reference line. Successive trials belong to one of two staircases, starting respectively at extreme left and right locations, and are moved progressively toward the median on the basis of the subjects’ responses. Response reversals occur for locations farther away from center when a central lamp is flashed in synchrony with the bursts than without flashes (Experiment 1), revealing an attraction of the sounds toward the flashes. The effect cannot originate in voluntary postperceptual decision, since the occurrence of response reversal implies that the subject is uncertain concerning the direction of the target sound. The attraction is contingent on sound-flash synchronization, for early response reversals did no longer occur when the inputs from the two modalities were desynchronized (Experiment 2). Taken together, the results show that the visual bias of auditory localization observed repeatedly in less controlled conditions is due partly at least to an automatic attraction of the apparent location of sound by spatially discordant but temporally correlated visual inputs.  相似文献   

18.
When you are looking for an object, does hearing its characteristic sound make you find it more quickly? Our recent results supported this possibility by demonstrating that when a cat target, for example, was presented among other objects, a simultaneously presented “meow” sound (containing no spatial information) reduced the manual response time for visual localization of the target. To extend these results, we determined how rapidly an object-specific auditory signal can facilitate target detection in visual search. On each trial, participants fixated a specified target object as quickly as possible. The target’s characteristic sound speeded the saccadic search time within 215–220 msec and also guided the initial saccade toward the target, compared with presentation of a distractor’s sound or with no sound. These results suggest that object-based auditory—visual interactions rapidly increase the target object’s salience in visual search.  相似文献   

19.
Four experiments explored possible roles for working memory in sound localization. In each experiment, the angular error of localization was assessed when performed alone, or concurrently with a working-memory task. The role of the phonological slave systems in auditory localization was ruled out by Experiments 1 and 2, while an engagement of central resources was suggested by the results of Experiment 3. Experiment 4 examined the involvement of visuo-spatial systems in auditory localization and revealed impairment of localization by the concurrent spatial working-memory task. A comparison of dual-task decrement across all four studies suggests that localization places greater demand on central than on spatial resources.  相似文献   

20.
It is unclear from current accounts of working memory which, if any, of its components might be involved in our ability to specify the location of a sound source. A series of studies were performed to assess the degree of interference in localization of broadband noise, by a concurrent articulatory suppression (articulatory loop—Experiment 1), serial recall (phonological store and articulatory loop—Experiment 2), and Paced Visual Serial Addition Test (central executive—Experiment 3). No significant disruption of auditory localization was revealed by the first two experiments, ruling out a role for the phonological loop in auditory localization. In Experiment 3, a large degree of error was exhibited in localization, when performed concurrently with the addition task, indicating a requirement for central resources. This suggestion is confirmed by comparison of localization performance across all three studies, which demonstrates a clear deterioration in performance as the demand of concurrent tasks on central resources increases. Finally, concurrent localization was shown to disrupt the primacy portion of the serial position curve, as well as performance on the Paced Visual Serial Addition Test.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号