首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
Thresholds for auditory motion detectability were measured in a darkened anechoic chamber while subjects were adapted to horizontally moving sound sources of various velocities. All stimuli were 500-Hz lowpass noises presented at a level of 55 dBA. The threshold measure employed was the minimum audible movement angle (MAMA)--that is, the minimum angle a horizontally moving sound must traverse to be just discriminable from a stationary sound. In an adaptive, two-interval forced-choice procedure, trials occurred every 2-5 sec (Experiment 1) or every 10-12 sec (Experiment 2). Intertrial time was "filled" with exposure to the adaptor--a stimulus that repeatedly traversed the subject's front hemifield at ear level (distance: 1.7 m) at a constant velocity (-150 degrees/sec to +150 degrees/sec) during a run. Average MAMAs in the control condition, in which the adaptor was stationary (0 degrees/sec,) were 2.4 degrees (Experiment 1) and 3.0 degrees (Experiment 2). Three out of 4 subjects in each experiment showed significantly elevated MAMAs (by up to 60%), with some adaptors relative to the control condition. However, there were large intersubject differences in the shape of the MAMA versus adaptor velocity functions. This loss of sensitivity to motion that most subjects show after exposure to moving signals is probably one component underlying the auditory motion aftereffect (Grantham, 1989), in which judgments of the direction of moving sounds are biased in the direction opposite to that of a previously presented adaptor.  相似文献   

2.
In this paper, the auditory motion aftereffect (aMAE) was studied, using real moving sound as both the adapting and the test stimulus. The sound was generated by a loudspeaker mounted on a robot arm that was able to move quietly in three-dimensional space. A total of 7 subjects with normal hearing were tested in three experiments. The results from Experiment 1 showed a robust and reliable negative aMAE in all the subjects. After listening to a sound source moving repeatedly to the right, a stationary sound source was perceived to move to the left. The magnitude of the aMAE tended to increase with adapting velocity up to the highest velocity tested (20°/sec). The aftereffect was largest when the adapting and the test stimuli had similar spatial location and frequency content. Offsetting the locations of the adapting and the test stimuli by 20° reduced the size of the effect by about 50%. A similar decline occurred when the frequency of the adapting and the test stimuli differed by one octave. Our results suggest that the human auditory system possesses specialized mechanisms for detecting auditory motion in the spatial domain.  相似文献   

3.
In this paper, the auditory motion aftereffect (aMAE) was studied, using real moving sound as both the adapting and the test stimulus. The sound was generated by a loudspeaker mounted on a robot arm that was able to move quietly in three-dimensional space. A total of 7 subjects with normal hearing were tested in three experiments. The results from Experiment 1 showed a robust and reliable negative aMAE in all the subjects. After listening to a sound source moving repeatedly to the right, a stationary sound source was perceived to move to the left. The magnitude of the aMAE tended to increase with adapting velocity up to the highest velocity tested (20 degrees/sec). The aftereffect was largest when the adapting and the test stimuli had similar spatial location and frequency content. Offsetting the locations of the adapting and the test stimuli by 20 degrees reduced the size of the effect by about 50%. A similar decline occurred when the frequency of the adapting and the test stimuli differed by one octave. Our results suggest that the human auditory system possesses specialized mechanisms for detecting auditory motion in the spatial domain.  相似文献   

4.
A horizontally moving sound was presented to an observer seated in the center of an anechoic chamber. The sound, either a 500-Hz low-pass noise or a 6300-Hz high-pass noise, repeatedly traversed a semicircular arc in the observer's front hemifield at ear level (distance: 1.5 m). At 10-sec intervals this adaptor was interrupted, and a 750-msec moving probe (a 500-Hz low-pass noise) was presented from a horizontal arc 1.6 m in front of the observer. During a run, the adaptor was presented at a constant velocity (-200 degrees to +200 degrees/sec), while probes with velocities varying from -10 degrees to +10 degrees/sec were presented in a random order. Observers judged the direction of motion (left or right) of each probe. As in the case of stimuli presented over headphones (Grantham & Wightman, 1979), an auditory motion aftereffect (MAE) occurred: subjects responded "left" to probes more often when the adaptor moved right than when it moved left. When the adaptor and probe were spectrally the same, the MAE was greater than when they were from different spectral regions; the magnitude of this difference depended on adaptor speed and was subject-dependent. It is proposed that there are two components underlying the auditory MAE: (1) a generalized bias to respond that probes move in the direction opposite to that of the adaptor, independent of their spectra; and (2) a loss of sensitivity to the velocity of moving sounds after prolonged exposure to moving sounds having the same spectral content.  相似文献   

5.
The effects of stimulus motion on time perception were examined in five experiments. Subjects judged the durations (6–18 sec) of a series of computer-generated visual displays comprised of varying numbers of simple geometrical forms. In Experiment 1, subjects reproduced the duration of displays consisting of stationary or moving (at 20 cm/sec) stimulus figures. In Experiment 2, subjects reproduced the durations of stimuli that were either stationary, moving slowly (at 10 cm/sec), or moving fast (at 30 cm/sec). In Experiment 3, subjects used the production method to generate specified durations for stationary, slow, and fast displays. In Experiments 4 and 5, subjects reproduced the duration of stimuli that moved at speeds ranging from 0 to 45 cm/sec. Each experiment showed that stimulus motion lengthened perceived time. In general, faster speeds lengthened perceived time to a greater degree than slower speeds. Varying the number of stimuli appearing in the displays had only limited effects on time judgments. Other findings indicated that shorter intervals tended to be overestimated and longer intervals underestimated (Vierordt’s law), an effect which applied to both stationary and moving stimuli. The results support a change model of perceived time, which maintains that intervals associated with more changes are perceived to be longer than intervals with fewer changes.  相似文献   

6.
A fundamental problem in the study of spatial perception concerns whether and how vision might acquire information about the metric structure of surfaces in three-dimensional space from motion and from stereopsis. Theoretical analyses have indicated that stereoscopic perceptions of metric relations in depth require additional information about egocentric viewing distance; and recent experiments by James Todd and his colleagues have indicated that vision acquires only affine but not metric structure from motion—that is, spatial relations ambiguous with regard to scale in depth. The purpose of the present study was to determine whether the metric shape of planar stereoscopic forms might be perceived from congruence under planar rotation. In Experiment 1, observers discriminated between similar planar shapes (ellipses) rotating in a plane with varying slant from the frontal-parallel plane. Experimental conditions varied the presence versus absence of binocular disparities, magnification of the disparity scale, and moving versus stationary patterns. Shape discriminations were accurate in all conditions with moving patterns and were near chance in conditions with stationary patterns; neither the presence nor the magnification of binocular disparities had any reliable effect. In Experiment 2, accuracy decreased as the range of rotation decreased from 80° to 10°. In Experiment 3, small deviations from planarity of the motion produced large decrements in accuracy. In contrast with the critical role of motion in shape discrimination, motion hindered discriminations of the binocular disparity scale in Experiment 4. In general, planar motion provides an intrinsic metric scale that is independent of slant in depth and of the scale of binocular disparities. Vision is sensitive to this intrinsic optical metric.  相似文献   

7.
There is growing interest in the effect of sound on visual motion perception. One model involves the illusion created when two identical objects moving towards each other on a two-dimensional visual display can be seen to either bounce off or stream through each other. Previous studies show that the large bias normally seen toward the streaming percept can be modulated by the presentation of an auditory event at the moment of coincidence. However, no reports to date provide sufficient evidence to indicate whether the sound bounce-inducing effect is due to a perceptual binding process or merely to an explicit inference resulting from the transient auditory stimulus resembling a physical collision of two objects. In the present study, we used a novel experimental design in which a subliminal sound was presented either 150 ms before, at, or 150 ms after the moment of coincidence of two disks moving towards each other. The results showed that there was an increased perception of bouncing (rather than streaming) when the subliminal sound was presented at or 150 ms after the moment of coincidence compared to when no sound was presented. These findings provide the first empirical demonstration that activation of the human auditory system without reaching consciousness affects the perception of an ambiguous visual motion display.  相似文献   

8.
In three visual search experiments, the processes involved in the efficient detection of motion-form conjunction targets were investigated. Experiment 1 was designed to estimate the relative contributions of stationary and moving nontargets to the search rate. Search rates were primarily determined by the number of moving nontargets; stationary nontargets sharing the target form also exerted a significant effect, but this was only about half as strong as that of moving nontargets; stationary nontargets not sharing the target form had little influence. In Experiments 2 and 3, the effects of display factors influencing the visual (form) quality of moving items (movement speed and item size) were examined. Increasing the speed of the moving items (> 1.5 degrees/sec) facilitated target detection when the task required segregation of the moving from the stationary items. When no segregation was necessary, increasing the movement speed impaired performance: With large display items, motion speed had little effect on target detection, but with small items, search efficiency declined when items moved faster than 1.5 degrees/sec. This pattern indicates that moving nontargets exert a strong effect on the search rate (Experiment 1) because of the loss of visual quality for moving items above a certain movement speed. A parallel-continuous processing account of motion-form conjunction search is proposed, which combines aspects of Guided Search (Wolfe, 1994) and attentional engagement theory (Duncan & Humphreys, 1989).  相似文献   

9.
While “recalibration by pairing” is now generally held to be the main process responsible for adaptation to intermodal discordance, the conditions under which pairing of heteromodal data occur in spite of a discordance have not been studied systematically. The question has been explored in the case of auditory-visual discordance. Subjects pointed at auditory targets before and after exposure to auditory and visual data from sources 20° apart in azimuth, in conditions varying by (a) the degree of realism of the context, and (b) the synchronization be-tween auditory and visual data. In Experiment 1, the exposure conditions combined the sound of a percussion instrument (bongos) with either the image on a video monitor of the hands of the player (semirealistic situation) or diffuse light modulated by the sound (nonrealistic situation). Experiment 2 featured a voice and either the image of the face of the speaker or light modulated by the voice, and in both situations either sound and image were exactly syn-chronous or the sound was made to lag by 0.35 sec. Desynchronization was found to reduce adaptation significantly, while degree of realism failed to produce an effect. Answers to a question asked at the end of the testing regarding the location of the sound source suggested that the apparent fusion of the auditory and visual data—the phenomenon called “ventriloquism”— was not affected by the conditions in the same way as adaptation. In Experiment 3, subjects were exposed to the experimental conditions of Experiment 2 and were asked to report their impressions of fusion by pressing a key. The results contribute to the suggestion that pairing of registered auditory and visual locations, the hypothetical process at the basis of recalibration, may be a different phenomenon from conscious fusion.  相似文献   

10.
In the present investigation, the effects of spatial separation on the interstimulus onset intervals (ISOIs) that produce auditory and visual apparent motion were compared. In Experiment 1, subjects were tested on auditory apparent motion. They listened to 50-msec broadband noise pulses that were presented through two speakers separated by one of six different values between 0 degrees and 160 degrees. On each trial, the sounds were temporally separated by 1 of 12 ISOIs from 0 to 500 msec. The subjects were instructed to categorize their perception of the sounds as "single," "simultaneous," "continuous motion," "broken motion," or "succession." They also indicated the proper temporal sequence of each sound pair. In Experiments 2 and 3, subjects were tested on visual apparent motion. Experiment 2 included a range of spatial separations from 6 degrees to 80 degrees; Experiment 3 included separations from .5 degrees to 10 degrees. The same ISOIs were used as in Experiment 1. When the separations were equal, the ISOIs at which auditory apparent motion was perceived were smaller than the values that produced the same experience in vision. Spatial separation affected only visual apparent motion. For separations less than 2 degrees, the ISOIs that produced visual continuous motion were nearly equal to those which produced auditory continuous motion. For larger separations, the ISOIs that produced visual apparent motion increased.  相似文献   

11.
In this study, we show that the contingent auditory motion aftereffect is strongly influenced by visual motion information. During an induction phase, participants listened to rightward-moving sounds with falling pitch alternated with leftward-moving sounds with rising pitch (or vice versa). Auditory aftereffects (i.e., a shift in the psychometric function for unimodal auditory motion perception) were bigger when a visual stimulus moved in the same direction as the sound than when no visual stimulus was presented. When the visual stimulus moved in the opposite direction, aftereffects were reversed and thus became contingent upon visual motion. When visual motion was combined with a stationary sound, no aftereffect was observed. These findings indicate that there are strong perceptual links between the visual and auditory motion-processing systems.  相似文献   

12.
The effects of stimulus duration and spatial separation on the illusion of apparent motion in the auditory modality were examined. Two narrow-band noise sources (40 dB, A-weighted) were presented through speakers separated in space by 2.5°, 5°, or 100, centered about the subject’s midline. The duration of each stimulus was 5, 10, or 50 msec. On each trial, the sound pair was temporally separated by 1 of 10 interstimulus onset intervals (ISOIs): 0, 2, 4, 6, 8, 10, 15, 20, 50, or 70 msec. Five subjects were tested in nine trial blocks; each block represented a particular spatial-separation-duration combination. Within a trial block, each ISOI was presented 30 times each, in random order. Subjects were instructed to listen to the stimulus sequence and classify their perception of the sound into one of five categories: single sound, simultaneous sounds, continuous motion, broken motion, or successive sounds. Each subject was also required to identify the location of the first-occurring stimulus (left or right). The percentage of continuous-motion responses was significantly affected by the ISOI [F(9,36) = 5.67,p < .001], the duration × ISOI interaction [F(18,72) = 3.54,p < .0001], and the separation × duration × ISOI interaction [F(36,144) = 1.51,p < .05]. The results indicate that a minimum duration is required for the perception of auditory apparent motion. Little or no motion was reported at durations of 10 msec or less. At a duration of 50 msec, motion was reported most often for ISOIs of 20–50 msec. The effect of separation appeared to be limited to durations and-ISOIs during which little motion was perceived.  相似文献   

13.
The judged vanishing point of a target undergoing apparent motion in a horizontal, vertical, or oblique direction was examined. In Experiment 1, subjects indicated the vanishing point by positioning a crosshair. Judged vanishing point was displaced forward in the direction of motion, with the magnitude of displacement being largest for horizontal motion, intermediate for oblique motion, and smallest for vertical motion. In addition, the magnitude of displacement increased with faster apparent velocities. In Experiment 2, subjects judged whether a stationary probe presented after the moving target vanished was at the same location where the moving target vanished. Probes were located along the axis of motion, and probes located beyond the vanishing point evidenced a higher probability of a same response than did probes behind the vanishing point. In Experiment 3, subjects judged whether a stationary probe presented after the moving target vanished was located on a straight-line extension of the path of motion of the moving target. Probes below the path of motion evidenced a higher probability of a same response than did probes above the path of motion for horizontal and ascending oblique motion; probes above the path of motion evidenced a higher probability for a same response than did probes below the path of motion for descending oblique motion. Overall, the pattern of results suggests that the magnitude of displacement increases as proximity to a horizontal axis increases, and that in some conditions there may be a component analogous to a gravitational influence incorporated into the mental representation.  相似文献   

14.
In representational momentum (RM), the final position of a moving target is mislocalized in the direction of motion. Here, the effect of a concurrent sound on visual RM was demonstrated. A visual stimulus moved horizontally and disappeared at unpredictable positions. A complex tone without any motion cues was presented continuously from the beginning of the visual motion. As compared with a silent condition, the RM magnitude increased when the sound lasted longer than and decreased when it did not last as long as the visual motion. However, the RM was unchanged when a brief complex tone was presented before or after the target disappeared (Experiment 2) or when the onset of the long-lasting sound was not synchronized with that of the visual motion (Experiments 3 and 4). These findings suggest that visual motion representation can be modulated by a sound if the visual motion information is firmly associated with the auditory information.  相似文献   

15.
Observers were adapted to simulated auditory movement produced by dynamically varying the interaural time and intensity differences of tones (500 or 2,000 Hz) presented through headphones. At lO-sec intervals during adaptation, various probe tones were presented for 1 sec (the frequency of the probe was always the same as that of the adaptation stimulus). Observers judged the direction of apparent movement (“left” or “right”) of each probe tone. At 500 Hz, with a 200-deg/sec adaptation velocity, “stationary” probe tones were consistently judged to move in the direction opposite to that of the adaptation stimulus. We call this result an auditory motion aftereffect. In slower velocity adaptation conditions, progressively less aftereffect was demonstrated. In the higher frequency condition (2,000 Hz, 200-deg/sec adaptation velocity), we found no evidence of motion aftereffect. The data are discussed in relation to the well-known visual analog-the “waterfall effect.” Although the auditory aftereffect is weaker than the visual analog, the data suggest that auditory motion perception might be mediated, as is generally believed for the visual system, by direction-specific movement analyzers.  相似文献   

16.
The effect of full-field sinusoidal visual roll motion stimuli of various frequencies and peak velocities upon the orientation of subjective visual vertical (SV) was studied. The angle of the optokinetically induced displacement of SV was found to be a linear function of the logarithm of the stimulus oscillation angle. Interindividual slopes of this function varied between 2 and 9. The logarithmic function is independent of stimulus frequency within the range of .02 Hz to .5 Hz and of peak stimulus velocity from 7.5°/sec to 170°/sec. It holds for oscillation angles up to 100°–140°. With larger rotational angles, saturation is reached. With small stimulus angles, a surprisingly high threshold (5°-8°) was observed in our experiments. This may reflect the unphysiological combination of visual roll stimuli without corroborating vestibular and proprioceptive inputs normally present when body sway produces visual stimulation. Under natural conditions, the visual feedback about spontaneous sway stabilizes body posture by integrating rotational velocity over stimulus duration which is equal to rotational angle.  相似文献   

17.
Perceived movement of a stationary visual stimulus during head motion was measured before and after adaptation intervals during which participants performed voluntary head oscillations while viewing a moving spot. During these intervals, participants viewed the spot stimulus moving alternately in the same direction as the head was moving during either .25- or 2.0-Hz oscillations, and then in the opposite direction as the head at the other of the two frequencies. Postadaptation measures indicated that the visual stimuli were perceived as stationary only if traveling in the same direction as that viewed during adaptation at the same frequency of head motion. Thus, opposite directions of spot motion were perceived as stationary following adaptation depending on head movement frequency. The results provide an example of the ability to establish dual (or “context-specific”) adaptations to altered visual—vestibular feedback.  相似文献   

18.
Unidirectional motion of a uniplanar background induces a codirectional postural sway. It has been shown recently that fixation of a stationary foreground object induces a sway response in the opposite direction (Bronstein & Buckwell, 1997) when the background moves transiently. The present study investigated factors determining this contradirectional postural response. In the experiments presented, center of foot pressure and head displacements were recorded from normal subjects. The subjects faced a visual background of 2 x 3 m, at a distance of 1.5 m, which could be moved parallel to the interaural axis. Results showed that when the visual scene consisted solely of a moving background, the conventional codirectional postural response was elicited. When subjects were asked to fixate an earth-fixed foreground (window frame) placed between them and the moving background, a consistent postural response in the opposite direction to background motion was observed. In addition, we showed that this contradirectional postural response was not transient but was sustained for the 11 sec of background motion. We investigated whether this contradirectional postural response was the consequence of the induced movement of the foreground by background motion. Although induced movement was verbally reported by subjects when viewing an earth-fixed target projected onto the moving background, the contradirectional sway did not occur. These results indicate that foreground-background separation in depth was necessary for the contradirectional postural response to occur rather than induced movement. Another experiment showed that, when the fixated foreground was attached to the head of the observer, the contradirectional sway was not observed and was therefore unrelated to vergence. Finally, results showed that the contradirectional postural response was, in the main, monocularly mediated. We conclude that the direction of the postural sway produced by a moving background in a three-dimensional environment is determined primarily by motion parallax.  相似文献   

19.
Motion information available to different sensory modalities can interact at both perceptual and post-perceptual (i.e., decisional) stages of processing. However, to date, researchers have only been able to demonstrate the influence of one of these components at any given time, hence the relationship between them remains uncertain. We addressed the interplay between the perceptual and post-perceptual components of information processing by assessing their influence on performance within the same experimental paradigm. We used signal detection theory to discriminate changes in perceptual sensitivity (d') from shifts in response criterion (c) in performance on a detection (Experiment 1) and a classification (Experiment 2) task regarding the direction of auditory apparent motion streams presented in noise. In the critical conditions, a visual motion distractor moving either leftward or rightward was presented together with the auditory motion. The results demonstrated a significant decrease in sensitivity to the direction of the auditory targets in the crossmodal conditions as compared to the unimodal baseline conditions that was independent of the relative direction of the visual distractor. In addition, we also observed significant shifts in response criterion, which were dependent on the relative direction of the distractor apparent motion. These results support the view that the perceptual and decisional components involved in audiovisual interactions in motion processing can coexist but are largely independent of one another.  相似文献   

20.
Using straight translatory motion of a visual peripheral cue in the frontoparallel plane, and probing target discrimination at different positions along the cue's motion trajectory, we found that target orientation discrimination was slower for targets presented at or near the position of motion onset (4.2° off centre), relative to the onset of a static cue (Experiment 1), and relative to targets presented further along the motion trajectory (Experiments 1 and 2). Target discrimination was equally fast and accurate in the moving cue conditions relative to static cue conditions at positions further along the cue's motion trajectory (Experiment 1). Moreover, target orientation discrimination was not slowed at the same position, once this position was no longer the motion onset position (Experiment 3), and performance in a target colour-discrimination task was not slowed even at motion onset (Experiment 4). Finally, we found that the onset location of the motion cue was perceived as being shifted in the direction of the cue's motion (Experiment 5). These results indicate that attention cannot be as quickly or precisely shifted to the onset of a motion stimulus as to other positions on a stimulus’ motion trajectory.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号