首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Three experiments investigating the basis of induced motion are reported. The proposition that induced motion is based on the visual capture of eye-position information and is therefore a subject-relative, rather than object-relative, motion was explored in the first experiment. Observers made saccades to an invisible auditory stimulus following fixation on a stationary stimulus in which motion was induced. In the remaining two experiments, the question of whether perceived induced motion produces a straight ahead shift was explored. The critical eye movement was directed to apparent straight ahead. Because these saccades partially compensated for the apparent displacement of the induction stimulus, and saccades to the auditory stimulus did not, we conclude that induced motion is not based on oculomotor visual capture. Rather, it is accompanied by a shift in the judged direction of straight ahead, an instance of the straight ahead shift. The results support an object-relative theory of induced motion.  相似文献   

2.
Several studies have shown that the direction in which a visual apparent motion stream moves can influence the perceived direction of an auditory apparent motion stream (an effect known as crossmodal dynamic capture). However, little is known about the role that intramodal perceptual grouping processes play in the multisensory integration of motion information. The present study was designed to investigate the time course of any modulation of the cross-modal dynamic capture effect by the nature of the perceptual grouping taking place within vision. Participants were required to judge the direction of an auditory apparent motion stream while trying to ignore visual apparent motion streams presented in a variety of different configurations. Our results demonstrate that the cross-modal dynamic capture effect was influenced more by visual perceptual grouping when the conditions for intramodal perceptual grouping were set up prior to the presentation of the audiovisual apparent motion stimuli. However, no such modulation occurred when the visual perceptual grouping manipulation was established at the same time as or after the presentation of the audiovisual stimuli. These results highlight the importance of the unimodal perceptual organization of sensory information to the manifestation of multisensory integration.  相似文献   

3.
The mappings from grapheme to phoneme are much less consistent in English than they are for most other languages. Therefore, the differences found between English-speaking dyslexics and controls on sensory measures of temporal processing might be related more to the irregularities of English orthography than to a general deficit affecting reading ability in all languages. However, here we show that poor readers of Norwegian, a language with a relatively regular orthography, are less sensitive than controls to dynamic visual and auditory stimuli. Consistent with results from previous studies of English-readers, detection thresholds for visual motion and auditory frequency modulation (FM) were significantly higher in 19 poor readers of Norwegian compared to 22 control readers of the same age. Over two-thirds (68.4%) of the children identified as poor readers were less sensitive than controls to either or both of the visual coherent motion or auditory 2Hz FM stimuli.  相似文献   

4.
In the present investigation, the effects of spatial separation on the interstimulus onset intervals (ISOIs) that produce auditory and visual apparent motion were compared. In Experiment 1, subjects were tested on auditory apparent motion. They listened to 50-msec broadband noise pulses that were presented through two speakers separated by one of six different values between 0 degrees and 160 degrees. On each trial, the sounds were temporally separated by 1 of 12 ISOIs from 0 to 500 msec. The subjects were instructed to categorize their perception of the sounds as "single," "simultaneous," "continuous motion," "broken motion," or "succession." They also indicated the proper temporal sequence of each sound pair. In Experiments 2 and 3, subjects were tested on visual apparent motion. Experiment 2 included a range of spatial separations from 6 degrees to 80 degrees; Experiment 3 included separations from .5 degrees to 10 degrees. The same ISOIs were used as in Experiment 1. When the separations were equal, the ISOIs at which auditory apparent motion was perceived were smaller than the values that produced the same experience in vision. Spatial separation affected only visual apparent motion. For separations less than 2 degrees, the ISOIs that produced visual continuous motion were nearly equal to those which produced auditory continuous motion. For larger separations, the ISOIs that produced visual apparent motion increased.  相似文献   

5.
Visual stimuli are often processed more efficiently than accompanying stimuli in another modality. In line with this “visual dominance”, earlier studies on attentional switching showed a clear benefit for visual stimuli in a bimodal visual–auditory modality-switch paradigm that required spatial stimulus localization in the relevant modality. The present study aimed to examine the generality of this visual dominance effect. The modality appropriateness hypothesis proposes that stimuli in different modalities are differentially effectively processed depending on the task dimension, so that processing of visual stimuli is favored in the dimension of space, whereas processing auditory stimuli is favored in the dimension of time. In the present study, we examined this proposition by using a temporal duration judgment in a bimodal visual–auditory switching paradigm. Two experiments demonstrated that crossmodal interference (i.e., temporal stimulus congruence) was larger for visual stimuli than for auditory stimuli, suggesting auditory dominance when performing temporal judgment tasks. However, attention switch costs were larger for the auditory modality than for visual modality, indicating a dissociation of the mechanisms underlying crossmodal competition in stimulus processing and modality-specific biasing of attentional set.  相似文献   

6.
Previous research suggests that there are significant differences in the operation of reference memory for stimuli of different modalities, with visual temporal entries appearing to be more durable than auditory entries (Ogden, Wearden, & Jones, 2008 , 2010). Ogden et al. ( 2008 , 2010 ) demonstrated that when participants were required to store multiple auditory temporal standards over a period of delay there was significant systematic interference to the representation of the standard characterized by shifts in the location of peak responding. No such performance deterioration was observed when multiple visually presented durations were encoded and maintained. The current article explored whether this apparent modality-based difference in reference memory operation is unique to temporal stimuli or whether similar characteristics are also apparent when nontemporal stimuli are encoded and maintained. The modified temporal generalization method developed in Ogden et al. (2008) was employed; however, standards and comparisons varied by pitch (auditory) and physical line length (visual) rather than duration. Pitch and line length generalization results indicated that increasing memory load led to more variable responding and reduced recognition of the standard; however, there was no systematic shift in the location of peak responding. Comparison of the results of this study with those of Ogden et al. (2008, 2010) suggests that although performance deterioration as a consequence of increases in memory load is common to auditory temporal and nontemporal stimuli and visual nontemporal stimuli, systematic interference is unique to auditory temporal processing.  相似文献   

7.
Similarities have been observed in the localization of the final position of moving visual and moving auditory stimuli: Perceived endpoints that are judged to be farther in the direction of motion in both modalities likely reflect extrapolation of the trajectory, mediated by predictive mechanisms at higher cognitive levels. However, actual comparisons of the magnitudes of displacement between visual tasks and auditory tasks using the same experimental setup are rare. As such, the purpose of the present free-field study was to investigate the influences of the spatial location of motion offset, stimulus velocity, and motion direction on the localization of the final positions of moving auditory stimuli (Experiment 1 and 2) and moving visual stimuli (Experiment 3). To assess whether auditory performance is affected by dynamically changing binaural cues that are used for the localization of moving auditory stimuli (interaural time differences for low-frequency sounds and interaural intensity differences for high-frequency sounds), two distinct noise bands were employed in Experiments 1 and 2. In all three experiments, less precise encoding of spatial coordinates in paralateral space resulted in larger forward displacements, but this effect was drowned out by the underestimation of target eccentricity in the extreme periphery. Furthermore, our results revealed clear differences between visual and auditory tasks. Displacements in the visual task were dependent on velocity and the spatial location of the final position, but an additional influence of motion direction was observed in the auditory tasks. Together, these findings indicate that the modality-specific processing of motion parameters affects the extrapolation of the trajectory.  相似文献   

8.
The numerosity of any set of discrete elements can be depicted by a genuinely abstract number representation, irrespective of whether they are presented in the visual or auditory modality. The accumulator model predicts that no cost should apply for comparing numerosities within and across modalities. However, in behavioral studies, some inconsistencies have been apparent in the performance of number comparisons among different modalities. In this study, we tested whether and how numerical comparisons of visual, auditory, and cross-modal presentations would differ under adequate control of stimulus presentation. We measured the Weber fractions and points of subjective equality of numerical discrimination in visual, auditory, and cross-modal conditions. The results demonstrated differences between the performances in visual and auditory conditions, such that numerical discrimination of an auditory sequence was more precise than that of a visual sequence. The performance of cross-modal trials lay between performance levels in the visual and auditory conditions. Moreover, the number of visual stimuli was overestimated as compared to that of auditory stimuli. Our findings imply that the process of approximate numerical representation is complex and involves multiple stages, including accumulation and decision processes.  相似文献   

9.
When auditory stimuli are used in two-dimensional spatial compatibility tasks, where the stimulus and response configurations vary along the horizontal and vertical dimensions simultaneously, a right–left prevalence effect occurs in which horizontal compatibility dominates over vertical compatibility. The right–left prevalence effects obtained with auditory stimuli are typically larger than that obtained with visual stimuli even though less attention should be demanded from the horizontal dimension in auditory processing. In the present study, we examined whether auditory or visual dominance occurs when the two-dimensional stimuli are audiovisual, as well as whether there will be cross-modal facilitation of response selection for the horizontal and vertical dimensions. We also examined whether there is an additional benefit of adding a pitch dimension to the auditory stimulus to facilitate vertical coding through use of the spatial-musical association of response codes (SMARC) effect, where pitch is coded in terms of height in space. In Experiment 1, we found a larger right–left prevalence effect for unimodal auditory than visual stimuli. Neutral, non-pitch coded, audiovisual stimuli did not result in cross-modal facilitation, but did show evidence of visual dominance. The right–left prevalence effect was eliminated in the presence of SMARC audiovisual stimuli, but the effect influenced horizontal rather than vertical coding. Experiment 2 showed that the influence of the pitch dimension was not in terms of influencing response selection on a trial-to-trial basis, but in terms of altering the salience of the task environment. Taken together, these findings indicate that in the absence of salient vertical cues, auditory and audiovisual stimuli tend to be coded along the horizontal dimension and vision tends to dominate audition in this two-dimensional spatial stimulus–response task.  相似文献   

10.
The effect of brief auditory stimuli on visual apparent motion   总被引:1,自引:0,他引:1  
Getzmann S 《Perception》2007,36(7):1089-1103
When two discrete stimuli are presented in rapid succession, observers typically report a movement of the lead stimulus toward the lag stimulus. The object of this study was to investigate crossmodal effects of irrelevant sounds on this illusion of visual apparent motion. Observers were presented with two visual stimuli that were temporally separated by interstimulus onset intervals from 0 to 350 ms. After each trial, observers classified their impression of the stimuli using a categorisation system. The presentation of short sounds intervening between the visual stimuli facilitated the impression of apparent motion relative to baseline (visual stimuli without sounds), whereas sounds presented before the first and after the second visual stimulus as well as simultaneously presented sounds reduced the motion impression. The results demonstrate an effect of the temporal structure of irrelevant sounds on visual apparent motion that is discussed in light of a related multisensory phenomenon, 'temporal ventriloquism', on the assumption that sounds can attract lights in the temporal dimension.  相似文献   

11.
Craig JC 《Perception》2006,35(3):351-367
Previous studies have demonstrated that visual apparent motion can alter the judgment of auditory apparent motion. We investigated the effect of visual apparent motion on judgments of the direction of tactile apparent motion. When visual motion was presented at the same time as, but in a direction opposite to, tactile motion, accuracy in judging the direction of tactile apparent motion was substantially reduced. This reduction in performance is referred to as 'the congruency effect'. Similar effects were observed when the visual display was placed either near to the tactile display or at some distance from the tactile display (experiment 1). In experiment 2, the relative alignment between the visual and tactile directions of motion was varied. The size of the congruency effect was similar at 0 degrees and 45 degrees alignments but much reduced at a 90 degrees alignment. In experiment 3, subjects made confidence ratings of their judgments of the direction of the tactile motion. The results indicated that the congruency effect was not due to subjects being unsure of the direction of motion and being forced to guess. In experiment 4, static visual stimuli were shown to have no effect on the judgments of direction of the tactile stimuli. The extent to which the congruency effect reflects capture effects and is the result of perceptual versus post-perceptual processes is discussed.  相似文献   

12.
Huddleston WE  DeYoe EA 《Perception》2003,32(9):1141-1149
Light energy displaced along the retinal photoreceptor array leads to a perception of visual motion. In audition, displacement of mechanical energy along the cochlear hair cell array is conceptually similar but leads to a perception of 'movement' in frequency space (spectral motion)--a rising or falling pitch. In vision there are other types of stimuli that also evoke a percept of motion but do not involve a displacement of energy across the photoreceptors (second-order stimuli). In this study, we used psychophysical methods to determine if such second-order stimuli also exist in audition, and if the resulting percept would rival that of first-order spectral motion. First-order auditory stimuli consisted of a frequency sweep of sixteen non-harmonic tones between 297 and 12123 Hz. Second-order stimuli consisted of the same tones, but with a random subset turned on at the beginning of a trial. During the trial, each tone in sequence randomly changed state (ON-to-OFF, or OFF-to-ON). Thus, state transitions created a 'sweep' having no net energy displacement correlated to the sweep direction. At relatively slow sweep speeds, subjects readily identified the sweep direction for both first-order and second-order stimuli, though accuracy decreased for second-order stimuli as the sweep speed increased. This latter characteristic is also true of some second-order visual stimuli. These results suggest a stronger parallelism between auditory and visual processing than previously appreciated.  相似文献   

13.
Xiao M  Wong M  Umali M  Pomplun M 《Perception》2007,36(9):1391-1395
Perceptual integration of audio-visual stimuli is fundamental to our everyday conscious experience. Eye-movement analysis may be a suitable tool for studying such integration, since eye movements respond to auditory as well as visual input. Previous studies have shown that additional auditory cues in visual-search tasks can guide eye movements more efficiently and reduce their latency. However, these auditory cues were task-relevant since they indicated the target position and onset time. Therefore, the observed effects may have been due to subjects using the cues as additional information to maximize their performance, without perceptually integrating them with the visual displays. Here, we combine a visual-tracking task with a continuous, task-irrelevant sound from a stationary source to demonstrate that audio-visual perceptual integration affects low-level oculomotor mechanisms. Auditory stimuli of constant, increasing, or decreasing pitch were presented. All sound categories induced more smooth-pursuit eye movement than silence, with the greatest effect occurring with stimuli of increasing pitch. A possible explanation is that integration of the visual scene with continuous sound creates the perception of continuous visual motion. Increasing pitch may amplify this effect through its common association with accelerating motion.  相似文献   

14.
Dissociations between a motor response and the subject's verbal report have been reported from various experiments that investigated special experimental effects (e.g., metacontrast or induced motion). To examine whether similar dissociations can also be observed under standard experimental conditions, we compared reaction times (RT) and temporal order judgments (TOJ) to visual and auditory stimuli of three intensity levels. Data were collected from six subjects, each of which served for nine sessions. The results showed a strong, highly significant modality dissociation: While RTs to auditory stimuli were shorter than RTs to visual stimuli, the TOJ data indicated longer processing times for auditory than for visual stimuli. This pattern was found over the whole range of intensities investigated. Light intensity had similar effects on RT and TOJ, while there was a marginally significant tendency of tone intensity to affect RT more strongly than TOJ. It is concluded that modality dissociation is an example of "direct parameter specification", where the pathway from stimulus to response in the simple RT experiment is (at least partially) separate from the pathway that leads to a conscious, reportable representation. Two variants of this notion and alternatives to it are discussed.  相似文献   

15.
Repeating temporal patterns were presented in the auditory and visual modalities so that: (a) all elements were of equal intensity and were equally spaced in time (uniform presentation); (b) the intensity of one element was increased (accent presentation); or (c) the interval between two elements was increased (pause presentation). Intensity and interval patterning serve to segment the element sequence into repeating patterns.

For uniform presentation, pattern organization was by pattern structure, with auditory identification being faster. For pause presentation, organization was by the pauses; both auditory and visual identification were twice as fast as for uniform presentation. For auditory accent presentation, organization was by pattern structure and identification was slower than for uniform presentation. In contrast, the organization of visual accent presentation was by accents and identification was faster than for uniform presentation. These results suggest that complex stimuli, in which elements are patterned along more than one sensory dimension, are perceptually unique and therefore their identification rests on the nature of each modality.  相似文献   

16.
Four experiments examined transfer of noncorresponding spatial stimulus-response associations to an auditory Simon task for which stimulus location was irrelevant. Experiment 1 established that, for a horizontal auditory Simon task, transfer of spatial associations occurs after 300 trials of practice with an incompatible mapping of auditory stimuli to keypress responses. Experiments 2-4 examined transfer effects within the auditory modality when the stimuli and responses were varied along vertical and horizontal dimensions. Transfer occurred when the stimuli and responses were arrayed along the same dimension in practice and transfer but not when they were arrayed along orthogonal dimensions. These findings indicate that prior task-defined associations have less influence on the auditory Simon effect than on the visual Simon effect, possibly because of the stronger tendency for an auditory stimulus to activate its corresponding response.  相似文献   

17.
Many studies have demonstrated that infants exhibit robust auditory rhythm discrimination, but research on infants' perception of visual rhythm is limited. In particular, the role of motion in infants' perception of visual rhythm remains unknown, despite the prevalence of motion cues in naturally occurring visual rhythms. In the present study, we examined the role of motion in 7-month-old infants' discrimination of visual rhythms by comparing experimental conditions with apparent motion in the stimuli versus stationary rhythmic stimuli. Infants succeeded at discriminating visual rhythms only when the visual rhythm occurred with an apparent motion component. These results support the view that motion plays a role in infants' perception of visual temporal information, consistent with the manner in which natural rhythms appear in the visual world.  相似文献   

18.
Pigeons were exposed to multiple second-order schedules of paired and unpaired brief stimuli in which responding on the main key was reinforced according to a fixed-interval thirty-second schedule by a brief stimulus (a tone in the paired schedule) and advancement to the next segment of the second-order schedule. In Experiment 1, a response on the second key was required during the tone in its fourth and final presentation to produce food. Responses during earlier brief stimuli indicated the extent to which the final brief stimulus was discriminated from preceding ones. Responding was comparable during all tones, extending prior findings with visual paired brief stimuli and weakening explanations of subjects' failure to discriminate between brief-stimulus presentations in terms of elicited responding. In Experiment 2 the number of fixed-interval segments comprising the second-order schedules varied from one through eight. Although main-key response rates increased across segments in both experiments, they increased much less sharply with a variable number of segments. These results suggest that the increase in main-key response rates across segments is due primarily to a degree of temporal discrimination not reflected on the second key. Main-key response rates were higher on paired auditory brief-stimulus schedules than on unpaired visual brief-stimulus schedules, especially in Experiment 2, thus further extending findings with visual brief stimuli to second-order schedules with auditory brief stimuli.  相似文献   

19.
Using a visual and an acoustic sample set that appeared to favour the auditory modality of the monkey subjects, in Experiment 1 retention gradients generated in closely comparable visual and auditory matching (go/no-go) tasks revealed a more durable short-term memory (STM) for the visual modality. In Experiment 2, potentially interfering visual and acoustic stimuli were introduced during the retention intervals of the auditory matching task. Unlike the case of visual STM, delay-interval visual stimulation did not affect auditory STM. On the other hand, delay-interval music decreased auditory STM, confirming that the monkeys maintained an auditory trace during the retention intervals. Surprisingly, monkey vocalizations injected during the retention intervals caused much less interference than music. This finding, which was confirmed by the results of Experiments 3 and 4, may be due to differential processing of “arbitrary” (the acoustic samples) and species-specific (monkey vocalizations) sounds by the subjects. Although less robust than visual STM, auditory STM was nevertheless substantial, even with retention intervals as long as 32 sec.  相似文献   

20.
Synchronization of finger taps with periodically flashing visual stimuli is known to be much more variable than synchronization with an auditory metronome. When one of these rhythms is the synchronization target and the other serves as a distracter at various temporal offsets, strong auditory dominance is observed. However, it has recently been shown that visuomotor synchronization improves substantially with moving stimuli such as a continuously bouncing ball. The present study pitted a bouncing ball against an auditory metronome in a target–distracter synchronization paradigm, with the participants being auditory experts (musicians) and visual experts (video gamers and ball players). Synchronization was still less variable with auditory than with visual target stimuli in both groups. For musicians, auditory stimuli tended to be more distracting than visual stimuli, whereas the opposite was the case for the visual experts. Overall, there was no main effect of distracter modality. Thus, a distracting spatiotemporal visual rhythm can be as effective as a distracting auditory rhythm in its capacity to perturb synchronous movement, but its effectiveness also depends on modality-specific expertise.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号