首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We employed audiovisual stream/bounce displays, in which two moving objects with crossing trajectories are more likely to be perceived as bouncing off, rather than streaming through, each other when a brief sound is presented at the coincidence of the two objects. However, Kawachi and Gyoba (Perception 35:1289–1294, 2006b) reported that the presence of an additional moving object near the two objects altered the perception of a bouncing event to that of a streaming event. In this study, we extended this finding and examined whether alteration of the event perception could be induced by the visual context, such as by occluded object motion near the stream/bounce display. The results demonstrated that even when the sound was presented, the continuous occluded motion strongly biased observers’ percepts toward the streaming percept during a short occlusion interval (approximately 100 ms). In contrast, when the continuous occluded motion was disrupted by introducing a spatiotemporal gap in the motion trajectory or by removing occlusion cues such as deletion/accretion, the bias toward the streaming percept declined. Thus, we suggest that a representation of object motion generated under a limited occlusion interval interferes with audiovisual event perception.  相似文献   

2.
Kawachi Y  Gyoba J 《Perception》2006,35(9):1289-1294
Two identical visual objects moving across each other in a two-dimensional display can be perceived as either streaming through or bouncing off each other. The bouncing event percept is promoted by the presentation of a brief sound at the point of coincidence of the two objects. In this study, we examined the effect of the presence of a moving object near the two objects as well as the brief sound on the stream/bounce event perception. When both the nearby moving object and brief sound were presented, a streaming event, not a bouncing event, was robustly perceived (experiment 1). The percentage of the streaming percept was also systematically affected by the proximity of the nearby object (experiment 2). These results suggest that the processing of intramodal grouping between a nearby moving object and either of the two objects in the stream/bounce display interferes with crossmodal (audiovisual) processing. Moreover, we demonstrated that, depending on the trajectory of the nearby moving object, the processing of intramodal grouping can promote the bouncing percept, just as crossmodal processing does (experiment 3).  相似文献   

3.
In a two-dimensional display, identical visual targets moving toward and across each other with equal, constant speed can be perceived either to reverse their motion directions at the coincidence point (bouncing percept) or to stream through one another (streaming percept). Although there is a strong tendency to perceive the streaming percept, various factors have been reported to induce the bouncing percept, such as a sound or a visual flash at the moment of the visual target coincidence. By changing duration of the postcoincidence trajectory (PCT), we investigated how long it would take for such bounce-inducing factors to be maximally effective after the visual coincidence. With bounce-inducing factors, the percentage of the bouncing percept did not reach its maximal level immediately after the coincidence but increased as a function of PCT duration up to 150-200 msec. The results clearly reject the possibility of the cognitive-bias hypothesis about the bounce-inducing effect and suggest rather that the bounce-inducing factors have to interact with the PCT for some period after the coincidence to be maximally effective.  相似文献   

4.
Two identical visual targets moving across each other can be perceived either to bounce off or to stream through each other. A brief sound at the moment the targets coincide biases perception toward bouncing. We found that this bounce-inducing effect was attenuated when other identical sounds (auditory flankers) were presented 300 ms before and after the simultaneous sound. The attenuation occurred only when the simultaneous sound and auditory flankers had similar acoustic characteristics and the simultaneous sound was not salient. These results suggest that there is an aspect of auditory-grouping (saliency-assigning) processes that is context-sensitive and can be utilized by the visual system for solving ambiguity. Furthermore, control experiments revealed that such auditory context did not affect the perceptual qualities of the simultaneous sound. Because the attenuation effect is not manifest in the perception of acoustic characteristics of individual sound elements, we conclude that it is a genuine cross-modal effect.  相似文献   

5.
Adults who watch an ambiguous visual event consisting of two identical objects moving toward, through, and away from each other and hear a brief sound when the objects overlap report seeing visual bouncing. We conducted three experiments in which we used the habituation/test method to determine whether these illusory effects might emerge early in development. In Experiments 1 and 3 we tested 4‐, 6‐ and 8‐month‐old infants’ discrimination between an ambiguous visual display presented together with a sound synchronized with the objects’ spatial coincidence and the identical visual display presented together with a sound no longer synchronized with coincidence. Consistent with illusory perception, the 6‐ and 8‐month‐old, but not the 4‐month‐old, infants responded to these events as different. In Experiment 2 infants were habituated to the ambiguous visual display together with a sound synchronized with the objects’ coincidence and tested with a physically bouncing object accompanied by the sound at the bounce. Consistent with illusory perception again, infants treated these two events as equivalent by not exhibiting response recovery. The developmental emergence of this intersensory illusion at 6 months of age is hypothesized to reflect developmental changes in object knowledge and attentional mechanisms.  相似文献   

6.
Previous research has shown that irrelevant sounds can facilitate the perception of visual apparent motion. Here the effectiveness of a single sound to facilitate motion perception was investigated in three experiments. Observers were presented with two discrete lights temporally separated by stimulus onset asynchronies from 0 to 350 ms. After each trial, observers classified their impression of the stimuli using a categorisation system. A short sound presented temporally (and spatially) midway between the lights facilitated the impression of motion relative to baseline (lights without sound), whereas a sound presented either before the first or after the second light or simultaneously with the lights did not affect motion impression. The facilitation effect also occurred with sound presented far from the visual display, as well as with continuous-sound that was started with the first light and terminated with the second light. No facilitation of visual motion perception occurred if the sound was part of a tone sequence that allowed for intramodal perceptual grouping of the auditory stimuli prior to the critical audiovisual stimuli. Taken together, the findings are consistent with a low-level audiovisual integration approach in which the perceptual system merges temporally proximate sound and light stimuli, thereby provoking the impression of a single multimodal moving object.  相似文献   

7.
In this study, we examined the contribution of the orientation of moving objects to perception of a streaming/bouncing motion display. In three experiments, participants reported which of the two types of motion, streaming or bouncing, they perceived. The following independent variables were used: orientation differences between Gabor micropatterns (Gabors) and their path of motion (all the experiments) and the presence/absence of a transient tone (Experiment 1), transient visual flash (Experiment 2), or concurrent secondary task (Experiment 3) at the coincidence of Gabors. The results showed that the events at coincidence generally biased responses toward the perception of bouncing. On the other hand, alignment of Gabors with their motion axes significantly reduced the frequency of bounce perception. The results also indicated that an object whose orientation was parallel to its motion path strengthened the spatiotemporal integration of local motion signals along a straight motion path, resulting in the perception of streaming. We suggest that the effect of collinearity between Gabors and their motion path is relatively free from the effect of attention distraction.  相似文献   

8.
Grove PM  Kawachi Y  Kenzo S 《Perception》2012,41(4):379-388
We generalised the stream/bounce effect to dynamic random element displays containing luminance- or disparity-defined targets. Previous studies investigating audio-visual interactions in this context have exclusively employed motion sequences with luminance-defined disks or squares and have focused on properties of the accompanying brief stimuli rather than the visual properties of the motion targets. We found that the presence of a brief sound temporally close to coincidence, or a visual flash at coincidence significantly promote bounce perception for motion targets defined by either luminance contrast or disparity contrast. A brief tone significantly promoted bouncing of luminance-defined targets above a no-sound baseline when it was presented at least 250 ms before coincidence and 100 ms after coincidence. A similar pattern was observed for disparity-defined targets, though the tone promoted bouncing above the no-sound baseline when presented at least 350 ms before and 300 ms after coincidence. We further explored the temporal properties of audio-visual interactions for these two display types and found that bounce perception saturated at similar durations after coincidence. The stream/bounce illusion manifests itself in dynamic random-element displays and is similar for luminance- and disparity-defined motion targets.  相似文献   

9.
The stream/bounce display represents an ambiguous motion event in which two identical visual objects move toward one another and the objects overlap completely before they pass each another. In our perception, they can be interpreted as either streaming past one another or bouncing off each other. Previous studies have shown that the streaming percept of the display is generic for humans, suggesting the inertial nature of the motion integration process. In this study, chimpanzees took part in behavioral experiments using an object-tracking task to reveal the characteristics of their stream/bounce perception. Chimpanzees did not show a tendency toward a dominant "stream" perception of the stream/bounce stimulus. However, depth cues, such as X-junctions and local motion coherence, did promote the stream percept in chimpanzees. These results suggest both similarities and differences between chimpanzees and humans with respect to motion integration and object individuation processes.  相似文献   

10.
Strybel TZ  Vatakis A 《Perception》2004,33(9):1033-1048
Unimodal auditory and visual apparent motion (AM) and bimodal audiovisual AM were investigated to determine the effects of crossmodal integration on motion perception and direction-of-motion discrimination in each modality. To determine the optimal stimulus onset asynchrony (SOA) ranges for motion perception and direction discrimination, we initially measured unimodal visual and auditory AMs using one of four durations (50, 100, 200, or 400 ms) and ten SOAs (40-450 ms). In the bimodal conditions, auditory and visual AM were measured in the presence of temporally synchronous, spatially displaced distractors that were either congruent (moving in the same direction) or conflicting (moving in the opposite direction) with respect to target motion. Participants reported whether continuous motion was perceived and its direction. With unimodal auditory and visual AM, motion perception was affected differently by stimulus duration and SOA in the two modalities, while the opposite was observed for direction of motion. In the bimodal audiovisual AM condition, discriminating the direction of motion was affected only in the case of an auditory target. The perceived direction of auditory but not visual AM was reduced to chance levels when the crossmodal distractor direction was conflicting. Conversely, motion perception was unaffected by the distractor direction and, in some cases, the mere presence of a distractor facilitated movement perception.  相似文献   

11.
K Watanabe  S Shimojo 《Perception》1998,27(9):1041-1054
Identical visual targets moving across each other with equal and constant speed can be perceived either to bounce off or to stream through each other. This bistable motion perception has been studied mostly in the context of motion integration. Since the perception of most ambiguous motion is affected by attention, there is the possibility of attentional modulation occurring in this case as well. We investigated whether distraction of attention from the moving targets would alter the relative frequency of each percept. During the observation of the streaming/bouncing motion event in the peripheral visual field, visual attention was disrupted by an abrupt presentation of a visual distractor at various timings and locations (experiment 1; exogenous distraction of attention) or by the demand of an additional discrimination task (experiments 2 and 3; endogenous distraction of attention). Both types of distractions of attention increased the frequency of the bouncing percept and decreased that of the streaming percept. These results suggest that attention may facilitate the perception of object motion as continuing in the same direction as in the past.  相似文献   

12.
Auditory and visual processes demonstrably enhance each other based on spatial and temporal coincidence. Our recent results on visual search have shown that auditory signals also enhance visual salience of specific objects based on multimodal experience. For example, we tend to see an object (e.g., a cat) and simultaneously hear its characteristic sound (e.g., “meow”), to name an object when we see it, and to vocalize a word when we read it, but we do not tend to see a word (e.g., cat) and simultaneously hear the characteristic sound (e.g., “meow”) of the named object. If auditory–visual enhancements occur based on this pattern of experiential associations, playing a characteristic sound (e.g., “meow”) should facilitate visual search for the corresponding object (e.g., an image of a cat), hearing a name should facilitate visual search for both the corresponding object and corresponding word, but playing a characteristic sound should not facilitate visual search for the name of the corresponding object. Our present and prior results together confirmed these experiential association predictions. We also recently showed that the underlying object-based auditory–visual interactions occur rapidly (within 220 ms) and guide initial saccades towards target objects. If object-based auditory–visual enhancements are automatic and persistent, an interesting application would be to use characteristic sounds to facilitate visual search when targets are rare, such as during baggage screening. Our participants searched for a gun among other objects when a gun was presented on only 10% of the trials. The search time was speeded when a gun sound was played on every trial (primarily on gun-absent trials); importantly, playing gun sounds facilitated both gun-present and gun-absent responses, suggesting that object-based auditory–visual enhancements persistently increase the detectability of guns rather than simply biasing gun-present responses. Thus, object-based auditory–visual interactions that derive from experiential associations rapidly and persistently increase visual salience of corresponding objects.  相似文献   

13.
A perception of coherent motion can be obtained in an otherwise ambiguous or illusory visual display by directing one's attention to a feature and tracking it. We demonstrate an analogous auditory effect in two separate sets of experiments. The temporal dynamics associated with the attention-dependent auditory motion closely matched those previously reported for attention-based visual motion. Since attention-based motion mechanisms appear to exist in both modalities, we also tested for multimodal (audiovisual) attention-based motion, using stimuli composed of interleaved visual and auditory cues. Although subjects were able to track a trajectory using cues from both modalities, no one spontaneously perceived "multimodal motion" across both visual and auditory cues. Rather, they reported motion perception only within each modality, thereby revealing a spatiotemporal limit on putative cross-modal motion integration. Together, results from these experiments demonstrate the existence of attention-based motion in audition, extending current theories of attention-based mechanisms from visual to auditory systems.  相似文献   

14.
Unconscious or subliminal perception has historically been a thorny issue in psychology. It has been the subject of debate and experimentation since the turn of the century. While psychologists now agree that the phenomenon of visual subliminal stimulation is real, disagreement continues over the effects of such stimulation as well as to its existence in other sensory modalities, notably the auditory. The present paper provides an overview of unresolved issues in auditory subliminal stimulation which explains much of the difficulty that has been encountered in experimental work in this area. A context is proposed for considering the effects of auditory subliminal stimulation and an overview of current investigations in this field is provided.  相似文献   

15.
Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept.  相似文献   

16.
Two disks moving from opposite points in space, overlapping, and stopping at one another’s starting point can be seen as either bouncing off one another or streaming through one another. With silent displays, observers report streaming, whereas, if a sound is played when the disks are in the overlap region, observers report bouncing. The change in perception is thought to be modulated by a lack of attention that inhibits the integration of the motion signal when disks overlap and by the sound that increases the congruence of the display, in comparison with a real elastic bounce. Here, we accompanied the disks’ motion with either a bounce-congruent sound (a billiard ball) or with bounce-incongruent sounds (a water drop, a firework). When the sound was switched on 200 msec before the disks’ overlap, (1) all the audiovisual displays induced more bounce responses than did the silent display, but (2) the bounce-congruent sound induced more bounce responses than did the bounceincongruent sounds. However, when the sound was switched on at the disks’ overlap, only the first result was observed. These results highlight both the role of attention and that of sound congruence.  相似文献   

17.
In this study, we show that the contingent auditory motion aftereffect is strongly influenced by visual motion information. During an induction phase, participants listened to rightward-moving sounds with falling pitch alternated with leftward-moving sounds with rising pitch (or vice versa). Auditory aftereffects (i.e., a shift in the psychometric function for unimodal auditory motion perception) were bigger when a visual stimulus moved in the same direction as the sound than when no visual stimulus was presented. When the visual stimulus moved in the opposite direction, aftereffects were reversed and thus became contingent upon visual motion. When visual motion was combined with a stationary sound, no aftereffect was observed. These findings indicate that there are strong perceptual links between the visual and auditory motion-processing systems.  相似文献   

18.
In representational momentum (RM), the final position of a moving target is mislocalized in the direction of motion. Here, the effect of a concurrent sound on visual RM was demonstrated. A visual stimulus moved horizontally and disappeared at unpredictable positions. A complex tone without any motion cues was presented continuously from the beginning of the visual motion. As compared with a silent condition, the RM magnitude increased when the sound lasted longer than and decreased when it did not last as long as the visual motion. However, the RM was unchanged when a brief complex tone was presented before or after the target disappeared (Experiment 2) or when the onset of the long-lasting sound was not synchronized with that of the visual motion (Experiments 3 and 4). These findings suggest that visual motion representation can be modulated by a sound if the visual motion information is firmly associated with the auditory information.  相似文献   

19.
When you are looking for an object, does hearing its characteristic sound make you find it more quickly? Our recent results supported this possibility by demonstrating that when a cat target, for example, was presented among other objects, a simultaneously presented “meow” sound (containing no spatial information) reduced the manual response time for visual localization of the target. To extend these results, we determined how rapidly an object-specific auditory signal can facilitate target detection in visual search. On each trial, participants fixated a specified target object as quickly as possible. The target’s characteristic sound speeded the saccadic search time within 215–220 msec and also guided the initial saccade toward the target, compared with presentation of a distractor’s sound or with no sound. These results suggest that object-based auditory—visual interactions rapidly increase the target object’s salience in visual search.  相似文献   

20.
Thresholds for auditory motion detectability were measured in a darkened anechoic chamber while subjects were adapted to horizontally moving sound sources of various velocities. All stimuli were 500-Hz lowpass noises presented at a level of 55 dBA. The threshold measure employed was the minimum audible movement angle (MAMA)--that is, the minimum angle a horizontally moving sound must traverse to be just discriminable from a stationary sound. In an adaptive, two-interval forced-choice procedure, trials occurred every 2-5 sec (Experiment 1) or every 10-12 sec (Experiment 2). Intertrial time was "filled" with exposure to the adaptor--a stimulus that repeatedly traversed the subject's front hemifield at ear level (distance: 1.7 m) at a constant velocity (-150 degrees/sec to +150 degrees/sec) during a run. Average MAMAs in the control condition, in which the adaptor was stationary (0 degrees/sec,) were 2.4 degrees (Experiment 1) and 3.0 degrees (Experiment 2). Three out of 4 subjects in each experiment showed significantly elevated MAMAs (by up to 60%), with some adaptors relative to the control condition. However, there were large intersubject differences in the shape of the MAMA versus adaptor velocity functions. This loss of sensitivity to motion that most subjects show after exposure to moving signals is probably one component underlying the auditory motion aftereffect (Grantham, 1989), in which judgments of the direction of moving sounds are biased in the direction opposite to that of a previously presented adaptor.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号