首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This study investigated whether explicit beat induction in the auditory, visual, and audiovisual (bimodal) modalities aided the perception of weakly metrical auditory rhythms, and whether it reinforced attentional entrainment to the beat of these rhythms. The visual beat-inducer was a periodically bouncing point-light figure, which aimed to examine whether an observed rhythmic human movement could induce a beat that would influence auditory rhythm perception. In two tasks, participants listened to three repetitions of an auditory rhythm that were preceded and accompanied by (1) an auditory beat, (2) a bouncing point-light figure, (3) a combination of (1) and (2) synchronously, or (4) a combination of (1) and (2), with the figure moving in anti-phase to the auditory beat. Participants reproduced the auditory rhythm subsequently (Experiment 1), or detected a possible temporal change in the third repetition (Experiment 2). While an explicit beat did not improve rhythm reproduction, possibly due to the syncopated rhythms when a beat was imposed, bimodal beat induction yielded greater sensitivity to a temporal deviant in on-beat than in off-beat positions. Moreover, the beat phase of the figure movement determined where on-beat accents were perceived during bimodal induction. Results are discussed with regard to constrained beat induction in complex auditory rhythms, visual modulation of auditory beat perception, and possible mechanisms underlying the preferred visual beat consisting of rhythmic human motions.  相似文献   

2.
A static bar is perceived to dynamically extend from a peripheral cue (illusory line motion (ILM)) or from a part of another figure presented in the previous frame (transformational apparent motion (TAM)). We examined whether visibility for the cue stimuli affected these transformational motions. Continuous flash suppression, one kind of dynamic interocular masking, was used to reduce the visibility for the cue stimuli. Both ILM and TAM significantly occurred when the d' for cue stimuli was zero (Experiment 1) and when the cue stimuli were presented at subthreshold levels (Experiment 2). We discuss that higher‐order motion processing underlying TAM and ILM can be weakly but significantly activated by invisible visual information.  相似文献   

3.
This study examined 4- to 10-month-old infants' perception of audio-visual (A-V) temporal synchrony cues in the presence or absence of rhythmic pattern cues. Experiment 1 established that infants of all ages could successfully discriminate between two different audiovisual rhythmic events. Experiment 2 showed that only 10-month-old infants detected a desynchronization of the auditory and visual components of a rhythmical event. Experiment 3 showed that 4- to 8-month-old infants could detect A-V desynchronization but only when the audiovisual event was nonrhythmic. These results show that initially in development infants attend to the overall temporal structure of rhythmic audiovisual events but that later in development they become capable of perceiving the embedded intersensory temporal synchrony relations as well.  相似文献   

4.
Infants' intermodal perception of two levels of temporal structure uniting the visual and acoustic stimulation from natural, complex events was investigated in four experiments. Films depicting a single object (single, large marble) and a compound object (group of smaller marbles) colliding against a surface in an erratic pattern were presented to infants between 3 and months of age using an intermodal preference and search method. These stimulus events portrayed two levels of invariant temporal structure: (a) temporal synchrony united the sights and sounds of object impact, and (b) temporal microstructure, the internal temporal structure of each impact sound and motion, specified the composition of the object (single vs. compound). Experiment 1 demonstrated that by 6 months infants detected a relation between the audible and visible stimulation from these events when both levels of invariant temporal structure guided their intermodal exploration. Experiment 2 revealed that by 6 months infants detected the bimodal temporal microstructure specifying object composition. They looked predominantly to the film whose natural soundtrack was played even though the motions of objects in both films were synchronized with the soundtrack. Experiment 3 assessed infants' sensitivity to temporal synchrony relations. Two films depicting objects of the same composition were presented while the motions of only one of them was synchronized with the appropriate soundtrack. Both 6-month-olds showed evidence of detecting temporal synchrony relations under some conditions. Experiment 4 examined how temporal synchrony and temporal microstructure interact in directing intermodal exploration. The natural soundtrack to one of the objects was played out-of-synchrony with the motions of both. In contrast with the results of Experiment 2, infants at 6 months showed no evidence of detecting a relationship between the film and its appropriate soundtrack. This suggests that the temporal asynchrony disrupted their detection of the temporal microstructure specifying object composition. Results of these studies support on invariant-detection view of the development of intermodal perception.  相似文献   

5.
When we move to music we feel the beat, and this feeling can shape the sound we hear. Previous studies have shown that when people listen to a metrically ambiguous rhythm pattern, moving the body on a certain beat--adults, by actively bouncing themselves in synchrony with the experimenter, and babies, by being bounced passively in the experimenter's arms--can bias their auditory metrical representation so that they interpret the pattern in a corresponding metrical form [Phillips-Silver, J., & Trainor, L. J. (2005). Feeling the beat: Movement influences infant rhythm perception. Science, 308, 1430; Phillips-Silver, J., & Trainor, L. J. (2007). Hearing what the body feels: Auditory encoding of rhythmic movement. Cognition, 105, 533-546]. The present studies show that in adults, as well as in infants, metrical encoding of rhythm can be biased by passive motion. Furthermore, because movement of the head alone affected auditory encoding whereas movement of the legs alone did not, we propose that vestibular input may play a key role in the effect of movement on auditory rhythm processing. We discuss possible cortical and subcortical sites for the integration of auditory and vestibular inputs that may underlie the interaction between movement and auditory metrical rhythm perception.  相似文献   

6.
Using straight translatory motion of a visual peripheral cue in the frontoparallel plane, and probing target discrimination at different positions along the cue's motion trajectory, we found that target orientation discrimination was slower for targets presented at or near the position of motion onset (4.2° off centre), relative to the onset of a static cue (Experiment 1), and relative to targets presented further along the motion trajectory (Experiments 1 and 2). Target discrimination was equally fast and accurate in the moving cue conditions relative to static cue conditions at positions further along the cue's motion trajectory (Experiment 1). Moreover, target orientation discrimination was not slowed at the same position, once this position was no longer the motion onset position (Experiment 3), and performance in a target colour-discrimination task was not slowed even at motion onset (Experiment 4). Finally, we found that the onset location of the motion cue was perceived as being shifted in the direction of the cue's motion (Experiment 5). These results indicate that attention cannot be as quickly or precisely shifted to the onset of a motion stimulus as to other positions on a stimulus’ motion trajectory.  相似文献   

7.
Abstract: In two experiments, we investigated how the number of auditory stimuli affected the apparent motion induced by visual stimuli. The multiple visual stimuli that induced the apparent motion on the front parallel plane, or in the depth dimension in terms of the binocular disparity cue, were accompanied by multiple auditory stimuli. Observers reported the number of visual stimuli (Experiments 1 and 2) and the displacement of the apparent motion that was defined by the distance between the first and last visual stimuli (Experiment 2). When the number of auditory stimuli was more/less than that of the visual stimuli, observers tended to perceive more/less visual stimuli and a larger/smaller displacement than when the numbers of the auditory and visual stimuli were the same, regardless of the dimension of motion. These results suggest that auditory stimulation may modify the visual processing of motion by modulating the spatiotemporal resolution and extent of the displacement.  相似文献   

8.
Similarities have been observed in the localization of the final position of moving visual and moving auditory stimuli: Perceived endpoints that are judged to be farther in the direction of motion in both modalities likely reflect extrapolation of the trajectory, mediated by predictive mechanisms at higher cognitive levels. However, actual comparisons of the magnitudes of displacement between visual tasks and auditory tasks using the same experimental setup are rare. As such, the purpose of the present free-field study was to investigate the influences of the spatial location of motion offset, stimulus velocity, and motion direction on the localization of the final positions of moving auditory stimuli (Experiment 1 and 2) and moving visual stimuli (Experiment 3). To assess whether auditory performance is affected by dynamically changing binaural cues that are used for the localization of moving auditory stimuli (interaural time differences for low-frequency sounds and interaural intensity differences for high-frequency sounds), two distinct noise bands were employed in Experiments 1 and 2. In all three experiments, less precise encoding of spatial coordinates in paralateral space resulted in larger forward displacements, but this effect was drowned out by the underestimation of target eccentricity in the extreme periphery. Furthermore, our results revealed clear differences between visual and auditory tasks. Displacements in the visual task were dependent on velocity and the spatial location of the final position, but an additional influence of motion direction was observed in the auditory tasks. Together, these findings indicate that the modality-specific processing of motion parameters affects the extrapolation of the trajectory.  相似文献   

9.
Although the pigeon is a popular model for studying visual perception, relatively little is known about its perception of motion. Three experiments examined the pigeons’ ability to capture a moving stimulus. In Experiment 1, the effect of manipulating stimulus speed and the length of the stimulus was examined using a simple rightward linear motion. This revealed a clear effect of length on capture and speed on errors. Errors were mostly anticipatory and there appeared to be two processes contributing to response locations: anticipatory peck bias and lag time. Using the same birds as Experiment 1, Experiment 2 assessed transfer of tracking and capture to novel linear motions. The birds were able to capture other motion directions, but they displayed a strong rightward peck bias, indicating that they had learned to peck to the right of the stimulus in Experiment 1. Experiment 3 used the same task as Experiment 2 but with naïve birds. These birds did not show the rightward bias in pecking and instead pecked more evenly around the stimulus. The combined results indicate that the pigeon can engage in anticipatory tracking and capture of a moving stimulus, and that motion properties and training experience influence capture.  相似文献   

10.
Self‐propelled motion is a powerful cue that conveys information that an object is animate. In this case, animate refers to an entity's capacity to initiate motion without an applied external force. Sensitivity to this motion cue is present in infants that are a few months old, but whether this sensitivity is experience‐dependent or is already present at birth is unknown. Here, we tested newborns to examine whether predispositions to process self‐produced motion cues underlying animacy perception were present soon after birth. We systematically manipulated the onset of motion by self‐propulsion (Experiment 1) and the change in trajectory direction in the presence or absence of direct contact with an external object (Experiments 2 and 3) to investigate how these motion cues determine preference in newborns. Overall, data demonstrated that, at least at birth, the self‐propelled onset of motion is a crucial visual cue that allowed newborns to differentiate between self‐ and non‐self‐propelled objects (Experiment 1) because when this cue was removed, newborns did not manifest any visual preference (Experiment 2), even if they were able to discriminate between the stimuli (Experiment 3). To our knowledge, this is the first study aimed at identifying sensitivity in human newborns to the most basic and rudimentary motion cues that reliably trigger perceptions of animacy in adults. Our findings are compatible with the hypothesis of the existence of inborn predispositions to visual cues of motion that trigger animacy perception in adults.  相似文献   

11.
People often move in synchrony with auditory rhythms (e.g., music), whereas synchronization of movement with purely visual rhythms is rare. In two experiments, this apparent attraction of movement to auditory rhythms was investigated by requiring participants to tap their index finger in synchrony with an isochronous auditory (tone) or visual (flashing light) target sequence while a distractor sequence was presented in the other modality at one of various phase relationships. The obtained asynchronies and their variability showed that auditory distractors strongly attracted participants' taps, whereas visual distractors had much weaker effects, if any. This asymmetry held regardless of the spatial congruence or relative salience of the stimuli in the two modalities. When different irregular timing patterns were imposed on target and distractor sequences, participants' taps tended to track the timing pattern of auditory distractor sequences when they were approximately in phase with visual target sequences, but not the reverse. These results confirm that rhythmic movement is more strongly attracted to auditory than to visual rhythms. To the extent that this is an innate proclivity, it may have been an important factor in the evolution of music.  相似文献   

12.
This study investigated multisensory interactions in the perception of auditory and visual motion. When auditory and visual apparent motion streams are presented concurrently in opposite directions, participants often fail to discriminate the direction of motion of the auditory stream, whereas perception of the visual stream is unaffected by the direction of auditory motion (Experiment 1). This asymmetry persists even when the perceived quality of apparent motion is equated for the 2 modalities (Experiment 2). Subsequently, it was found that this visual modulation of auditory motion is caused by an illusory reversal in the perceived direction of sounds (Experiment 3). This "dynamic capture" effect occurs over and above ventriloquism among static events (Experiments 4 and 5), and it generalizes to continuous motion displays (Experiment 6). These data are discussed in light of related multisensory phenomena and their support for a "modality appropriateness" interpretation of multisensory integration in motion perception.  相似文献   

13.
Rhythmic auditory stimuli presented before a goal-directed movement have been found to improve temporal and spatial movement outcomes. However, little is known about the mechanisms mediating these benefits. The present experiment used three types of auditory stimuli to probe how improved scaling of movement parameters, temporal preparation and an external focus of attention may contribute to changes in movement performance. Three types of auditory stimuli were presented for 1200 ms before movement initiation; three metronome beats (RAS), a tone that stayed the same (tone-same), a tone that increased in pitch (tone-change) and a no sound control, were presented with and without visual feedback for a total of eight experimental conditions. The sound was presented before a visual go-signal, and participants were instructed to reach quickly and accurately to one of two targets randomly identified in left and right hemispace. Twenty-two young adults completed 24 trials per blocked condition in a counterbalanced order. Movements were captured with an Optotrak 3D Investigator, and a 4(sound) by 2(vision) repeated measures ANOVA was used to analyze dependant variables. All auditory conditions had shorter reaction times than no sound. Tone-same and tone-change conditions had shorter movement times and higher peak velocities, with no change in trajectory variability or endpoint error. Therefore, rhythmic and non-rhythmic auditory stimuli impacted movement performance differently. Based on the pattern of results we propose multiple mechanisms impact movement planning processes when rhythmic auditory stimuli are present.  相似文献   

14.
Computer-driven visual displays (CDVDs), like television and movies, produce stroboscopic rather than continuous physical movement. The success with which the perception of motion is produced depends or. factors such as the fineness of the raster and the temporal and spatiai reiationships of the stimulus points. For a given velocity, the more points there are on the movement trajectory, and the closer their spacing, the better is the perceived movement. Moderately slow retinal velocities (on the order of .4 to .8 deg/sec) produce the highest quality of perceived movement. One can discriminate among possible subclasses of movement detectors by presenting a complex sequence of intensities at two or more points and varying their cross correlation. Motion between two areas can be perceived even when there is zero correlation between the spatial patterns in each location. Perceived motion can be of rotation, as well as of translation. The two-dimensional shadow of a rotating three-dimensional wire figure is perceived as a rotating, rigid, three-dimensional wire figure (the kinetic depth effect). A three-dimensional “shadow” of a hypothetical four-dimensional wire figure also has been produced; it was not seen as rigid.  相似文献   

15.
Macar F 《Acta psychologica》2002,111(2):243-262
Two experiments indicate that prospective judgments of a temporal target are influenced by nontarget temporal features. The basic task was to reproduce a target interval marked by visual events. In addition, visual or auditory interfering events were delivered. Experiment 1 showed that temporal reproduction is shorter when the interfering events occupy a late rather than early position during the target interval, a result explained in terms of expectancy, which causes attention shifts. This study also revealed that similar trends are obtained whether the interfering event involves a specific task or is irrelevant. Experiment 2 confirmed the position effect, and showed that the duration of an irrelevant cue can influence judgment of the target interval, as if it were also timed without appropriate control.  相似文献   

16.
A perception of coherent motion can be obtained in an otherwise ambiguous or illusory visual display by directing one's attention to a feature and tracking it. We demonstrate an analogous auditory effect in two separate sets of experiments. The temporal dynamics associated with the attention-dependent auditory motion closely matched those previously reported for attention-based visual motion. Since attention-based motion mechanisms appear to exist in both modalities, we also tested for multimodal (audiovisual) attention-based motion, using stimuli composed of interleaved visual and auditory cues. Although subjects were able to track a trajectory using cues from both modalities, no one spontaneously perceived "multimodal motion" across both visual and auditory cues. Rather, they reported motion perception only within each modality, thereby revealing a spatiotemporal limit on putative cross-modal motion integration. Together, results from these experiments demonstrate the existence of attention-based motion in audition, extending current theories of attention-based mechanisms from visual to auditory systems.  相似文献   

17.
Although music and dance are often experienced simultaneously, it is unclear what modulates their perceptual integration. This study investigated how two factors related to music–dance correspondences influenced audiovisual binding of their rhythms: the metrical match between the music and dance, and the kinematic familiarity of the dance movement. Participants watched a point-light figure dancing synchronously to a triple-meter rhythm that they heard in parallel, whereby the dance communicated a triple (congruent) or a duple (incongruent) visual meter. The movement was either the participant’s own or that of another participant. Participants attended to both streams while detecting a temporal perturbation in the auditory beat. The results showed lower sensitivity to the auditory deviant when the visual dance was metrically congruent to the auditory rhythm and when the movement was the participant’s own. This indicated stronger audiovisual binding and a more coherent bimodal rhythm in these conditions, thus making a slight auditory deviant less noticeable. Moreover, binding in the metrically incongruent condition involving self-generated visual stimuli was correlated with self-recognition of the movement, suggesting that action simulation mediates the perceived coherence between one’s own movement and a mismatching auditory rhythm. Overall, the mechanisms of rhythm perception and action simulation could inform the perceived compatibility between music and dance, thus modulating the temporal integration of these audiovisual stimuli.  相似文献   

18.
Target velocity effects on manual interception kinematics   总被引:3,自引:0,他引:3  
Participants generated manual interception movements toward a target cursor that moved across a computer screen. The target reached its peak velocity either during the first third, at the midpoint, or during the last third of the movement. In Experiment 1 the view of the target was available for either the first 316, 633, 950, or 1267 ms, after which it disappeared. Results showed that for all viewing conditions, the timing of the interception velocity was related to the temporal properties of the target's trajectory. In Experiment 2, when the portion of the target trajectory that was viewed was reversed (such that participants did not see the first 316, 633, 950, or 1267 ms of the trajectory, but instead saw only the later portions of the trajectory), there was no clear relationship between the target trajectory and the timing of the aiming trajectory. These results suggest that participants use visual information early in the target's trajectory to form a representation of the target motion that is used to facilitate manual interception.  相似文献   

19.
Phillips-Silver and Trainor (Phillips-Silver, J., Trainor, L.J., (2005). Feeling the beat: movement influences infants' rhythm perception. Science, 308, 1430) demonstrated an early cross-modal interaction between body movement and auditory encoding of musical rhythm in infants. Here we show that the way adults move their bodies to music influences their auditory perception of the rhythm structure. We trained adults, while listening to an ambiguous rhythm with no accented beats, to bounce by bending their knees to interpret the rhythm either as a march or as a waltz. At test, adults identified as similar an auditory version of the rhythm pattern with accented strong beats that matched their previous bouncing experience in comparison with a version whose accents did not match. In subsequent experiments we showed that this effect does not depend on visual information, but that movement of the body is critical. Parallel results from adults and infants suggest that the movement-sound interaction develops early and is fundamental to music processing throughout life.  相似文献   

20.
Behavioral studies of multisensory integration in motion perception have focused on the particular case of visual and auditory signals. Here, we addressed a new case: audition and touch. In Experiment 1, we tested the effects of an apparent motion stream presented in an irrelevant modality (audition or touch) on the perception of apparent motion streams in the other modality (touch or audition, respectively). We found significant congruency effects (lower performance when the direction of motion in the irrelevant modality was incongruent with the direction of the target) for the two possible modality combinations. This congruency effect was asymmetrical, with tactile motion distractors having a stronger influence on auditory motion perception than vice versa. In Experiment 2, we used auditory motion targets and tactile motion distractors while participants adopted one of two possible postures: arms uncrossed or arms crossed. The effects of tactile motion on auditory motion judgments were replicated in the arms-uncrossed posture, but they dissipated in the arms-crossed posture. The implications of these results are discussed in light of current findings regarding the representation of tactile and auditory space.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号