首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Temporal expectation is a process by which people use temporally structured sensory information to explicitly or implicitly predict the onset and/or the duration of future events. Because timing plays a critical role in crossmodal interactions, we investigated how temporal expectation influenced auditory–visual interaction, using an auditory–visual crossmodal congruity effect as a measure of crossmodal interaction. For auditory identification, an incongruent visual stimulus produced stronger interference when the crossmodal stimulus was presented with an expected rather than an unexpected timing. In contrast, for visual identification, an incongruent auditory stimulus produced weaker interference when the crossmodal stimulus was presented with an expected rather than an unexpected timing. The fact that temporal expectation made visual distractors more potent and visual targets less susceptible to auditory interference suggests that temporal expectation increases the perceptual weight of visual signals.  相似文献   

2.
A Au  B Lovegrove 《Perception》2001,30(9):1127-1142
In the present study, the role of rapid visual and auditory temporal processing in reading irregular and nonsense words was investigated with a group of normal readers. One hundred and five undergraduates participated in various visual and auditory temporal-processing tasks. Readers who primarily adopted the phonological route in reading (nonsense-word readers) showed a trend for better auditory temporal resolution but readers who primarily adopted sight word skills (irregular-word readers) did not exhibit better visual temporal resolution. Both the correlation and stepwise multiple-regression analyses, however, revealed a relationship between visual temporal processing and irregular-word reading as well as a relationship between auditory temporal processing and nonsense-word reading. The results support the involvement of visual and auditory processing in reading irregular and nonsense words respectively, and were discussed with respect to recent findings that only dyslexics with phonological impairment will display temporal deficits. Further, the temporal measures were not effective discriminants for the reading groups, suggesting a lack of association between reading ability and the choice of reading strategy.  相似文献   

3.
When the senses deliver conflicting information, vision dominates spatial processing, and audition dominates temporal processing. We asked whether this sensory specialization results in cross-modal encoding of unisensory input into the task-appropriate modality. Specifically, we investigated whether visually portrayed temporal structure receives automatic, obligatory encoding in the auditory domain. In three experiments, observers judged whether the changes in two successive visual sequences followed the same or different rhythms. We assessed temporal representations by measuring the extent to which both task-irrelevant auditory information and task-irrelevant visual information interfered with rhythm discrimination. Incongruent auditory information significantly disrupted task performance, particularly when presented during encoding; by contrast, varying the nature of the rhythm-depicting visual changes had minimal impact on performance. Evidently, the perceptual system automatically and obligatorily abstracts temporal structure from its visual form and represents this structure using an auditory code, resulting in the experience of "hearing visual rhythms."  相似文献   

4.
Three experiments tested the idea that auditory presentation facilitates temporal recall whereas spatial recall is better if the input modality is visual. Lists of words were presented in which the temporal and spatial orders were independent, and instructions to the subjects determined whether recall would be given in a spatial or temporal order. In all three experiments, a significant interaction between the input modality and the type of recall was found, such that visual presentation resulted in superior recall over auditory presentation in the spatial conditions and auditory presentation yielded superior recall to visual in the temporal conditions. The present results contradict an earlier study by Murdock that showed that auditory presentation resulted in better performance than visual presentation in a nominally spatial task. An explanation for the discrepancies between the results of that study and the present one is presented.  相似文献   

5.
People naturally dance to music, and research has shown that rhythmic auditory stimuli facilitate production of precisely timed body movements. If motor mechanisms are closely linked to auditory temporal processing, just as auditory temporal processing facilitates movement production, producing action might reciprocally enhance auditory temporal sensitivity. We tested this novel hypothesis with a standard temporal-bisection paradigm, in which the slope of the temporal-bisection function provides a measure of temporal sensitivity. The bisection slope for auditory time perception was steeper when participants initiated each auditory stimulus sequence via a keypress than when they passively heard each sequence, demonstrating that initiating action enhances auditory temporal sensitivity. This enhancement is specific to the auditory modality, because voluntarily initiating each sequence did not enhance visual temporal sensitivity. A control experiment ruled out the possibility that tactile sensation associated with a keypress increased auditory temporal sensitivity. Taken together, these results demonstrate a unique reciprocal relationship between auditory time perception and motor mechanisms. As auditory perception facilitates precisely timed movements, generating action enhances auditory temporal sensitivity.  相似文献   

6.
Unlike visual and tactile stimuli, auditory signals that allow perception of timbre, pitch and localization are temporal. To process these, the auditory nervous system must either possess specialized neural machinery for analyzing temporal input, or transform the initial responses into patterns that are spatially distributed across its sensory epithelium. The former hypothesis, which postulates the existence of structures that facilitate temporal processing, is most popular. However, I argue that the cochlea transforms sound into spatiotemporal response patterns on the auditory nerve and central auditory stages; and that a unified computational framework exists for central auditory, visual and other sensory processing. Specifically, I explain how four fundamental concepts in visual processing play analogous roles in auditory processing.  相似文献   

7.
Rhythmic auditory stimuli presented before a goal-directed movement have been found to improve temporal and spatial movement outcomes. However, little is known about the mechanisms mediating these benefits. The present experiment used three types of auditory stimuli to probe how improved scaling of movement parameters, temporal preparation and an external focus of attention may contribute to changes in movement performance. Three types of auditory stimuli were presented for 1200 ms before movement initiation; three metronome beats (RAS), a tone that stayed the same (tone-same), a tone that increased in pitch (tone-change) and a no sound control, were presented with and without visual feedback for a total of eight experimental conditions. The sound was presented before a visual go-signal, and participants were instructed to reach quickly and accurately to one of two targets randomly identified in left and right hemispace. Twenty-two young adults completed 24 trials per blocked condition in a counterbalanced order. Movements were captured with an Optotrak 3D Investigator, and a 4(sound) by 2(vision) repeated measures ANOVA was used to analyze dependant variables. All auditory conditions had shorter reaction times than no sound. Tone-same and tone-change conditions had shorter movement times and higher peak velocities, with no change in trajectory variability or endpoint error. Therefore, rhythmic and non-rhythmic auditory stimuli impacted movement performance differently. Based on the pattern of results we propose multiple mechanisms impact movement planning processes when rhythmic auditory stimuli are present.  相似文献   

8.
Königs K  Knöll J  Bremmer F 《Perception》2007,36(10):1507-1512
Previous studies have shown that the perceived location of visual stimuli briefly flashed during smooth pursuit, saccades, or optokinetic nystagmus (OKN) is not veridical. We investigated whether these mislocalisations can also be observed for brief auditory stimuli presented during OKN. Experiments were carried out in a lightproof sound-attenuated chamber. Participants performed eye movements elicited by visual stimuli. An auditory target (white noise) was presented for 5 ms. Our data clearly indicate that auditory targets are mislocalised during reflexive eye movements. OKN induces a shift of perceived location in the direction of the slow eye movement and is modulated in the temporal vicinity of the fast phase. The mislocalisation is stronger for look- as compared to stare-nystagmus. The size and temporal pattern of the observed mislocalisation are different from that found for visual targets. This suggests that different neural mechanisms are at play to integrate oculomotor signals and information on the spatial location of visual as well as auditory stimuli.  相似文献   

9.
Two experiments were conducted to examine the performance of normal adults, normal children, and children diagnosed with central auditory dysfunction presumed to involve the interhemispheric pathways on a dichotic digits test in common clinical use for the diagnosis of central auditory processing disorder (CAPD) and its corresponding visual analog. Results of the first experiment revealed a significant right ear advantage (REA) for the dichotic listening task and a left-visual-field advantage (LVFA) for the corresponding visual analog in normal adults and children. In the second experiment, results revealed a significantly larger REA in the children with CAPD as compared to the normal children. Results also revealed a reversed cerebral asymmetry (RVFA) for the children with CAPD on the visual task. Significant cross-modal correlations suggest that the two tasks may reflect, at least in part, similar interhemispheric processing mechanisms in children. Findings are discussed in relation to differential diagnosis and modality-specificity of CAPD.  相似文献   

10.
In this paper we examine the evidence for human brain areas dedicated to visual or auditory word form processing by comparing cortical activation for auditory word repetition, reading, picture naming, and environmental sound naming. Both reading and auditory word repetition activated left lateralised regions in the frontal operculum (Broca's area), posterior superior temporal gyrus (Wernicke's area), posterior inferior temporal cortex, and a region in the mid superior temporal sulcus relative to baseline conditions that controlled for sensory input and motor output processing. In addition, auditory word repetition increased activation in a lateral region of the left mid superior temporal gyrus but critically, this area is not specific to auditory word processing, it is also activated in response to environmental sounds. There were no reading specific activations, even in the areas previously claimed as visual word form areas: activations were either common to reading and auditory word repetition or common to reading and picture naming. We conclude that there is no current evidence for cortical sites dedicated to visual or auditory word form processing.  相似文献   

11.
Synchronization of finger taps with periodically flashing visual stimuli is known to be much more variable than synchronization with an auditory metronome. When one of these rhythms is the synchronization target and the other serves as a distracter at various temporal offsets, strong auditory dominance is observed. However, it has recently been shown that visuomotor synchronization improves substantially with moving stimuli such as a continuously bouncing ball. The present study pitted a bouncing ball against an auditory metronome in a target–distracter synchronization paradigm, with the participants being auditory experts (musicians) and visual experts (video gamers and ball players). Synchronization was still less variable with auditory than with visual target stimuli in both groups. For musicians, auditory stimuli tended to be more distracting than visual stimuli, whereas the opposite was the case for the visual experts. Overall, there was no main effect of distracter modality. Thus, a distracting spatiotemporal visual rhythm can be as effective as a distracting auditory rhythm in its capacity to perturb synchronous movement, but its effectiveness also depends on modality-specific expertise.  相似文献   

12.
Two new, long-lasting phenomena involving modality of stimulus presentation are documented. In one series of experiments we investigated effects of modality of presentation on order judgments. Order judgments for auditory words were more accurate than order judgments for visual words at both the beginning and the end of lists, and the auditory advantage increased with the temporal separation of the successive items. A second series of experiments investigated effects of modality on estimates of presentation frequency. Frequency estimates of repeated auditory words exceeded frequency estimates of repeated visual words. The auditory advantage increased with frequency of presentation, and this advantage was not affected by the retention interval. These various effects were taken as support for a temporal coding assumption, that auditory presentation produces a more accurate encoding of time of presentation than does visual presentation.  相似文献   

13.
In some people, visual stimulation evokes auditory sensations. How prevalent and how perceptually real is this? 22% of our neurotypical adult participants responded ‘Yes' when asked whether they heard faint sounds accompanying flash stimuli, and showed significantly better ability to discriminate visual ‘Morse-code’ sequences. This benefit might arise from an ability to recode visual signals as sounds, thus taking advantage of superior temporal acuity of audition. In support of this, those who showed better visual relative to auditory sequence discrimination also had poorer auditory detection in the presence of uninformative visual flashes, though this was independent of awareness of visually-evoked sounds. Thus a visually-evoked auditory representation may occur subliminally and disrupt detection of real auditory signals. The frequent natural correlation between visual and auditory stimuli might explain the surprising prevalence of this phenomenon. Overall, our results suggest that learned correspondences between strongly correlated modalities may provide a precursor for some synaesthetic abilities.  相似文献   

14.
This study examined the relative involvement of rapid auditory and visual temporal resolution mechanisms in the reading of phonologically regular pseudowords and English irregular words presented both in isolation and in contiguity as a series of six words. Seventy-nine undergraduates participated in a range of reading, visual temporal, and auditory temporal tasks. The correlation analyses suggested a general timing mechanism across modalities. There were more significant correlations between the visual temporal measures and irregular word reading and between the auditory measures and pseudoword reading. Auditory gap detection predicted pseudoword reading accuracies. The low temporal frequency flicker contrast sensitivity measure predicted the accuracies of isolated irregular words and pseudowords presented in contiguity. However, when a combined speed-accuracy score was used, visible persistence at both low and high spatial frequencies and auditory gap detection were active in the reading of pseudowords presented in contiguity. Sensory processing skills in both visual and auditory modalities accounted for some of the variance in the reading performance of normal undergraduates, not just reading-impaired students.  相似文献   

15.
Investigation of the effect that a word recognition task has on concurrent nonverbal tasks showed (a) auditory verbal messages affected visual tracking performance but not the detection of brief light flashes in the visual periphery, (b) greater impairment, both of tracking and light detections, when verbal messages were visual rather than auditory. With a kinaesthetic tracking task, errors increased significantly during auditory messages but were even greater during visual messages. There was no interaction between the modality of tracking error feedback (auditory or visual) and the modality of the verbal message. Nor was the decrement from visual messages reduced by changing the presentation format. It is suggested that different temporal characteristics of visual and auditory information affect the attentional demands of verbal messages.  相似文献   

16.
The present study tested the effect of an extended music curriculum (EMC) for two years in secondary school, consisting of musical instrument, auditory perception, and music theory training, on children's visual and auditory memory. We tested 10-year-old children who had just started EMC and children without EMC (T0) in visual and auditory memory and retested the same children two years later (T1) to observe the effects of school music training. Confounding variables, like intelligence, socioeconomic status, extracurricular schooling, motivation to avoid work, and musical aptitude were controlled. Prior to the beginning of the music training no differences in the control variables and the memory variables between children with and without EMC were revealed. Children with EMC improved significantly from T0 to T1 in visual as well as in auditory memory. Such an improvement was not found for children without EMC. We conclude that extended school music training enhances children's visual and auditory memory.  相似文献   

17.
This experiment investigated the effect of modality on temporal discrimination in children aged 5 and 8 years and adults using a bisection task with visual and auditory stimuli ranging from 200 to 800 ms. In the first session, participants were required to compare stimulus durations with standard durations presented in the same modality (within-modality session), and in the second session in different modalities (cross-modal session). Psychophysical functions were orderly in all age groups, with the proportion of long responses (judgement that a duration was more similar to the long than to the short standard) increasing with the stimulus duration, although functions were flatter in the 5-year-olds than in the 8-year-olds and adults. Auditory stimuli were judged to be longer than visual stimuli in all age groups. The statistical results and a theoretical model suggested that this modality effect was due to differences in the pacemaker speed of the internal clock. The 5-year-olds also judged visual stimuli as more variable than auditory ones, indicating that their temporal sensitivity was lower in the visual than in the auditory modality.  相似文献   

18.
Male albino rats were trained on an adjusting avoidance schedule in which each lever press accumulated a given amount of shock-free time. Multiple auditory and visual stimuli were programmed for each discrete temporal distance from the shock in an effort to place the avoidance behavior under the control of the shock proximity. The effects of the stimuli were further examined by presenting part of them and then by removing them altogether. With the combined auditory and visual stimuli, the rat spent most of the time relatively close to the shock and usually started to respond only when the shock was near. With the visual stimuli only, the rat kept the shock at intermediate temporal distances and responded more variably. The behavior with the auditory stimuli alone was quite similar to that produced by the combined stimuli, thus indicating that the auditory stimuli exercised the greater control. When all stimuli were removed, the animal usually kept the shock as far away as the procedure permitted. When only a single pre-shock stimulus was presented, the rat remained quite close to the shock and started to respond predominantly in the pre-shock step.  相似文献   

19.
It has previously been shown that adults localize unseen auditory targets more accurately with their eyes open than closed. The interpretation usually proposed to explain this phenomenon is that auditory spatial information is referred or translated to a visual frame of reference. The present experiments show that the presence of an auditory reference point facilitates auditory localization judgements in the same manner as a visual reference point does. Although our results do not support the visual frame of reference hypothesis, they suggest that the auditory and the visual modalities are strongly linked in their localizing processes.  相似文献   

20.
This study investigated whether explicit beat induction in the auditory, visual, and audiovisual (bimodal) modalities aided the perception of weakly metrical auditory rhythms, and whether it reinforced attentional entrainment to the beat of these rhythms. The visual beat-inducer was a periodically bouncing point-light figure, which aimed to examine whether an observed rhythmic human movement could induce a beat that would influence auditory rhythm perception. In two tasks, participants listened to three repetitions of an auditory rhythm that were preceded and accompanied by (1) an auditory beat, (2) a bouncing point-light figure, (3) a combination of (1) and (2) synchronously, or (4) a combination of (1) and (2), with the figure moving in anti-phase to the auditory beat. Participants reproduced the auditory rhythm subsequently (Experiment 1), or detected a possible temporal change in the third repetition (Experiment 2). While an explicit beat did not improve rhythm reproduction, possibly due to the syncopated rhythms when a beat was imposed, bimodal beat induction yielded greater sensitivity to a temporal deviant in on-beat than in off-beat positions. Moreover, the beat phase of the figure movement determined where on-beat accents were perceived during bimodal induction. Results are discussed with regard to constrained beat induction in complex auditory rhythms, visual modulation of auditory beat perception, and possible mechanisms underlying the preferred visual beat consisting of rhythmic human motions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号