首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The EKG was recorded while Ss differentially responded to auditory or visual stimuli in a reaction time task. The EKG record was analyzed by dividing each R-R interval encompassing a stimulus presentation into 9 equal phases. Reaction times were determined as a function of the phase encompassing stimulus onset while movement times were determined for the phase in which the response was initiated. Only reaction time significantly varied with cardiac cycle, with reactions during the second phase being slower than later phases.  相似文献   

2.
Mood states and anxiety might alter performance in complex tasks whereas in more simple tasks such as stimulus-response, high anxiety could provoke bias in mechanisms of attention leading to better performances. We investigated the effects of anxiety, tension, and fatigue induced by the video-recorded Stroop Color-Word Interference Test on either reaction or movement time. 61 subjects performed a visual and an auditory response-time test in Control and Anxiogenic conditions during which heart rate was measured. Tension and anxiety states were assessed using self-evaluation. Analysis showed auditory response time was improved for both reaction and movement times in the Anxiogenic condition. These data suggest that the increased attention underlying anxiety and mood responses could have favored auditory response time by leading subjects to process stimuli more actively. In addition, state-anxiety and tension could have influenced muscular tension, enhancing the movement time in the auditory task.  相似文献   

3.
Rhythmic auditory stimuli presented before a goal-directed movement have been found to improve temporal and spatial movement outcomes. However, little is known about the mechanisms mediating these benefits. The present experiment used three types of auditory stimuli to probe how improved scaling of movement parameters, temporal preparation and an external focus of attention may contribute to changes in movement performance. Three types of auditory stimuli were presented for 1200 ms before movement initiation; three metronome beats (RAS), a tone that stayed the same (tone-same), a tone that increased in pitch (tone-change) and a no sound control, were presented with and without visual feedback for a total of eight experimental conditions. The sound was presented before a visual go-signal, and participants were instructed to reach quickly and accurately to one of two targets randomly identified in left and right hemispace. Twenty-two young adults completed 24 trials per blocked condition in a counterbalanced order. Movements were captured with an Optotrak 3D Investigator, and a 4(sound) by 2(vision) repeated measures ANOVA was used to analyze dependant variables. All auditory conditions had shorter reaction times than no sound. Tone-same and tone-change conditions had shorter movement times and higher peak velocities, with no change in trajectory variability or endpoint error. Therefore, rhythmic and non-rhythmic auditory stimuli impacted movement performance differently. Based on the pattern of results we propose multiple mechanisms impact movement planning processes when rhythmic auditory stimuli are present.  相似文献   

4.
The effects of chronic, daily administration of cocaine on auditory and visual reaction times and thresholds were studied in baboons. Single intramuscular injections of cocaine hydrochloride (0.1 to 5.6 mg/kg) were given once daily for periods of 10 to 25 days, and were followed immediately by psychophysical tests designed to assess cocaine's effects on simple reaction times as on auditory and visual threshold functions. Consistent reductions in reaction times were frequently observed over the cocaine dose range of 0.32 to 1.0 mg/kg; at higher doses, either decreases or increases in reaction times were observed, depending upon the animal. Lowered reaction times generally occurred immediately following the 1st day's cocaine injection, and continued through all subsequent days during the dose administration period, suggesting little development of tolerance or sensitivity to these reaction-time effects. Reaction-time decreases showed a U-shaped dose-effect function. The greatest decreases in reaction times occurred from 0.32 to 1.0 mg/kg, and produced an average reaction-time decrease of 10 to 12%. Concurrently measured auditory and visual thresholds showed no systematic changes as a function of cocaine dose. Pausing was observed during performance of the psychophysical tasks, with the length of total session pause times being directly related to cocaine dose.  相似文献   

5.
Contrasting the traditional focus on alcohol‐related visual images, this study examined the impact of both alcohol‐related auditory cues and visual stimuli on inhibitory control (IC). Fifty‐eight participants completed a Go/No‐Go Task, with alcohol‐related and neutral visual stimuli presented with or without short or continuous auditory bar cues. Participants performed worse when presented with alcohol‐related images and auditory cues. Problematic alcohol consumption and higher effortful control (EC) were associated with better IC performance for alcohol images. It is postulated that those with higher EC may be better able to ignore alcohol‐related stimuli, while those with problematic alcohol consumption are unconsciously less attuned to these. This runs contrary to current dogma and highlights the importance of examining both auditory and visual stimuli when investigating IC.  相似文献   

6.
For some stimuli, dynamic changes are crucial for identifying just what the stimuli are. For example, spoken words (or any auditory stimuli) require change over time to be recognized. Kallman and Cameron (1989) have proposed that this sort of dynamic change underlies the enhanced recency effect found for auditory stimuli, relative to visual stimuli. The results of three experiments replicate and extend Kallman and Cameron's finding that dynamic visual stimuli (that is visual stimuli in which movement is necessary to identify the stimuli), relative to static visual stimuli, engender enhanced recency effects. In addition, an analysis based on individual differences is used to demonstrate that the processes underlying enhanced recency effects for auditory and dynamic visual stimuli are substantially similar. These results are discussed in the context of perceptual grouping processes.  相似文献   

7.
Xiao M  Wong M  Umali M  Pomplun M 《Perception》2007,36(9):1391-1395
Perceptual integration of audio-visual stimuli is fundamental to our everyday conscious experience. Eye-movement analysis may be a suitable tool for studying such integration, since eye movements respond to auditory as well as visual input. Previous studies have shown that additional auditory cues in visual-search tasks can guide eye movements more efficiently and reduce their latency. However, these auditory cues were task-relevant since they indicated the target position and onset time. Therefore, the observed effects may have been due to subjects using the cues as additional information to maximize their performance, without perceptually integrating them with the visual displays. Here, we combine a visual-tracking task with a continuous, task-irrelevant sound from a stationary source to demonstrate that audio-visual perceptual integration affects low-level oculomotor mechanisms. Auditory stimuli of constant, increasing, or decreasing pitch were presented. All sound categories induced more smooth-pursuit eye movement than silence, with the greatest effect occurring with stimuli of increasing pitch. A possible explanation is that integration of the visual scene with continuous sound creates the perception of continuous visual motion. Increasing pitch may amplify this effect through its common association with accelerating motion.  相似文献   

8.
Many studies have shown that subjects can correct their own errors of movement more quickly than they can react to external stimuli. In the control of movements, three general categories of feedback have been defined as follows: (a) knowledge of results, primarily visually mediated, (b) proprioceptive or kinesthetic, such as from muscle spindles and joint receptors, and (c) corollary discharge or efference copy within the central nervous system. Experiments were conducted on eight normal human subjects to study the effects of these feedbacks on simple RT, choice RT, and error correction time. The movement used was plantarflexion and dorsiflexion of the ankle joint. The feedback loops were modified (a) by inverting the visual display to alter the subject's perception of results and (b) by applying a 100-Hz vibration simultaneously to both flexor and extensor muscles of the ankle joint. Central processing was altered by giving the subjects moderated doses of alcohol (blood-alcohol concentration levels of up to.10%). Vibration and alcohol increased both simple and choice RT but not the error correction time. These data reinforce the concept that there is a central pathway which can mediate error correcting responses.  相似文献   

9.
Four experiments were conducted in order to compare the effects of stimulus redundancy on temporal order judgments (TOJs) and reaction times (RTs). In Experiments 1 and 2, participants were presented in each trial with a tone and either a single visual stimulus or two redundant visual stimuli. They were asked to judge whether the tone or the visual display was presented first. Judgments of the relative onset times of the visual and the auditory stimuli were virtually unaffected by the presentation of redundant, rather than single, visual stimuli. Experiments 3 and 4 used simple RT tasks with the same stimuli, and responses were much faster to redundant than to single visual stimuli. It appears that the traditional speedup of RT associated with redundant visual stimuli arises after the stimulus detection processes to which TOJs are sensitive.  相似文献   

10.
People often move in synchrony with auditory rhythms (e.g., music), whereas synchronization of movement with purely visual rhythms is rare. In two experiments, this apparent attraction of movement to auditory rhythms was investigated by requiring participants to tap their index finger in synchrony with an isochronous auditory (tone) or visual (flashing light) target sequence while a distractor sequence was presented in the other modality at one of various phase relationships. The obtained asynchronies and their variability showed that auditory distractors strongly attracted participants' taps, whereas visual distractors had much weaker effects, if any. This asymmetry held regardless of the spatial congruence or relative salience of the stimuli in the two modalities. When different irregular timing patterns were imposed on target and distractor sequences, participants' taps tended to track the timing pattern of auditory distractor sequences when they were approximately in phase with visual target sequences, but not the reverse. These results confirm that rhythmic movement is more strongly attracted to auditory than to visual rhythms. To the extent that this is an innate proclivity, it may have been an important factor in the evolution of music.  相似文献   

11.
Effectively executing goal-directed behaviours requires both temporal and spatial accuracy. Previous work has shown that providing auditory cues enhances the timing of upper-limb movements. Interestingly, alternate work has shown beneficial effects of multisensory cueing (i.e., combined audiovisual) on temporospatial motor control. As a result, it is not clear whether adding visual to auditory cues can enhance the temporospatial control of sequential upper-limb movements specifically. The present study utilized a sequential pointing task to investigate the effects of auditory, visual, and audiovisual cueing on temporospatial errors. Eighteen participants performed pointing movements to five targets representing short, intermediate, and large movement amplitudes. Five isochronous auditory, visual, or audiovisual priming cues were provided to specify an equal movement duration for all amplitudes prior to movement onset. Movement time errors were then computed as the difference between actual and predicted movement times specified by the sensory cues, yielding delta movement time errors (ΔMTE). It was hypothesized that auditory-based (i.e., auditory and audiovisual) cueing would yield lower movement time errors compared to visual cueing. The results showed that providing auditory relative to visual priming cues alone reduced ΔMTE particularly for intermediate amplitude movements. The results further highlighted the beneficial impact of unimodal auditory cueing for improving visuomotor control in the absence of significant effects for the multisensory audiovisual condition.  相似文献   

12.
Numerous studies have focused on the diversity of audiovisual integration between younger and older adults. However, consecutive trends in audiovisual integration throughout life are still unclear. In the present study, to clarify audiovisual integration characteristics in middle-aged adults, we instructed younger and middle-aged adults to conduct an auditory/visual stimuli discrimination experiment. Randomized streams of unimodal auditory (A), unimodal visual (V) or audiovisual stimuli were presented on the left or right hemispace of the central fixation point, and subjects were instructed to respond to the target stimuli rapidly and accurately. Our results demonstrated that the responses of middle-aged adults to all unimodal and bimodal stimuli were significantly slower than those of younger adults (p < 0.05). Audiovisual integration was markedly delayed (onset time 360 ms) and weaker (peak 3.97%) in middle-aged adults than in younger adults (onset time 260 ms, peak 11.86%). The results suggested that audiovisual integration was attenuated in middle-aged adults and further confirmed age-related decline in information processing.  相似文献   

13.
Here, we investigate how audiovisual context affects perceived event duration with experiments in which observers reported which of two stimuli they perceived as longer. Target events were visual and/or auditory and could be accompanied by nontargets in the other modality. Our results demonstrate that the temporal information conveyed by irrelevant sounds is automatically used when the brain estimates visual durations but that irrelevant visual information does not affect perceived auditory duration (Experiment 1). We further show that auditory influences on subjective visual durations occur only when the temporal characteristics of the stimuli promote perceptual grouping (Experiments 1 and 2). Placed in the context of scalar expectancy theory of time perception, our third and fourth experiments have the implication that audiovisual context can lead both to changes in the rate of an internal clock and to temporal ventriloquism-like effects on perceived on- and offsets. Finally, intramodal grouping of auditory stimuli diminished any crossmodal effects, suggesting a strong preference for intramodal over crossmodal perceptual grouping (Experiment 5).  相似文献   

14.
Dual-process accounts of working memory have suggested distinct encoding processes for verbal and visual information in working memory, but encoding for nonspeech sounds (e.g., tones) is not well understood. This experiment modified the sentence–picture verification task to include nonspeech sounds with a complete factorial examination of all possible stimulus pairings. Participants studied simple stimuli–pictures, sentences, or sounds–and encoded the stimuli verbally, as visual images, or as auditory images. Participants then compared their encoded representations to verification stimuli–again pictures, sentences, or sounds–in a two-choice reaction time task. With some caveats, the encoding strategy appeared to be as important or more important than the external format of the initial stimulus in determining the speed of verification decisions. Findings suggested that: (1) auditory imagery may be distinct from verbal and visuospatial processing in working memory; (2) visual perception but not visual imagery may automatically activate concurrent verbal codes; and (3) the effects of hearing a sound may linger for some time despite recoding in working memory. We discuss the role of auditory imagery in dual-process theories of working memory.  相似文献   

15.
Instrumentation is described which permits study of the effects of different forms of visual feedback display on the patterns of fine movement obtained from the extended human index finger when the subject is attempting to keep his finger at a fixed point in space. The task is a compensatory tracking task in which the only source of input to the system is the subject's own finger movement. The effects of increasing the gain (or amplification) of a proportional error signal on the pattern of finger movement was studied. Gains of 1, 2, 4, 10, 20 and 40 were studied with a group of 24 subjects. Increasing the gain of a proportional error signal resulted in a marked improvement in the ability of subjects to maintain their extended finger at a fixed point in space. As the gain of the error signal was increased, the subject's high-amplitude, low frequency errors were reduced, and there was a progressive appearance of high-frequency activity of low-amplitude, more accurately centred about the reference position in space. A total off-target area measure (integrated absolute error) showed marked decrease in scores as the amplification of the error signal was increased from 1 through 10. Beyond this gain there was no appreciable additional improvement in motor control, however no degradation of control was noted to characterize the group performance.

Exploratory studies were undertaken to permit comparison of the effects of increasing the gain of a proportional visual display with the effects of increasing the gain of non-proportional visual and auditory displays. An increase in the dominant-energy frequency was noted as the error signal gain was increased, independent of whether a proportional visual, or non-proportional visual or auditory display was used. This observation suggests that common mechanisms mediate the processing of the gain parameters of feedback displays, in some measure independent of the display form or the sensory modality used for presentation.  相似文献   

16.
Ss were presented two stimuli of equal duration separated in time. The parrs of stimuli were vibrotactile, auditory, or visual. The Ss adjusted the time between the two stimuli to be equal to the duration of the first stimulus. The results show that for stimulus durations ranging from 100 to 1,200 msec, Ss set the tune between the two stimuli too long and by a constant amount. For vibrotactfle stimuli, the constant was 596 msec; for auditory stimuli, 657 msec; and for visual stimuli, 436 msec. Changing the intensity of the vibrotactile stimuli did not change the size of the constant error. When Ss were presented two tones with a burst of white noise between the tones and adjusted the duration of the white noise to be equal to the duration of the first tone, the white noise was not adjusted too long by a constant amount. The results suggest that there is a constant error in the perception of unfilled relative to filled temporal intervals.  相似文献   

17.
Manual reaction times to visual, auditory, and tactile stimuli presented simultaneously, or with a delay, were measured to test for multisensory interaction effects in a simple detection task with redundant signals. Responses to trimodal stimulus combinations were faster than those to bimodal combinations, which in turn were faster than reactions to unimodal stimuli. Response enhancement increased with decreasing auditory and tactile stimulus intensity and was a U-shaped function of stimulus onset asynchrony. Distribution inequality tests indicated that the multisensory interaction effects were larger than predicted by separate activation models, including the difference between bimodal and trimodal response facilitation. The results are discussed with respect to previous findings in a focused attention task and are compared with multisensory integration rules observed in bimodal and trimodal superior colliculus neurons in the cat and monkey.  相似文献   

18.
本研究使用空间任务-转换范式,控制视、听刺激的突显性,探讨自下而上注意对视觉主导效应的影响。结果表明视、听刺激突显性显著地影响视觉主导效应,实验一中当听觉刺激为高突显性时,视觉主导效应显著减弱。实验二中当听觉刺激为高突显性并且视觉刺激为低突显性时,视觉主导效应进一步减弱但依然存在。结果支持偏向竞争理论,在跨通道视听交互过程中视觉刺激更具突显性,在多感觉整合过程中更具加工优势。  相似文献   

19.
Spatially orthogonal stimulus and response sets can produce compatibility effects. To explore whether such effects cross the border of logically independent tasks, we combined a nonspeeded visual task requiring verbal report of a stimulus movement (up vs. down) with an auditory reaction time task that required a unimanual movement to the left or right. Two experiments demonstrated that up stimuli facilitate rightward responses and down stimuli facilitate leftward responses, relative to the opposite combinations, thus producing an orthogonal cross-task compatibility effect. This effect presumably arises from abstract coding with respect to the salient referents of a spatial dimension (i.e., up and right), so that coactivation of structurally similar codes leads to mutual priming even when the codes refer to different tasks. The present evidence for abstract spatial coding extends previously proposed coding principles from single-task settings to dual-task settings.  相似文献   

20.
Two experiments examined the effects of multimodal presentation and stimulus familiarity on auditory and visual processing. In Experiment 1, 10-month-olds were habituated to either an auditory stimulus, a visual stimulus, or an auditory-visual multimodal stimulus. Processing time was assessed during the habituation phase, and discrimination of auditory and visual stimuli was assessed during a subsequent testing phase. In Experiment 2, the familiarity of the auditory or visual stimulus was systematically manipulated by prefamiliarizing infants to either the auditory or visual stimulus prior to the experiment proper. With the exception of the prefamiliarized auditory condition in Experiment 2, infants in the multimodal conditions failed to increase looking when the visual component changed at test. This finding is noteworthy given that infants discriminated the same visual stimuli when presented unimodally, and there was no evidence that multimodal presentation attenuated auditory processing. Possible factors underlying these effects are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号