首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We report three experiments designed to investigate the nature of any crossmodal links between audition and touch in sustained endogenous covert spatial attention, using the orthogonal spatial cuing paradigm. Participants discriminated the elevation (up vs. down) of auditory and tactile targets presented to either the left or the right of fixation. In Experiment 1, targets were expected on a particular side in just one modality; the results demonstrated that the participants could spatially shift their attention independently in both audition and touch. Experiment 2 demonstrated that when the participants were informed that targets were more likely to be on one side for both modalities, elevation judgments were faster on that side in both audition and touch. The participants were also able to "split" their auditory and tactile attention, albeit at some cost, when targets in the two modalities were expected on opposite sides. Similar results were also reported in Experiment 3 when participants adopted a crossed-hands posture, thus revealing that crossmodal links in audiotactile attention operate on a representation of space that is updated following posture change. These results are discussed in relation to previous findings regarding crossmodal links in audiovisual and visuotactile covert spatial attentional orienting.  相似文献   

2.
Three experiments investigated cross-modal links between touch, audition, and vision in the control of covert exogenous orienting. In the first two experiments, participants made speeded discrimination responses (continuous vs. pulsed) for tactile targets presented randomly to the index finger of either hand. Targets were preceded at a variable stimulus onset asynchrony (150,200, or 300 msec) by a spatially uninformative cue that was either auditory (Experiment 1) or visual (Experiment 2) on the same or opposite side as the tactile target. Tactile discriminations were more rapid and accurate when cue and target occurred on the same side, revealing cross-modal covert orienting. In Experiment 3, spatially uninformative tactile cues were presented prior to randomly intermingled auditory and visual targets requiring an elevation discrimination response (up vs. down). Responses were significantly faster for targets in both modalities when presented ipsilateral to the tactile cue. These findings demonstrate that the peripheral presentation of spatially uninforrnative auditory and visual cues produces cross-modal orienting that affects touch, and that tactile cues can also produce cross-modal covert orienting that affects audition and vision.  相似文献   

3.
We investigated the effect of unseen hand posture on cross-modal, visuo-tactile links in covert spatial attention. In Experiment 1, a spatially nonpredictive visual cue was presented to the left or right hemifield shortly before a tactile target on either hand. To examine the spatial coordinates of any cross-modal cuing, the unseen hands were either uncrossed or crossed so that the left hand lay to the right and vice versa. Tactile up/down (i.e., index finger/thumb) judgments were better on the same side of external space as the visual cue, for both crossed and uncrossed postures. Thus, which hand was advantaged by a visual cue in a particular hemifield reversed across the different unseen postures. In Experiment 2, nonpredictive tactile cues now preceded visual targets. Up/down judgments for the latter were better on the same side of external space as the tactile cue, again for both postures. These results demonstrate cross-modal links between vision and touch in exogenous covert spatial attention that remap across changes in unseen hand posture, suggesting a modulatory role for proprioception.  相似文献   

4.
We investigated the effects of seen and unseen within-hemifield posture changes on crossmodal visual–tactile links in covert spatial attention. In all experiments, a spatially nonpredictive tactile cue was presented to the left or the right hand, with the two hands placed symmetrically across the midline. Shortly after a tactile cue, a visual target appeared at one of two eccentricities within either of the hemifields. For half of the trial blocks, the hands were aligned with the inner visual target locations, and for the remainder, the hands were aligned with the outer target locations. In Experiments 1 and 2, the inner and outer eccentricities were 17.5º and 52.5º, respectively. In Experiment 1, the arms were completely covered, and visual up–down judgments were better when on the same side as the preceding tactile cue. Cueing effects were not significantly affected by hand or target alignment. In Experiment 2, the arms were in view, and now some target responses were affected by cue alignment: Cueing for outer targets was only significant when the hands were aligned with them. In Experiment 3, we tested whether any unseen posture changes could alter the cueing effects, by widely separating the inner and outer target eccentricities (now 10º and 86º). In this case, hand alignment did affect some of the cueing effects: Cueing for outer targets was now only significant when the hands were in the outer position. Although these results confirm that proprioception can, in some cases, influence tactile–visual links in exogenous spatial attention, they also show that spatial precision is severely limited, especially when posture is unseen.  相似文献   

5.
Selective attention to the chemosensory modality   总被引:3,自引:0,他引:3  
Previous studies have shown that behavioral responses to auditory, visual, and tactile stimuli are modulated by expectancies regarding the likely modality of an upcoming stimulus (see Spence & Driver, 1997). In the present study, we investigated whether people can also selectively attend to the chemosensory modality (involving responses to olfactory, chemical, and painful stimuli). Participants made speeded spatial discrimination responses (left vs. right) to an unpredictable sequence of odor and tactile targets. Odor stimuli were presented to either the left or the right nostril, embedded in a birhinally applied constant airstream. Tactile stimuli were presented to the left or the right hand. On each trial, a symbolic visual cue predicted the likely modality for the upcoming target (the cue was a valid predictor of the target modality on the majority of trials). Response latencies were faster when targets were presented in the expected modality than when they were presented in the unexpected modality, showing for the first time that behavioral responses to chemosensory stimuli can be modulated by selective attention.  相似文献   

6.
Subjects judged the elevation (up vs. down, regardless of laterality) of peripheral auditory or visual targets, following uninformative cues on either side with an intermediate elevation. Judgments were better for targets in either modality when preceded by an uninformative auditory cue on the side of the target. Experiment 2 ruled out nonattentional accounts for these spatial cuing effects. Experiment 3 found that visual cues affected elevation judgments for visual but not auditory targets. Experiment 4 confirmed that the effect on visual targets was attentional. In Experiment 5, visual cues produced spatial cuing when targets were always auditory, but saccades toward the cue may have been responsible. No such visual-to-auditory cuing effects were found in Experiment 6 when saccades were prevented, though they were present when eye movements were not monitored. These results suggest a one-way cross-modal dependence in exogenous covert orienting whereby audition influences vision, but not vice versa. Possible reasons for this asymmetry are discussed in terms of the representation of space within the brain.  相似文献   

7.
Perceptual judgments can be affected by expectancies regarding the likely target modality. This has been taken as evidence for selective attention to particular modalities, but alternative accounts remain possible in terms of response priming, criterion shifts, stimulus repetition, and spatial confounds. We examined whether attention to a sensory modality would still be apparent when these alternatives were ruled out. Subjects made a speeded detection response (Experiment 1), an intensity or color discrimination (Experiment 2), or a spatial discrimination response (Experiments 3 and 4) for auditory and visual targets presented in a random sequence. On each trial, a symbolic visual cue predicted the likely target modality. Responses were always more rapid and accurate for targets presented in the expected versus unexpected modality, implying that people can indeed selectively attend to the auditory or visual modalities. When subjects were cued to both the probable modality of a target and its likely spatial location (Experiment 4), separable modality-cuing and spatial-cuing effects were observed. These studies introduce appropriate methods for distinguishing attention to a modality from the confounding factors that have plagued previous normal and clinical research.  相似文献   

8.
Peripheral cues are thought to facilitate responses to stimuli presented at the same location because they lead to exogenous attention shifts. Facilitation has been observed in numerous studies of visual and auditory attention, but there have been only four demonstrations of tactile facilitation, all in studies with potential confounds. Three studies used a spatial (finger versus thumb) discrimination task, where the cue could have provided a spatial framework that might have assisted the discrimination of subsequent targets presented on the same side as the cue. The final study circumvented this problem by using a non-spatial discrimination; however, the cues were informative and interspersed with visual cues which may have affected the attentional effects observed. In the current study, therefore, we used a non-spatial tactile frequency discrimination task following a non-informative tactile white noise cue. When the target was presented 150 ms after the cue, we observed faster discrimination responses to targets presented on the same side compared to the opposite side as the cue; by 1000 ms, responses were significantly faster to targets presented on the opposite side to the cue. Thus, we demonstrated that tactile attentional facilitation can be observed in a non-spatial discrimination task, under unimodal conditions and with entirely non-predictive cues. Furthermore, we provide the first demonstration of significant tactile facilitation and tactile inhibition of return within a single experiment.  相似文献   

9.
Load theory suggests that working memory controls the extent to which irrelevant distractors are processed (e.g., Lavie, Hirst, De Fockert, & Viding, 2004). However, so far this proposal has only been tested in vision. Here, we examine the extent to which tactile selective attention also depends on working memory. In Experiment 1, participants focused their attention on continuous target vibrations while attempting to ignore pulsed distractor vibrations. In Experiment 2, targets were always presented to a particular hand, with distractors being presented to the other hand. In both experiments, a high (vs. low) load in a concurrent working memory task led to greater interference by the tactile distractors. These results establish the role of working memory in the control of tactile selective attention, demonstrating for the first time that the principles of load theory also apply to the tactile modality.  相似文献   

10.
本研究使用空间任务-转换范式,控制视、听刺激的突显性,探讨自下而上注意对视觉主导效应的影响。结果表明视、听刺激突显性显著地影响视觉主导效应,实验一中当听觉刺激为高突显性时,视觉主导效应显著减弱。实验二中当听觉刺激为高突显性并且视觉刺激为低突显性时,视觉主导效应进一步减弱但依然存在。结果支持偏向竞争理论,在跨通道视听交互过程中视觉刺激更具突显性,在多感觉整合过程中更具加工优势。  相似文献   

11.
In Experiment 1, participants were presented with pairs of stimuli (one visual and the other tactile) from the left and/or right of fixation at varying stimulus onset asynchronies and were required to make unspeeded temporal order judgments (TOJs) regarding which modality was presented first. When the participants adopted an uncrossed-hands posture, just noticeable differences (JNDs) were lower (i.e., multisensory TOJs were more precise) when stimuli were presented from different positions, rather than from the same position. This spatial redundancy benefit was reduced when the participants adopted a crossed-hands posture, suggesting a failure to remap visuotactile space appropriately. In Experiment 2, JNDs were also lower when pairs of auditory and visual stimuli were presented from different positions, rather than from the same position. Taken together, these results demonstrate that people can use redundant spatial cues to facilitate their performance on multisensory TOJ tasks and suggest that previous studies may have systematically overestimated the precision with which people can make such judgments. These results highlight the intimate link between spatial and temporal factors in determining our perception of the multimodal objects and events in the world around us.  相似文献   

12.
This paper seeks to bring together two previously separate research traditions: research on spatial orienting within the visual cueing paradigm and research into social cognition, addressing our tendency to attend in the direction that another person looks. Cueing methodologies from mainstream attention research were adapted to test the automaticity of orienting in the direction of seen gaze. Three studies manipulated the direction of gaze in a computerized face, which appeared centrally in a frontal view during a peripheral letter-discrimination task. Experiments 1 and 2 found faster discrimination of peripheral target letters on the side the computerized face gazed towards, even though the seen gaze did not predict target side, and despite participants being asked to ignore the face. This suggests reflexive covert and/or overt orienting in the direction of seen gaze, arising even when the observer has no motivation to orient in this way. Experiment 3 found faster letter discrimination on the side the computerized face gazed towards even when participants knew that target letters were four times as likely on the opposite side. This suggests that orienting can arise in the direction of seen gaze even when counter to intentions. The experiments illustrate that methods from mainstream attention research can be usefully applied to social cognition, and that studies of spatial attention may profit from considering its social function.  相似文献   

13.
The cost of expecting events in the wrong sensory modality   总被引:8,自引:0,他引:8  
We examined the effects of modality expectancy on human performance. Participants judged azimuth (left vs. right location) for an unpredictable sequence of auditory, visual, and tactile targets. In some blocks, equal numbers of targets were presented in each modality. In others, the majority (75%) of the targets were presented in just one expected modality. Reaction times (RTs) for targets in an unexpected modality were slower than when that modality was expected or when no expectancy applied. RT costs associated with shifting attention from the tactile modality were greater than those for shifts from either audition or vision. Any RT benefits for the most likely modality were due to priming from an event in the same modality on the previous trial, not to the expectancy per se. These results show that stimulus-driven and expectancy-driven effects must be distinguished in studies of attending to different sensory modalities.  相似文献   

14.
Six experiments examined the issue of whether one single system or separate systems underlie visual and auditory orienting of spatial attention. When auditory targets were used, reaction times were slower on trials in which cued and target locations were at opposite sides of the vertical head-centred meridian than on trials in which cued and target locations were at opposite sides of the vertical visual meridian or were not separated by any meridian. The head-centred meridian effect for auditory stimuli was apparent when targets were cued by either visual (Experiments 2, 3, and 6) or auditory cues (Experiment 5). Also, the head-centred meridian effect was found when targets were delivered either through headphones (Experiments 2, 3, and 5) or external loudspeakers (Experiment 6). Conversely, participants showed a visual meridian effect when they were required to respond to visual targets (Experiment 4). These results strongly suggest that auditory and visual spatial attention systems are indeed separate, as far as endogenous orienting is concerned.  相似文献   

15.
The participants in this study discriminated the position of tactile target stimuli presented at the tip or the base of the forefinger of one of the participants’ hands, while ignoring visual distractor stimuli. The visual distractor stimuli were presented from two circles on a display aligned with the tactile targets in Experiment 1 or orthogonal to them in Experiment 2. Tactile discrimination performance was slower and less accurate when the visual distractor stimuli were presented from incongruent locations relative to the tactile target stimuli (e.g., tactile target at the base of the finger with top visual distractor) highlighting a cross-modal congruency effect. We examined whether the presence and orientation of a simple line drawing of a hand, which was superimposed on the visual distractor stimuli, would modulate the cross-modal congruency effects. When the tactile targets and the visual distractors were spatially aligned, the modulatory effects of the hand picture were small (Experiment 1). However, when they were spatially misaligned, the effects were much larger, and the direction of the cross-modal congruency effects changed in accordance with the orientation of the picture of the hand, as if the hand picture corresponded to the participants’ own stimulated hand (Experiment 2). The results suggest that the two-dimensional picture of a hand can modulate processes maintaining our internal body representation. We also observed that the cross-modal congruency effects were influenced by the postures of the stimulated and the responding hands. These results reveal the complex nature of spatial interactions among vision, touch, and proprioception.  相似文献   

16.
This study addressed the role of proprioceptive and visual cues to body posture during the deployment of tactile spatial attention. Participants made speeded elevation judgments (up vs. down) to vibrotactile targets presented to the finger or thumb of either hand, while attempting to ignore vibrotactile distractors presented to the opposite hand. The first two experiments established the validity of this paradigm and showed that congruency effects were stronger when the target hand was uncertain (Experiment 1) than when it was certain (Experiment 2). Varying the orientation of the hands revealed that these congruency effects were determined by the position of the target and distractor in external space, and not by the particular skin sites stimulated (Experiment 3). Congruency effects increased as the hands were brought closer together in the dark (Experiment 4), demonstrating the role of proprioceptive input in modulating tactile selective attention. This spatial modulation was also demonstrated when a mirror was used to alter the visually perceived separation between the hands (Experiment 5). These results suggest that tactile, spatially selective attention can operate according to an abstract spatial frame of reference, which is significantly modulated by multisensory contributions from both proprioception and vision.  相似文献   

17.
Behavioral studies of multisensory integration in motion perception have focused on the particular case of visual and auditory signals. Here, we addressed a new case: audition and touch. In Experiment 1, we tested the effects of an apparent motion stream presented in an irrelevant modality (audition or touch) on the perception of apparent motion streams in the other modality (touch or audition, respectively). We found significant congruency effects (lower performance when the direction of motion in the irrelevant modality was incongruent with the direction of the target) for the two possible modality combinations. This congruency effect was asymmetrical, with tactile motion distractors having a stronger influence on auditory motion perception than vice versa. In Experiment 2, we used auditory motion targets and tactile motion distractors while participants adopted one of two possible postures: arms uncrossed or arms crossed. The effects of tactile motion on auditory motion judgments were replicated in the arms-uncrossed posture, but they dissipated in the arms-crossed posture. The implications of these results are discussed in light of current findings regarding the representation of tactile and auditory space.  相似文献   

18.
The pupillary light reflex (PLR) was used to track covert shifts of attention to items maintained in visual working memory (VWM). In three experiments, participants performed a change detection task in which rectangles appeared on either side of fixation and at test participants indicated if the cued rectangle changed its orientation. Prior to presentation or during the delay, participants were cued to the light or dark side of the screen. When cued to the light side, the pupil constricted, and when cued to the dark side, the pupil dilated, suggesting that the PLR tracked covert shifts of attention. Similar covert shifts of attention were seen when the target stimuli remained onscreen and during a blank delay period, suggesting similar effects for attention to perceptual stimuli and attention to stimuli maintained in VWM. Furthermore, similar effects were demonstrated when participants were pre-cued or retro-cued to the prioritized location, suggesting that shifts of covert attention can occur both before and after target presentation. These results are consistent with prior research, suggesting an important role of covert shifts of attention during VWM maintenance and that the PLR can be used to track these covert shifts of attention.  相似文献   

19.
郑晓丹  岳珍珠 《心理科学》2022,45(6):1329-1336
采用生活中的真实客体,我们考察了跨通道语义相关性对视觉注意的影响以及跨通道促进的时程。结合启动范式和点探测范式,实验1发现在听觉启动600毫秒后,被试对高相关视觉刺激的反应比对低相关刺激的反应更快,而在视觉启动下没有发现启动效应。实验2发现在启动刺激呈现900毫秒后跨通道启动效应消失。我们的研究证明了基于先前经验的视、听语义相关能够促进视觉的选择性注意。  相似文献   

20.
以往研究表明, 预期机制和注意机制都能促进感知行为, 但两者以何种方式共同作用于感知行为仍然存在争议, 特别是, 对于预期主体在其中的作用尚不清楚。本研究采用空间提示以及视觉搜索相结合的范式, 通过4个实验, 考察了当被试对目标进行预期以及对分心物进行预期时, 空间预期对空间注意效应的不同影响。结果显示:(1)当目标为预期主体时, 预期对注意效应具有调节作用; (2)当分心物为预期主体时, 预期与注意的作用独立; (3)当目标为预期主体时, 通过刺激数增加而导致的任务难度变化不影响预期和注意之间的关系。这表明, 空间预期是否影响空间注意效应受制于预期主体——当预期主体为目标时, 预期和注意两者交互式地影响感知行为; 当预期主体为分心物时, 预期和注意独立地影响感知行为; 而且, 预期和注意之间的关系不受任务难度影响。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号