首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 916 毫秒
1.
We report three experiments designed to investigate the nature of any crossmodal links between audition and touch in sustained endogenous covert spatial attention, using the orthogonal spatial cuing paradigm. Participants discriminated the elevation (up vs. down) of auditory and tactile targets presented to either the left or the right of fixation. In Experiment 1, targets were expected on a particular side in just one modality; the results demonstrated that the participants could spatially shift their attention independently in both audition and touch. Experiment 2 demonstrated that when the participants were informed that targets were more likely to be on one side for both modalities, elevation judgments were faster on that side in both audition and touch. The participants were also able to "split" their auditory and tactile attention, albeit at some cost, when targets in the two modalities were expected on opposite sides. Similar results were also reported in Experiment 3 when participants adopted a crossed-hands posture, thus revealing that crossmodal links in audiotactile attention operate on a representation of space that is updated following posture change. These results are discussed in relation to previous findings regarding crossmodal links in audiovisual and visuotactile covert spatial attentional orienting.  相似文献   

2.
Craig JC 《Perception》2006,35(3):351-367
Previous studies have demonstrated that visual apparent motion can alter the judgment of auditory apparent motion. We investigated the effect of visual apparent motion on judgments of the direction of tactile apparent motion. When visual motion was presented at the same time as, but in a direction opposite to, tactile motion, accuracy in judging the direction of tactile apparent motion was substantially reduced. This reduction in performance is referred to as 'the congruency effect'. Similar effects were observed when the visual display was placed either near to the tactile display or at some distance from the tactile display (experiment 1). In experiment 2, the relative alignment between the visual and tactile directions of motion was varied. The size of the congruency effect was similar at 0 degrees and 45 degrees alignments but much reduced at a 90 degrees alignment. In experiment 3, subjects made confidence ratings of their judgments of the direction of the tactile motion. The results indicated that the congruency effect was not due to subjects being unsure of the direction of motion and being forced to guess. In experiment 4, static visual stimuli were shown to have no effect on the judgments of direction of the tactile stimuli. The extent to which the congruency effect reflects capture effects and is the result of perceptual versus post-perceptual processes is discussed.  相似文献   

3.
Strybel TZ  Vatakis A 《Perception》2004,33(9):1033-1048
Unimodal auditory and visual apparent motion (AM) and bimodal audiovisual AM were investigated to determine the effects of crossmodal integration on motion perception and direction-of-motion discrimination in each modality. To determine the optimal stimulus onset asynchrony (SOA) ranges for motion perception and direction discrimination, we initially measured unimodal visual and auditory AMs using one of four durations (50, 100, 200, or 400 ms) and ten SOAs (40-450 ms). In the bimodal conditions, auditory and visual AM were measured in the presence of temporally synchronous, spatially displaced distractors that were either congruent (moving in the same direction) or conflicting (moving in the opposite direction) with respect to target motion. Participants reported whether continuous motion was perceived and its direction. With unimodal auditory and visual AM, motion perception was affected differently by stimulus duration and SOA in the two modalities, while the opposite was observed for direction of motion. In the bimodal audiovisual AM condition, discriminating the direction of motion was affected only in the case of an auditory target. The perceived direction of auditory but not visual AM was reduced to chance levels when the crossmodal distractor direction was conflicting. Conversely, motion perception was unaffected by the distractor direction and, in some cases, the mere presence of a distractor facilitated movement perception.  相似文献   

4.
The participants in this study discriminated the position of tactile target stimuli presented at the tip or the base of the forefinger of one of the participants’ hands, while ignoring visual distractor stimuli. The visual distractor stimuli were presented from two circles on a display aligned with the tactile targets in Experiment 1 or orthogonal to them in Experiment 2. Tactile discrimination performance was slower and less accurate when the visual distractor stimuli were presented from incongruent locations relative to the tactile target stimuli (e.g., tactile target at the base of the finger with top visual distractor) highlighting a cross-modal congruency effect. We examined whether the presence and orientation of a simple line drawing of a hand, which was superimposed on the visual distractor stimuli, would modulate the cross-modal congruency effects. When the tactile targets and the visual distractors were spatially aligned, the modulatory effects of the hand picture were small (Experiment 1). However, when they were spatially misaligned, the effects were much larger, and the direction of the cross-modal congruency effects changed in accordance with the orientation of the picture of the hand, as if the hand picture corresponded to the participants’ own stimulated hand (Experiment 2). The results suggest that the two-dimensional picture of a hand can modulate processes maintaining our internal body representation. We also observed that the cross-modal congruency effects were influenced by the postures of the stimulated and the responding hands. These results reveal the complex nature of spatial interactions among vision, touch, and proprioception.  相似文献   

5.

It has been suggested that judgments about the temporal–spatial order of successive tactile stimuli depend on the perceived direction of apparent motion between them. Here we manipulated tactile apparent-motion percepts by presenting a brief, task-irrelevant auditory stimulus temporally in-between pairs of tactile stimuli. The tactile stimuli were applied one to each hand, with varying stimulus onset asynchronies (SOAs). Participants reported the location of the first stimulus (temporal order judgments: TOJs) while adopting both crossed and uncrossed hand postures, so we could scrutinize skin-based, anatomical, and external reference frames. With crossed hands, the sound improved TOJ performance at short (≤300 ms) and at long (>300 ms) SOAs. When the hands were uncrossed, the sound induced a decrease in TOJ performance, but only at short SOAs. A second experiment confirmed that the auditory stimulus indeed modulated tactile apparent motion perception under these conditions. Perceived apparent motion directions were more ambiguous with crossed than with uncrossed hands, probably indicating competing spatial codes in the crossed posture. However, irrespective of posture, the additional sound tended to impair potentially anatomically coded motion direction discrimination at a short SOA of 80 ms, but it significantly enhanced externally coded apparent motion perception at a long SOA of 500 ms. Anatomically coded motion signals imply incorrect TOJ responses with crossed hands, but correct responses when the hands are uncrossed; externally coded motion signals always point toward the correct TOJ response. Thus, taken together, these results suggest that apparent-motion signals are likely taken into account when tactile temporal–spatial information is reconstructed.

  相似文献   

6.
Three experiments investigated cross-modal links between touch, audition, and vision in the control of covert exogenous orienting. In the first two experiments, participants made speeded discrimination responses (continuous vs. pulsed) for tactile targets presented randomly to the index finger of either hand. Targets were preceded at a variable stimulus onset asynchrony (150,200, or 300 msec) by a spatially uninformative cue that was either auditory (Experiment 1) or visual (Experiment 2) on the same or opposite side as the tactile target. Tactile discriminations were more rapid and accurate when cue and target occurred on the same side, revealing cross-modal covert orienting. In Experiment 3, spatially uninformative tactile cues were presented prior to randomly intermingled auditory and visual targets requiring an elevation discrimination response (up vs. down). Responses were significantly faster for targets in both modalities when presented ipsilateral to the tactile cue. These findings demonstrate that the peripheral presentation of spatially uninforrnative auditory and visual cues produces cross-modal orienting that affects touch, and that tactile cues can also produce cross-modal covert orienting that affects audition and vision.  相似文献   

7.
This study investigated multisensory interactions in the perception of auditory and visual motion. When auditory and visual apparent motion streams are presented concurrently in opposite directions, participants often fail to discriminate the direction of motion of the auditory stream, whereas perception of the visual stream is unaffected by the direction of auditory motion (Experiment 1). This asymmetry persists even when the perceived quality of apparent motion is equated for the 2 modalities (Experiment 2). Subsequently, it was found that this visual modulation of auditory motion is caused by an illusory reversal in the perceived direction of sounds (Experiment 3). This "dynamic capture" effect occurs over and above ventriloquism among static events (Experiments 4 and 5), and it generalizes to continuous motion displays (Experiment 6). These data are discussed in light of related multisensory phenomena and their support for a "modality appropriateness" interpretation of multisensory integration in motion perception.  相似文献   

8.
We examined the effect of posture change on the representation of visuotactile space in a split-brain patient using a cross-modal congruency task. Split-brain patient J.W. made speeded elevation discrimination responses (up versus down) to a series of tactile targets presented to the index finger or thumb of his right hand. We report congruency effects elicited by irrelevant visual distractors placed either close to, or far from, the stimulated hand. These cross-modal congruency effects followed the right hand as it moved within the right hemispace, but failed to do so when the hand crossed the midline into left hemispace. These results support recent claims that interhemispheric connections are required to maintain an accurate representation of visuotactile space.  相似文献   

9.
This study addressed the role of proprioceptive and visual cues to body posture during the deployment of tactile spatial attention. Participants made speeded elevation judgments (up vs. down) to vibrotactile targets presented to the finger or thumb of either hand, while attempting to ignore vibrotactile distractors presented to the opposite hand. The first two experiments established the validity of this paradigm and showed that congruency effects were stronger when the target hand was uncertain (Experiment 1) than when it was certain (Experiment 2). Varying the orientation of the hands revealed that these congruency effects were determined by the position of the target and distractor in external space, and not by the particular skin sites stimulated (Experiment 3). Congruency effects increased as the hands were brought closer together in the dark (Experiment 4), demonstrating the role of proprioceptive input in modulating tactile selective attention. This spatial modulation was also demonstrated when a mirror was used to alter the visually perceived separation between the hands (Experiment 5). These results suggest that tactile, spatially selective attention can operate according to an abstract spatial frame of reference, which is significantly modulated by multisensory contributions from both proprioception and vision.  相似文献   

10.
We investigated the extent to which people can discriminate between languages on the basis of their characteristic temporal, rhythmic information, and the extent to which this ability generalizes across sensory modalities. We used rhythmical patterns derived from the alternation of vowels and consonants in English and Japanese, presented in audition, vision, both audition and vision at the same time, or touch. Experiment 1 confirmed that discrimination is possible on the basis of auditory rhythmic patterns, and extended it to the case of vision, using ‘aperture-close’ mouth movements of a schematic face. In Experiment 2, language discrimination was demonstrated using visual and auditory materials that did not resemble spoken articulation. In a combined analysis including data from Experiments 1 and 2, a beneficial effect was also found when the auditory rhythmic information was available to participants. Despite the fact that discrimination could be achieved using vision alone, auditory performance was nevertheless better. In a final experiment, we demonstrate that the rhythm of speech can also be discriminated successfully by means of vibrotactile patterns delivered to the fingertip. The results of the present study therefore demonstrate that discrimination between language's syllabic rhythmic patterns is possible on the basis of visual and tactile displays.  相似文献   

11.
There is currently a great deal of interest regarding the possible existence of a crossmodal attentional blink (AB) between audition and vision. The majority of evidence now suggests that no such crossmodal deficit exists unless a task switch is introduced. We report two experiments designed to investigate the existence of a crossmodal AB between vision and touch. Two masked targets were presented successively at variable interstimulus intervals. Participants had to respond either to both targets (experimental condition) or to just the second target (control condition). In Experiment 1, the order of target modality was blocked, and an AB was demonstrated when visual targets preceded tactile targets, but not when tactile targets preceded visual targets. In Experiment 2, target modality was mixed randomly, and a significant crossmodal AB was demonstrated in both directions between vision and touch. The contrast between our visuotactile results and those of previous audiovisual studies is discussed, as are the implications for current theories of the AB.  相似文献   

12.
Load theory suggests that working memory controls the extent to which irrelevant distractors are processed (e.g., Lavie, Hirst, De Fockert, & Viding, 2004). However, so far this proposal has only been tested in vision. Here, we examine the extent to which tactile selective attention also depends on working memory. In Experiment 1, participants focused their attention on continuous target vibrations while attempting to ignore pulsed distractor vibrations. In Experiment 2, targets were always presented to a particular hand, with distractors being presented to the other hand. In both experiments, a high (vs. low) load in a concurrent working memory task led to greater interference by the tactile distractors. These results establish the role of working memory in the control of tactile selective attention, demonstrating for the first time that the principles of load theory also apply to the tactile modality.  相似文献   

13.
Previous studies of tactile spatial perception focussed either on a single point of stimulation, on local patterns within a single skin region such as the fingertip, on tactile motion, or on active touch. It remains unclear whether we should speak of a tactile field, analogous to the visual field, and supporting spatial relations between stimulus locations. Here we investigate this question by studying perception of large-scale tactile spatial patterns on the hand, arm and back. Experiment 1 investigated the relation between perception of tactile patterns and the identification of subsets of those patterns. The results suggest that perception of tactile spatial patterns is based on representing the spatial relations between locations of individual stimuli. Experiment 2 investigated the spatial and temporal organising principles underlying these relations. Experiment 3 showed that tactile pattern perception makes reference to structural representations of the body, such as body parts separated by joints. Experiment 4 found that precision of pattern perception is poorer for tactile patterns that extend across the midline, compared to unilateral patterns. Overall, the results suggest that the human sense of touch involves a tactile field, analogous to the visual field. The tactile field supports computation of spatial relations between individual stimulus locations, and thus underlies tactile pattern perception.  相似文献   

14.
Three experiments were conducted examining unimodal and crossmodal effects of attention to motion. Horizontally moving sounds and dot patterns were presented and participants’ task was to discriminate their motion speed or whether they were presented with a brief gap. In Experiments 1 and 2, stimuli of one modality and of one direction were presented with a higher probability ( p = .7) than other stimuli. Sounds and dot patterns moving in the expected direction were discriminated faster than stimuli moving in the unexpected direction. In Experiment 3, participants had to respond only to stimuli moving in one direction within the primary modality, but to all stimuli regardless of their direction within the rarer secondary modality. Stimuli of the secondary modality moving in the attended direction were discriminated faster than were oppositely moving stimuli. Results suggest that attending to the direction of motion affects perception within vision and audition, but also across modalities.  相似文献   

15.
An important step in developing a theory of calibration is establishing what it is that participants become calibrated to as a result of feedback. Three experiments used a transfer of calibration paradigm to investigate this issue. In particular, these experiments investigated whether recalibration of perception of length transferred from audition to dynamic (i.e., kinesthetic) touch when objects were grasped at one end (Experiment 1), when objects were grasped at one end and when they were grasped at a different location (i.e., the middle) (Experiment 2), and when false (i.e., inflated) feedback was provided about object length (Experiment 3). In all three experiments, there was a transfer of recalibration of perception of length from audition to dynamic touch when feedback was provided on perception by audition. Such results suggest that calibration is not specific to a particular perceptual modality and are also consistent with previous research that perception of object length by audition and dynamic touch are each constrained by the object's mechanical properties.  相似文献   

16.
Spence C  Walton M 《Acta psychologica》2005,118(1-2):47-70
We investigated the extent to which people can selectively ignore distracting vibrotactile information when performing a visual task. In Experiment 1, participants made speeded elevation discrimination responses (up vs. down) to a series of visual targets presented from one of two eccentricities on either side of central fixation, while simultaneously trying to ignore task-irrelevant vibrotactile distractors presented independently to the finger (up) vs. thumb (down) of either hand. Participants responded significantly more slowly, and somewhat less accurately, when the elevation of the vibrotactile distractor was incongruent with that of the visual target than when they were presented from the same (i.e., congruent) elevation. This crossmodal congruency effect was significantly larger when the visual and tactile stimuli appeared on the same side of space than when they appeared on different sides, although the relative eccentricity of the two stimuli within the hemifield (i.e., same vs. different) had little effect on performance. In Experiment 2, participants who crossed their hands over the midline showed a very different pattern of crossmodal congruency effects to participants who adopted an uncrossed hands posture. Our results suggest that both the relative external location and the initial hemispheric projection of the target and distractor stimuli contribute jointly to determining the magnitude of the crossmodal congruency effect when participants have to respond to vision and ignore touch.  相似文献   

17.
Visual dominance and attention: the Colavita effect revisited   总被引:4,自引:0,他引:4  
Under many conditions, humans display a robust tendency to rely more on visual information than on other forms of sensory information. Colavita (1974) illustrated this visual dominance effect by showing that naive observers typically fail to respond to clearly suprathreshold tones if these are presented simultaneously with a visual target flash. In the present study, we demonstrate that visual dominance influences performance under more complex stimulation conditions and address the role played by attention in mediating this effect. In Experiment 1, we show the Colavita effect in the simple speeded detection of line drawings and naturalistic sounds, whereas in Experiment 2 we demonstrate visual dominance when the task targets (auditory, visual, or bimodal combinations) are embedded among continuous streams of irrelevant distractors. In Experiments 3-5, we address the consequences of varying the probability of occurrence of targets in each sensory modality. In Experiment 6, we further investigate the role played by attention on visual dominance by manipulating perceptual load in either the visual or the auditory modality. Our results demonstrate that selective attention to a particular sensory modality can modulate--although not completely reverse--visual dominance as illustrated by the Colavita effect.  相似文献   

18.
The role of perceptual load in processing distractor faces   总被引:5,自引:0,他引:5  
It has been established that successful ignoring of irrelevant distractors depends on the extent to which the current task loads attention. However, the previous load studies have typically employed neutral distractor stimuli (e.g., letters). In the experiments reported here, we examined whether the perception of irrelevant distractor faces would show the same effects. We manipulated attentional load in a relevant task of name search by varying the search set size and found that whereas congruency effects from meaningful nonface distractors were eliminated by higher search load, interference from distractor faces was entirely unaffected by search load. These results support the idea that face processing may be mandatory and generalize the load theory to the processing of meaningful and more complex nonface distractors.  相似文献   

19.
Previous research has demonstrated that the localization of auditory or tactile stimuli can be biased by the simultaneous presentation of a visual stimulus from a different spatial position. We investigated whether auditory localization judgments could also be affected by the presentation of spatially displaced tactile stimuli, using a procedure designed to reveal perceptual interactions across modalities. Participants made left-right discrimination responses regarding the perceived location of sounds, which were presented either in isolation or together with tactile stimulation to the fingertips. The results demonstrate that the apparent location of a sound can be biased toward tactile stimulation when it is synchronous, but not when it is asynchronous, with the auditory event. Directing attention to the tactile modality did not increase the bias of sound localization toward synchronous tactile stimulation. These results provide the first demonstration of the tactile capture of audition.  相似文献   

20.
Previous research has demonstrated that the localization of auditory or tactile stimuli can be biased by the simultaneous presentation of a visual stimulus from a different spatial position. We investigated whether auditory localization judgments could also be affected by the presentation of spatially displaced tactile stimuli, using a procedure designed to reveal perceptual interactions across modalities. Participants made left—right discrimination responses regarding the perceived location of sounds, which were presented either in isolation or together with tactile stimulation to the fingertips. The results demonstrate that the apparent location of a sound can be biased toward tactile stimulation when it is synchronous, but not when it is asynchronous, with the auditory event. Directing attention to the tactile modality did not increase the bias of sound localization toward synchronous tactile stimulation. These results provide the first demonstration of the tactilecapture of audition.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号