首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
In Experiment 1, participants were presented with pairs of stimuli (one visual and the other tactile) from the left and/or right of fixation at varying stimulus onset asynchronies and were required to make unspeeded temporal order judgments (TOJs) regarding which modality was presented first. When the participants adopted an uncrossed-hands posture, just noticeable differences (JNDs) were lower (i.e., multisensory TOJs were more precise) when stimuli were presented from different positions, rather than from the same position. This spatial redundancy benefit was reduced when the participants adopted a crossed-hands posture, suggesting a failure to remap visuotactile space appropriately. In Experiment 2, JNDs were also lower when pairs of auditory and visual stimuli were presented from different positions, rather than from the same position. Taken together, these results demonstrate that people can use redundant spatial cues to facilitate their performance on multisensory TOJ tasks and suggest that previous studies may have systematically overestimated the precision with which people can make such judgments. These results highlight the intimate link between spatial and temporal factors in determining our perception of the multimodal objects and events in the world around us.  相似文献   

2.
We investigated the effect of unseen hand posture on cross-modal, visuo-tactile links in covert spatial attention. In Experiment 1, a spatially nonpredictive visual cue was presented to the left or right hemifield shortly before a tactile target on either hand. To examine the spatial coordinates of any cross-modal cuing, the unseen hands were either uncrossed or crossed so that the left hand lay to the right and vice versa. Tactile up/down (i.e., index finger/thumb) judgments were better on the same side of external space as the visual cue, for both crossed and uncrossed postures. Thus, which hand was advantaged by a visual cue in a particular hemifield reversed across the different unseen postures. In Experiment 2, nonpredictive tactile cues now preceded visual targets. Up/down judgments for the latter were better on the same side of external space as the tactile cue, again for both postures. These results demonstrate cross-modal links between vision and touch in exogenous covert spatial attention that remap across changes in unseen hand posture, suggesting a modulatory role for proprioception.  相似文献   

3.
An ability to detect the common location of multisensory stimulation is essential for us to perceive a coherent environment, to represent the interface between the body and the external world, and to act on sensory information. Regarding the tactile environment “at hand”, we need to represent somatosensory stimuli impinging on the skin surface in the same spatial reference frame as distal stimuli, such as those transduced by vision and audition. Across two experiments we investigated whether 6‐ (n = 14; Experiment 1) and 4‐month‐old (n = 14; Experiment 2) infants were sensitive to the colocation of tactile and auditory signals delivered to the hands. We recorded infants’ visual preferences for spatially congruent and incongruent auditory‐tactile events delivered to their hands. At 6 months, infants looked longer toward incongruent stimuli, whilst at 4 months infants looked longer toward congruent stimuli. Thus, even from 4 months of age, infants are sensitive to the colocation of simultaneously presented auditory and tactile stimuli. We conclude that 4‐ and 6‐month‐old infants can represent auditory and tactile stimuli in a common spatial frame of reference. We explain the age‐wise shift in infants’ preferences from congruent to incongruent in terms of an increased preference for novel crossmodal spatial relations based on the accumulation of experience. A comparison of looking preferences across the congruent and incongruent conditions with a unisensory control condition indicates that the ability to perceive auditory‐tactile colocation is based on a crossmodal rather than a supramodal spatial code by 6 months of age at least.  相似文献   

4.
Spence C  Walton M 《Acta psychologica》2005,118(1-2):47-70
We investigated the extent to which people can selectively ignore distracting vibrotactile information when performing a visual task. In Experiment 1, participants made speeded elevation discrimination responses (up vs. down) to a series of visual targets presented from one of two eccentricities on either side of central fixation, while simultaneously trying to ignore task-irrelevant vibrotactile distractors presented independently to the finger (up) vs. thumb (down) of either hand. Participants responded significantly more slowly, and somewhat less accurately, when the elevation of the vibrotactile distractor was incongruent with that of the visual target than when they were presented from the same (i.e., congruent) elevation. This crossmodal congruency effect was significantly larger when the visual and tactile stimuli appeared on the same side of space than when they appeared on different sides, although the relative eccentricity of the two stimuli within the hemifield (i.e., same vs. different) had little effect on performance. In Experiment 2, participants who crossed their hands over the midline showed a very different pattern of crossmodal congruency effects to participants who adopted an uncrossed hands posture. Our results suggest that both the relative external location and the initial hemispheric projection of the target and distractor stimuli contribute jointly to determining the magnitude of the crossmodal congruency effect when participants have to respond to vision and ignore touch.  相似文献   

5.
We investigated the effects of seen and unseen within-hemifield posture changes on crossmodal visual–tactile links in covert spatial attention. In all experiments, a spatially nonpredictive tactile cue was presented to the left or the right hand, with the two hands placed symmetrically across the midline. Shortly after a tactile cue, a visual target appeared at one of two eccentricities within either of the hemifields. For half of the trial blocks, the hands were aligned with the inner visual target locations, and for the remainder, the hands were aligned with the outer target locations. In Experiments 1 and 2, the inner and outer eccentricities were 17.5º and 52.5º, respectively. In Experiment 1, the arms were completely covered, and visual up–down judgments were better when on the same side as the preceding tactile cue. Cueing effects were not significantly affected by hand or target alignment. In Experiment 2, the arms were in view, and now some target responses were affected by cue alignment: Cueing for outer targets was only significant when the hands were aligned with them. In Experiment 3, we tested whether any unseen posture changes could alter the cueing effects, by widely separating the inner and outer target eccentricities (now 10º and 86º). In this case, hand alignment did affect some of the cueing effects: Cueing for outer targets was now only significant when the hands were in the outer position. Although these results confirm that proprioception can, in some cases, influence tactile–visual links in exogenous spatial attention, they also show that spatial precision is severely limited, especially when posture is unseen.  相似文献   

6.
Adults show a deficit in their ability to localize tactile stimuli to their hands when their arms are in the less familiar, crossed posture. It is thought that this ‘crossed‐hands deficit’ arises due to a conflict between the anatomical and external spatial frames of reference within which touches can be encoded. The ability to localize a single tactile stimulus applied to one of the two hands across uncrossed‐hands and crossed‐hands postures was investigated in typically developing children (aged 4 to 6 years). The effect of posture was also compared across conditions in which children did, or did not, have visual information about current hand posture. All children, including the 4‐year‐olds, demonstrated the crossed‐hands deficit when they did not have sight of hand posture, suggesting that touch is located in an external reference frame by this age. In this youngest age group, when visual information about current hand posture was available, tactile localization performance was impaired specifically when the children's hands were uncrossed. We propose that this may be due to an early difficulty with integrating visual representations of the hand within the body schema.  相似文献   

7.
The participants in this study discriminated the position of tactile target stimuli presented at the tip or the base of the forefinger of one of the participants’ hands, while ignoring visual distractor stimuli. The visual distractor stimuli were presented from two circles on a display aligned with the tactile targets in Experiment 1 or orthogonal to them in Experiment 2. Tactile discrimination performance was slower and less accurate when the visual distractor stimuli were presented from incongruent locations relative to the tactile target stimuli (e.g., tactile target at the base of the finger with top visual distractor) highlighting a cross-modal congruency effect. We examined whether the presence and orientation of a simple line drawing of a hand, which was superimposed on the visual distractor stimuli, would modulate the cross-modal congruency effects. When the tactile targets and the visual distractors were spatially aligned, the modulatory effects of the hand picture were small (Experiment 1). However, when they were spatially misaligned, the effects were much larger, and the direction of the cross-modal congruency effects changed in accordance with the orientation of the picture of the hand, as if the hand picture corresponded to the participants’ own stimulated hand (Experiment 2). The results suggest that the two-dimensional picture of a hand can modulate processes maintaining our internal body representation. We also observed that the cross-modal congruency effects were influenced by the postures of the stimulated and the responding hands. These results reveal the complex nature of spatial interactions among vision, touch, and proprioception.  相似文献   

8.
Pointing, like eye gaze, is a deictic gesture that can be used to orient the attention of another person towards an object or an event. Previous research suggests that infants first begin to follow a pointing gesture between 10 and 13 months of age. We investigated whether sensitivity to pointing could be seen at younger ages employing a technique recently used to show early sensitivity to perceived eye gaze. Three experiments were conducted with 4.5- and 6.5-month-old infants. Our first goal was to examine whether these infants could show a systematic response to pointing by shifting their visual attention in the direction of a pointing gesture when we eliminated the difficulty of disengaging fixation from a pointing hand. The results from Experiments 1 and 2 suggest that a dynamic, but not a static, pointing gesture triggers shifts of visual attention in infants as young as 4.5 months of age. Our second goal was to clarify whether this response was based on sensitivity to the directional posture of the pointing hand, the motion of the pointing hand, or both. The results from Experiment 3 suggest that the direction of motion is necessary but not sufficient to orient infants' attention toward a distal target. Infants shifted their attention in the direction of the pointing finger, but only when the hand was moving in the same direction. These results suggest that infants are prepared to orient to the distal referent of a pointing gesture which likely contributes to their learning the communicative function of pointing.  相似文献   

9.
Shore DI  Gray K  Spry E  Spence C 《Perception》2005,34(10):1251-1262
We report a series of three experiments designed to examine the effect of posture on tactile temporal processing. Observers reported which of two tactile stimuli, presented to the left and right index fingers (experiments 1-3; or thumb, experiment 3), was perceived first while adopting one of two postures--hands-close (adjacent, but not touching) or hands-far (1 m apart)--in the dark. Just-noticeable differences were significantly smaller in the hands-far posture across all three experiments. In the first two experiments we compared hand versus foot responses and found equivalent advantages for the hands-far posture. In the final experiment the stimuli were presented to either the same or different digit on each hand (index finger or thumb) and we found that only when the same digit on each hand was stimulated was there an advantage for the hands-far posture. The finding that temporal precision was better with greater distance contradicts predictions based on attention-switching models of temporal-order judgments, and also contrasts with results from similar experimental manipulations in other modalities (eg vision). These results provide support for a rapid and automatic process that transforms the representation of a tactile stimulus from a skin-centred reference frame to a more external (eg body-centred or allocentric) one.  相似文献   

10.
Behavioral studies of multisensory integration in motion perception have focused on the particular case of visual and auditory signals. Here, we addressed a new case: audition and touch. In Experiment 1, we tested the effects of an apparent motion stream presented in an irrelevant modality (audition or touch) on the perception of apparent motion streams in the other modality (touch or audition, respectively). We found significant congruency effects (lower performance when the direction of motion in the irrelevant modality was incongruent with the direction of the target) for the two possible modality combinations. This congruency effect was asymmetrical, with tactile motion distractors having a stronger influence on auditory motion perception than vice versa. In Experiment 2, we used auditory motion targets and tactile motion distractors while participants adopted one of two possible postures: arms uncrossed or arms crossed. The effects of tactile motion on auditory motion judgments were replicated in the arms-uncrossed posture, but they dissipated in the arms-crossed posture. The implications of these results are discussed in light of current findings regarding the representation of tactile and auditory space.  相似文献   

11.
Multisensory integration of nonspatial features between vision and touch was investigated by examining the effects of redundant signals of visual and tactile inputs. In the present experiments, visual letter stimuli and/or tactile letter stimuli were presented, which participants were asked to identify as quickly as possible. The results of Experiment 1 demonstrated faster reaction times for bimodal stimuli than for unimodal stimuli (the redundant signals effect (RSE)). The RSE was due to coactivation of figural representations from the visual and tactile modalities. This coactivation did not occur for a simple stimulus detection task (Experiment 2) or for bimodal stimuli with the same semantic information but different physical stimulus features (Experiment 3). The findings suggest that the integration process might occur at a relatively early stage of object-identification prior to the decision level.  相似文献   

12.
Four experiments were conducted, three with tactile stimuli and one with visual stimuli, in which subjects made temporal order judgments (TOJs). The tactile stimuli were patterns that moved laterally across the fingerpads. The subject's task was to judge which finger received the pattern first. Even though the movement was irrelevant to the task, the subjects' TOJs were greatly affected by the direction of movement of the patterns. Accuracy in judging temporal order was enhanced when the patterns moved in a direction that was consistent with the temporal order of presentation--for example, when the movement on each fingerpad was from right to left and the temporally leading site of stimulation was to the right of the temporally trailing site of stimulation. When movement was inconsistent with the temporal order of presentation, accuracy was considerably reduced, often well below chance.The bias in TOJs was unaffected by training or by presenting the stimuli to fingers on opposite hands. In a fourth experiment, subjects judged the temporal order of visual stimuli that, like the tactile stimuli, moved in a direction that was either consistent or inconsistent with the TOJ. The results were similar to those obtained with tactile stimuli. It is suggested that the bias may be affected by attentional mechanisms and by apparent motion generated between the two sites on the skin.  相似文献   

13.
The ability to report the temporal order of 2 tactile stimuli (1 applied to each hand) has been shown to decline when the arms are crossed over compared with when they are uncrossed. However, these effects have only been measured when temporal order was reported by stimulus location. It is unknown whether this spatial manipulation of the body affects all tactile temporal order judgments (TOJs) or only those judgments that are spatially defined. The authors examined the effect of crossing the arms on tactile TOJs when stimuli were identified by either spatial (location) or nonspatial (frequency or duration) attributes. Spatial TOJs were significantly impaired when the arms were in crossed compared with uncrossed postures, but there was no effect of posture when order was judged by nonspatial attributes. Task-dependent modulation of the effects of posture was also evident when response complexity was reduced to go/no-go responses. These results suggest that crossing the arms impairs tactile localization and thus spatial TOJs. However, the data also suggest that localization is not a necessary precursor when temporal order can be computed by nonspatial means.  相似文献   

14.
Vibrotactile mobility systems present spatial information such as the direction of a waypoint through a localized vibration on the torso. Using these systems requires the ability to determine the absolute location of the stimulus. Because data are available only on the ability to determine the relative location of stimuli on the torso, we developed a novel method for measuring absolute localization on the basis of triangulation. For 15 observers, we calculated the subjective location of visual and tactile stimuli on the frontal half of the torso. The size of the 95% confidence intervals around the subjective tactile locations is about the size of the stimuli (1.66 cm) and is slightly larger than that around the subjective visual locations (mean difference, 0.17 cm). The error in tactile judgments over and above that in the visual judgments is present only for locations near the body midline. When the subjective visual and tactile locations are not co-located, the difference can best be described by a shift along the radius from the body midaxis. The same holds for the differences between the veridical and the subjective locations. Therefore, the difference between the veridical and the subjective directions of a stimulus is small. The results make us believe that stimulus locations on the torso are coded in polar coordinates of which the angle is perceptual invariant and the distance is less important, probably because it varies with changes in, among other things, posture and breathing.  相似文献   

15.
This paper examines how the covert orienting of spatial attention affects motor responses to visual stimuli. Premotor theories, as well as hemi-field inhibition accounts of visual attention predict an increase in response times when a target stimulus appears in the opposite direction to a spatial cue. Some models also suggest that this meridional effect should be increased across oblique meridians. Two types of cue (central and peripheral) were used to orient attention towards locations prior to the onset of visual targets. Simple manual (press button) and saccadic responses were measured. No meridional effects were found with peripheral cues, whereas central cueing produced meridional effects across all meridians. Cueing effects did not vary significantly with two-dimensional axis for either manual or saccadic responses. Increases in response time with cue-target distance were found for both response and cue types. For saccades, distance gradients were shallower moving distally rather than proximally from the cued position. However, simple manual responses did not show this asymmetry. Orienting to central cues also modulated the amplitude of saccades. The results are consistent with an effect of attentional cues in oculomotor centres as well as the existence of actiondependent attentional representations. However, it is proposed that, rather than reflecting oculomotor programming, meridional effects arise from a directional organization within spatio-cognitive representations.  相似文献   

16.
Change blindness, the surprising inability of people to detect significant changes between consecutively-presented visual displays, has recently been shown to affect tactile perception as well. Visual change blindness has been observed during saccades and eye blinks, conditions under which people’s awareness of visual information is temporarily suppressed. In the present study, we demonstrate change blindness for suprathreshold tactile stimuli resulting from the execution of a secondary task requiring bodily movement. In Experiment 1, the ability of participants to detect changes between two sequentially-presented vibrotactile patterns delivered on their arms and legs was compared while they performed a secondary task consisting of either the execution of a movement with the right arm toward a visual target or the verbal identification of the target side. The results demonstrated that a motor response gave rise to the largest drop in perceptual sensitivity (as measured by changes in d′) in detecting changes to the tactile display. In Experiment 2, we replicated these results under conditions in which the participants had to detect tactile changes while turning a steering wheel instead. These findings are discussed in terms of the role played by bodily movements, sensory suppression, and higher order information processing in modulating people’s awareness of tactile information across the body surface.  相似文献   

17.
Three experiments investigated cross-modal links between touch, audition, and vision in the control of covert exogenous orienting. In the first two experiments, participants made speeded discrimination responses (continuous vs. pulsed) for tactile targets presented randomly to the index finger of either hand. Targets were preceded at a variable stimulus onset asynchrony (150,200, or 300 msec) by a spatially uninformative cue that was either auditory (Experiment 1) or visual (Experiment 2) on the same or opposite side as the tactile target. Tactile discriminations were more rapid and accurate when cue and target occurred on the same side, revealing cross-modal covert orienting. In Experiment 3, spatially uninformative tactile cues were presented prior to randomly intermingled auditory and visual targets requiring an elevation discrimination response (up vs. down). Responses were significantly faster for targets in both modalities when presented ipsilateral to the tactile cue. These findings demonstrate that the peripheral presentation of spatially uninforrnative auditory and visual cues produces cross-modal orienting that affects touch, and that tactile cues can also produce cross-modal covert orienting that affects audition and vision.  相似文献   

18.
Stimulus-response (S-R) compatibility effects between vertically oriented stimuli (above or below fixation) and horizontally oriented responses (left or right switch deflections by a single hand) have been shown to depend both on which hand responds (Bauer & Miller, 1982) and on the location at which the response is made (eccentricity on a frontoparallel line; Michaels, 1989). In the latter study, hand position and hand posture were confounded, so it is unclear which variable determined the compatibility effect. In Experiment 1, the importance of effector position was tested. Vertically oriented stimuli were paired with a horizontal response solicited at different locations but always involving the same hand posture. Compatibility effects emerged, and their direction depended on position. In Experiment 2, the compatibilities were not evident in a simple reaction time paradigm, so the effect was not due to differential ease of responses. In Experiment 3, a change in hand posture (palm up or palm down) at the same location (the body midline) also affected the compatibilities. It was concluded that the S-R compatibility of orthogonally oriented stimuli and responses is influenced by (1) which hand responds, (2) the location of that hand, and (3) its posture. The results imply that both postural and positional states of the action system affect S-R compatibility.  相似文献   

19.
Previous studies of tactile spatial perception focussed either on a single point of stimulation, on local patterns within a single skin region such as the fingertip, on tactile motion, or on active touch. It remains unclear whether we should speak of a tactile field, analogous to the visual field, and supporting spatial relations between stimulus locations. Here we investigate this question by studying perception of large-scale tactile spatial patterns on the hand, arm and back. Experiment 1 investigated the relation between perception of tactile patterns and the identification of subsets of those patterns. The results suggest that perception of tactile spatial patterns is based on representing the spatial relations between locations of individual stimuli. Experiment 2 investigated the spatial and temporal organising principles underlying these relations. Experiment 3 showed that tactile pattern perception makes reference to structural representations of the body, such as body parts separated by joints. Experiment 4 found that precision of pattern perception is poorer for tactile patterns that extend across the midline, compared to unilateral patterns. Overall, the results suggest that the human sense of touch involves a tactile field, analogous to the visual field. The tactile field supports computation of spatial relations between individual stimulus locations, and thus underlies tactile pattern perception.  相似文献   

20.

It has been suggested that judgments about the temporal–spatial order of successive tactile stimuli depend on the perceived direction of apparent motion between them. Here we manipulated tactile apparent-motion percepts by presenting a brief, task-irrelevant auditory stimulus temporally in-between pairs of tactile stimuli. The tactile stimuli were applied one to each hand, with varying stimulus onset asynchronies (SOAs). Participants reported the location of the first stimulus (temporal order judgments: TOJs) while adopting both crossed and uncrossed hand postures, so we could scrutinize skin-based, anatomical, and external reference frames. With crossed hands, the sound improved TOJ performance at short (≤300 ms) and at long (>300 ms) SOAs. When the hands were uncrossed, the sound induced a decrease in TOJ performance, but only at short SOAs. A second experiment confirmed that the auditory stimulus indeed modulated tactile apparent motion perception under these conditions. Perceived apparent motion directions were more ambiguous with crossed than with uncrossed hands, probably indicating competing spatial codes in the crossed posture. However, irrespective of posture, the additional sound tended to impair potentially anatomically coded motion direction discrimination at a short SOA of 80 ms, but it significantly enhanced externally coded apparent motion perception at a long SOA of 500 ms. Anatomically coded motion signals imply incorrect TOJ responses with crossed hands, but correct responses when the hands are uncrossed; externally coded motion signals always point toward the correct TOJ response. Thus, taken together, these results suggest that apparent-motion signals are likely taken into account when tactile temporal–spatial information is reconstructed.

  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号