首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
How do we individuate body parts? Here, we investigated the effect of body segmentation between hand and arm in tactile and visual perception. In a first experiment, we showed that two tactile stimuli felt farther away when they were applied across the wrist than when they were applied within a single body part (palm or forearm), indicating a "category boundary effect". In the following experiments, we excluded two hypotheses, which attributed tactile segmentation to other, nontactile factors. In Experiment 2, we showed that the boundary effect does not arise from motor cues. The effect was reduced during a motor task involving flexion and extension movements of the wrist joint. Action brings body parts together into functional units, instead of pulling them apart. In Experiments 3 and 4, we showed that the effect does not arise from perceptual cues of visual discontinuities. We did not find any segmentation effect for the visual percept of the body in Experiment 3, nor for a neutral shape in Experiment 4. We suggest that the mental representation of the body is structured in categorical body parts delineated by joints, and that this categorical representation modulates tactile spatial perception.  相似文献   

2.
How do we individuate body parts? Here, we investigated the effect of body segmentation between hand and arm in tactile and visual perception. In a first experiment, we showed that two tactile stimuli felt farther away when they were applied across the wrist than when they were applied within a single body part (palm or forearm), indicating a “category boundary effect”. In the following experiments, we excluded two hypotheses, which attributed tactile segmentation to other, nontactile factors. In Experiment 2, we showed that the boundary effect does not arise from motor cues. The effect was reduced during a motor task involving flexion and extension movements of the wrist joint. Action brings body parts together into functional units, instead of pulling them apart. In Experiments 3 and 4, we showed that the effect does not arise from perceptual cues of visual discontinuities. We did not find any segmentation effect for the visual percept of the body in Experiment 3, nor for a neutral shape in Experiment 4. We suggest that the mental representation of the body is structured in categorical body parts delineated by joints, and that this categorical representation modulates tactile spatial perception.  相似文献   

3.
The tactile surface forms a continuous sheet covering the body. And yet, the perceived distance between two touches varies across stimulation sites. Perceived tactile distance is larger when stimuli cross over the wrist, compared to when both fall on either the hand or the forearm. This effect could reflect a categorical distortion of tactile space across body-part boundaries (in which stimuli crossing the wrist boundary are perceptually elongated) or may simply reflect a localised increased in acuity surrounding anatomical landmarks (in which stimuli near the wrist are perceptually elongated). We tested these two interpretations across two experiments, by comparing a well-documented bias to perceive mediolateral tactile distances across the forearm/hand as larger than proximodistal ones along the forearm/hand at three different sites (hand, wrist, and forearm). According to the ‘categorical’ interpretation, tactile distances should be elongated selectively in the proximodistal axis thus reducing the anisotropy. According to the ‘localised acuity’ interpretation, distances will be perceptually elongated in the vicinity of the wrist regardless of orientation, leading to increased overall size without affecting anisotropy. Consistent with the categorical account, we found a reduction in the magnitude of anisotropy at the wrist, with no evidence of a corresponding localised increase in precision. These findings demonstrate that we reference touch to a representation of the body that is categorically segmented into discrete parts, which consequently influences the perception of tactile distance.  相似文献   

4.
The perceived distance between touches on a single skin surface is larger on regions of high tactile sensitivity than those with lower acuity, an effect known as Weber's illusion. This illusion suggests that tactile size perception involves a representation of the perceived size of body parts preserving characteristics of the somatosensory homunculus. Here, we investigated how body shape is coded within this representation by comparing tactile distances presented in different orientations on the hand. Participants judged which of two tactile distances on the dorsum of their left hand felt larger. One distance was aligned with the proximodistal axis (along the hand), the other with the mediolateral axis (across the hand). Across distances were consistently perceived as larger than along ones. A second experiment showed that this effect is specific to the hairy skin of the hand dorsum and does not occur on glabrous skin of the palm. A third experiment demonstrated that this bias reflects orientation on the hand surface, rather than an eye- or torso-centered reference frame. These results mirror known orientational anisotropies of both tactile acuity and of tactile receptive fields (RFs) of cortical neurons. We suggest that the dorsum of the hand is implicitly represented as wider than it actually is and that the shape of tactile RFs may partly explain distortions of mental body representations.  相似文献   

5.
Previous research has revealed that anticipating pain at a particular location of the body prioritizes somatosensory input presented there. The present study tested whether the spatial features of bodily threat are limited to the exact location of nociception. Participants judged which one of two tactile stimuli, presented to either hand, had been presented first, while occasionally experiencing a painful stimulus. The distance between the pain and tactile locations was manipulated. In Experiment 1, participants expected pain either proximal to one of the tactile stimuli (on the hand; near condition) or more distant on the same body part (arm; far condition). In Experiment 2, the painful stimulus was expected either proximal to one of the tactile stimuli (hand; near) or on a different body-part at the same body side (leg; far). The results revealed that in the near condition of both experiments, participants became aware of tactile stimuli presented to the “threatened” hand more quickly as compared to the “neutral” hand. Of particular interest, the data in the far conditions showed a similar prioritization effect when pain was expected at a different location of the same body part as well as when pain was expected at a different body part at the same body side. In this study, the encoding of spatial features of bodily threat was not limited to the exact location where pain was anticipated but rather generalized to the entire body part and even to different body parts at the same side of the body.  相似文献   

6.
Tactile signals on a hand that serves as movement goal are enhanced during movement planning and execution. Here, we examined how spatially specific tactile enhancement is when humans reach to their own static hand. Participants discriminated two brief and simultaneously presented tactile stimuli: a comparison stimulus on the left thumb or little finger from a reference stimulus on the sternum. Tactile stimuli were presented either during right-hand reaching towards the left thumb or little finger or while holding both hands still (baseline). Consistent with our previous findings, stimuli on the left hand were perceived stronger during movement than baseline. However, tactile enhancement was not stronger when the stimuli were presented on the digit that served as reach target, thus the perception across the whole hand was uniformly enhanced. In experiment 2, we also presented stimuli on the upper arm in half of the trials to reduce the expectation of the stimulus location. Tactile stimuli on the target hand, but not on the upper arm, were generally enhanced, supporting the idea of a spatial gradient of tactile enhancement. Overall, our findings argue for low spatial specificity of tactile enhancement at movement-relevant body parts, here the target hand.  相似文献   

7.
Vision of the body modulates somatosensation, even when entirely non-informative about stimulation. For example, seeing the body increases tactile spatial acuity, but reduces acute pain. While previous results demonstrate that vision of the body modulates somatosensory sensitivity, it is unknown whether vision also affects metric properties of touch, and if so how. This study investigated how non-informative vision of the body modulates tactile size perception. We used the mirror box illusion to induce the illusion that participants were directly seeing their stimulated left hand, though they actually saw their reflected right hand. We manipulated whether participants: (a) had the illusion of directly seeing their stimulated left hand, (b) had the illusion of seeing a non-body object at the same location, or (c) looked directly at their non-stimulated right-hand. Participants made verbal estimates of the perceived distance between two tactile stimuli presented simultaneously to the dorsum of the left hand, either 20, 30, or 40 mm apart. Vision of the body significantly reduced the perceived size of touch, compared to vision of the object or of the contralateral hand. In contrast, no apparent changes of perceived hand size were found. These results show that seeing the body distorts tactile size perception.  相似文献   

8.
Shore DI  Gray K  Spry E  Spence C 《Perception》2005,34(10):1251-1262
We report a series of three experiments designed to examine the effect of posture on tactile temporal processing. Observers reported which of two tactile stimuli, presented to the left and right index fingers (experiments 1-3; or thumb, experiment 3), was perceived first while adopting one of two postures--hands-close (adjacent, but not touching) or hands-far (1 m apart)--in the dark. Just-noticeable differences were significantly smaller in the hands-far posture across all three experiments. In the first two experiments we compared hand versus foot responses and found equivalent advantages for the hands-far posture. In the final experiment the stimuli were presented to either the same or different digit on each hand (index finger or thumb) and we found that only when the same digit on each hand was stimulated was there an advantage for the hands-far posture. The finding that temporal precision was better with greater distance contradicts predictions based on attention-switching models of temporal-order judgments, and also contrasts with results from similar experimental manipulations in other modalities (eg vision). These results provide support for a rapid and automatic process that transforms the representation of a tactile stimulus from a skin-centred reference frame to a more external (eg body-centred or allocentric) one.  相似文献   

9.
Two motor acts were analyzed at the level of tongue and fingers. These motor acts generated illusions. When subjects voluntarily rotated the tongue by 90 degrees, the perceived orientation of a tactile stimulus applied to the tongue did not covary with the perceived orientation of the tongue itself. Analogously, when subjects voluntarily crossed two adjacent fingers, the perceived position of two tactile stimuli applied to the fingers did not covary with the perceived position of the fingers themselves. Although tongue and fingers were positioned accurately in space, a lack of perceptual constancy occurred for tactile stimuli applied to these body parts. Therefore, whereas position sense was preserved, correct localization of objects was lost. The occurrence of this perceptual dissociation suggests that spatial localization of tactile stimuli may be independent both of knowledge of body part location and motor activity.  相似文献   

10.
Mental body representations underlying tactile perception do not accurately reflect the body’s true morphology. For example, perceived tactile distance is dependent on both the body part being touched and the stimulus orientation, a phenomenon called Weber’s illusion. These findings suggest the presence of size and shape distortions, respectively. However, whereas each morphological feature is typically measured in isolation, a complete morphological characterization requires the concurrent measurement of both size and shape. We did so in three experiments, manipulating both the stimulated body parts (hand; forearm) and stimulus orientation while requiring participants to make tactile distance judgments. We found that the forearm was significantly more distorted than the hand lengthwise but not widthwise. Effects of stimulus orientation are thought to reflect receptive field anisotropies in primary somatosensory cortex. The results of the present study therefore suggest that mental body representations retain homuncular shape distortions that characterize early stages of somatosensory processing.  相似文献   

11.
The concept that distance on the skin is frequently misperceived was first reported over a century ago by Weber. Weber and others have reported that the apparent distance between pressure stimuli fluctuates with both body site and stimulus orientation. The present study confirms these effects and shows that the misperceptions are usually compressive in nature. It further establishes that errors in perceived distance correspond to errors in perceived location, indicating that an interaction exists between the perceptual processes responsible for percepts of tactile location and distance. Perceived location depends on the relationship of a tactile stimulus both to the body frame and to nearby stimuli, and the effect of nearby stimuli is to induce a perceptual affinity between sensations of pressure. These results are discussed in relation to the more frequently examined dynamic illusions of tactile distance (tau phenomenon) and location (the cutaneous rabbit).  相似文献   

12.
Poliakoff E  Miles E  Li X  Blanchette I 《Cognition》2007,102(3):405-414
Viewing a threatening stimulus can bias visual attention toward that location. Such effects have typically been investigated only in the visual modality, despite the fact that many threatening stimuli are most dangerous when close to or in contact with the body. Recent multisensory research indicates that a neutral visual stimulus, such as a light flash, can lead to a tactile attention shift towards a nearby body part. Here, we investigated whether the threat value of a visual stimulus modulates its effect on attention to touch. Participants made speeded discrimination responses about tactile stimuli presented to one or other hand, preceded by a picture cue (snake, spider, flower or mushroom) presented close to the same or the opposite hand. Pictures of snakes led to a significantly greater tactile attentional facilitation effect than did non-threatening pictures of flowers and mushrooms. Furthermore, there was a correlation between self-reported fear of snakes and spiders and the magnitude of early facilitation following cues of that type. These findings demonstrate that the attentional bias towards threat extends to the tactile modality and indicate that perceived threat value can modulate the cross-modal effect that a visual cue has on attention to touch.  相似文献   

13.
Previous studies of tactile spatial perception focussed either on a single point of stimulation, on local patterns within a single skin region such as the fingertip, on tactile motion, or on active touch. It remains unclear whether we should speak of a tactile field, analogous to the visual field, and supporting spatial relations between stimulus locations. Here we investigate this question by studying perception of large-scale tactile spatial patterns on the hand, arm and back. Experiment 1 investigated the relation between perception of tactile patterns and the identification of subsets of those patterns. The results suggest that perception of tactile spatial patterns is based on representing the spatial relations between locations of individual stimuli. Experiment 2 investigated the spatial and temporal organising principles underlying these relations. Experiment 3 showed that tactile pattern perception makes reference to structural representations of the body, such as body parts separated by joints. Experiment 4 found that precision of pattern perception is poorer for tactile patterns that extend across the midline, compared to unilateral patterns. Overall, the results suggest that the human sense of touch involves a tactile field, analogous to the visual field. The tactile field supports computation of spatial relations between individual stimulus locations, and thus underlies tactile pattern perception.  相似文献   

14.
Rubber-hand and virtual-hand illusions show that people can perceive body ownership for objects under suitable conditions. Bottom-up approaches assume that perceived ownership emerges from multisensory matching (e.g., between seen object and felt hand movements), whereas top-down approaches claim that novel body parts are integrated only if they resemble some part of a permanent internal body representation. We demonstrate that healthy adults perceive body ownership for a virtual balloon changing in size, and a virtual square changing in size or color, in synchrony with movements of their real hand. This finding is inconsistent with top-down approaches and amounts to an existence proof that non-corporeal events can be perceived as body parts if their changes are systematically related to one’s actions. It also implies that previous studies with passive-stimulation techniques might have underestimated the plasticity of body representations and put too much emphasis on the resemblance between viewed object and real hand.  相似文献   

15.
Tactile stimulus location is automatically transformed from somatotopic into external spatial coordinates, rendering information about the location of touch in three-dimensional space. This process is referred to as tactile remapping. Whereas remapping seems to occur automatically for the hands and feet, the fingers may constitute an exception in that some studies have implied purely somatotopic coding of touch to the fingers. When participants judge the order of two tactile stimuli, they often err when the stimulated body parts (usually the two hands) are crossed, presumably because somatotopic and external coordinates are in conflict in crossed postures. Using this task, we investigated, first, whether the fingers are unlike other limbs with regard to spatial coding, by testing whether crossing effects, indicative of external coding, were observable when stimulating two fingers, either on the same or on different hands. Second, we investigated the interaction of hand and finger posture in tactile localization of finger stimuli. Crossing effects emerged when fingers and hands were crossed, suggesting external coding for all body parts. Crossing effects were larger when both hand and finger were located in the hemifield opposite to their body side, and smaller when only hand or finger lay in the opposite hemifield. We suggest that tactile location is estimated by integrating the external location of all relevant body parts, here of a finger and its belonging hand, and that such integrative coding may represent a general principle for body part processing as well as for tool use.  相似文献   

16.
The participants in this study discriminated the position of tactile target stimuli presented at the tip or the base of the forefinger of one of the participants’ hands, while ignoring visual distractor stimuli. The visual distractor stimuli were presented from two circles on a display aligned with the tactile targets in Experiment 1 or orthogonal to them in Experiment 2. Tactile discrimination performance was slower and less accurate when the visual distractor stimuli were presented from incongruent locations relative to the tactile target stimuli (e.g., tactile target at the base of the finger with top visual distractor) highlighting a cross-modal congruency effect. We examined whether the presence and orientation of a simple line drawing of a hand, which was superimposed on the visual distractor stimuli, would modulate the cross-modal congruency effects. When the tactile targets and the visual distractors were spatially aligned, the modulatory effects of the hand picture were small (Experiment 1). However, when they were spatially misaligned, the effects were much larger, and the direction of the cross-modal congruency effects changed in accordance with the orientation of the picture of the hand, as if the hand picture corresponded to the participants’ own stimulated hand (Experiment 2). The results suggest that the two-dimensional picture of a hand can modulate processes maintaining our internal body representation. We also observed that the cross-modal congruency effects were influenced by the postures of the stimulated and the responding hands. These results reveal the complex nature of spatial interactions among vision, touch, and proprioception.  相似文献   

17.
The phenomenon of change blindness (the surprising inability of people to correctly perceive changes between consecutively presented displays), primarily reported in vision, has recently been shown to occur for positional changes presented in tactile displays as well. Here, we studied people's ability to detect changes in the number of tactile stimuli in successively presented displays composed of one to three stimuli distributed over the body surface. In Experiment 1, a tactile mask consisting of the simultaneous activation of all seven possible tactile stimulators was sometimes presented between the two to-be-discriminated tactile displays. In Experiment 2, a "mudsplash" paradigm was used, with a brief irrelevant tactile distractor presented at the moment of change of the tactile display. Change blindness was demonstrated in both experiments, thus showing that the failure to detect tactile change is not necessarily related to (1) the physical disruption between consecutive events, (2) the effect of masking covering the location of the change, or (3) the erasure or resetting of the information contained within an internal representation of the tactile display. These results are interpreted in terms of a limitation in the number of spatial locations/events that can be consciously accessed at any one time. This limitation appears to constrain change-detection performance, no matter the sensory modality in which the stimuli are presented.  相似文献   

18.
The phenomenon of change blindness (the surprising inability of people to correctly perceive changes between consecutively presented displays), primarily reported in vision, has recently been shown to occur for positional changes presented in tactile displays as well. Here, we studied people’s ability to detect changes in the number of tactile stimuli in successively presented displays composed of one to three stimuli distributed over the body surface. In Experiment 1, a tactile mask consisting of the simultaneous activation of all seven possible tactile stimulators was sometimes presented between the two to-be-discriminated tactile displays. In Experiment 2, a “mudsplash” paradigm was used, with a brief irrelevant tactile distractor presented at the moment of change of the tactile display. Change blindness was demonstrated in both experiments, thus showing that the failure to detect tactile change is not necessarily related to (1) the physical disruption between consecutive events, (2) the effect of masking covering the location of the change, or (3) the erasure or resetting of the information contained within an internal representation of the tactile display. These results are interpreted in terms of a limitation in the number of spatial locations/events that can be consciously accessed at any one time. This limitation appears to constrain change-detection performance, no matter the sensory modality in which the stimuli are presented.  相似文献   

19.
Adults show a deficit in their ability to localize tactile stimuli to their hands when their arms are in the less familiar, crossed posture. It is thought that this ‘crossed‐hands deficit’ arises due to a conflict between the anatomical and external spatial frames of reference within which touches can be encoded. The ability to localize a single tactile stimulus applied to one of the two hands across uncrossed‐hands and crossed‐hands postures was investigated in typically developing children (aged 4 to 6 years). The effect of posture was also compared across conditions in which children did, or did not, have visual information about current hand posture. All children, including the 4‐year‐olds, demonstrated the crossed‐hands deficit when they did not have sight of hand posture, suggesting that touch is located in an external reference frame by this age. In this youngest age group, when visual information about current hand posture was available, tactile localization performance was impaired specifically when the children's hands were uncrossed. We propose that this may be due to an early difficulty with integrating visual representations of the hand within the body schema.  相似文献   

20.
The rubber hand illusion (RHI) and its variant the invisible hand illusion (IHI) are useful for investigating multisensory aspects of bodily self‐consciousness. Here, we explored whether auditory conditioning during an RHI could enhance the trisensory visuo‐tactile‐proprioceptive interaction underlying the IHI. Our paradigm comprised of an IHI session that was followed by an RHI session and another IHI session. The IHI sessions had two parts presented in counterbalanced order. One part was conducted in silence, whereas the other part was conducted on the backdrop of metronome beats that occurred in synchrony with the brush movements used for the induction of the illusion. In a first experiment, the RHI session also involved metronome beats and was aimed at creating an associative memory between the brush stroking of a rubber hand and the sounds. An analysis of IHI sessions showed that the participants’ perceived hand position drifted more towards the body‐midline in the metronome relative to the silent condition without any sound‐related session differences. Thus, the sounds, but not the auditory RHI conditioning, influenced the IHI. In a second experiment, the RHI session was conducted without metronome beats. This confirmed the conditioning‐independent presence of sound‐induced proprioceptive drift in the IHI. Together, these findings show that the influence of visuo‐tactile integration on proprioceptive updating is modifiable by irrelevant auditory cues merely through the temporal correspondence between the visuo‐tactile and auditory events.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号