首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Representation of visuotactile space in the split brain   总被引:3,自引:0,他引:3  
Recent neurophysiological research in the monkey has revealed bimodal neuronal cells with both tactile receptive fields on the hand and visual receptive fields that follow the hands as they move, suggesting the existence of a bimodal map of visuotactile space. Using a cross-modal congruency task, we examined the representation of visuotactile space in normal people and in a split-brain patient (J.W.) as the right arm assumed different postures. The results showed that the congruency effects from distracting lights followed the hand around in space in normal people, but failed to do so in the split-brain patient when the hand crossed the midline. This suggests that cross-cortical connections are required to remap visual space to the current hand position when the hand crosses the midline.  相似文献   

2.
The participants in this study discriminated the position of tactile target stimuli presented at the tip or the base of the forefinger of one of the participants’ hands, while ignoring visual distractor stimuli. The visual distractor stimuli were presented from two circles on a display aligned with the tactile targets in Experiment 1 or orthogonal to them in Experiment 2. Tactile discrimination performance was slower and less accurate when the visual distractor stimuli were presented from incongruent locations relative to the tactile target stimuli (e.g., tactile target at the base of the finger with top visual distractor) highlighting a cross-modal congruency effect. We examined whether the presence and orientation of a simple line drawing of a hand, which was superimposed on the visual distractor stimuli, would modulate the cross-modal congruency effects. When the tactile targets and the visual distractors were spatially aligned, the modulatory effects of the hand picture were small (Experiment 1). However, when they were spatially misaligned, the effects were much larger, and the direction of the cross-modal congruency effects changed in accordance with the orientation of the picture of the hand, as if the hand picture corresponded to the participants’ own stimulated hand (Experiment 2). The results suggest that the two-dimensional picture of a hand can modulate processes maintaining our internal body representation. We also observed that the cross-modal congruency effects were influenced by the postures of the stimulated and the responding hands. These results reveal the complex nature of spatial interactions among vision, touch, and proprioception.  相似文献   

3.
Two experiments employing subjects with different experience in tactile discrimination (blind and seeing subjects) were carried out to investigate the effect of the space location of stimuli on the information processing activity of the two cerebral hemispheres. An angle discrimination task that yields a right hemisphere superiority was used. In Experiment 1, seeing subjects showed a general superiority of the left hand (right hemisphere) which was more pronounced in the left hemispace with respect to the central and the right hemispace performance. In Experiment 2, blind subjects showed a significant superiority of the left hand in the central and in the left hemispace and no difference between the two hands in the right hemispace. In both experiments hemispace differences were due only to the modification of the left hand (right hemisphere) performance. These results suggest that the hemispace control by the contralateral hemisphere interacts only with the activity of the hemisphere dominant in the information processing.  相似文献   

4.
The aim of this study was to assess whether perceptual representation along the horizontal axis is affected by hemispace position of the stimulus or by orienting attention to one side. Ten control subjects and 10 right brain damaged patients with left unilateral spatial neglect (USN) were asked to bisect lines of five lengths in three space positions (left, center, right) and under three cueing conditions (no cue, left cue, right cue). Normal controls showed significant displacement of bisection opposite to the side of hemispace presentation and toward the side of cueing. USN patients showed a bisection error toward the right end which increased with lines placed in the left hemispace and decreased with lines placed in the right hemispace and when attention was oriented toward the left side. We conclude that (1) In absence of cues normal subjects tend to overestimate the portions of space closer to their body midline; (2) both normal and USN patients tend to overestimate portions of space that they direct their attention to; (3) USN patients' performance without cueing is consistent with an attentional shift toward the right hemispace implying a gradient of overestimation of the right-most portions of space. A common neural substratum for directing attention and space representation can explain these findings.  相似文献   

5.
Across three experiments, participants made speeded elevation discrimination responses to vibrotactile targets presented to the thumb (held in a lower position) or the index finger (upper position) of either hand, while simultaneously trying to ignore visual distractors presented independently from either the same or a different elevation. Performance on the vibrotactile elevation discrimination task was slower and less accurate when the visual distractor was incongruent with the elevation of the vibrotactile target (e.g., a lower light during the presentation of an upper vibrotactile target to the index finger) than when they were congruent, showing that people cannot completely ignore vision when selectively attending to vibrotactile information. We investigated the attentional, temporal, and spatial modulation of these cross-modal congruency effects by manipulating the direction of endogenous tactile spatial attention, the stimulus onset asynchrony between target and distractor, and the spatial separation between the vibrotactile target, any visual distractors, and the participant’s two hands within and across hemifields. Our results provide new insights into the spatiotemporal modulation of crossmodal congruency effects and highlight the utility of this paradigm for investigating the contributions of visual, tactile, and proprioceptive inputs to the multisensory representation of peripersonal space.  相似文献   

6.
The present paper describes a performance method for determining hand preference. The task requires participants to reach into different regions of hemispace to perform various actions (point, pick up, toss, sweep, and position) with a dowel located at each position. In accordance with the participants' hand preference as measured by the Waterloo Handedness Questionnaire, the preferred hand was used more frequently on the various performance tasks. The distribution of hand use in working space indicates that preferred hand use was almost exclusive for actions carried out in ipsilateral hemispace, while it is used only moderately for actions in contralateral hemispace, revealing that this hand is used throughout a wider range of extrapersonal space than the nonpreferred hand. These trends were observed across all of the performance tasks, suggesting that task complexity did not affect the frequency of preferred hand use either overall or, more specifically, in right hemispace, as was predicted. This finding is inconsistent with empirical work on questionnaires indicating that verbal reports of preferred hand use increase for more complex tasks (e.g., Steenhuis & Bryden, 1988). As well, performance on the preferential reaching task correlated significantly with hand preference as measured on the Waterloo Handedness Questionnaire (Bryden, 1977), unlike the other performance measure examined, indicating that the preferential reaching task is sensitive to differences in the degree of hand preference.  相似文献   

7.
We investigated the effect of unseen hand posture on cross-modal, visuo-tactile links in covert spatial attention. In Experiment 1, a spatially nonpredictive visual cue was presented to the left or right hemifield shortly before a tactile target on either hand. To examine the spatial coordinates of any cross-modal cuing, the unseen hands were either uncrossed or crossed so that the left hand lay to the right and vice versa. Tactile up/down (i.e., index finger/thumb) judgments were better on the same side of external space as the visual cue, for both crossed and uncrossed postures. Thus, which hand was advantaged by a visual cue in a particular hemifield reversed across the different unseen postures. In Experiment 2, nonpredictive tactile cues now preceded visual targets. Up/down judgments for the latter were better on the same side of external space as the tactile cue, again for both postures. These results demonstrate cross-modal links between vision and touch in exogenous covert spatial attention that remap across changes in unseen hand posture, suggesting a modulatory role for proprioception.  相似文献   

8.
Do patients with unilateral neglect exhibit direction-specific deficits in the control of movement velocity when performing goal-directed arm movements? Five patients with left-sided neglect performed unrestrained three-dimensional pointing movements to visual targets presented at body midline, the left and right hemispace. A group of healthy adults and a group of patients with right-hemispheric brain damage but no neglect served as controls. Pointing was performed under normal room light or in darkness. Time-position data of the hand were recorded with an opto-electronic camera system. We found that compared to healthy controls, movement times were longer in both patient groups due to prolonged acceleration and deceleration phases. Tangential peak hand velocity was lower in both patient groups, but not significantly different from controls. Single peak, bell-shaped velocity profiles of the hand were preserved in all right hemispheric patients and in three out of five neglect patients. Most important, the velocity profiles of neglect patients to leftward targets did not differ significantly from those to targets in the right hemispace. In summary, we found evidence for general bradykinesia in neglect patients, but not for a direction-specific deficit in the control of hand velocity. We conclude that visual neglect induces characteristic changes in exploratory behavior, but not in the kinematics of goal-directed movements to objects in peripersonal space.  相似文献   

9.
In mirror reflections, visual stimuli in near peripersonal space (e.g., an object in the hand) can project the retinal image of far, extrapersonal stimuli "beyond" the mirror. We studied the interaction of such visual reflections with tactile stimuli in a cross-modal congruency task. We found that visual distractors produce stronger interference on tactile judgments when placed close to the stimulated hand, but observed indirectly as distant mirror reflections, than when directly observed in equivalently distant far space, even when in contact with a dummy hand or someone else's hand in the far location. The stronger visual-tactile interference for the mirror condition implies that near stimuli seen as distant reflections in a mirror view of one's own hands can activate neural networks coding peripersonal space, because these visual stimuli are coded as having a true source near to the body.  相似文献   

10.
The rubber hand illusion represents an illusory experience during the mislocalization of own hand when correlated visuotactile stimuli are presented to the actual and fake hands. The visuotactile integration process appears to cause this illusion; the corresponding brain activity was revealed in many studies. In this study, we investigated the effect of the rubber hand illusion on the crossmodal integration process by measuring EEG. The participants who experienced less intensive illusion showed greater congruency effect on reaction time (RT), greater power increase at the parietal zero electrode (Pz) and smaller interelectrode synchrony of the gamma band activity. On the other hand, the participants who experienced more intense illusion showed greater interelectrode synchrony. The results suggested that the gamma band activity in the parietal area reflects the visuotactile integration process and that its synchrony causes the illusory intensity.  相似文献   

11.
The mental representation of numbers along a line oriented left to right affects spatial cognition, facilitating responses in the ipsilateral hemispace (the spatial-numerical association of response codes [SNARC] effect). We investigated whether the number/space association is the result of an attentional shift or response selection. Previous research has often introduced covert left/right response cues by presenting targets to the left or the right. The present study avoided left/right cues by requiring forced choice upper/lower luminance discriminations to two mirror-reversed luminance gradients (the grayscale task). The grayscale stimuli were overlaid with strings of (1) low numbers, (2) high numbers, and (3) nonnumerical characters. In Experiment 1, 20 dextrals judged the number’s magnitude and then indicated whether the upper/lower grayscale was darker. Results showed leftward and rightward attentional biases for low and high numbers, respectively. Demands to process numbers along a left/right line were made less explicit in Experiment 2 (N=8 dextrals), using (1) a parity judgment and (2) arbitrary linguistic labels for top/bottom. Once again, a spatial congruency effect was observed. Because the response (up/down) was orthogonal to the dimension of interest (left/right), the effect of number cannot be attributed to late-stage response congruencies. This study required unspeeded responses to stimuli presented in free vision, whereas other experiments have used speeded responses. Understanding the time course of number-space effects may, therefore, be important to the debate associated with response selection.  相似文献   

12.
Performance-based measures of hand preference have been developed as an objective method of examining handedness. Previous research using this method showed that both skill demands and the position of the object in working space affect preferential hand reaching. Specifically, preferred hand reaches predominated in left hemispace, in spite of the biomechanical inefficiency involved in reaching across the body midline. This was mediated by the skill demands, with a higher frequency of preferred hand reaches for tasks requiring more skill. To further examine this issue, we increased the task skill demands. Twenty-two right-handed adults reached for five tools located in an array of five positions in front of them. Participants were required to pick up the tool, pick up and demonstrate how to use it, or pick up and actually use the tool on the materials provided. The results showed that the frequency of right hand reaches was greatest for the tool use condition. This effect was mediated by the position of the object in hemispace, with more right hand reaches occurring for the Use task in left hemispace than the other tasks, in support of our previous work.  相似文献   

13.
Response latencies were measured to vibrotactile stimulation delivered to the forefingers of the left or right hands which were positioned ipsilaterally or contralaterally (across the midline) in left or right hemispace. While the two hands did not differ in speed of response, either hand performed better when located in right hemispace (experiment 1). This effect was greatly reduced, though not eliminated, with 90 degrees lateral head turn, when performance was better with stimulation and responding in right-of-head hemispace, but not right-of-body hemispace (experiment 2). When different hands received stimulation and initiated responses, and were located in either the same or opposite hemispace, right-hemispace superiority was found to be motor rather than sensory (experiment 3). These findings are discussed in the context of the true and the phenomenological midline and the clinical syndrome of hemineglect.  相似文献   

14.
A difference in the perception of extrapersonal space has been shown to exist between dextrals and sinistrals. On the classical line bisection task, this difference is evident in a greater left bias for dextrals compared to sinistrals. Different modalities and regions of space can be affected. However, it has not yet been investigated whether a systematic bias also exists for the perception of personal or body space. We investigated this by using three tasks which assess different aspects of personal space in an implicit and explicit way. These tasks were performed by strongly right-handed (dextrals), strongly left-handed (sinistrals) and mixed-handed participants. First, a task of pointing to three areas of one’s own body without the use of visual information showed dextrals to have an asymmetric estimation of their body. In right hemispace, dextrals’ pointing was at a greater distance from the midsagittal plane compared to pointing in left hemispace. No such asymmetry was present for sinistrals, while mixed-handers’ performance was intermediate to that of strong right- and strong left-handers. Second, a task of recovering circular patches from their body surface whilst blindfolded also showed superior performance of sinistrals compared to dextrals. On these two tasks, there was also a moderate relationship between handedness scores and performance measures. Third, a computer-based task of adjusting scaled body-outline-halves showed no handedness differences. Overall, these findings suggest handedness differences in the implicit but not explicit representation of one’s own body space. Possible mechanisms underlying the handedness differences shown for the implicit tasks are a stronger lateralization or a greater activation imbalance for dextrals and/or greater access to right hemispheric functions, such as an “up-to-date body” representation, by sinistrals. In contrast, explicit measures of how body space is represented may not be affected due to their relying on a different processing pathway.  相似文献   

15.
Strong cross-modal interactions exist between visual and auditory processing. The relative contributions of perceptual versus decision-related processes to such interactions are only beginning to be understood. We used methodological and statistical approaches to control for potential decision-related contributions such as response interference, decisional criterion shift, and strategy selection. Participants were presented with rising-, falling-, and constant-amplitude sounds and were asked to detect change (increase or decrease) in sound amplitude while ignoring an irrelevant visual cue of a disk that grew, shrank, or stayed constant in size. Across two experiments, testing context was manipulated by varying the grouping of visual cues during testing, and cross-modal congruency showed independent perceptual and decision-related effects. Whereas a change in testing context greatly affected criterion shifts, cross-modal effects on perceptual sensitivity remained relatively consistent. In general, participants were more sensitive to increases in sound amplitude and less sensitive to sounds paired with dynamic visual cues. As compared with incongruent visual cues, congruent cues enhanced detection of amplitude decreases, but not increases. These findings suggest that the relative contributions of perceptual and decisional processing and the impacts of these processes on cross-modal interactions can vary significantly depending on asymmetries in within-modal processing, as well as consistencies in cross-modal dynamics.  相似文献   

16.
Seeing one's own body (either directly or indirectly) can influence visuotactile crossmodal interactions. Recently, it has been shown that even viewing a simple line drawing of a hand can also modulate such crossmodal interactions, as if the picture of the hand somehow corresponds to (or primes) the participants' own hand. Alternatively, however, it could be argued that the modulatory effects of viewing the picture of a hand on visuotactile interactions might simply be attributed to cognitive processes such as the semantic referral to the relevant body part or to the orientation cues provided by the hand picture instead. In the present study, we evaluated these various different interpretations of the hand picture effect. Participants made speeded discrimination responses to the location of brief vibrotactile targets presented to either the tip or base of their forefinger, while trying to ignore simultaneously-presented visual distractors presented to either side of central fixation. We compared the modulatory effect of the picture of a hand with that seen when the visual distractors were presented next to words describing the tip and base of the forefinger (Experiment 1), or were superimposed over arrows which provided another kind of directional cue (Experiment 2). Tactile discrimination performance was modulated in the hand picture condition, but not in the word or arrow conditions. These results therefore suggest that visuotactile interactions are specifically modulated by the image of the hand rather than by cognitive cues such as simply semantic referral to the relevant body sites and/or any visual orientation cues provided by the picture of a hand.  相似文献   

17.
It is well established that temporal events are represented on a spatially oriented mental time line from left to right. Depending on the task characteristics, the spatial representation of time may be linked to different types of dimensions, including manual response codes and physical space codes. The aim of the present study was to analyze whether manual response and physical space codes are independent of each other or whether they interact when both types of information are involved in the task. The participants performed a temporal estimation task with two lateralized response buttons in four experiments. In the first experiment, the target stimuli were presented on the left side, at the center, or on the right side of the space, whereas the reference stimuli were always presented centrally. The reverse situation was presented in the second experiment. In the third experiment, both stimuli were presented in opposite spatial positions (e.g., left–right), whereas in the last experiment, both stimuli were presented in the same spatial position (e.g., left–left). In all experiments, perceptual and motor congruency effects were found, but no modulation of the congruency effects was found when both the perceptual and motor components were congruent. The results indicated that physical, spatial, and manual response codes are independent from each other for time–space associations, even when both codes are involved in the task. These results are discussed in terms of the “intermediate-coding” account.  相似文献   

18.
Visual line bisection was investigated in 26 sinistral and 24 dextral subjects as a function of hemispace, hand and scan direction. An ANOVA revealed significant main effects for hand preference, due to the mean bisection errors of dextral subjects being significantly leftward of those of sinistral subjects; for hand, due to the bisection errors of the left hand being significantly to the left of the right hand; and for scan, due to the bisection errors following a left scan being significantly to the left of a right scan. One significant interaction was found, that between hand and direction of scan, due to a significant difference between left and right hands following a scan from the left but not following a scan from the right. For dextral subjects the leftward bisection errors of the left and right hands following a scan from the left, but not for a scan from the right, differed significantly from the midpoint. For sinistral subjects the leftward bisection errors following a scan from the left and rightward bisection errors following a scan from the right differed significantly from the midpoint for the left hand but not for the right hand. No significant main effect or interactions for hemispace were found. This confirms that both sinistral and dextral subjects display pseudoneglect when using their preferred hand and scanning from the left. However, sinistrals, but not dextrals, will display reversed pseudoneglect when using their preferred hand and adopting a scan direction from the right. These results are discussed in terms of the interaction between three factors, whose influence can jointly and severally produce misbisections, hemispheric specialisation for visuospatial function, hemispheric activation for a manual response, and the allocation of visual attention.  相似文献   

19.
The present study describes a developmental performance measure of hand preference that considers task complexity and position in hemispace. Eighty right-handed children and adults (ages 3-4, 6-7, 9-10, 18-24) were observed for hand selection responses to 2 unimanual tasks (simple vs complex) across positions in hemispace. Results revealed an age-related trend in the tendency to use the preferred hand in right and left hemispace. While the adult's and 3- to 4-year-old's preferred hand use decreased as they moved into left hemispace, children between the ages of 6 and 10 years tended to use their preferred hands consistently throughout both regions of hemispace. The relationship between hand preference and skilled, cost-efficient performance throughout development are discussed.  相似文献   

20.
The metaphoric mapping theory suggests that abstract concepts, like time, are represented in terms of concrete dimensions such as space. This theory receives support from several lines of research ranging from psychophysics to linguistics and cultural studies; especially strong support comes from recent response time studies. These studies have reported congruency effects between the dimensions of time and space indicating that time evokes spatial representations that may facilitate or impede responses to words with a temporal connotation. The present paper reports the results of three linguistic experiments that examined this congruency effect when participants processed past- and future-related sentences. Response time was shorter when past-related sentences required a left-hand response and future-related sentences a right-hand response than when this mapping of time onto response hand was reversed (Experiment 1). This result suggests that participants can form time–space associations during the processing of sentences and thus this result is consistent with the view that time is mentally represented from left to right. The activation of these time–space associations, however, appears to be non-automatic as shown by the results of Experiments 2 and 3 when participants were asked to perform a non-temporal meaning discrimination task.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号