首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 19 毫秒
1.
A significant challenge in developing spatial representations for the control of action is one of multisensory integration. Specifically, we require an ability to efficiently integrate sensory information arriving from multiple modalities pertaining to the relationships between the acting limbs and the nearby external world (i.e. peripersonal space), across changes in body posture and limb position. Evidence concerning the early development of such spatial representations points towards the independent emergence of two distinct mechanisms of multisensory integration. The earlier-developing mechanism achieves spatial correspondence by representing body parts in their typical or default locations, and the later-developing mechanism does so by dynamically remapping the representation of the position of the limbs with respect to external space in response to changes in postural information arriving from proprioception and vision.  相似文献   

2.
The present study investigated how multisensory integration in peripersonal space is modulated by limb posture (i.e. whether the limbs are crossed or uncrossed) and limb congruency (i.e. whether the observed body part matches the actual position of one’s limb). This was done separately for the upper limbs (Experiment 1) and the lower limbs (Experiment 2). The crossmodal congruency task was used to measure peripersonal space integration for the hands and the feet. It was found that the peripersonal space representation for the hands but not for the feet is dynamically updated based on both limb posture and limb congruency. Together these findings show how dynamic cues from vision, proprioception, and touch are integrated in peripersonal limb space and highlight fundamental differences in the way in which peripersonal space is represented for the upper and lower extremity.  相似文献   

3.
de Vignemont  Fr&#;d&#;rique 《Synthese》2018,198(17):4027-4044

Philosophy of perception is guilty of focusing on the perception of far space, neglecting the possibility that the perception of the space immediately surrounding the body, which is known as peripersonal space, displays different properties. Peripersonal space is the space in which the world is literally at hand for interaction. It is also the space in which the world can become threatening and dangerous, requiring protective behaviours. Recent research in cognitive neuroscience has yielded a vast array of discoveries on the multisensory and sensorimotor specificities of the processing of peripersonal space. Yet very little has been done on their philosophical implications. Here I will raise the following question: in what manner does the visual experience of a big rock close to my foot differ from the visual experience of the moon in the sky?

  相似文献   

4.
How do people learn multisensory, or amodal, representations, and what consequences do these representations have for perceptual performance? We address this question by performing a rational analysis of the problem of learning multisensory representations. This analysis makes use of a Bayesian nonparametric model that acquires latent multisensory features that optimally explain the unisensory features arising in individual sensory modalities. The model qualitatively accounts for several important aspects of multisensory perception: (a) it integrates information from multiple sensory sources in such a way that it leads to superior performances in, for example, categorization tasks; (b) its performances suggest that multisensory training leads to better learning than unisensory training, even when testing is conducted in unisensory conditions; (c) its multisensory representations are modality invariant; and (d) it predicts ‘‘missing” sensory representations in modalities when the input to those modalities is absent. Our rational analysis indicates that all of these aspects emerge as part of the optimal solution to the problem of learning to represent complex multisensory environments.  相似文献   

5.
According to embodied cognition, bodily interactions with our environment shape the perception and representation of our body and the surrounding space, that is, peripersonal space. To investigate the adaptive nature of these spatial representations, we introduced a multisensory conflict between vision and proprioception in an immersive virtual reality. During individual bimanual interaction trials, we gradually shifted the visual hand representation. As a result, participants unknowingly shifted their actual hands to compensate for the visual shift. We then measured the adaptation to the invoked multisensory conflict by means of a self-localization and an external localization task. While effects of the conflict were observed in both tasks, the effects systematically interacted with the type of localization task and the available visual information while performing the localization task (i.e., the visibility of the virtual hands). The results imply that the localization of one’s own hands is based on a multisensory integration process, which is modulated by the saliency of the currently most relevant sensory modality and the involved frame of reference. Moreover, the results suggest that our brain strives for consistency between its body and spatial estimates, thereby adapting multiple, related frames of reference, and the spatial estimates within, due to a sensory conflict in one of them.  相似文献   

6.
Berti  Anna 《Cognitive processing》2021,22(1):121-126

Years ago, it was demonstrated (e.g., Rizzolatti et al. in Handbook of neuropsychology, Elsevier Science, Amsterdam, 2000) that the brain does not encode the space around us in a homogeneous way, but through neural circuits that map the space relative to the distance that objects of interest have from the body. In monkeys, relatively discrete neural systems, characterized by neurons with specific neurophysiological responses, seem to be dedicated either to represent the space that can be reached by the hand (near/peripersonal space) or to the distant space (far/extrapersonal space). It was also shown that the encoding of spaces has dynamic aspects because they can be remapped by the use of tools that trigger different actions (e.g., Iriki et al. 1998). In this latter case, the effect of the tool depends on the modulation of personal space, that is the space of our body. In this paper, I will review and discuss selected research, which demonstrated that also in humans: 1 spaces are encoded in a dynamic way; 2 encoding can be modulated by the use of tool that the system comes to consider as parts of the own body; 3 body representations are not fixed, but they are fragile and subject to change to the point that we can incorporate not only the tools necessary for action, but even limbs belonging to other people. What embodiment of tools and of alien limb tell us about body representations is then briefly discussed.

  相似文献   

7.
Neurophysiological data indicate that the reachable peripersonal space and the unreachable extrapersonal space are represented in segregated parietofrontal circuits and that when the unreachable space becomes reachable because of tool use, it is automatically coded by the network selective for peripersonal space. Here we directly tested the role of action's consequences in space coding. Thirty-eight participants bisected lines at either a reachable distance (60 cm) or unreachable distance (120 cm) using either a laser pointer or laser cutter. The laser cutter but not the laser pointer had an action consequence; the line broke into two pieces. The results showed that distance moderated the effect of action. At an unreachable distance, the mean bisection point was closer to the centre when participants used the laser cutter compared to when they used the laser pointer. There were no differences at a reachable distance (60 cm). This result suggests that the space in which the individual may determine a physical consequence is categorized as peripersonal space, independently from its actual distance from the individual's body.  相似文献   

8.
The aim of this study was to explore whether the attentional system, as far as an endogenous orienting is concerned, allocates resources along the sagittal plane and whether such a process is affected by, and is likely to be based on, different functional representations of 3D space in the brain. Several models make a main action-based distinction between representations of peripersonal space and of those extrapersonal space. Accordingly, if attention has to move from one representation to another, it should be possible to observe a decrease in performance during such a transition. To test this hypothesis three experiments were run in which participants performed a cued detection task. Cue stimuli were informative and were centrally located around the fixation point. Target stimuli were displayed at four different depth planes. In the first experiment, assuming that the border between the peripersonal space and the extrapersonal space was at 1 m from the observer, half the target stimuli were located in the peripersonal space and half in the extrapersonal space. The fixation point was located at 1 m from the observer. In the second experiment, the fixation point was moved at 2 m from the observer in order to rule out the possible effects of ocular motor programming. In the third experiment, in order to rule out effects related to the spatial layout of target stimuli (i.e., centre of mass effect) two target stimuli were located in the peripersonal space and six in the extrapersonal space. In all the experiments, besides a validity effect, we observed greater reaction times when attention shift was across spatial representations than when it was within the same representation. The implications for action-oriented models of attention are discussed.  相似文献   

9.
Previous studies have suggested that the perception of peripersonal space can hardly be achieved using only scene-based visual cues and requires combining visual information with motor representations. Motor representation can be viewed as a component of a predictive system, which includes a neural process that simulates through motor imagery the dynamic behaviour of the body in relation to the environment. In this study, we analysed whether modifying the force required to reach a visual target influences the perception of what is reachable. In a visuomotor task, the experimental group (n = 10) adapted to a 1.5 kg weight attached to the right wrist while performing a series of pointing movements. The control group (n = 10) performed the motor task without inertial perturbation. A perceptual judgement task of what is reachable was performed before and after the motor task. Results showed that inertial perturbation produced initially an undershoot of the target suggesting a lack of motor force to overcome the inertial perturbation, but spatial errors receded progressively through movement rehearsal. Perceptual estimates of what is reachable slightly overestimated action capacities but were not affected by motor adaptation. Thus, modifying motor force required to compensate for inertial perturbation had no direct effect on the perception of peripersonal space. When interpreted in regard to previous experimental work, this result suggests that motor representations may provide information about the sensory or spatial consequences of action rather than the sense of effort associated with motor production.  相似文献   

10.
Research on visuospatial memory has shown that egocentric (subject-to-object) and allocentric (object-to-object) reference frames are connected to categorical (non-metric) and coordinate (metric) spatial relations, and that motor resources are recruited especially when processing spatial information in peripersonal (within arm reaching) than extrapersonal (outside arm reaching) space. In order to perform our daily-life activities, these spatial components cooperate along a continuum from recognition-related (e.g., recognizing stimuli) to action-related (e.g., reaching stimuli) purposes. Therefore, it is possible that some types of spatial representations rely more on action/motor processes than others. Here, we explored the role of motor resources in the combinations of these visuospatial memory components. A motor interference paradigm was adopted in which participants had their arms bent behind their back or free during a spatial memory task. This task consisted in memorizing triads of objects and then verbally judging what was the object: (1) closest to/farthest from the participant (egocentric coordinate); (2) to the right/left of the participant (egocentric categorical); (3) closest to/farthest from a target object (allocentric coordinate); and (4) on the right/left of a target object (allocentric categorical). The triads appeared in participants' peripersonal (Experiment 1) or extrapersonal (Experiment 2) space. The results of Experiment 1 showed that motor interference selectively damaged egocentric-coordinate judgements but not the other spatial combinations. The results of Experiment 2 showed that the interference effect disappeared when the objects were in the extrapersonal space. A third follow-up study using a within-subject design confirmed the overall pattern of results. Our findings provide evidence that motor resources play an important role in the combination of coordinate spatial relations and egocentric representations in peripersonal space.  相似文献   

11.
The aim of this study was to explore the role of motor resources in peripersonal space encoding: are they intrinsic to spatial processes or due to action potentiality of objects? To answer this question, we disentangled the effects of motor resources on object manipulability and spatial processing in peripersonal and extrapersonal spaces. Participants had to localize manipulable and non-manipulable 3-D stimuli presented within peripersonal or extrapersonal spaces of an immersive virtual reality scenario. To assess the contribution of motor resources to the spatial task a motor interference paradigm was used. In Experiment 1, localization judgments were provided with the left hand while the right dominant arm could be free or blocked. Results showed that participants were faster and more accurate in localizing both manipulable and non-manipulable stimuli in peripersonal space with their arms free. On the other hand, in extrapersonal space there was no significant effect of motor interference. Experiment 2 replicated these results by using alternatively both hands to give the response and controlling the possible effect of the orientation of object handles. Overall, the pattern of results suggests that the encoding of peripersonal space involves motor processes per se, and not because of the presence of manipulable stimuli. It is argued that this motor grounding reflects the adaptive need of anticipating what may happen near the body and preparing to react in time.  相似文献   

12.
《Acta psychologica》2013,142(2):177-183
A key tool for investigating body ownership is the rubber hand illusion, in which synchronous multisensory feedback can induce feelings of ownership over a fake hand. Much research in the field aims to tease apart the mechanisms that underlie this phenomenon. Currently there is conflicting evidence as to whether increasing the distance between the real and fake hands (within reaching space) can reduce the illusion. The current study examines this further by modulating, not only the absolute distance between the real and fake hands but also their relative distance from body midline. It is found that the strength of the illusion is reduced only when the fake hand is both far from the real hand and far from the trunk; illusion scores over a fake hand in the same position can then be increased by moving the real hand nearer. This is related to peripersonal space surrounding the trunk and the hand. Subjective disownership of the real hand, and proprioceptive drift measures were also taken and may be driven by different mechanisms.  相似文献   

13.
In order to code visual peripersonal space, human and non-human primates need an integrated system that controls both visual and tactile inputs within peripersonal space around the face and the hand, based on visual experience of body parts. The existence of such a system in humans has been demonstrated, and there is evidence showing that visual peripersonal space relating to the hand has important dynamic properties, for example, it can be expanded and contracted depending on tool use. There is also evidence for a high degree of functional similarity between the characteristics of the visual peripersonal space in humans and in monkeys.  相似文献   

14.
Spinal cord injury can cause cognitive impairments even when no cerebral lesion is appreciable. As patients are forced to explore the environment in a non-canonical position (i.e., seated on a wheelchair), a modified relation with space can explain motor-related cognitive differences compared to non-injured individuals. Peripersonal space is encoded in motor terms, that is, in relation to the representation of action abilities and is strictly related to the affordance of reachability. In turn, affordances, the action possibilities suggested by relevant properties of the environment, are related to the perceiver's peripersonal space and motor abilities. One might suppose that these motor-related cognitive abilities are compromised when an individual loses the ability to move. We shed light on this issue in 10 patients with paraplegia and 20 matched controls. All have been administered an affordances-related reachability judgement task adapted from Costantini, Ambrosini, Tieri, Sinigaglia, and Committeri (2010, Experimental Brain Research, 207, 95) and neuropsychological tests. Our findings demonstrate that patients and controls show the same level of accuracy in estimating the location of their peripersonal space boundaries, but only controls show the typical overestimation of reaching range. Secondly, patients show a higher variability in their judgements than controls. Importantly, this finding is related to the patients’ ability to perform everyday tasks. Finally, patients are not faster in making their judgements on reachability in peripersonal space, while controls are. Our results suggest that not moving freely or as usual in the environment impact decoding of action-related properties even when the upper limbs are not compromised.  相似文献   

15.
Experiment 1 investigated whether tool use can expand the peripersonal space into the very far extrapersonal space. Healthy participants performed line bisection in peripersonal and extrapersonal space using wooden sticks up to a maximum of 240 cm. Participants misbisected to the left of the true midpoint, both for lines presented in peripersonal and for those presented in extrapersonal space, confirming a peripersonal space expansion up to a distance of 240 cm. Experiment 2 investigated whether arm position could influence the perception of peripersonal and extrapersonal space during tool use. Participants performed line bisection in the peripersonal and in the extrapersonal space (up to a maximum of 120 cm) using wooden sticks in two different conditions: either with the arm bent or with the arm stretched. Results showed stronger pseudoneglect in the stretched arm condition.  相似文献   

16.
Perception of affordance is enhanced not only when that object is located in one’s own peripersonal space, as compared to when it is located within extrapersonal space, but also when the object is located in another person’s peripersonal space [as measured by a spatial alignment effect (SAE)]. It has been suggested that this reflects the existence of an interpersonal body representation (IBR) that allows us to represent the perceptual states and action possibilities of others. Here, we address the question of whether IBR can be modulated by higher level/reflective social cognition, such as judgments about one’s own social status. Participants responded with either the right or the left hand as soon as a go signal appeared. The go signal screen contained a task-irrelevant stimulus consisting of a 3D scene in which a mug with a left- or right-facing handle was positioned on a table. The mug was positioned either inside or outside the reaching space of the participants. In a third of the trials, the mug was positioned within the reaching space of an avatar seated at the table. Prior to this task we induced an experience of social ostracism in half of the participants by means of a standardized social exclusion condition. The results were that the SAE that normally occurs when the mug is in the avatar’s reaching space is extinguished by the induced social exclusion. This indicates that judgments about one’s own social status modulate the effect of IBR.  相似文献   

17.
It has been proposed that one means of understanding a person's current behaviour and predicting future actions is by simulating their actions. That is, when another person's actions are observed, similar motor processes are activated in the observer. For example, after observing a reach over an obstacle, a person's subsequent reach trajectory is more curved, reflecting motor priming. Importantly, such motor states are only activated if the observed action is in near (peripersonal) space. However, we demonstrate that when individuals share action environments, simulation of another person's obstacle avoiding reach path takes place even when the action is in far (extrapersonal) space. We propose that action simulation is influenced by factors such as ownership. When an "owned" object is a potential future obstacle, even when it is viewed beyond current action space, simulations are evoked, and these leave a more stable memory capable of influencing future behaviour.  相似文献   

18.
It has been proposed that one means of understanding a person's current behaviour and predicting future actions is by simulating their actions. That is, when another person's actions are observed, similar motor processes are activated in the observer. For example, after observing a reach over an obstacle, a person's subsequent reach trajectory is more curved, reflecting motor priming. Importantly, such motor states are only activated if the observed action is in near (peripersonal) space. However, we demonstrate that when individuals share action environments, simulation of another person's obstacle avoiding reach path takes place even when the action is in far (extrapersonal) space. We propose that action simulation is influenced by factors such as ownership. When an “owned” object is a potential future obstacle, even when it is viewed beyond current action space, simulations are evoked, and these leave a more stable memory capable of influencing future behaviour.  相似文献   

19.
Embodiment, spatial categorisation and action   总被引:1,自引:0,他引:1  
Despite the subjective experience of a continuous and coherent external world, we will argue that the perception and categorisation of visual space is constrained by the spatial resolution of the sensory systems but also and above all, by the pre-reflective representations of the body in action. Recent empirical data in cognitive neurosciences will be presented that suggest that multidimensional categorisation of perceptual space depends on body representations at both an experiential and a functional level. Results will also be resumed that show that representations of the body in action are pre-reflective in nature as only some aspects of the pre-reflective states can be consciously experienced. Finally, a neuro-cognitive model based on the integration of afferent and efferent information will be described, which suggests that action simulation and associated predicted sensory consequences may represent the underlying principle that enables pre-reflective representations of the body for space categorisation and selection for action.  相似文献   

20.
Audiotactile multisensory interactions in human information processing   总被引:1,自引:0,他引:1  
Abstract:  The last few years has seen a very rapid growth of interest in how signals from different sensory modalities are integrated in the brain to form the unified percepts that fill our daily lives. Research on multisensory interactions between vision, touch, and proprioception has revealed the existence of multisensory spatial representations that code the location of external events relative to our own bodies. In this review, we highlight recent converging evidence from both human and animal studies that has revealed that spatially-modulated multisensory interactions also occur between hearing and touch, especially in the space immediately surrounding the head. These spatial audiotactile interactions for stimuli presented close to the head can affect not only the spatial aspects of perception, but also various other non-spatial aspects of audiotactile information processing. Finally, we highlight some of the most important questions for future research in this area.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号