首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 625 毫秒
1.
Research on visuospatial memory has shown that egocentric (subject-to-object) and allocentric (object-to-object) reference frames are connected to categorical (non-metric) and coordinate (metric) spatial relations, and that motor resources are recruited especially when processing spatial information in peripersonal (within arm reaching) than extrapersonal (outside arm reaching) space. In order to perform our daily-life activities, these spatial components cooperate along a continuum from recognition-related (e.g., recognizing stimuli) to action-related (e.g., reaching stimuli) purposes. Therefore, it is possible that some types of spatial representations rely more on action/motor processes than others. Here, we explored the role of motor resources in the combinations of these visuospatial memory components. A motor interference paradigm was adopted in which participants had their arms bent behind their back or free during a spatial memory task. This task consisted in memorizing triads of objects and then verbally judging what was the object: (1) closest to/farthest from the participant (egocentric coordinate); (2) to the right/left of the participant (egocentric categorical); (3) closest to/farthest from a target object (allocentric coordinate); and (4) on the right/left of a target object (allocentric categorical). The triads appeared in participants' peripersonal (Experiment 1) or extrapersonal (Experiment 2) space. The results of Experiment 1 showed that motor interference selectively damaged egocentric-coordinate judgements but not the other spatial combinations. The results of Experiment 2 showed that the interference effect disappeared when the objects were in the extrapersonal space. A third follow-up study using a within-subject design confirmed the overall pattern of results. Our findings provide evidence that motor resources play an important role in the combination of coordinate spatial relations and egocentric representations in peripersonal space.  相似文献   

2.
In mirror reflections, visual stimuli in near peripersonal space (e.g., an object in the hand) can project the retinal image of far, extrapersonal stimuli "beyond" the mirror. We studied the interaction of such visual reflections with tactile stimuli in a cross-modal congruency task. We found that visual distractors produce stronger interference on tactile judgments when placed close to the stimulated hand, but observed indirectly as distant mirror reflections, than when directly observed in equivalently distant far space, even when in contact with a dummy hand or someone else's hand in the far location. The stronger visual-tactile interference for the mirror condition implies that near stimuli seen as distant reflections in a mirror view of one's own hands can activate neural networks coding peripersonal space, because these visual stimuli are coded as having a true source near to the body.  相似文献   

3.
Embodied theories of object representation propose that the same neural networks are involved in encoding and retrieving object knowledge. In the present study, we investigated whether motor programs play a causal role in the retrieval of object names. Participants performed an object-naming task while squeezing a sponge with either their right or left hand. The objects were artifacts (e.g. hammer) or animals (e.g. giraffe) and were presented in an orientation that favored a grasp or not. We hypothesized that, if activation of motor programs is necessary to retrieve object knowledge, then concurrent motor activity would interfere with naming manipulable artifacts but not non-manipulable animals. In Experiment 1, we observed naming interference for all objects oriented towards the occupied hand. In Experiment 2, we presented the objects in more ‘canonical orientations’. Participants named all objects more quickly when they were oriented towards the occupied hand. Together, these interference/facilitation effects suggest that concurrent motor activity affects naming for both categories. These results also suggest that picture-plane orientation interacts with an attentional bias that is elicited by the objects and their relationship to the occupied hand. These results may be more parsimoniously accounted for by a domain-general attentional effect, constraining the embodied theory of object representations. We suggest that researchers should scrutinize attentional accounts of other embodied cognitive effects.  相似文献   

4.
Experiment 1 investigated whether tool use can expand the peripersonal space into the very far extrapersonal space. Healthy participants performed line bisection in peripersonal and extrapersonal space using wooden sticks up to a maximum of 240 cm. Participants misbisected to the left of the true midpoint, both for lines presented in peripersonal and for those presented in extrapersonal space, confirming a peripersonal space expansion up to a distance of 240 cm. Experiment 2 investigated whether arm position could influence the perception of peripersonal and extrapersonal space during tool use. Participants performed line bisection in the peripersonal and in the extrapersonal space (up to a maximum of 120 cm) using wooden sticks in two different conditions: either with the arm bent or with the arm stretched. Results showed stronger pseudoneglect in the stretched arm condition.  相似文献   

5.
The aim of this study was to explore whether the attentional system, as far as an endogenous orienting is concerned, allocates resources along the sagittal plane and whether such a process is affected by, and is likely to be based on, different functional representations of 3D space in the brain. Several models make a main action-based distinction between representations of peripersonal space and of those extrapersonal space. Accordingly, if attention has to move from one representation to another, it should be possible to observe a decrease in performance during such a transition. To test this hypothesis three experiments were run in which participants performed a cued detection task. Cue stimuli were informative and were centrally located around the fixation point. Target stimuli were displayed at four different depth planes. In the first experiment, assuming that the border between the peripersonal space and the extrapersonal space was at 1 m from the observer, half the target stimuli were located in the peripersonal space and half in the extrapersonal space. The fixation point was located at 1 m from the observer. In the second experiment, the fixation point was moved at 2 m from the observer in order to rule out the possible effects of ocular motor programming. In the third experiment, in order to rule out effects related to the spatial layout of target stimuli (i.e., centre of mass effect) two target stimuli were located in the peripersonal space and six in the extrapersonal space. In all the experiments, besides a validity effect, we observed greater reaction times when attention shift was across spatial representations than when it was within the same representation. The implications for action-oriented models of attention are discussed.  相似文献   

6.
ABSTRACT

Studies of object similarity have focused on the relationship between different physical objects and their mental representations or between instances of the same physical object and its mental representation. The present study is the first to investigate the structure of within-category psychological space. We provided evidence that large objects and frequently mentioned objects are perceived as less similar to each other compared to small objects or less frequently mentioned objects. Further, similarity judgments were higher for manipulable objects compared to non-manipulable objects. The relevance of these data to the isomorphism between physical and psychological spaces is also discussed.  相似文献   

7.
Theories in motor control suggest that the parameters specified during the planning of goal-directed hand movements to a visual target are defined in spatial parameters like direction and amplitude. Recent findings in the visual attention literature, however, argue widely for early object-based selection processes. The present experiments were designed to examine the contributions of object-based and space-based selection processes to the preparation time of goal-directed pointing movements. Therefore, a cue was presented at a specific location. The question addressed was whether the initiation of responses to uncued target stimuli could benefit from being either within the same object (object based) or presented at the same direction (space based). Experiment 1 replicated earlier findings of object-based benefits for non-goal-directed responses. Experiment 2 confirmed earlier findings of space-based benefits for goal-directed hand pointing movements. In Experiments 3 and 4, space-based and object-based manipulations were combined while requiring goal-directed hand pointing movements. The results clearly favour the notion that the selection processes for goal-directed pointing movements are primarily object based. Implications for theories on selective attention and action planning are discussed.  相似文献   

8.
Perception of affordance is enhanced not only when that object is located in one’s own peripersonal space, as compared to when it is located within extrapersonal space, but also when the object is located in another person’s peripersonal space [as measured by a spatial alignment effect (SAE)]. It has been suggested that this reflects the existence of an interpersonal body representation (IBR) that allows us to represent the perceptual states and action possibilities of others. Here, we address the question of whether IBR can be modulated by higher level/reflective social cognition, such as judgments about one’s own social status. Participants responded with either the right or the left hand as soon as a go signal appeared. The go signal screen contained a task-irrelevant stimulus consisting of a 3D scene in which a mug with a left- or right-facing handle was positioned on a table. The mug was positioned either inside or outside the reaching space of the participants. In a third of the trials, the mug was positioned within the reaching space of an avatar seated at the table. Prior to this task we induced an experience of social ostracism in half of the participants by means of a standardized social exclusion condition. The results were that the SAE that normally occurs when the mug is in the avatar’s reaching space is extinguished by the induced social exclusion. This indicates that judgments about one’s own social status modulate the effect of IBR.  相似文献   

9.
It has been proposed that one means of understanding a person's current behaviour and predicting future actions is by simulating their actions. That is, when another person's actions are observed, similar motor processes are activated in the observer. For example, after observing a reach over an obstacle, a person's subsequent reach trajectory is more curved, reflecting motor priming. Importantly, such motor states are only activated if the observed action is in near (peripersonal) space. However, we demonstrate that when individuals share action environments, simulation of another person's obstacle avoiding reach path takes place even when the action is in far (extrapersonal) space. We propose that action simulation is influenced by factors such as ownership. When an "owned" object is a potential future obstacle, even when it is viewed beyond current action space, simulations are evoked, and these leave a more stable memory capable of influencing future behaviour.  相似文献   

10.
It has been proposed that one means of understanding a person's current behaviour and predicting future actions is by simulating their actions. That is, when another person's actions are observed, similar motor processes are activated in the observer. For example, after observing a reach over an obstacle, a person's subsequent reach trajectory is more curved, reflecting motor priming. Importantly, such motor states are only activated if the observed action is in near (peripersonal) space. However, we demonstrate that when individuals share action environments, simulation of another person's obstacle avoiding reach path takes place even when the action is in far (extrapersonal) space. We propose that action simulation is influenced by factors such as ownership. When an “owned” object is a potential future obstacle, even when it is viewed beyond current action space, simulations are evoked, and these leave a more stable memory capable of influencing future behaviour.  相似文献   

11.
Berti  Anna 《Cognitive processing》2021,22(1):121-126

Years ago, it was demonstrated (e.g., Rizzolatti et al. in Handbook of neuropsychology, Elsevier Science, Amsterdam, 2000) that the brain does not encode the space around us in a homogeneous way, but through neural circuits that map the space relative to the distance that objects of interest have from the body. In monkeys, relatively discrete neural systems, characterized by neurons with specific neurophysiological responses, seem to be dedicated either to represent the space that can be reached by the hand (near/peripersonal space) or to the distant space (far/extrapersonal space). It was also shown that the encoding of spaces has dynamic aspects because they can be remapped by the use of tools that trigger different actions (e.g., Iriki et al. 1998). In this latter case, the effect of the tool depends on the modulation of personal space, that is the space of our body. In this paper, I will review and discuss selected research, which demonstrated that also in humans: 1 spaces are encoded in a dynamic way; 2 encoding can be modulated by the use of tool that the system comes to consider as parts of the own body; 3 body representations are not fixed, but they are fragile and subject to change to the point that we can incorporate not only the tools necessary for action, but even limbs belonging to other people. What embodiment of tools and of alien limb tell us about body representations is then briefly discussed.

  相似文献   

12.
Previous research investigating the influence of object manipulability (the properties of objects that make them appropriate for manual action) on object identification has not tightly controlled for effects of both object familiarity and age of acquisition of objects. The current research carefully controlled these two variables on a balanced set of 120 photographs and showed significant effects of object manipulability during object categorization (Experiment 1) and object naming (Experiment 2). Critically, the effects showed a manipulability-effect reversal, with faster categorization of non-manipulable objects, but faster naming of manipulable objects, suggesting that task moderates the direction of the manipulability effect. Exposure duration (the amount of time the object was visible to participants) was also investigated, but no interactions between exposure duration and manipulability were found. These results indicate that not only can manipulability influence object identification, but the way in which it does depends on the task.  相似文献   

13.
Primate data suggest that near (peripersonal) and far (extrapersonal) space are coded within distinct representations. Support for this claim has been gained from human studies of line bisection, many of which have focused on neuropsychological, rather than normative, samples. One important aspect of these bisection studies has been to control for the changes in angular extent of stimuli that normally accompany changes in viewing distance. The control of angular information, however, requires alterations in the linear dimensions (actual stimulus size) of stimuli. We report two experiments in which normal subjects made manual bisection judgements on stimuli positioned in near or far space, and which were oriented in either the left-right (Experiment 1) or radial plane (Experiment 2). Both experiments were designed to enable the separable effects of linear and angular extent to be disentangled. Viewing distance effects were obtained when angular information was controlled, but many of these were dependent on changes in linear extent, and were only apparent at the individual subject level. Our data confirm that genuine near/far effects may be observed in normative bisection, but that many previous studies which appeared to support a near/far distinction in both normal and brain-damaged bisection behaviour may reflect a failure to control for changes in stimulus size.  相似文献   

14.
Two experiments investigated infants' ability to localize tactile sensations in peripersonal space. Infants aged 10 months (Experiment 1) and 6.5 months (Experiment 2) were presented with vibrotactile stimuli unpredictably to either hand while they adopted either a crossed- or uncrossed-hands posture. At 6.5 months, infants' responses were predominantly manual, whereas at 10 months, visual orienting behavior was more evident. Analyses of the direction of the responses indicated that (a) both age groups were able to locate tactile stimuli, (b) the ability to remap visual and manual responses to tactile stimuli across postural changes develops between 6.5 and 10 months of age, and (c) the 6.5-month-olds were biased to respond manually in the direction appropriate to the more familiar uncrossed-hands posture across both postures. The authors argue that there is an early visual influence on tactile spatial perception and suggest that the ability to remap visual and manual directional responses across changes in posture develops between 6.5 and 10 months, most likely because of the experience of crossing the midline gained during this period.  相似文献   

15.
Theories of embodied object representation predict a tight association between sensorimotor processes and visual processing of manipulable objects. Previous research has shown that object handles can ‘potentiate’ a manual response (i.e., button press) to a congruent location. This potentiation effect is taken as evidence that objects automatically evoke sensorimotor simulations in response to the visual presentation of manipulable objects. In the present series of experiments, we investigated a critical prediction of the theory of embodied object representations that potentiation effects should be observed with manipulable artifacts but not non-manipulable animals. In four experiments we show that (a) potentiation effects are observed with animals and artifacts; (b) potentiation effects depend on the absolute size of the objects and (c) task context influences the presence/absence of potentiation effects. We conclude that potentiation effects do not provide evidence for embodied object representations, but are suggestive of a more general stimulus–response compatibility effect that may depend on the distribution of attention to different object features.  相似文献   

16.
The present study investigates the influence of depth on pseudoneglect in healthy young participants (n = 18) within three-dimensional virtual space, by presenting a variation of the greyscales task and a landmark task, which were specifically matched for stimulus–response compatibility, as well as perceptual factors within and across the tasks. Tasks were presented in different depth locations (peripersonal, extrapersonal) and different orientations (horizontal, vertical) within three-dimensional virtual space, using virtual reality technique. A horizontal leftward bias (pseudoneglect) for both tasks was found, which was stronger in peripersonal than in extrapersonal space. For the vertical condition, an upward bias was observed in the greyscales task, but not in the landmark task. These results support the hypotheses of right hemispheric dominance for visual spatial attention and our study is the first to examine horizontal and vertical orienting biases with the greyscales task in peri- and extrapersonal space. Furthermore, the differences in attentional asymmetries with respect to depth suggest dissociable neural mechanisms for visual attentional processing in near and far space and the lack of significant correlations implies independence of horizontal and vertical stimulus processing.  相似文献   

17.
Studies on affordances typically focus on single objects. We investigated whether affordances are modulated by the context, defined by the relation between two objects and a hand. Participants were presented with pictures displaying two manipulable objects linked by a functional (knife-butter), a spatial (knife-coffee mug), or by no relation. They responded by pressing a key whether the objects were related or not. To determine if observing other's actions and understanding their goals would facilitate judgments, a hand was: (a) displayed near the objects; (b) grasping an object to use it; (c) grasping an object to manipulate/move it; (d) no hand was displayed. RTs were faster when objects were functionally rather than spatially related. Manipulation postures were the slowest in the functional context and functional postures were inhibited in the spatial context, probably due to mismatch between the inferred goal and the context. The absence of this interaction with foot responses instead of hands in Experiment 2 suggests that effects are due to motor simulation rather than to associations between context and hand-postures.  相似文献   

18.
We used repetition blindness to investigate the nature of the representations underlying identification of manipulable objects. Observers named objects presented in rapid serial visual presentation streams containing either manipulable or nonmanipulable objects. In half the streams, 1 object was repeated. Overall accuracy was lower when streams contained 2 different manipulable objects than when they contained only nonmanipulable objects or a single manipulable object. In addition, nonmanipulable objects induced repetition blindness, whereas manipulable objects were associated with a repetition advantage. These findings suggest that motor information plays a direct role in object identification. Manipulable objects are vulnerable to interference from other objects associated with conflicting motor programs, but they show better individuation of repeated objects associated with the same action. (PsycINFO Database Record (c) 2012 APA, all rights reserved).  相似文献   

19.
In contrast to the leftward inattention caused by right parietal damage, normal brain function shows a subtle neglect of the right and left sides in peripersonal and extrapersonal space, respectively. This study explored how these attentional biases cause healthy individuals to collide with objects on the right. In Experiment 1, participants navigated manual and electric wheelchairs through a narrow doorway. More rightward collisions were observed for the electric, but not the manual, wheelchair. Experiment 2 demonstrated that the rightward deviation for electric wheelchairs increased for wider doorways. Experiment 3 established that the rightward deviation is not the result of task-related vestibular input, using a remote control device to operate the wheelchair. The rightward deviation persisted in Experiment 4 when the doorway was removed, suggesting that the bias is the result of a mis-bisection of space. In Experiment 5, the rightward bias was replicated using an electric scooter, which is steered using handlebars. Finally, Experiment 6 required participants to point to the middle of the doorway, using a laser, before moving the scooter. Rightward mis-bisection was observed in both conditions. Rightward mis-bisection of lines in extrapersonal space provides the most parsimonious explanation of the rightward collisions and deviations.  相似文献   

20.
The nature of hand-action representations evoked during language comprehension was investigated using a variant of the visual–world paradigm in which eye fixations were monitored while subjects viewed a screen displaying four hand postures and listened to sentences describing an actor using or lifting a manipulable object. Displayed postures were related to either a functional (using) or volumetric (lifting) interaction with an object that matched or did not match the object mentioned in the sentence. Subjects were instructed to select the hand posture that matched the action described in the sentence. Even before the manipulable object was mentioned in the sentence, some sentence contexts allowed subjects to infer the object's identity and the type of action performed with it and eye fixations immediately favored the corresponding hand posture. This effect was assumed to be the result of ongoing motor or perceptual imagery in which the action described in the sentence was mentally simulated. In addition, the hand posture related to the manipulable object mentioned in a sentence, but not related to the described action (e.g., a writing posture in the context of a sentence that describes lifting, but not using, a pencil), was favored over other hand postures not related to the object. This effect was attributed to motor resonance arising from conceptual processing of the manipulable object, without regard to the remainder of the sentence context.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号