首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
Subjects viewed the Müller-Lyer illusion, making either saccadic or smooth tracking eye movements between the apexes of the arrow heads. The decrement in the magnitude of the illusion was significantly greater for Ss in the saccadic viewing condition. Saccadic and smooth tracking eye movements are separately controlled,and information about eye position is more readily available from the efferent signals issued to control a saccadic eye movement. The experimental findings were consistent with the hypothesis that Ss in the saccadic condition learned a new afferent efferent association. The results support a theory that visual perception is determined by efferent readiness activated by visual afferent stimulation.  相似文献   

2.
It is not known why people move their eyes when engaged in non-visual cognition. The current study tested the hypothesis that differences in saccadic eye movement rate (EMR) during non-visual cognitive tasks reflect different requirements for searching long-term memory. Participants performed non-visual tasks requiring relatively low or high long-term memory retrieval while eye movements were recorded. In three experiments, EMR was substantially lower for low-retrieval than for high-retrieval tasks, including in an eyes closed condition in Experiment 3. Neither visual imagery nor between-task difficulty was related to EMR, although there was some evidence for a minor effect of within-task difficulty. Comparison of task-related EMRs to EMR during a no-task waiting period suggests that eye movements may be suppressed or activated depending on task requirements. We discuss a number of possible interpretations of saccadic eye movements during non-visual cognition and propose an evolutionary model that links these eye movements to memory search through an elaboration of circuitry involved in visual perception.  相似文献   

3.
Reaching to targets in space requires the coordination of eye and hand movements. In two experiments, we recorded eye and hand kinematics to examine the role of gaze position at target onset on eye-hand coordination and reaching performance. Experiment 1 showed that with eyes and hand aligned on the same peripheral start location, time lags between eye and hand onsets were small and initiation times were substantially correlated, suggesting simultaneous control and tight eye-hand coupling. With eyes and hand departing from different start locations (gaze aligned with the center of the range of possible target positions), time lags between eye and hand onsets were large and initiation times were largely uncorrelated, suggesting independent control and decoupling of eye and hand movements. Furthermore, initial gaze position strongly mediated manual reaching performance indexed by increments in movement time as a function of target distance. Experiment 2 confirmed the impact of target foveation in modulating the effect of target distance on movement time. Our findings reveal the operation of an overarching, flexible neural control system that tunes the operation and cooperation of saccadic and manual control systems depending on where the eyes look at target onset.  相似文献   

4.
Coordinated control of eye and hand movements in dynamic reaching   总被引:3,自引:0,他引:3  
In the present study, we integrated two recent, at first sight contradictory findings regarding the question whether saccadic eye movements can be generated to a newly presented target during an ongoing hand movement. Saccades were measured during so-called adaptive and sustained pointing conditions. In the adapted pointing condition, subjects had to direct both their gaze and arm movements to a displaced target location. The results showed that the eyes could fixate the new target during pointing. In addition, a temporal coupling of these corrective saccades was found with changes in arm movement trajectories when reaching to the new target. In the sustained pointing condition, however, the same subjects had to point to the initial target, while trying to deviate their gaze to a new target that appeared during pointing. It was found that the eyes could not fixate the new target before the hand reached the initial target location. Together, the results indicate that ocular gaze is always forced to follow the target intended by a manual arm movement. A neural mechanism is proposed that couples ocular gaze to the target of an arm movement. Specifically, the mechanism includes a reach neuron layer besides the well-known saccadic layer in the primate superior colliculus. Such a tight, sub-cortical coupling of ocular gaze to the target of a reaching movement can explain the contrasting behavior of the eyes in dependency of whether the eye and hand share the same target position or attempt to move to different locations.  相似文献   

5.
In a number of studies, we have demonstrated that the spatial-temporal coupling of eye and hand movements is optimal for the pickup of visual information about the position of the hand and the target late in the hand's trajectory. Several experiments designed to examine temporal coupling have shown that the eyes arrive at the target area concurrently with the hand achieving peak acceleration. Between the time the hand reached peak velocity and the end of the movement, increased variability in the position of the shoulder and the elbow was accompanied by a decreased spatial variability in the hand. Presumably, this reduction in variability was due to the use of retinal and extra-retinal information about the relative positions of the eye, hand and target. However, the hand does not appear to be a slave to the eye. For example, we have been able to decouple eye movements and hand movements using Müller-Lyer configurations as targets. Predictable bias, found in primary and corrective saccadic eye movements, was not found for hand movements, if on-line visual information about the target was available during aiming. That is, the hand remained accurate even when the eye had a tendency to undershoot or overshoot the target position. However, biases of the hand were evident, at least in the initial portion of an aiming movement, when vision of the target was removed and vision of the hand remained. These findings accent the versatility of human motor control and have implications for current models of visual processing and limb control.  相似文献   

6.
The relationship between saccadic eye movements and covert orienting of visual spatial attention was investigated in two experiments. In the first experiment, subjects were required to make a saccade to a specified location while also detecting a visual target presented just prior to the eye movement. Detection accuracy was highest when the location of the target coincided with the location of the saccade, suggesting that subjects use spatial attention in the programming and/or execution of saccadic eye movements. In the second experiment, subjects were explicitly directed to attend to a particular location and to make a saccade to the same location or to a different one. Superior target detection occurred at the saccade location regardless of attention instructions. This finding shows that subjects cannot move their eyes to one location and attend to a different one. The results of these experiments suggest that visuospatial attention is an important mechanism in generating voluntary saccadic eye movements.  相似文献   

7.
When observers localize the vanishing point of a moving target, localizations are reliably displaced beyond the final position, in the direction the stimulus was travelling just prior to its offset. We examined modulations of this phenomenon through eye movements and action control over the vanishing point. In Experiment 1 with pursuit eye movements, localization errors were in movement direction, but less pronounced when the vanishing point was self‐determined by a key press of the observer. In contrast, in Experiment 2 with fixation instruction, localization errors were opposite movement direction and independent from action control. This pattern of results points at the role of eye movements, which were gathered in Experiment 3. That experiment showed that the eyes lagged behind the target at the point in time, when it vanished from the screen, but that the eyes continued to drift on the targets' virtual trajectory. It is suggested that the perceived target position resulted from the spatial lag of the eyes and of the persisting retinal image during the drift.  相似文献   

8.
There is considerable evidence that covert visual attention precedes voluntary eye movements to an intended location. What happens to covert attention when an involuntary saccadic eye movement is made? In agreement with other researchers, we found that attention and voluntary eye movements are tightly coupled in such a way that attention always shifts to the intended location before the eyes begin to move. However, we found that when an involuntary eye movement is made, attention first precedes the eyes to the unintended location and then switches to the intended location, with the eyes following this pattern a short time later. These results support the notion that attention and saccade programming are tightly coupled.  相似文献   

9.
Two experiments were performed to evaluate the influence of movement frequency and predictability on visual tracking of the actively and the passively moved hand. Four measures of tracking precision were employed: (a) saccades/cycle, (b) percent of pursuit movement, (c) eye amplitude/arm amplitude, (d) asynchrony of eye and hand at reversal. Active and passive limb movements were tracked with nearly identical accuracy and were always vastly superior to tracking an external visual target undergoing comparable motion. Proprioceptive information about target position appears to provide velocity and position information about target location. Its presence permits the development of central eye-movement programmes that move the eyes in patterns that approximate but do not exactly match, temporally or spatially, the motion of the hand.  相似文献   

10.
The question investigated was whether or not eye movements accompanied by abnormal retinal image movements, movements that are either or both at a different rate or in a different direction than the eye movement, predictably lead to perceived movement. Os reported whether or not they saw a visual target move when the movement of the target was either dependent on and simultaneous with their eye movements or when the target movement was independent of their eye movements. In the main experiment, observations were made when the ratio between eye and target movement fem/tm) was 2/5, 1/5, 1/10, 1/20, and 0. All these ratios were tested when the direction of the target movement was in the same (H+), opposite (H?), and at right angles to (V+, V?) the movement of the eyes. Eye movements, target movements, and reports of target movement were recorded. Results indicate that a discrepancy between eye and target movement greater than 20% predictably leads to perceived target movement, whereas a discrepancy of 5% or less rarely leads to perceived movement. The results are interpreted as support for the operation of a compensatory mechanism during eye movements.  相似文献   

11.
Subjects who make repetitive saccadic eye movements before a memory test subsequently exhibit superior retrieval in comparison with subjects who do not move their eyes. It has been proposed that eye movements enhance retrieval by increasing interaction of the left and right cerebral hemispheres. To test this, we compared the effect of eye movements on subsequent recall (Experiment 1) and recognition (Experiment 2) in two groups thought to differ in baseline degree of hemispheric interaction—individuals who are strongly right-handed (SR) and individuals who are not (nSR). For SR subjects, who naturally may experience less hemispheric interaction than nSR subjects, eye movements enhanced retrieval. In contrast, depending on the measure, eye movements were either inconsequential or even detrimental for nSR subjects. These results partially support the hemispheric interaction account, but demand an amendment to explain the harmful effects of eye movements for nSR individuals.  相似文献   

12.
In the present study, we examined whether eye movements facilitate retention of visuo-spatial information in working memory. In two experiments, participants memorised the sequence of the spatial locations of six digits across a retention interval. In some conditions, participants were free to move their eyes during the retention interval, but in others they either were required to remain fixated or were instructed to move their eyes exclusively to a selection of the memorised locations. Memory performance was no better when participants were free to move their eyes during the memory interval than when they fixated a single location. Furthermore, the results demonstrated a primacy effect in the eye movement behaviour that corresponded with the memory performance. We conclude that overt eye movements do not provide a benefit over covert attention for rehearsing visuo-spatial information in working memory.  相似文献   

13.
Many studies have shown that covert visual attention precedes saccadic eye movements to locations in space. The present research investigated whether the allocation of attention is similarly affected by eye blinks. Subjects completed a partial-report task under blink and no-blink conditions. Experiment 1 showed that blinking facilitated report of the bottom row of the stimulus array: Accuracy for the bottom row increased and mislocation errors decreased under blink, as compared with no-blink, conditions, indicating that blinking influenced the allocation of visual attention. Experiment 2 showed that this was true even when subjects were biased to attend elsewhere. These results indicate that attention moves downward before a blink in an involuntary fashion. The eyes also move downward during blinks, so attention may precede blink-induced eye movements just as it precedes saccades and other types of eye movements.  相似文献   

14.
Eye movement desensitization and reprocessing can reduce ratings of the vividness and emotionality of unpleasant memories-hence it is commonly used to treat posttraumatic stress disorder. The present experiments compared three accounts of how eye movements produce these benefits. Participants rated unpleasant autobiographical memories before and after eye movements or an eyes stationary control condition. In Experiment 1, eye movements produced benefits only when memories were held in mind during the movements, and eye movements increased arousal, contrary to an investigatory-reflex account. In Experiment 2, horizontal and vertical eye movements produced equivalent benefits, contrary to an interhemispheric-communication account. In Experiment 3, two other distractor tasks (auditory shadowing, drawing) produced benefits that were negatively correlated with working-memory capacity. These findings support a working-memory account of the eye movement benefits in which the central executive is taxed when a person performs a distractor task while attempting to hold a memory in mind.  相似文献   

15.
Orienting to a target by looking and pointing is examined for parallels between the control of the two systems and interactions due to movement of the eyes and limb to the same target. Parallels appear early in orienting and may be due to common processing of spatial information for the ocular and manual systems. The eyes and limb both have shorter response latency to central visual and peripheral auditory targets. Each movement also has shorter latency and duration when the target presentation is short enough (200 msec) that no analysis of feedback of the target position is possible during the movement. Interactions appear at many stages of information processing for movement. Latency of ocular movement is much longer when the subject also points, and the eye and limb movement latencies are highly correlated for orienting to auditory targets. Final position of eyes and limb are significantly correlated only when target duration is short (200 msec). This illustrates that sensory information obtained before the movement begins is an important, but not the only, source of input about target position. Additional information that assists orienting may be passed from one system to another, since visual information gained by looking aided pointing to lights and proprioceptive information from the pointing hand seemed to assist the eyes in looking to sounds. Thus the production of this simple set of movements may be partly described by a cascade-type process of parallel analysis of spatial information for eye and hand control, but is also, later in the movement, assisted by cross-system interaction.  相似文献   

16.
Two experiments are reported that address the issue of coordination of the eyes, head, and hand during reaching and pointing. Movement initiation of the eyes, head, and hand were monitored in order to make inferences about the type of movement control used. In the first experiment, when subjects pointed with the finger to predictable or unpredictable locations marked by the appearance of a light, no differences between head and eye movement initiation were found. In the second experiment, when subjects pointed very fast with the finger, the head started to move before the eyes did. Conversely, when subjects pointed accurately, and thus more slowly, with the finger, the eyes started to move first, followed by the head and finger. When subjects were instructed to point to the same visual target only with their eyes and head, both fast and accurately, however, eye movement always started before head movement, regardless of speed-accuracy instructions. These results indicate that the behavior of the eye and head system can be altered by introducing arm movements. This, along with the variable movement initiation patterns, contradicts the idea that the eye, head, and hand system is controlled by a single motor program. The time of movement termination was also monitored, and across both experiments, the eyes always reached the target first, followed by the finger, and then the head. This finding suggests that movement termination patterns may be a fundamental control variable.  相似文献   

17.
18.
Recent research has shown superior memory retrieval when participants make a series of horizontal saccadic eye movements between the memory encoding phase and the retrieval phase compared to participants who do not move their eyes or move their eyes vertically. It has been hypothesized that the rapidly alternating activation of the two hemispheres that is associated with the series of left–right eye movements is critical in causing the enhanced retrieval. This hypothesis predicts a beneficial effect on retrieval of alternating left–right stimulation not only of the visuomotor system, but also of the somatosensory system, both of which have a strict contralateral organization. In contrast, this hypothesis does not predict an effect, or a weaker effect, on retrieval of alternating left–right stimulation of the auditory system, which has a much less lateralized organization. Consistent with these predictions, we replicated the horizontal saccade-induced retrieval enhancement (Experiment 1) and showed that a similar retrieval enhancement occurs after alternating left–right tactile stimulation (Experiment 2). Furthermore, retrieval was not enhanced after alternating left–right auditory stimulation compared to simultaneous bilateral auditory stimulation (Experiment 3). We discuss the possibility that alternating bilateral activation of the left and right hemispheres exerts its effects on memory by increasing the functional connectivity between the two hemispheres. We also discuss the findings in the context of clinical practice, in which bilateral eye movements (EMDR) and auditory stimulation are used in the treatment of post-traumatic stress disorder.  相似文献   

19.
Tatler BW  Wade NJ 《Perception》2003,32(2):167-184
Investigations of the ways in which the eyes move came to prominence in the 19th century, but techniques for measuring them more precisely emerged in the 20th century. When scanning a scene or text the eyes engage in periods of relative stability (fixations) interspersed with ballistic rotations (saccades). The saccade-and-fixate strategy, associated with voluntary eye movements, was first uncovered in the context of involuntary eye movements following body rotation. This pattern of eye movements is now referred to as nystagmus, and involves periods of slow eye movements, during which objects are visible, and rapid returns, when they are not; it is based on a vestibular reflex which attempts to achieve image stabilisation. Post-rotational nystagmus was reported in the late 18th century (by Wells), with afterimages used as a means of retinal stabilisation to distinguish between movement of the eyes and of the environment. Nystagmus was linked to vestibular stimulation in the 19th century, and Mach, Breuer, and Crum Brown all described its fast and slow phases. Wells and Breuer proposed that there was no visual awareness during the ballistic phase (saccadic suppression). The saccade-and-fixate strategy highlighted by studies of nystagmus was shown to apply to tasks like reading by Dodge, who used more sophisticated photographic techniques to examine oculomotor kinematics. The relationship between eye movements and perception, following earlier intuitions by Wells and Breuer, was explored by Dodge, and has been of fundamental importance in the direction of vision research over the last century.  相似文献   

20.
This experiment tested whether the perceived stability of the environment is altered when there is a combination of eye and visually open-loop hand movements toward a target displaced during the eye movements, i.e., during saccadic suppression. Visual-target eccentricity randomly decreased or increased during eye movements and subjects reported whether they perceived a target displacement or not, and if so, the direction of the displacement. Three experimental conditions, involving different combinations of eye and arm movements, were tested: (a) eye movements only; (b) simultaneous eye and rapid arm movements toward the target; and (c) simultaneous eye and arm movements with a restraint blocking the arm as soon as the hand left the starting position. The perceptual threshold of target displacements resulting in an increased target eccentricity was greater when subjects combined eye and arm movements toward the target object, specially for the no-restraint condition. Subjects corrected most of their arm trajectory toward the displaced target despite the short movement times (average MT = 189 ms). After the movements, the null error feedback of the hand's final position presumably overlapped the retino-oculomotor signal error and could be responsible for the deficient perception of target displacements. Thus, subjects interpreted the terminal hand positions as being within the range of the endpoint variability associated with the production of rapid arm movements rather than as a change of the environment. These results suggest that a natural strategy adopted for processing spatial information, especially in a competing situation, could favour a constancy tendency, avoiding systematic perception of a change of environment for any noise or variability at the central or peripheral levels.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号