首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
When a rigid object moves toward the eye, it is usually perceived as being rigid. However, in the case of motion away from the eye, the motion and structure of the object are perceived nonveridically, with the percept tending to reflect the nonrigid transformations that are present in the retinal image. This difference in response to motion to and from the observer was quantified in an experiment using wire-frame computer-generated boxes which moved toward and away from the eye. Two theoretical systems are developed by which uniform three-dimensional velocity can be recovered from an expansion pattern of nonuniform velocity vectors. It is proposed that the human visual system uses two similar systems for processing motion in depth. The mechanism used for motion away from the eye produces perceptual errors because it is not suited to objects with a depth component.  相似文献   

2.
The perceived position of an object is determined not only by the retinal location of the object but also by gaze direction, eye movements, and the motion of the object itself. Recent evidence further suggests that the motion of one object can alter the perceived positions of stationary objects in remote regions of visual space (Whitney & Cavanagh, 2000). This indicates that there is an influence of motion on perceived position, and that this influence can extend over large areas of the visual field. Yet, it remains unclear whether the motion of one object shifts the perceived positions of other moving stimuli. To test this we measured two well-known visual illusions, the Fröhlich effect and representational momentum, in the presence of extraneous surrounding motion. We found that the magnitude of these mislocalizations was altered depending on the direction and speed of the surrounding motion. The results indicate that the positions assigned to stationary and moving objects are affected by motion signals over large areas of space and that both types of stimuli may be assigned positions by a common mechanism.  相似文献   

3.
Many fatal accidents that involve pedestrians occur at road crossings, and are attributed to a breakdown of communication between pedestrians and drivers. Thus, it is important to investigate how forms of communication in traffic, such as eye contact, influence crossing decisions. Thus far, there is little information about the effect of drivers’ eye contact on pedestrians’ perceived safety to cross the road. Existing studies treat eye contact as immutable, i.e., it is either present or absent in the whole interaction, an approach that overlooks the effect of the timing of eye contact. We present an online crowdsourced study that addresses this research gap. 1835 participants viewed 13 videos of an approaching car twice, in random order, and held a key whenever they felt safe to cross. The videos differed in terms of whether the car yielded or not, whether the car driver made eye contact or not, and the times when the driver made eye contact. Participants also answered questions about their perceived intuitiveness of the driver’s eye contact behavior. The results showed that eye contact made people feel considerably safer to cross compared to no eye contact (an increase in keypress percentage from 31% to 50% was observed). In addition, the initiation and termination of eye contact affected perceived safety to cross more strongly than continuous eye contact and a lack of it, respectively. The car’s motion, however, was a more dominant factor. Additionally, the driver’s eye contact when the car braked was considered intuitive, and when it drove off, counterintuitive. In summary, this study demonstrates for the first time how drivers’ eye contact affects pedestrians’ perceived safety as a function of time in a dynamic scenario and questions the notion in recent literature that eye contact in road interactions is dispensable. These findings may be of interest in the development of automated vehicles (AVs), where the driver of the AV might not always be paying attention to the environment.  相似文献   

4.
Perceived stereomotion trajectory was measured before and after adaptation to lateral motion in the dominant or nondominant eye to assess the relative contributions of 2 cues: changing disparity and interocular velocity difference. Perceived speed for monocular lateral motion and perceived binocular visual direction (BVD) was also assessed. Unlike stereomotion trajectory perception, the BVD of static targets showed an ocular dominance bias, even without adaptation. Adaptation caused equivalent biases in perceived trajectory and monocular motion speed, without significantly affecting perceived BVD. Predictions from monocular motion data closely match trajectory perception data, unlike those from BVD sources. The results suggest that the interocular velocity differences make a significant contribution to stereomotion trajectory perception.  相似文献   

5.
The apparent relative motion of physically stationary objects that frequently occurs as the head is moved in a frontoparallel plane is almost always in the direction expected from the projection into the distal world of the relative motion of the images on the eye. It is hypothesized that this is the result of the perceptual underestimation of the depth between the objects. If a perceptual overestimation of the depth were produced, it was predicted that the apparent relative motion would be in a direction opposite to that expected from the projection of the retinal motions. This prediction was tested using the binocular disparity cue to produce perceptual overestimation of the slant (depth) of a luminous line. In this case, perceived slant was the indicator of perceived depth, and perceived rotation concomitant with the motion of the head was the indicator of perceived relative motion. The results support the prediction and also provide some support for a theoretically derived equation specifying the relation between these two perceptual variables.  相似文献   

6.
Ito H 《Perception》2003,32(3):367-375
The Pulfrich effect yields a perceived depth for horizontally moving objects but not for vertically moving ones. In this study the Pulfrich effect was measured by translating oblique lines seen through a circular window, which made motion direction ambiguous. Overlaying random dots that moved horizontally, vertically, or diagonally controlled the perceptual motion direction of the lines. In experiment 1, when the lines were seen to move horizontally, the effect was strongest in spite of the same physical motion of the lines. Experiment 2 was performed to test the above conditions again, excluding the Pulfrich effect of the dots on the depth of the lines. The overlaid dots were presented to one eye only. The result showed that the Pulfrich effect of the lines was persistently strong in spite of the perceptual changes in motion direction. Experiment 3 also showed that the Pulfrich depth was independent of the perceived horizontal speed in a plaid display. The Pulfrich effect was determined by measuring the horizontal disparity component, independently of the perceived motion direction. These results demonstrate that the aperture problems in motion and stereopsis in the Pulfrich effect are solved independently.  相似文献   

7.
A horizontally moving target was followed by rotation of the eyes alone or by a lateral movement of the head. These movements resulted in the retinal displacement of a vertically moving target from its perceived path, the amplitude of which was determined by the phase and amplitude of the object motion and of the eye or head movements. In two experiments, we tested the prediction from our model of spatial motion (Swanston, Wade, & Day, 1987) that perceived distance interacts with compensation for head movements, but not with compensation for eye movements with respect to a stationary head. In both experiments, when the vertically moving target was seen at a distance different from its physical distance, its perceived path was displaced relative to that seen when there was no error in perceived distance, or when it was pursued by eye movements alone. In a third experiment, simultaneous measurements of eye and head position during lateral head movements showed that errors in fixation were not sufficient to require modification of the retinal paths determined by the geometry of the observation conditions in Experiments 1 and 2.  相似文献   

8.
A horizontally moving target was followed by rotation of the eyes alone or by a lateral movement of the head. These movements resulted in the retinal displacement of a vertically moving target from its perceived path, the amplitude of which was determined by the phase and amplitude of the object motion and of the eye or head movements. In two experiments, we tested the prediction from our model of spatial motion (Swanston, Wade, & Day, 1987) that perceived distance interacts with compensation for head movements, but not with compensation-for eye movements with respect to a stationary head. In both experiments, when the vertically moving target was seen at a distance different from its physical distance, its perceived path was displaced relative to that seen when there was no error in pereived distance, or when it was pursued by eye movements alone. In a third experiment, simultaneous measurements of eye and head position during lateral head movements showed that errors in fixation were not sufficient to require modification of the retinal paths determined by the geometry of the observation conditions in Experiments 1 and 2.  相似文献   

9.
This study investigated multisensory interactions in the perception of auditory and visual motion. When auditory and visual apparent motion streams are presented concurrently in opposite directions, participants often fail to discriminate the direction of motion of the auditory stream, whereas perception of the visual stream is unaffected by the direction of auditory motion (Experiment 1). This asymmetry persists even when the perceived quality of apparent motion is equated for the 2 modalities (Experiment 2). Subsequently, it was found that this visual modulation of auditory motion is caused by an illusory reversal in the perceived direction of sounds (Experiment 3). This "dynamic capture" effect occurs over and above ventriloquism among static events (Experiments 4 and 5), and it generalizes to continuous motion displays (Experiment 6). These data are discussed in light of related multisensory phenomena and their support for a "modality appropriateness" interpretation of multisensory integration in motion perception.  相似文献   

10.
It has been hypothesized that an evolutionarily ancient mechanism underlies the ability of human infants to detect and act upon the direction of eye gaze of another human face. However, the evidence from behavioral studies with infants is also consistent with a more domain-general system responsive to the lateral motion of stimuli regardless of whether or not eyes are involved. To address this issue three experiments with 4-month-old infants are reported that utilize a standard face-cueing paradigm. In the first experiment an inverted face was used to investigate whether the motion of the pupils elicits the cueing effect regardless of the surrounding face context. In the second experiment pupil motion and eye gaze direction were opposed, allowing us to assess their relative importance. In a third experiment, a more complex gaze shift sequence allowed us to analyse the importance of beginning with a period of mutual gaze. Overall, the results were consistent with the importance of the perceived direction of motion of pupils. However, to be effective in cueing spatial locations this motion needs to be preceded by a period of direct mutual gaze (eye contact). We suggest that evolution results in information-processing biases that shape and constrain the outcome of individual development to eventually result in adult adaptive specializations.  相似文献   

11.
In the model of motion perception proposed by Swanston, Wade, and Day (1987, Perception 16 143-159) it was suggested that retinocentric motion and eye movement information are combined independently for each eye, to give left and right orbitocentric representations of movement. The weighted orbitocentric values are then added, to give a single agocentric representation. It is shown that for a physical motion observed without pursuit eye movements this formulation predicts a reduction in the perceived extent of motion with monocular as opposed to binocular viewing. This prediction was tested, and shown to be incorrect. Accordingly, a modification of the model is proposed, in which the left and right retinocentric signals are weighted according to the presence or absence of stimulation, and combined to give a binocular retinocentric representation. In a similar way left-eye and right-eye position signals are combined to give a single binocular eye movement signal for version. This is then added to the binocular retinocentric signal to give the egocentric representation. This modification provides a unified account of both static visual direction and movement perception.  相似文献   

12.
When a figure moves behind a narrow aperture in an opaque surface, if it is perceived as a figure, its shape will often appear distorted. Under such anorthoscopic conditions, the speed or direction of the object's motion is ambiguous. However, when the observer simultaneously tracks a moving target, a figure is always perceived, and its precise shape is a function of the speed or direction of tracking. The figure is seen as moving with the speed or in the direction of the target. Thus, it is argued that eye movement serves as a cue to the figure's motion, which, in turn, determines its perceived length or orientation.  相似文献   

13.
To investigate the effect of smooth pursuit effort against optokinetic nystagmus (OKN) on the magnitude of induced motion, we measured the magnitude of induced motion and eye movements of karate athletes and novices. In Experiment 1, participants were required to pursue a horizontally moving fixation stimulus against a vertically moving inducing stimulus and to point at the most distorted position of the perceived pathway of the fixation stimulus. In Experiments 2 and 3, participants were presented with the inducing stimulus with or without a static fixation stimulus. Experiments 1 and 2 showed a larger magnitude of induced motion and more stable fixation for the athletes than for the novices. Experiment 3 showed no difference in eye movements between the two groups. These results suggest that the magnitude of induced motion reflects fixation stability that may have been strengthened in karate athletes through their experience and training.  相似文献   

14.
Freeman TC  Sumnall JH 《Perception》2002,31(5):603-615
Abstract. Observers can recover motion with respect to the head during an eye movement by comparing signals encoding retinal motion and the velocity of pursuit. Evidently there is a mismatch between these signals because perceived head-centred motion is not always veridical. One example is the Filehne illusion, in which a stationary object appears to move in the opposite direction to pursuit. Like the motion aftereffect, the phenomenal experience of the Filehne illusion is one in which the stimulus moves but does not seem to go anywhere. This raises problems when measuring the illusion by motion nulling because the more traditional technique confounds perceived motion with changes in perceived position. We devised a new nulling technique using global-motion stimuli that degraded familiar position cues but preserved cues to motion. Stimuli consisted of random-dot patterns comprising signal and noise dots that moved at the same retinal 'base' speed. Noise moved in random directions. In an eye-stationary speed-matching experiment we found noise slowed perceived retinal speed as 'coherence strength' (ie percentage of signal) was reduced. The effect occurred over the two-octave range of base speeds studied and well above direction threshold. When the same stimuli were combined with pursuit, observers were able to null the Filehne illusion by adjusting coherence. A power law relating coherence to retinal base speed fit the data well with a negative exponent. Eye-movement recordings showed that pursuit was quite accurate. We then tested the hypothesis that the stimuli found at the null-points appeared to move at the same retinal speed. Two observers supported the hypothesis, a third partially, and a fourth showed a small linear trend. In addition, the retinal speed found by the traditional Filehne technique was similar to the matches obtained with the global-motion stimuli. The results provide support for the idea that speed is the critical cue in head-centred motion perception.  相似文献   

15.
Five experiments were conducted to examine how perceived direction of motion is influenced by aspects of shape of a moving object such as symmetry and elongation. Random polygons moving obliquely were presented on a computer screen and perceived direction of motion was measured. Experiments 1 and 2 showed that a symmetric object moving off the axis of symmetry caused motion to be perceived as more aligned with the axis than it actually was. However, Experiment 3 showed that motion did not influence perceived orientation of symmetry axis. Experiment 4 revealed that symmetric shapes resulted in faster judgments on direction of motion than asymmetric shapes only when the motion is along the axis. Experiment 5 showed that elongation causes a bias in perceived direction of motion similar to effects of symmetry. Existence of such biases is consistent with the hypothesis that in the course of evolution, the visual system has been adapted to regularities of motion in the animate world.  相似文献   

16.
Three experiments investigating the basis of induced motion are reported. The proposition that induced motion is based on the visual capture of eye-position information and is therefore a subject-relative, rather than object-relative, motion was explored in the first experiment. Observers made saccades to an invisible auditory stimulus following fixation on a stationary stimulus in which motion was induced. In the remaining two experiments, the question of whether perceived induced motion produces a straight ahead shift was explored. The critical eye movement was directed to apparent straight ahead. Because these saccades partially compensated for the apparent displacement of the induction stimulus, and saccades to the auditory stimulus did not, we conclude that induced motion is not based on oculomotor visual capture. Rather, it is accompanied by a shift in the judged direction of straight ahead, an instance of the straight ahead shift. The results support an object-relative theory of induced motion.  相似文献   

17.
Perceptual constancy of visual motion is usually described as the degree of correspondence between physical and perceived characteristics of motion in the external world. To study it, one has to assess the relationship between physical motion, its retinal image, and its perception. We describe a quantitative estimation procedure for a measure K denoting the degree of perceptual constancy of background target motions noncollinear to the eye movements during ocular pursuit. The calculation of K is based on three vectors describing the target motion (1) as it is physically, (2) as it is mapped to the retina, and (3) as it is perceived, but only the direction of the perceptual motion vector has to be determined experimentally. K allows for quantitative comparison between experiments with a variety of parameters in visual motion displays.  相似文献   

18.
Lighted points that moved as if located on the rim of a rolling wheel were displayed to subjects whose task was to describe the pattern they perceived. The perceived patterns could be classified into one of four categories ranging from cycloidal to circular motion. Pursuit eye movements were controlled by having subjects track a fixation point that moved in the direction of the rolling wheel on a path just above the wheel’s rim. With respect to the translatory velocity of the rolling wheel, the velocity of the fixation point was 100%, 67%, 33%, or 0% (i.e., stationary). The patterns traced out by the points on the wheel were perceived to become increasingly circular as pursuit eye movements more closely matched the translatory speed of the rolling wheel. This is taken to support Stoper’s hypothesis that pursuit eye movements can establish a frame of reference for motion analysis.  相似文献   

19.
Motion lines (MLs) are a pictorial technique used to represent object movement in a still picture. This study explored how MLs contribute to motion perception. In Experiment 1, we reported the creation of a motion illusion caused by MLs: random displacements of objects with MLs on each frame were perceived as unidirectional global motion along the pictorial motion direction implied by MLs. In Experiment 2, we showed that the illusory global motion in the peripheral visual field captured the perceived motion direction of random displacement of objects without MLs in the central visual field, and confirmed that the results in Experiment 1 did not stem simply from response bias, but resulted from perceptual processing. In Experiment 3, we showed that the spatial arrangement of orientation information rather than ML length is important for the illusory global motion. Our results indicate that the ML effect is based on perceptual processing rather than response bias, and that comparison of neighboring orientation components may underlie the determination of pictorial motion direction with MLs.  相似文献   

20.
Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号