首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Previous studies have demonstrated large errors (over 30 degrees ) in visually perceived exocentric directions (the direction between two objects that are both displaced from the observer's location; e.g., Philbeck et al. [Philbeck, J. W., Sargent, J., Arthur, J. C., & Dopkins, S. (2008). Large manual pointing errors, but accurate verbal reports, for indications of target azimuth. Perception, 37, 511-534]). Here, we investigated whether a similar pattern occurs in auditory space. Blindfolded participants either attempted to aim a pointer at auditory targets (an exocentric task) or gave a verbal estimate of the egocentric target azimuth. Targets were located at 20-160 degrees azimuth in the right hemispace. For comparison, we also collected pointing and verbal judgments for visual targets. We found that exocentric pointing responses exhibited sizeable undershooting errors, for both auditory and visual targets, that tended to become more strongly negative as azimuth increased (up to -19 degrees for visual targets at 160 degrees ). Verbal estimates of the auditory and visual target azimuths, however, showed a dramatically different pattern, with relatively small overestimations of azimuths in the rear hemispace. At least some of the differences between verbal and pointing responses appear to be due to the frames of reference underlying the responses; when participants used the pointer to reproduce the egocentric target azimuth rather than the exocentric target direction relative to the pointer, the pattern of pointing errors more closely resembled that seen in verbal reports. These results show that there are similar distortions in perceiving exocentric directions in visual and auditory space.  相似文献   

2.
Doyle MC  Snowden RJ 《Perception》2001,30(7):795-810
Can auditory signals influence the processing of visual information? The present study examined the effects of simple auditory signals (clicks and noise bursts) whose onset was simultaneous with that of the visual target, but which provided no information about the target. It was found that such a signal enhances performance in the visual task: the accessory sound reduced response times for target identification with no cost to accuracy. The spatial location of the sound (whether central to the display or at the target location) did not modify this facilitation. Furthermore, the same pattern of facilitation was evident whether the observer fixated centrally or moved their eyes to the target. The results were not altered by changes in the contrast (and therefore visibility) of the visual stimulus or by the perceived utility of the spatial location of the sound. We speculate that the auditory signal may promote attentional 'disengagement' and that, as a result, observers are able to process the visual target sooner when sound accompanies the display relative to when visual information is presented alone.  相似文献   

3.
A brief visual cue that attracts attention repels the perceived location of a subsequent visual stimulus away from the focus of attention (attentional repulsion). In the first experiment reported here, we presented a visual cue after a visual target and found that the perceived location of the target stimulus shifted toward the location of the cue (attentional attraction). The subsequent experiments ruled out nonattentional hypotheses and indicated that the mislocalization effect is attributable to the attentional shift. The results of this study suggest that preceding and succeeding contexts differentially modulate the perceived location of a briefly presented stimulus. Our findings also underscore the importance of retrospective processes in visual attention.  相似文献   

4.
The aim of the experiment was to find out whether saccadiceve movements have any effect on perceived visual directions. ihe method was to alter the parameters of the oculomotor system so that the eye movement made in response to a peripheral target was inappropriate to the retinal locus of its image. It was found that this procedure had no effect on the perceived location of the peripheral target; and it was concluded that a specific retinal locus is more or less rigidly associated with a corresponding visual direction, but not with a particular magnitude of ocular rotation.  相似文献   

5.
Two-handed performance of a rhythmical fitts task by individuals and dyads.   总被引:3,自引:0,他引:3  
The authors investigated 4 variants of a reciprocal Fitts task in which the pointer was moved to a stationary target, the target was moved to a stationary pointer, or both the pointer and the target were moved to each other bimanually; the bimanual task was carried out either by a single person or by a dyad. Fitts's law held in all 4 conditions, with only minor parametric changes. The kinematic organization varied with task difficulty but remained invariant in task space (i.e., in the mutual frame of reference of the pointer-target system) whatever the pointing condition. In the bimanual conditions, the 2 effectors were coordinated in antiphase with compensatory variability. The authors suggest that the observed chronometric and kinematic patterns emerge from an interplay between simple harmonic motion and the stabilizing influence of the informational flow generated by the closing of the gap between the pointer and the target interval.  相似文献   

6.
Sometimes a goal-directed arm movement has to be modified en route due to an unforeseen perturbation such as a target displacement or a hand displacement by an external force. In this paper several aspects of that modification process are addressed. Subjects had to perform a point-to-point movement task on a computer screen using a mouse-coupled pointer as the representation of the hand position. Trajectory modifications were imposed by unexpectedly changing the position of the target or by changing the relation between mouse and screen pointer.In the first series of experiments, we examined how often a trajectory is updated. Here, trajectory modifications were imposed by unexpectedly changing the normal relation between mouse and pointer to a shear-like relation, where a percentage of the forward/backward position of the hand was added to the pointer position in the left/right direction. Withdrawal of visual feedback during the movement revealed that trajectories were updated at interval times shorter than 200 ms. From the similarity with experiments where the original relation between mouse and pointer was restored during the movements, we conclude that motor plans are updated on-line to move the hand from its current perceived position to the target.In a second series of experiments, we studied whether a continuous change in target position yields similar trajectory modifications as a continuous hand displacement. To mimic the latter perturbation, we used the above-mentioned distortion of the mouse-pointer relation. We found that the resulting hand paths did not differ for the two visual perturbations and conclude that the perturbed, goal-directed movements are modified in a consistent way, irrespective of whether the position of the target or hand was perturbed. Simulations of the experimental data with a kinematic reaching model support this conclusion.  相似文献   

7.
Two experiments investigated the ability of subjects to identify a moving, tactile stimulus. In both experiments, the subjects were presented with a target to their left index fingerpad and a nontarget (also moving) to their left middle fingerpad. Subjects were instructed to attend only to the target location and to respond “1” if the stimulus moved either to the left or up the finger, and to respond “2” if the stimulus moved either right or down the finger. The results showed that accuracy was better and reaction times were faster when the target and nontarget moved in the same direction than when they moved in different directions. When the target and nontarget moved in different directions, accuracy was significantly better and reaction times were significantly faster when the two stimuli had the same assigned response than when they had different responses. The results provide support for the conclusion that movement information is processed across adjacent fingers to the level of incipient response activation, even when subjects attempt to focus their attention on one location on the skin.  相似文献   

8.
An analysis is presented of ways in which the total duration of perception of transient visual stimuli may be determined by means of psychophysical judgments of the simultaneity (or relative precedence) of two sensory events. This analysis yields a new method for measuring the duration of perception that only requires judgments of the simultaneity of the offset of one visual target with the onset of another (“offset-onset” judgments), and is thus free of differential biases between onset-onset and offset-onset judgments of simultaneity which could be involved in previous measurements. When three or more perceived durations need to be determined, the new method is more efficient than earlier methods; it requires measurement of only one PSE in order to evaluate one response duration as compared to two PSEs per response duration for previous methods. We also describe ways of determining the presence of some kinds of biases and quantitatively evaluating the magnitude of bias in the new method, as well as bias in onset-onset or offset-offset judgments of simultaneity alone; such evaluations of differential bias were not possible for the earlier methods. An experimental example of a bias analysis is described. No significant biasing effects were detected in the measures of perceived duration that were extracted as either retinal location or background luminance was changed, although background luminance itself markedly influenced the values of perceived duration.  相似文献   

9.
Many tasks have been used to probe human directional knowledge, but relatively little is known about the comparative merits of different means of indicating target azimuth. Few studies have compared action-based versus non-action-based judgments for targets encircling the observer. This comparison promises to illuminate not only the perception of azimuths in the front and rear hemispaces, but also the frames of reference underlying various azimuth judgments, and ultimately their neural underpinnings. We compared a response in which participants aimed a pointer at a nearby target, with verbal azimuth estimates. Target locations were distributed between 20 degrees and 340 degrees. Non-visual pointing responses exhibited large constant errors (up to -32 degrees) that tended to increase with target eccentricity. Pointing with eyes open also showed large errors (up to -21 degrees). In striking contrast, verbal reports were highly accurate, with constant errors rarely exceeding +/-5 degrees. Under our testing conditions, these results are not likely to stem from differences in perception-based versus action-based responses, but instead reflect the frames of reference underlying the pointing and verbal responses. When participants used the pointer to match the egocentric target azimuth rather than the exocentric target azimuth relative to the pointer, errors were reduced.  相似文献   

10.
Two experiments investigated the ability of subjects to identify a moving, tactile stimulus. In both experiments, the subjects were presented with a target to their left index fingerpad and a nontarget (also moving) to their left middle fingerpad. Subjects were instructed to attend only to the target location and to respond "1" if the stimulus moved either to the left or up the finger, and to respond "2" if the stimulus moved either right or down the finger. The results showed that accuracy was better and reaction times were faster when the target and nontarget moved in the same direction than when they moved in different directions. When the target and nontarget moved in different directions, accuracy was significantly better and reaction times were significantly faster when the two stimuli had the same assigned response than when they had different responses. The results provide support for the conclusion that movement information is processed across adjacent fingers to the level of incipient response activation, even when subjects attempt to focus their attention on one location on the skin.  相似文献   

11.
Visual discrimination and detection responses to a single stimulus presented simultaneously with noise stimuli are slower and less accurate than are responses to a single stimulus presented alone. This occurs even though the location of the relevant stimulus (target) is known or visually indicated with stimuli onset. Results showed that noise elements delay focal attending and processing of a target. Furthermore, precuing the target location reduces, and can eliminate, target processing delays. Processing delays were not due to response competition or to random attentional capture by noise. It is suggested that simultaneous stimuli are perceived initially as a single object, and delays in processing a single stimulus are due to difficulties in perceptually segregating this stimulus from noise. Precuing is assumed to facilitate this segregation process.  相似文献   

12.
We examined the influence of context on exocentric pointing. In a virtual three-dimensional set-up, we asked our subjects to aim a pointer toward a target in two conditions. The target and the pointer were visible alone, or they were visible with planes through each of them. The planes consisted of a regular grid of horizontal and vertical lines. The presence of the planes had a significant influence on the indicated direction. These changes in indicated direction depended systematically on the orientation of the planes relative to the subject and on the angle between the planes. When the orientation of the (perpendicular) planes varied from asymmetrical to symmetrical to the frontoparallel plane, the indicated direction varied over a range of 15 degrees--from a slightly larger slant to a smaller slant--as compared with the condition without the contextual planes. When the dihedral angle between the two planes varied from 90 degrees to 40 degrees, the indicated direction varied over a range of less than 5 degrees: A smaller angle led to a slightly larger slant. The standard deviations in the indicated directions (about 3 degrees) did not change systematically. The additional structure provided by the planes did not lead to more consistent pointing. The systematic changes in the indicated direction contradict all theories that assume that the perceived distance between any two given points is independent of whatever else is present in the visual field--that is, they contradict all theories of visual space that assume that its geometry is independent of its contents (e.g., Gilinsky, 1951; Luneburg, 1947; Wagner, 1985).  相似文献   

13.
Spatial attention can be biased to locations near the hand. Some studies have found facilitated processing of targets appearing within hand-grasping space. In this study, we investigated how changing top-down task priorities alters hand bias during visual processing. In Experiment 1, we used a covert orienting paradigm with nonpredictive cues and emphasized the location of the hand relative to the target. Hands or visual anchors (boards) were placed next to potential target locations, and responses were made with the contralateral hand. Results indicated a hand-specific processing bias: Hand location, but not board location, speeded responses to targets near the hand. This pattern of results replicated previous studies using covert orienting paradigms with highly predictive cues. In Experiment 2, we used the same basic paradigm but emphasized the location of the response hand. Results now showed speeded responses to targets near response locations. Together these experiments demonstrated that top-down instructional sets (i.e., what is considered to be most relevant to task performance) can change the processing priority of hand location by influencing the strength of top-down, as compared with bottom-up, inputs competing for attention resources.  相似文献   

14.
In the present study, by using a briefly masked prime display paradigm, we investigated whether the pointing relation (same or different) between two unconsciously perceived arrows in the prime could be processed. Since only motor response priming can reflect unconscious processing of two arrows’ pointing-direction relation (i.e., a relational integration), we could distinguish the motor response priming from the visual priming in this study which in other studies were not separated. We also manipulated the prime-to-target stimulus onset asynchronies (SOA) by using a 70?ms and a 180?ms SOA. In this experiment, two masked arrow signs pointing in the same or different directions (> > or > <) were simultaneously presented in the prime, followed by two arrow symbols also pointing in the same or different directions in the target. The participants were asked to decide whether the two arrows in the target were pointing in the same or different directions. The results did not show any visual priming effect, but did show that the unconsciously perceived pointing relation in the prime elicited a positive motor response priming effect in RT under the 70?ms SOA condition, and a negative motor response priming effect in accuracy under the 180?ms SOA condition. The results were discussed in terms of self-motor-inhibition (or mask-triggered inhibition) and attention mechanisms. Overall, this study indicated that the pointing relation between the two subliminal arrows in the prime could influence the subsequent responses to the target and suggested that people can integrate unconsciously perceived information.  相似文献   

15.
Using a pointing test, perceived location of a target seen in induced motion was evaluated under two display conditions. In one, a fixated, horizontally stationary spot was surrounded by a frame moving back and forth. As the frame moved to each side, its center shifted correspondingly with respect to the subject’s objective median plane. In the second display, the surround was constructed so that as it moved back and forth, its center remained in virtual alignment with the objective median plane. Although both conditions produced a substantial induced-motion effect, only the former produced significant shifts in the target’s perceived location. Furthermore, similar shifts were also obtained with a stationary, offcenter frame (Experiment 2). This suggests that the changes in perceived location obtained with the first induced-motion display were not derived from the induced motion per se, but, rather, from a frame effect produced when the surround moved to an off-center position. Implications for the relationship between perceived motion and position, as well as for two theories of induced motion, are discussed.  相似文献   

16.
Abnormal balance in individuals suffering from traumatic brain injury (TBI) has been documented in numerous recent studies. However, specific mechanisms causing balance deficits have not been systematically examined. This paper demonstrated the destabilizing effect of visual field motion, induced by virtual reality graphics in concussed individuals but not in normal controls. Fifty five student-athletes at risk for concussion participated in this study prior to injury and 10 of these subjects who suffered MTBI were tested again on day 3, day 10, and day 30 after the incident. Postural responses to visual field motion were recorded using a virtual reality (VR) environment in conjunction with balance (AMTI force plate) and motion tracking (Flock of Birds) technologies. Two experimental conditions were introduced where subjects passively viewed VR scenes or actively manipulated the visual field motion. Long-lasting destabilizing effects of visual field motion were revealed, although subjects were asymptomatic when standard balance tests were introduced. The findings demonstrate that advanced VR technology may detect residual symptoms of concussion at least 30 days post-injury.  相似文献   

17.
Five experiments are reported in which subjects judged the movement or spatial location of a visible object that underwent a combination of real and induced (illusory) motion. When subjects attempted to reproduce the distance that the object moved by moving their unseen hands, they were more affected by the illusion than when they pointed to the object's perceived final location. Furthermore, pointing to the final location was more affected by the illusion when the hand movement began from the same position as that at which the object initially appeared, as compared with responses that began from other positions. The results suggest that people may separately encode two distinct types of spatial information: (1) information about the distance moved by an object and (2) information about the absolute spatial location of the object. Information about distance is more susceptible to the influence of an induced motion illusion, and people appear to rely differentially on the different types of spatial information, depending on features of the pointing response. The results have important implications for the mechanisms that underlie spatially oriented behavior in general.  相似文献   

18.
It has been argued that two distinct maps of visual space are formed: a cognitive map that is susceptible to illusions, and a motor map that represents the physical world veridically. In the present study, subjects responded to a nonspatial attribute of a visual target stimulus by pressing a left or right key, while an illusory horizontal displacement of the target was induced. A Simon-type effect was obtained to the induced target motion or position shift—that is, responses were faster when the illusory target motion or location corresponded to the response position. Further experiments indicated that the observed effects cannot be accounted for by attentional shifts. These results suggest that the content of the cognitive map does not only influence perceptual judgments but is also responsible for the automatic activation of response codes. In other words, perception and action seem to be fed by a common, cognitively penetrable, spatial representation.  相似文献   

19.
Getzmann S  Lewald J  Guski R 《Perception》2004,33(5):591-599
The final position of a moving visual object usually appears to be displaced in the direction of motion. We investigated this phenomenon, termed representational momentum, in the auditory modality. In a dark anechoic environment, an acoustic target (continuous noise or noise pulses) moved from left to right or from right to left along the frontal horizontal plane. Listeners judged the final position of the target using a hand pointer. Target velocity was 8 degrees s(-1) or 16 degrees s(-1). Generally, the final target positions were localised as displaced in the direction of motion. With presentation of continuous noise, target velocity had a strong influence on mean displacement: displacements were stronger with lower velocity. No influence of sound velocity on displacement was found with motion of pulsed noise. Although these findings suggest that the underlying mechanisms may be different in the auditory and visual modality, the occurrence of displacements indicates that representational-momentum-like effects are not restricted to the visual modality, but may reflect a general phenomenon with judgments of dynamic events.  相似文献   

20.
Three experiments investigated auditory distance perception under natural listening conditions in a large open field. Targets varied in egocentric distance from 3 to 16 m. By presenting visual targets at these same locations on other trials, we were able to compare visual and auditory distance perception under similar circumstances. In some experimental conditions, observers made verbal reports of target distance. In others, observers viewed or listened to the target and then, without further perceptual information about the target, attempted to face the target, walk directly to it, or walk along a two-segment indirect path to it. The primary results were these. First, the verbal and walking responses were largely concordant, with the walking responses exhibiting less between-observer variability. Second, different motoric responses provided consistent estimates of the perceived target locations and, therefore, of the initially perceived distances. Third, under circumstances for which visual targets were perceived more or less correctly in distance using the more precise walking response, auditory targets were generally perceived with considerable systematic error. In particular, the perceived locations of the auditory targets varied only about half as much in distance as did the physical targets; in addition, there was a tendency to underestimate target distance, except for the closest targets.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号