首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
When sighted persons try to identify one of two speech utterances coming from different directions, they display both a frontal position advantage, i.e., better recognition of inputs from the front than of those from the rear, and a right-side advantage, better recognition of inputs from the right than of those from the left. The present study demonstrates a dissociation of the two effects in blind subjects (N = 10) who showed no frontal position advantage together with a right-side advantage superior to that of control sighted subjects (N = 16). There was no systematic difference between congenitally blind subjects and noncongenitals. The absence of frontal position advantage in the blind is consistent with the notion that this effect originates in the habit of sighted listeners to orient toward the source of heard speech. The occurrence of at least normal right-side advantage in the blind does not support recent suggestions of reduced lateralization of language functions in such subjects.  相似文献   

2.
Ss who are exposed to a sound coming from straight ahead, but who turn their eyes 20 deg to the side toward a visible speaker during the exposure period and expect to hear the sound coming from the visible source, show a shift in localization of the sound up to a maximum of about 9 deg. Ss who only turn their eyes 20 deg to the side during the exposure period show a smaller but significant shift in sound localization, while Ss who do not turn their eyes, but are led to expect that the sound will appear to come from a visible loudspeaker 20 deg to the side, show no significant shift. Comparison of test results before and after the exposure period, with eyes directed straight ahead and no visible speaker present, shows the presence of a localization aftereffect for those experimental groups that showed a significant localization shift during the exposure period. Sounds are localized a few degrees to the side of their physical location in the same direction as the shift in localization during the exposure period. Further experiments show that part, but not all, of the shift in localization during the exposure period can be understood in terms of a shift in perceived head direction. The localization aftereffects are shown not to be due to change in physical or perceived eye or head position.  相似文献   

3.
Previous studies have found that the effect of a tilted frame on egocentric rod adjustments is greater when an overhead display in a horizontal plane is viewed from a supine body position than when a vertical display is viewed from an erect body position. The present studies were designed to see whether this phenomenon could be attributed to an intravisual orientation contrast effect or to the effects of visually induced eye torsion. No significant erect-supine differences were found on measures of either effect. Errors in the direction of frame tilt were significantly greater in the supine position when observers were asked to align a visible rod or an unseen hand-held disk with the head, but no effect of body position was found in matching the orientation of the disk with the rod. The data suggest that erect-supine differences in frame effects are not attributable simply to intravisual factors. The results are discussed in terms used by Harris (1974) to describe “straight-ahead shifts” in judging spatial directions with respect to the median plane.  相似文献   

4.
胡中华  赵光  刘强  李红 《心理学报》2012,44(4):435-445
已有研究发现在视觉搜索任务中对直视的探测比斜视更快且更准确, 该现象被命名为“人群中的凝视效应”。大多数研究者将该效应的产生归因于直视会捕获更多的注意。然而, 直视条件下对搜索项的匹配加工更容易也有可能导致对直视的探测比斜视快。此外,已有研究还发现头的朝向会影响对注视方向的探测, 但对于其产生原因缺乏实验验证。本研究采用视觉搜索范式, 运用眼动技术, 把注视探测的视觉搜索过程分为准备阶段、搜索阶段和反应阶段, 对这两个问题进行了探讨。结果显示:对直视的探测优势主要表现在搜索阶段和反应阶段; 在搜索阶段直视的探测优势获益于搜索路径的变短和分心项数量的变少以及分心项平均注视时间的变短; 头的朝向仅在搜索阶段对注视探测产生影响。该结果表明, 在直视探测中对搜索项的匹配加工比在斜视探测中更容易也是导致“人群中的凝视效应”的原因之一; 头的朝向仅仅影响了对注视方向的搜索并没有影响对其的确认加工。  相似文献   

5.
How well do we maintain heading direction during walking while we look at objects beside our path by rotating our eyes, head, or trunk? Common experience indicates that it may be fairly hazardous not to look where you are going. In the present study, 12 young adults walked on a treadmill while they followed a moving dot along a horizontal line with their gaze by rotating primarily either their eyes, head, or trunk for amplitudes of up to 25 degrees . During walking the movement of the center of pressure (COP) was monitored using force transducers under a treadmill. Under normal light conditions, the participants showed little lateral deviation of the COP from the heading direction when they performed the eye or head movement task during walking, even when optic flow information was limited. In contrast, trunk rotations led to a doubling of the COP deviation in the mediolateral direction. Some of this deviation was attributed to foot rotation. Participants tended to point their feet in the gaze direction when making trunk turns. The tendency of the feet to be aligned with the trunk is likely to be due to a preference to have feet and body in the same orientation. Such alignment is weaker for the feet with respect to head position and it is absent with respect to eye position. It is argued that feet and trunk orientation are normally tightly coupled during gait and that it requires special abilities to move both segments independently when walking.  相似文献   

6.
This study proposed and verified a new hypothesis on the relationship between gaze direction and visual attention: attentional bias by default gaze direction based on eye-head coordination. We conducted a target identification task in which visual stimuli appeared briefly to the left and right of a fixation cross. In Experiment 1, the direction of the participant’s head (aligned with the body) was manipulated to the left, front, or right relative to a central fixation point. In Experiment 2, head direction was manipulated to the left, front, or right relative to the body direction. This manipulation was based on results showing that bias of eye position distribution was highly correlated with head direction. In both experiments, accuracy was greater when the target appeared at a position where the eyes would potentially be directed. Consequently, eye–head coordination influences visual attention. That is, attention can be automatically biased toward the location where the eyes tend to be directed.  相似文献   

7.
Eight participants were presented with auditory or visual targets and then indicated the target's remembered positions relative to their head eight seconds after actively moving their eyes, head or body to pull apart head, retinal, body, and external space reference frames. Remembered target position was indicated by repositioning sounds or lights. Localization errors were found related to head-on-body position but not of eye-in-head or body-in-space for both auditory (0.023 dB/deg in the direction of head displacement) and visual targets (0.068 deg/deg in the direction opposite to head displacement). The results indicate that both auditory and visual localization use head-on-body information, suggesting a common coding into body coordinates--the only conversion that requires this information.  相似文献   

8.
Tests were carried out on 17 subjects to determine the accuracy of monaural sound localization when the head is not free to turn toward the sound source. Maximum accuracy of localization for a constant-volume sound source coincided with the position for maximum perceived intensity of the sound in the front quadrant. There was a tendency for sounds to be perceived more often as coming from a position directly toward the ear. That is, for sounds in the front quadrant, errors of localization tended to be predominantly clockwise (i.e., biased toward a line directly facing the ear). Errors for sounds occurring in the rear quadrant tended to be anticlockwise. The pinna's differential effect on sound intensity between front and rear quadrants would assist in identifying the direction of movement of objects, for example an insect, passing the ear.  相似文献   

9.
We investigated whether the relative position of objects and the body would influence haptic recognition. People felt objects on the right or left side of their body midline, using their right hand. Their head was turned towards or away from the object, and they could not see their hands or the object. People were better at naming 2-D raised line drawings and 3-D small-scale models of objects and also real, everyday objects when they looked towards them. However, this head-towards benefit was reliable only when their right hand crossed their body midline to feel objects on their left side. Thus, haptic object recognition was influenced by people's head position, although vision of their hand and the object was blocked. This benefit of turning the head towards the object being explored suggests that proprioceptive and haptic inputs are remapped into an external coordinate system and that this remapping is harder when the body is in an unusual position (with the hand crossing the body midline and the head turned away from the hand). The results indicate that haptic processes align sensory inputs from the hand and head even though either hand-centered or object-centered coordinate systems should suffice for haptic object recognition.  相似文献   

10.
In a series of 4 experiments, the authors examined involuntary rotations of a steering device (handlebar or wheel) that were associated with periodic head rotations and eccentric head positions. Periodic head rotations resulted in isodirectional involuntary rotations of a horizontally arranged steering device of very small amplitude. When the orientation of a steering wheel was changed to vertical and to a backward tilt, the involuntary rotations were in the opposite direction. That pattern of results is consistent with the assumption that small movements of the shoulder girdle, which are associated with head turns and which cannot be prevented by mechanical immobilization of the shoulder, are propagated to the wheel, but is not consistent with previous suggestions that involuntary rotations of a steering device can result from the action of the tonic neck reflex. Effects that correspond to the pattern of the tonic neck reflex were found only when a spring-centered handlebar was held in an eccentric position; maintenance of the eccentric position was facilitated when the participant's head was turned in the opposite direction. The findings strongly suggest that head movements can result in involuntary movements of a steering device via different mechanisms.  相似文献   

11.
In a series of 4 experiments, the authors examined involuntary rotations of a steering device (handlebar or wheel) that were associated with periodic head rotations and eccentric head positions. Periodic head rotations resulted in isodirectional involuntary rotations of a horizontally arranged steering device of very small amplitude. When the orientation of a steering wheel was changed to vertical and to a backward tilt, the involuntary rotations were in the opposite direction. That pattern of results is consistent with the assumption that small movements of the shoulder girdle, which are associated with head turns and which cannot be prevented by mechanical immobilization of the shoulder, are propagated to the wheel, but is not consistent with previous suggestions that involuntary rotations of a steering device can result from the action of the tonic neck reflex. Effects that correspond to the pattern of the tonic neck reflex were found only when a spring-centered handlebar was held in an eccentric position; maintenance of the eccentric position was facilitated when the participant's head was turned in the opposite direction. The findings strongly suggest that head movements can result in involuntary movements of a steering device via different mechanisms.  相似文献   

12.
Several past studies have considered how perceived head orientation may be combined with perceived gaze direction in judging where someone else is attending. In three experiments we tested the impact of different sources of information by examining the role of head orientation in gaze-direction judgements when presenting: (a) the whole face; (b) the face with the nose masked; (c) just the eye region, removing all other head-orientation cues apart from some visible part of the nose; or (d) just the eyes, with all parts of the nose masked and no head orientation cues present other than those within the eyes themselves. We also varied time pressure on gaze direction judgements. The results showed that gaze judgements were not solely driven by the eye region. Gaze perception can also be affected by parts of the head and face, but in a manner that depends on the time constraints for gaze direction judgements. While “positive” congruency effects were found with time pressure (i.e., faster left/right judgements of seen gaze when the seen head deviated towards the same side as that gaze), the opposite applied without time pressure.  相似文献   

13.
Groups of 2-, 3-, and 4-month olds were tested for dichotic ear differences in memory-based phonetic and music timbre discriminations. A right-ear advantage for speech and a left-ear advantage (LEA) for music were found in the 3- and 4-month-olds. However, the 2-month-olds showed only the music LEA, with no reliable evidence of memory-based speech discrimination by either hemisphere. Thus, the responses of all groups to speech contrasts were different from those to music contrasts, but the pattern of the response dichotomy in the youngest group deviated from that found in the older infants. It is suggested that the quality or use of lefthemisphere phonetic memory may change between 2 and 3 months, and that the engagement of right-hemisphere specialized memory for musical timbre may precede that for left-hemisphere phonetic memory. Several directions for future research are suggested to determine whether infant short-term memory asymmetries for speech and music are attributable to acoustic factors, to different modes or strategies in perception, or to structural and dynamic properties of natural sound sources.  相似文献   

14.
Self-motion perception and vestibulo-ocular reflex (VOR) were studied during whole body yaw rotation in the dark at different static head positions. Rotations consisted of four cycles of symmetric sinusoidal and asymmetric oscillations. Self-motion perception was evaluated by measuring the ability of subjects to manually track a static remembered target. VOR was recorded separately and the slow phase eye position (SPEP) was computed. Three different head static yaw deviations (active and passive) relative to the trunk (0°, 45° to right and 45° to left) were examined. Active head deviations had a significant effect during asymmetric oscillation: the movement perception was enhanced when the head was kept turned toward the side of body rotation and decreased in the opposite direction. Conversely, passive head deviations had no effect on movement perception. Further, vibration (100 Hz) of the neck muscles splenius capitis and sternocleidomastoideus remarkably influenced perceived rotation during asymmetric oscillation. On the other hand, SPEP of VOR was modulated by active head deviation, but was not influenced by neck muscle vibration. Through its effects on motion perception and reflex gain, head position improved gaze stability and enhanced self-motion perception in the direction of the head deviation.  相似文献   

15.
Dark vergence depends on the vertical direction of gaze; it decreases with raised gaze and increases with lowered gaze. The vertical direction of gaze can be varied by means or raising or lowering the eyes or by way of tilting the head forward or backward. The effects of both manipulations on dark vergence are different. According to Heuer (1988) the effects of head tilt and eye inclination on dark vergence are almost, but not exactly, additive. In Exp. 1 the hypothesis of additive effects of gaze direction and eye inclination was tested and could not be rejected. The two additive hypotheses (head tilt and eye inclination vs. gaze inclination and eye inclination) result in different predictions for dark vergence with "compensatory" head and eye inclinations, which leave the direction of gaze in space invariant. In Exp. 2 it was shown that predictions from both hypotheses deviated from the observed values of dark vergence. Thus none of the two additive hypotheses provides exact predictions of dark vergence for all possible combinations of head tilt and eye inclination. For practical purposes the approximation might be sufficient. In particular, although mean dark vergence cannot be predicted exactly, individual differences can be predicted quite accurately.  相似文献   

16.
The effects of gaze direction on memory for faces were studied in children from three different age groups (6-7, 8-9, and 10-11 years old) using a computerized version of a task devised by Hood, Macrae, Cole-Davies and Dias (2003). Participants were presented with a sequence of faces in an encoding phase, and were then required to judge which faces they had previously encountered in a surprise two-alternative forced-choice recognition test. In one condition, stimulus eye gaze was either direct or deviated at the viewing phase, and eyes were closed at the test phase. In another condition, stimulus eyes were closed at the viewing phase, with either direct or deviated gaze at the test phase. Modulation of gaze direction affected hit rates, with participants demonstrating greater accuracy for direct gaze targets compared to deviated gaze targets in both conditions. Reaction times (RT) to correctly recognized stimuli were faster for direct gaze stimuli at the viewing phase, but not at the test phase. The age group of participants differentially affected these measures: there was a greater hit rate advantage for direct gaze stimuli in older children, although RTs were less affected by age. These findings suggest that while the facilitation of face recognition by gaze direction is robust across encoding and recognition stages, the efficiency of the process is affected by the stage at which gaze is modulated.  相似文献   

17.
Several past studies have considered how perceived head orientation may be combined with perceived gaze direction in judging where someone else is attending. In three experiments we tested the impact of different sources of information by examining the role of head orientation in gaze-direction judgements when presenting: (a) the whole face; (b) the face with the nose masked; (c) just the eye region, removing all other head-orientation cues apart from some visible part of the nose; or (d) just the eyes, with all parts of the nose masked and no head orientation cues present other than those within the eyes themselves. We also varied time pressure on gaze direction judgements. The results showed that gaze judgements were not solely driven by the eye region. Gaze perception can also be affected by parts of the head and face, but in a manner that depends on the time constraints for gaze direction judgements. While “positive” congruency effects were found with time pressure (i.e., faster left/right judgements of seen gaze when the seen head deviated towards the same side as that gaze), the opposite applied without time pressure.  相似文献   

18.
A novel population of cells is described, located in the anterior part of the superior temporal sulcus (STSa, sometimes called STPa) of the temporal lobe in the macaque monkey. These cells respond selectively to the sight of reaching but only when the agent performing the action is seen to be attending to the target position of the reaching. We describe how such conditional selectivity can be generated from the properties of distinct cell populations within STSa. One cell population responds selectively to faces, eye gaze, and body posture, and we argue that subsets of these cells code for the direction of attention of others. A second cell population is selectively responsive to limb movement in certain directions (e.g., responding to an arm movement to the left but not to an equivalent leg movement or vice versa). The responses of a subset of cells sensitive to limb movement are modulated by the direction of attention (indicated by head and body posture of the agent performing the action). We conclude that this combined analysis of direction of attention and body movements supports the detection of intentional actions.  相似文献   

19.
Previous evidence suggests that children's mastery of prosodic modulations to signal the informational status of discourse referents emerges quite late in development. In the present study, we investigate the children's use of head gestures as it compares to prosodic cues to signal a referent as being contrastive relative to a set of possible alternatives. A group of French-speaking pre-schoolers were audio-visually recorded while playing in a semi-spontaneous but controlled production task, to elicit target words in the context of broad focus, contrastive focus, or corrective focus utterances. We analysed the acoustic features of the target words (syllable duration and word-level pitch range), as well as the head gesture features accompanying these target words (head gesture type, alignment patterns with speech). We found that children's production of head gestures, but not their use of either syllable duration or word-level pitch range, was affected by focus condition. Children mostly aligned head gestures with relevant speech units, especially when the target word was in phrase-final position. Moreover, the presence of a head gesture was linked to greater syllable duration patterns in all focus conditions. Our results show that (a) 4- and 5-year-old French-speaking children use head gestures rather than prosodic cues to mark the informational status of discourse referents, (b) the use of head gestures may gradually entrain the production of adult-like prosodic features, and that (c) head gestures with no referential relation with speech may serve a linguistic structuring function in communication, at least during language development.  相似文献   

20.
Children and adults were tested on a forced‐choice face recognition task in which the direction of eye gaze was manipulated over the course of the initial presentation and subsequent test phase of the experiment. To establish the effects of gaze direction on the encoding process, participants were presented with to‐be‐studied faces displaying either direct or deviated gaze (i.e. encoding manipulation). At test, all the faces depicted persons with their eyes closed. To investigate the effects of gaze direction on the efficiency of the retrieval process, a second condition (i.e. retrieval manipulation) was run in which target faces were presented initially with eyes closed and tested with either direct or deviated gaze. The results revealed the encoding advantages enjoyed by faces with direct gaze was present for both children and adults. Faces with direct gaze were also recognized better than faces with deviated gaze at retrieval, although this effect was most pronounced for adults. Finally, the advantage for direct gaze over deviated gaze at encoding was greater than the advantage for direct gaze over deviated gaze at retrieval. We consider the theoretical implications of these findings.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号