首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 912 毫秒
1.
The present study examined the effect of perceived motion-in-depth on temporal interval perception. We required subjects to estimate the length of a short empty interval starting from the offset of a first marker and ending with the onset of a second marker. The size of the markers was manipulated so that the subjects perceived a visual object as approaching or receding. We demonstrated that the empty intervals between markers was perceived as taking shorter to view when the object was perceived as approaching than when it was perceived as receding. We found in addition that the motion-in-depth effect disappeared when the shape continuity between the first and second marker was broken or when the object approached but missed the face. We conclude that anticipated collision of an approaching object altered perception of an empty interval.  相似文献   

2.
Arao H  Suetomi D  Nakajima Y 《Perception》2000,29(7):819-830
The duration of a short empty time interval (typically shorter than 300 ms) is often underestimated when it is immediately preceded by a shorter time interval. This illusory underestimation--time-shrinking--had been studied only with auditory temporal patterns. In the present study, we examined whether similar underestimation would take place with visual temporal patterns. It turned out that underestimation of the same kind takes place also in the visual modality. However, a considerable difference between the auditory and the visual modalities appeared. In the auditory modality, it had been shown that the amount of underestimation decreased for preceding time intervals longer than 200 ms. In the present study, the underestimation increased when the preceding time interval varied from 160 to 400 ms. Furthermore, the differences between the two neighbouring intervals which could cause this underestimation had always been in a fixed range in the auditory modality. In the visual modality, the range was broader when the intervals were longer. These results were interpreted in terms of an assimilation process in light of the processing-time hypothesis proposed by Nakajima (1987 Perception 16 485-520) in order to explain an aspect of empty-duration perception.  相似文献   

3.
Buchan JN  Munhall KG 《Perception》2011,40(10):1164-1182
Conflicting visual speech information can influence the perception of acoustic speech, causing an illusory percept of a sound not present in the actual acoustic speech (the McGurk effect). We examined whether participants can voluntarily selectively attend to either the auditory or visual modality by instructing participants to pay attention to the information in one modality and to ignore competing information from the other modality. We also examined how performance under these instructions was affected by weakening the influence of the visual information by manipulating the temporal offset between the audio and video channels (experiment 1), and the spatial frequency information present in the video (experiment 2). Gaze behaviour was also monitored to examine whether attentional instructions influenced the gathering of visual information. While task instructions did have an influence on the observed integration of auditory and visual speech information, participants were unable to completely ignore conflicting information, particularly information from the visual stream. Manipulating temporal offset had a more pronounced interaction with task instructions than manipulating the amount of visual information. Participants' gaze behaviour suggests that the attended modality influences the gathering of visual information in audiovisual speech perception.  相似文献   

4.
Whereas the visual modality tends to dominate over the auditory modality in bimodal spatial perception, the auditory modality tends to dominate over the visual modality in bimodal temporal perception. Recent results suggest that the visual modality dominates bimodal spatial perception because spatial discriminability is typically greater for the visual than for the auditory modality; accordingly, visual dominance is eliminated or reversed when visual-spatial discriminability is reduced by degrading visual stimuli to be equivalent or inferior to auditory spatial discriminability. Thus, for spatial perception, the modality that provides greater discriminability dominates. Here, we ask whether auditory dominance in duration perception is similarly explained by factors that influence the relative quality of auditory and visual signals. In contrast to the spatial results, the auditory modality dominated over the visual modality in bimodal duration perception even when the auditory signal was clearly weaker, when the auditory signal was ignored (i.e., the visual signal was selectively attended), and when the temporal discriminability was equivalent for the auditory and visual signals. Thus, unlike spatial perception, where the modality carrying more discriminable signals dominates, duration perception seems to be mandatorily linked to auditory processing under most circumstances.  相似文献   

5.
Two experiments were pedormed under visual-only and visual-auditory discrepancy conditions (dubs) to assess observers’ abilities to read speech information on a face. In the first experiment, identification and multiple choice testing were used. In addition, the relation between visual and auditory phonetic information was manipulated and related to perceptual bias. In the second experiment, the “compellingness” of the visual-auditory discrepancy as a single speech event was manipulated. Subjects also rated the confidence they had that their perception of the lipped word was accurate. Results indicated that competing visual information exerted little effect on auditory speech recognition, but visual speech recognition was substantially interfered with when discrepant auditory information was present. The extent of auditory bias was found to be related to the abilities of observers to read speech under nondiscrepancy conditions, the magnitude of the visual-auditory discrepancy, and the compellingheSS of the visual-auditory discrepancy as a single event. Auditory bias during speech was found to be a moderately compelling conscious experience, and not simply a case of confused responding or guessing. Results were discussed in terms of current models of perceptual dominance and related to results from modality discordance during space perception.  相似文献   

6.
《Acta psychologica》2013,143(1):20-34
Both vision and touch yield comparable results in terms of roughness estimation of familiar textures as was shown in earlier studies. To our knowledge, no research has been conducted on the effect of sensory familiarity with the stimulus material on roughness estimation of unfamiliar textures.The influence of sensory modality and familiarity on roughness perception of dot pattern textures was investigated in a series of five experiments. Participants estimated the roughness of textures varying in mean center-to-center dot spacing in experimental conditions providing visual, haptic and visual–haptic combined information.The findings indicate that roughness perception of unfamiliar dot pattern textures is well described by a bi-exponential function of inter-dot spacing, regardless of the sensory modality used. However, sensory modality appears to affect the maximum of the psychophysical roughness function, with visually perceived roughness peaking for a smaller inter-dot spacing than haptic roughness. We propose that this might be due to the better spatial acuity of the visual modality. Individuals appeared to use different visual roughness estimation strategies depending on their first sensory experience (visual vs. haptic) with the stimulus material, primarily in an experimental context which required the combination of visual and haptic information in a single bimodal roughness estimate. Furthermore, the similarity of findings in experimental settings using real and virtual visual textures indicates the suitability of the experimental setup for neuroimaging studies, creating a more direct link between behavioral and neuroimaging results.  相似文献   

7.
Perceptual learning was used to study potential transfer effects in a duration discrimination task. Subjects were trained to discriminate between two empty temporal intervals marked with auditory beeps, using a twoalternative forced choice paradigm. The major goal was to examine whether perceptual learning would generalize to empty intervals that have the same duration but are marked by visual flashes. The experiment also included longer intervals marked with auditory beeps and filled auditory intervals of the same duration as the trained interval, in order to examine whether perceptual learning would generalize to these conditions within the same sensory modality. In contrast to previous findings showing a transfer from the haptic to the auditory modality, the present results do not indicate a transfer from the auditory to the visual modality; but they do show transfers within the auditory modality.  相似文献   

8.
Gamache PL  Grondin S 《Perception》2010,39(11):1431-1451
To further explore how memory influences time judgments, we conducted two experiments on the lifespan of temporal representations in memory. Penney et al (2000, Journal of Experimental Psychology Human Perception and Performance 26 1770-1787) reported that the perceived duration of auditorily and visually marked intervals differs only when both marker-type intervals are compared directly. This finding can be explained by a 'memory-mixing' process, whereby the memory trace of previous intervals influences the perception of upcoming ones, which are then added to the memory content. In the experiments discussed here, we manipulated the mixing mode of auditory/visual signal presentations. In experiment 1, signals from the same modality were either grouped by blocks or randomised within blocks. The results showed that the auditory/visual difference decreased but remained present when modalities were grouped by blocks. In experiment 2, we used a line-segmentation task. The results showed that, after a training block was performed in one modality, the perceived duration of signals from the other modality was distorted for at least 30 trials and that the magnitude of the difference decreased as the block went on. The results of both experiments highlight the influence of memory on time judgments, providing empirical support to, and quantitative portrayal of, the memory-mixing process.  相似文献   

9.
Accuracy of temporal coding: Auditory-visual comparisons   总被引:1,自引:0,他引:1  
Three experiments were designed to decide whether temporal information is coded more accurately for intervals defined by auditory events or for those defined by visual events. In the first experiment, the irregular-list technique was used, in which a short list of items was presented, the items all separated by different interstimulus intervals. Following presentation, the subject was given three items from the list, in their correct serial order, and was asked to judge the relative interstimulus intervals. Performance was indistinguishable whether the items were presented auditorily or visually. In the second experiment, two unfilled intervals were defined by three nonverbal signals in either the auditory or the visual modality. After delays of 0, 9, or 18 sec (the latter two filled with distractor activity), the subjects were directed to make a verbal estimate of the length of one of the two intervals, which ranged from 1 to 4 sec and from 10 to 13 sec. Again, performance was not dependent on the modality of the time markers. The results of Experiment 3, which was procedurally similar to Experiment 2 but with filled rather than empty intervals, showed significant modality differences in one measure only. Within the range of intervals employed in the present study, our results provide, at best, only modest support for theories that predict more accurate temporal coding in memory for auditory, rather than visual, stimulus presentation.  相似文献   

10.
It is not clear what role visual information plays in the development of space perception. It has previously been shown that in absence of vision, both the ability to judge orientation in the haptic modality and bisect intervals in the auditory modality are severely compromised (Gori, Sandini, Martinoli & Burr, 2010; Gori, Sandini, Martinoli & Burr, 2014). Here we report for the first time also a strong deficit in proprioceptive reproduction and audio distance evaluation in early blind children and adults. Interestingly, the deficit is not present in a small group of adults with acquired visual disability. Our results support the idea that in absence of vision the audio and proprioceptive spatial representations may be delayed or drastically weakened due to the lack of visual calibration over the auditory and haptic modalities during the critical period of development.  相似文献   

11.
6 experienced orienteers were subject to a VO2max treadmill test, two days prior to undertaking two tests of visual perception. One test was conducted while the subjects were in a rested state while the other was conducted while they were under a state of fatigue. Fatigue was defined as a state in which the subjects were working at or above their anaerobic threshold which had been determined previously from their VO2max test. The tests in both the fatigue and rest condition were of a similar nature, that is, the subjects were presented slides of orienteering checkpoints at regular intervals followed by a slide showing a set of questions which the subjects had to answer verbally. Two sets of slides were employed and these were approximately counterbalanced between both subjects and conditions. Points were awarded for the correct answers and the two conditions were then compared. The Wilcoxon test for two correlated samples was used and showed a significant difference between the fatigue and rest scores at p less than 0.05. The data suggest that under the influence of fatigue, an orienteer's ability to perceive visual information is greatly impaired.  相似文献   

12.
Long-term memory of haptic, visual, and cross-modality information was investigated. In Experiment 1, subjects briefly explored 40 commonplace objects visually or haptically and then received a recognition test with categorically similar foils in the same or the alternative modality both immediately and after 1 week. Recognition was best for visual input and test, with haptic memory still apparent after a week's delay. Recognition was poorest in the cross-modality conditions, with performance on the haptic-visual and visual-haptic cross-modal conditions being nearly identical. Visual and haptic information decayed at similar rates across a week delay. In Experiment 2, subjects simultaneously viewed and handled the same objects, and transfer was tested in a successive cue-modality paradigm. Performance with the visual modality again exceeded that with the haptic modality. Furthermore, initial errors on the haptic test were often corrected when followed by the visual presentation, both immediately and after 1 week. However, visual test errors were corrected by haptic cuing on the immediate test only. These results are discussed in terms of shared information between the haptic and visual modalities, and the ease of transfer between these modalities immediately and after a substantial delay.  相似文献   

13.
融景知觉现象是指观察者任意时刻虽然只能观察到部分刺激模式,却能基于部分刺激模式形成完整知觉。完成融景知觉的关键在于需要将时间序列上的信息进行整合。根据视知觉与视觉工作记忆动态交互模型,完成信息整合需视觉工作记忆的参与。然而以往研究均关注融景现象的知觉加工特性,尚未涉及视觉工作记忆在该现象中的作用机制。笔者采用双任务范式,要求被试完成规则或不规则刺激模式的融景知觉,同时操纵工作记忆负荷,以考察记忆负荷对融景知觉的影响。结果发现,高记忆负荷显著降低不规则刺激的融景知觉绩效(实验一);且该效应并非由记忆负荷对融景刺激维持阶段的影响所致(实验二)。上述结果说明,至少对不规则图形的融景知觉需要视觉工作记忆参与。  相似文献   

14.
The present experiment assessed intersensory differences in temporal judgments, that is, auditory stimuli are perceived as longer than physically equivalent visual stimuli. The results confirmed the intersensory difference. Auditorially defined intervals were experienced as longer than visually defined intervals. Auditory boundaries were perceived as longer than visual ones. An interaction of boundary modality and interval modality was obtained which suggested that auditorially defined intervals provided more temporal information about events occurring in close temporal proximity than visually defined intervals. It was hypothesized that cognitive factors, specifically stimulus complexity, would affect the auditory and visual systems differentially. This hypothesis was not substantiated, although highly complex stimuli were experienced as longer than those of low complexity.  相似文献   

15.
Various studies have demonstrated an advantage of auditory over visual text modality when learning with texts and pictures. To explain this modality effect, two complementary assumptions are proposed by cognitive theories of multimedia learning: first, the visuospatial load hypothesis, which explains the modality effect in terms of visuospatial working memory overload in the visual text condition; and second, the temporal contiguity assumption, according to which the modality effect occurs because solely auditory texts and pictures can be attended to simultaneously. The latter explanation applies only to simultaneous presentation, the former to both simultaneous and sequential presentation. This paper introduces a third explanation, according to which parts of the modality effect are due to early, sensory processes. This account predicts that-for texts longer than one sentence-the modality effect with sequential presentation is restricted to the information presented most recently. Two multimedia experiments tested the influence of text modality across three different conditions: simultaneous presentation of texts and pictures versus sequential presentation versus presentation of text only. Text comprehension and picture recognition served as dependent variables. An advantage for auditory texts was restricted to the most recent text information and occurred under all presentation conditions. With picture recognition, the modality effect was restricted to the simultaneous condition. These findings clearly support the idea that the modality effect can be attributed to early processes in perception and sensory memory rather than to a working memory bottleneck.  相似文献   

16.
Garvill, J. & Molander, B. Effects of standard modality, comparison modality and retention interval on matching of form. Scand. J. Psychol., 1973, 14, 203–206.MdashIntra-modal and cross-modal matching of nonsense forms was studied in a 2 (standard modalities: visual vs. tactual) by 2 (comparison modalities: visual vs. tactual) by 3 (intervals between standard and comparison: 1, 10 and 30 sec) factorial experiment. The errors were divided into false negatives and false positives. Significant effects of standard modality and of comparison modality were found for false negatives. For false positives the most prominent effect was an interaction between the standard modality and the comparison modality. Retention interval had no effect in any of the modality conditions. The effects are discussed in terms of differential information processing capacity for the visual and the tactual modalities.  相似文献   

17.
Experiments 1 and 2 compared, with a single-stimulus procedure, the discrimination of filled and empty intervals in both auditory and visual modalities. In Experiment 1, in which intervals were about 250 msec, the discrimination was superior with empty intervals in both modalities. In Experiment 2, with intervals lasting about 50 msec, empty intervals showed superior performance with visual signals only. In Experiment 3, for the auditory modality at 250 msec, the discrimination was easier with empty intervals than with filled intervals with both the forced-choice (FC) and the single stimulus (SS) modes of presentation, and the discrimination was easier with the FC than with the SS method. Experiment 4, however, showed that at 50 and 250 msec, with a FC-adaptive procedure, there were no differences between filled and empty intervals in the auditory mode; the differences observed with the visual mode in Experiments 1 and 2 remained significant. Finally, Experiment 5 compared differential thresholds for four marker-type conditions, filled and empty intervals in the auditory and visual modes, for durations ranging from .125 to 4 sec. The results showed (1) that the differential threshold differences among marker types are important for short durations but decrease with longer durations, and (2) that a generalized Weber’s law generally holds for these conditions. The results as a whole are discussed in terms of timing mechanisms.  相似文献   

18.
This article provides a selective review of time perception research, mainly focusing on the authors' research. Aspects of psychological time include simultaneity, successiveness, temporal order, and duration judgments. In contrast to findings at interstimulus intervals or durations less than 3.0–5.0 s, there is little evidence for an “across-senses” effect of perceptual modality (visual vs. auditory) at longer intervals or durations. In addition, the flow of time (events) is a pervasive perceptual illusion, and we review evidence on that. Some temporal information is encoded All rights reserved. relatively automatically into memory: People can judge time-related attributes such as recency, frequency, temporal order, and duration of events. Duration judgments in prospective and retrospective paradigms reveal differences between them, as well as variables that moderate the processes involved. An attentional-gate model is needed to account for prospective judgments, and a contextual-change model is needed to account for retrospective judgments.  相似文献   

19.
The effect of practice on the organization and the rate of identification of temporal patterns was investigated. The patterns involved a set of eight dichotomous left-right elements, repeated without interruption until the subject was able to identify the patterns. The patterns were presented in the auditory, tactual, or visual modalities, or in pairs of these modalities. In some conditions, all the pattern elements were presented in one modality; in other conditions, four elements were presented in one modality and the remaining four elements were presented in the second modality.

The results demonstrated that when all pattern elements were presented in one modality, naive subjects organized the sequence into a well-structured pattern; practiced subjects organized the sequence into a pattern beginning at the starting element. These patterns may be poorly-structured. When four pattern elements were presented in each of two modalities, naive subjects organized the elements in each modality separately; practiced subjects disregarded the modality structure and organized the sequence into a well-structured pattern.

These changes in organization suggest a hierarchy of perceptual modes; perception by modality (i.e. by sensations) is least complex, perception by pattern structure is intermediate, and perception by start point is most complex. Changes in the rate of pattern identification confirm this hierarchy. Furthermore, the changes in organization and identification found for highly practiced patterns were also found for novel patterns.  相似文献   

20.
There is evidence that for both auditory and visual speech perception, familiarity with the talker facilitates speech recognition. Explanations of these effects have concentrated on the retention of talker information specific to each of these modalities. It could be, however, that some amodal, talker-specific articulatory-style information facilitates speech perception in both modalities. If this is true, then experience with a talker in one modality should facilitate perception of speech from that talker in the other modality. In a test of this prediction, subjects were given about 1 hr of experience lipreading a talker and were then asked to recover speech in noise from either this same talker or a different talker. Results revealed that subjects who lip-read and heard speech from the same talker performed better on the speech-in-noise task than did subjects who lip-read from one talker and then heard speech from a different talker.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号