首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Action can affect visual perception if the action's expected sensory effects resemble a concurrent unstable or deviant event. To determine whether action can also change auditory perception, participants were required to play pairs of octave-ambiguous tones by pressing successive keys on a piano or computer keyboard and to judge whether each pitch interval was rising or falling. Both pianists and nonpianist musicians gave significantly more “rising” responses when the order of key presses was left-to-right than when it was right-to-left, in accord with the pitch mapping of the piano. However, the effect was much larger in pianists. Pianists showed a similarly large effect when they passively observed the experimenter pressing keys on a piano keyboard, as long as the keyboard faced the participant. The results suggest that acquired action–effect associations can affect auditory perceptual judgement.  相似文献   

2.
Serial order is fundamental to perception, cognition and behavioral action. Three experiments investigated infants' perception, learning and discrimination of serial order. Four- and 8-month-old infants were habituated to three sequentially moving objects making visible and audible impacts and then were tested on separate test trials for their ability to detect auditory, visual or auditory-visual changes in their ordering. The 4-month-old infants did not respond to any order changes and instead appeared to attend to the 'local' audio-visual synchrony part of the event. When this local part of the event was blocked from view, the 4-month-olds did perceive the serial order feature of the event but only when it was specified multimodally. In contrast, the 8-month-old infants perceived all three kinds of order changes regardless of whether the synchrony part of the event was visible or not. The findings show that perception of spatiotemporal serial order emerges early in infancy and that its perception is initially facilitated by multimodal specification.  相似文献   

3.
Time duration is perceived to be longer when accompanied by dynamic sensory stimulation than when accompanied by static stimulation. This distortion of time perception is thought to be due to the acceleration of an internal pacemaker that has been assumed to be the main component of temporal judgments. In order to investigate whether the function of the internal pacemaker is modality dependent or independent, we examined the correlation of visual flicker and auditory flutter effects on a temporal production task. While seeing a 10‐Hz visual flicker or hearing a 10‐Hz auditory flutter, participants estimated a duration of 2500 ms as accurately as possible by pressing a button. The results showed a significant within‐individual correlation between the time distortion due to visual flicker and that due to auditory flutter. Additionally, we found that time distortion due to auditory flutter tended to be larger in female participants than in male participants. These results suggest that the mechanisms underlying subjective time dilation are similar between vision and audition within individuals, but that they vary across individuals.  相似文献   

4.
Contrary to the predictions of established theory, Schutz and Lipscomb (2007) have shown that visual information can influence the perceived duration of concurrent sounds. In the present study, we deconstruct the visual component of their illusion, showing that (1) cross-modal influence depends on visible cues signaling an impact event (namely, a sudden change of direction concurrent with tone onset) and (2) the illusion is controlled primarily by the duration of post-impact motion. Other aspects of the post-impact motion—distance traveled, velocity, acceleration, and the rate of its change (i.e., its derivative, jerk)—play a minor role, if any. Together, these results demonstrate that visual event duration can influence the perception of auditory event duration, but only when stimulus cues are sufficient to give rise to the perception of a causal cross-modal relationship. This refined understanding of the illusion’s visual aspects is helpful in comprehending why it contrasts so markedly with previous research on cross-modal integration, demonstrating that vision does not appreciably influence auditory judgments of event duration (Walker & Scott, 1981).  相似文献   

5.
In musical performance, bodily gestures play an important role in communicating expressive intentions to audiences. Although previous studies have demonstrated that visual information can have an effect on the perceived expressivity of musical performances, the investigation of audiovisual interactions has been held back by the technical difficulties associated with the generation of controlled, mismatching stimuli. With the present study, we aimed to address this issue by utilizing a novel method in order to generate controlled, balanced stimuli that comprised both matching and mismatching bimodal combinations of different expressive intentions. The aim of Experiment 1 was to investigate the relative contributions of auditory and visual kinematic cues in the perceived expressivity of piano performances, and in Experiment 2 we explored possible crossmodal interactions in the perception of auditory and visual expressivity. The results revealed that although both auditory and visual kinematic cues contribute significantly to the perception of overall expressivity, the effect of visual kinematic cues appears to be somewhat stronger. These results also provide preliminary evidence of crossmodal interactions in the perception of auditory and visual expressivity. In certain performance conditions, visual cues had an effect on the ratings of auditory expressivity, and auditory cues had a small effect on the ratings of visual expressivity.  相似文献   

6.
Here, we investigate how audiovisual context affects perceived event duration with experiments in which observers reported which of two stimuli they perceived as longer. Target events were visual and/or auditory and could be accompanied by nontargets in the other modality. Our results demonstrate that the temporal information conveyed by irrelevant sounds is automatically used when the brain estimates visual durations but that irrelevant visual information does not affect perceived auditory duration (Experiment 1). We further show that auditory influences on subjective visual durations occur only when the temporal characteristics of the stimuli promote perceptual grouping (Experiments 1 and 2). Placed in the context of scalar expectancy theory of time perception, our third and fourth experiments have the implication that audiovisual context can lead both to changes in the rate of an internal clock and to temporal ventriloquism-like effects on perceived on- and offsets. Finally, intramodal grouping of auditory stimuli diminished any crossmodal effects, suggesting a strong preference for intramodal over crossmodal perceptual grouping (Experiment 5).  相似文献   

7.
Studies of time estimation have provided evidence that human time perception is determined by an internal clock containing a temporal oscillator and have also provided estimates of the frequency of this oscillator (Treisman, Faulkner, Naish, & Brogan, 1992; Treisman & Brogan, 1992). These estimates were based on the observation that when the intervals to be estimated are accompanied by auditory clicks that recur at certain critical rates, perturbations in time estimation occur. To test the hypothesis that the mechanisms that underlie the perception of time and those that control the timing of motor performance are similar, analogous experiments were performed on motor timing, with the object of seeing whether evidence for a clock would be obtained and if so whether its properties resemble those of the time perception clock. The prediction was made that perturbations in motor timing would be seen at the same or similar critical auditory click rates. The experiments examined choice reaction time and typing. The results support the hypothesis that a temporal oscillator paces motor performance and that this oscillator is similar to the oscillator underlying time perception. They also provide an estimate of the characteristic frequency of the oscillator.  相似文献   

8.
《Ecological Psychology》2013,25(1):53-56
Two questions have priority for a perception psychologist: What is perceived, and what is the information for it? What we perceive are the affordances of the world. Because perception is prospective and goes on over time, the information for affordances is in events, both external and within the perceiver. Hence, we must study perception of events if we would understand how affordances are perceived.  相似文献   

9.
Witt JK  Proffitt DR  Epstein W 《Perception》2004,33(5):577-590
Perceiving egocentric distance is not only a function of the optical variables to which it relates, but also a function of people's current physiological potential to perform intended actions. In a set of experiments, we showed that, as the effort associated with walking increases, perceived distance increases if the perceiver intends to walk the extent, but not if the perceiver intends to throw. Conversely, as the effort associated with throwing increases, perceived distance increases if people intend to throw to the target, but not if they intend to walk. Perceiving distance combines the geometry of the world with our behavior goals and the potential of our body to achieve these goals.  相似文献   

10.
The features of perceived objects are processed in distinct neural pathways, which call for mechanisms that integrate the distributed information into coherent representations (the binding problem). Recent studies of sequential effects have demonstrated feature binding not only in perception, but also across (visual) perception and action planning. We investigated whether comparable effects can be obtained in and across auditory perception and action. The results from two experiments revealed effects indicative of spontaneous integration of auditory features (pitch and loudness, pitch and location), as well as evidence for audio—manual stimulus—response integration. Even though integration takes place spontaneously, features related to task-relevant stimulus or response dimensions are more likely to be integrated. Moreover, integration seems to follow a temporal overlap principle, with features coded close in time being more likely to be bound together. Taken altogether, the findings are consistent with the idea of episodic event files integrating perception and action plans.  相似文献   

11.
After adaptation to a fixed temporal delay between actions and their sensory consequences, stimuli delivered during the delay are perceived to occur prior to actions. Temporal judgments are also influenced by the sensation of agency (experience of causing our own actions and their sensory consequences). Sensory consequences of voluntary actions are perceived to occur earlier in time than those of involuntary actions. However, it is unclear whether temporal order illusions influence the sensation of agency. Thus, we tested how the illusionary reversal of motor actions and sound events affect the sensation of agency. We observed an absence of the sensation of agency in the auditory modality in a condition in which sounds were falsely perceived as preceding motor acts relative to the perceived temporal order in the control condition. This finding suggests a strong association between the sensation of agency and the temporal order perception of actions and their consequences.  相似文献   

12.
Attended stimuli are perceived as occurring earlier than unattended stimuli. This phenomenon of prior entry is usually identified by a shift in the point of subjective simultaneity (PSS) in temporal order judgements (TOJs). According to its traditional psychophysical interpretation, the PSS coincides with the perception of simultaneity. This assumption is, however, questionable. Technically, the PSS represents the temporal interval between two stimuli at which the two alternative TOJs are equally likely. Thus it also seems possible that observers perceive not simultaneity, but uncertainty of temporal order. This possibility is supported by prior-entry studies, which find that perception of simultaneity is not very likely at the PSS. The present study tested the percept at the PSS in prior entry, using peripheral cues to orient attention. We found that manipulating attention caused varying temporal perceptions around the PSS. On some occasions observers perceived the two stimuli as simultaneous, but on others they were simply uncertain about the order in which they had been presented. This finding contradicts the implicit assumption of most models of temporal order perception, that perception of simultaneity inevitably results if temporal order cannot be discriminated.  相似文献   

13.
Vatakis, A. and Spence, C. (in press) [Crossmodal binding: Evaluating the 'unity assumption' using audiovisual speech stimuli. Perception &Psychophysics] recently demonstrated that when two briefly presented speech signals (one auditory and the other visual) refer to the same audiovisual speech event, people find it harder to judge their temporal order than when they refer to different speech events. Vatakis and Spence argued that the 'unity assumption' facilitated crossmodal binding on the former (matching) trials by means of a process of temporal ventriloquism. In the present study, we investigated whether the 'unity assumption' would also affect the binding of non-speech stimuli (video clips of object action or musical notes). The auditory and visual stimuli were presented at a range of stimulus onset asynchronies (SOAs) using the method of constant stimuli. Participants made unspeeded temporal order judgments (TOJs) regarding which modality stream had been presented first. The auditory and visual musical and object action stimuli were either matched (e.g., the sight of a note being played on a piano together with the corresponding sound) or else mismatched (e.g., the sight of a note being played on a piano together with the sound of a guitar string being plucked). However, in contrast to the results of Vatakis and Spence's recent speech study, no significant difference in the accuracy of temporal discrimination performance for the matched versus mismatched video clips was observed. Reasons for this discrepancy are discussed.  相似文献   

14.
The effects of viewing the face of the talker (visual speech) on the processing of clearly presented intact auditory stimuli were investigated using two measures likely to be sensitive to the articulatory motor actions produced in speaking. The aim of these experiments was to highlight the need for accounts of the effects of audio-visual (AV) speech that explicitly consider the properties of articulated action. The first experiment employed a syllable-monitoring task in which participants were required to monitor for target syllables within foreign carrier phrases. An AV effect was found in that seeing a talker's moving face (moving face condition) assisted in more accurate recognition (hits and correct rejections) of spoken syllables than of auditory-only still face (still face condition) presentations. The second experiment examined processing of spoken phrases by investigating whether an AV effect would be found for estimates of phrase duration. Two effects of seeing the moving face of the talker were found. First, the moving face condition had significantly longer duration estimates than the still face auditory-only condition. Second, estimates of auditory duration made in the moving face condition reliably correlated with the actual durations whereas those made in the still face auditory condition did not. The third experiment was carried out to determine whether the stronger correlation between estimated and actual duration in the moving face condition might have been due to generic properties of AV presentation. Experiment 3 employed the procedures of the second experiment but used stimuli that were not perceived as speech although they possessed the same timing cues as those of the speech stimuli of Experiment 2. It was found that simply presenting both auditory and visual timing information did not result in more reliable duration estimates. Further, when released from the speech context (used in Experiment 2), duration estimates for the auditory-only stimuli were significantly correlated with actual durations. In all, these results demonstrate that visual speech can assist in the analysis of clearly presented auditory stimuli in tasks concerned with information provided by viewing the production of an utterance. We suggest that these findings are consistent with there being a processing link between perception and action such that viewing a talker speaking will activate speech motor schemas in the perceiver.  相似文献   

15.
Speech perception, especially in noise, may be maximized if the perceiver observes the naturally occurring visual-plus-auditory cues inherent in the production of spoken language. Evidence is conflicting, however, about which aspects of visual information mediate enhanced speech perception in noise. For this reason, we investigated the relative contributions of audibility and the type of visual cue in three experiments in young adults with normal hearing and vision. Relative to static visual cues, access to the talker??s phonetic gestures in speech production, especially in noise, was associated with (a) faster response times and sensitivity for speech understanding in noise, and (b) shorter latencies and reduced amplitudes of auditory N1 event-related potentials. Dynamic chewing facial motion also decreased the N1 latency, but only meaningful linguistic motions reduced the N1 amplitude. The hypothesis that auditory?Cvisual facilitation is distinct to properties of natural, dynamic speech gestures was partially supported.  相似文献   

16.
We present a new auditory illusion, the gap transfer illusion, supported by phenomenological and psychophysical data. In a typical situation, an ascending frequency glide of 2,500 msec with a temporal gap of 100 msec in the middle and a continuously descending frequency glide of 500 msec cross each other at their central positions. These glides move at the same constant speed in logarithmic frequency in opposite directions. The temporal gap in the long glide is perceived as if it were in the short glide. The same kind of subjective transfer of a temporal gap can take place also when the stimulus pattern is reversed in time. This phenomenon suggests that onsets and terminations of glide components behave as if they were independent perceptual elements. We also find that when two long frequency glides are presented successively with a short temporal overlap, a long glide tone covering the whole duration of the pattern and a short tone around the temporal middle can be perceived. To account for these results, we propose an event construction model, in which perceptual onsets and terminations are coupled to construct auditory events and the proximity principle connects these elements.  相似文献   

17.
The perception of transparency is highly dependent on luminance and perceived depth. An image region is seen as transparent if it is of intermediate luminance relative to adjacent image regions, and if it is perceived in front of another region and has a boundary which provides information that an object is visible through this region. Yet, transparency is not just the passive end-product of these required conditions. If perceived transparency is triggered, a number of seemingly more elemental perceptual primitives such as color, contour, and depth can be radically altered. Thus, with the perception of transparency, neon color spreading becomes apparent, depth changes, stereoscopic depth capture can be eliminated, and otherwise robust subjective contours can be abolished. In addition, we show that transparency is not coupled strongly to real-world chromatic constraints since combinations of luminance and color which would be unlikely to arise in real-world scenes still give rise to the perception of transparency. Rather than seeing transparency as a perceptual end-point, determined by seemingly more primitive processes, we interpret perceived transparency as much a 'cause', as an 'effect'. We speculate that the anatomical substrate for such mutual interaction may lie in cortical feed-forward connections which maintain modular segregation and cortical feedback connections which do not.  相似文献   

18.
Recently, Nakajima, ten Hoopen, and van der Wilk (1991) and Nakajima, ten Hoopen, Hilkhuysen, and Sasaki (1992) described what they believed to be a new illusion of auditory time perception. They reported that the perceived duration of a temporal interval was influenced by an immediately preceding or succeeding temporal interval The influence of a neighboring temporal interval on perceived duration is not a new illusion, however, but is another demonstration of the time-order error  相似文献   

19.
Vatakis A  Spence C 《Perception》2008,37(1):143-160
Research has shown that inversion is more detrimental to the perception of faces than to the perception of other types of visual stimuli. Inverting a face results in an impairment of configural information processing that leads to slowed early face processing and reduced accuracy when performance is tested in face recognition tasks. We investigated the effects of inverting speech and non-speech stimuli on audiovisual temporal perception. Upright and inverted audiovisual video clips of a person uttering syllables (experiments 1 and 2), playing musical notes on a piano (experiment 3), or a rhesus monkey producing vocalisations (experiment 4) were presented. Participants made unspeeded temporal-order judgments regarding which modality stream (auditory or visual) appeared to have been presented first. Inverting the visual stream did not have any effect on the sensitivity of temporal discrimination responses in any of the four experiments, thus implying that audiovisual temporal integration is resilient to the effects of orientation in the picture plane. By contrast, the point of subjective simultaneity differed significantly as a function of orientation only for the audiovisual speech stimuli but not for the non-speech stimuli or monkey calls. That is, smaller auditory leads were required for the inverted than for the upright-visual speech stimuli. These results are consistent with the longer processing latencies reported previously when human faces are inverted and demonstrates that the temporal perception of dynamic audiovisual speech can be modulated by changes in the physical properties of the visual speech (ie by changes in orientation).  相似文献   

20.
In a series of five experiments, we showed that the perception of temporal distance to a future event is shaped by the effort one must invest to realize the event. Studies 1a and 1b showed that when actors are faced with realizing an event by a certain deadline, more effortful events are perceived as closer in time, regardless of the objective temporal distance to the deadline. This negative relationship was reversed, however, when deadlines were absent (Study 2). Finally, priming high effort reduced perceived temporal distance to an event, whereas priming low effort increased perceived temporal distance to the event (Studies 3 and 4). The implications of these findings for models of temporal distance are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号