首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
王沛  苏洁 《心理科学》2007,30(6):1497-1499
音乐知觉涉及基于声音分析、听觉记忆、听觉场景分析、以及音乐的句法和语义加工的复杂的脑神经活动。同时,音乐知觉还潜在的影响情绪、自主神经系统、荷尔蒙以及免疫系统,并激活(前)运动表象。在过去的几年里,音乐加工及其神经关联方面的研究已取得了飞速的进展,主要体现为音乐知觉的神经认知模型、音乐的句法和语义加工、音乐和机体反应等领域的研究。  相似文献   

2.
Towards a neural basis of music perception   总被引:11,自引:0,他引:11  
Music perception involves complex brain functions underlying acoustic analysis, auditory memory, auditory scene analysis, and processing of musical syntax and semantics. Moreover, music perception potentially affects emotion, influences the autonomic nervous system, the hormonal and immune systems, and activates (pre)motor representations. During the past few years, research activities on different aspects of music processing and their neural correlates have rapidly progressed. This article provides an overview of recent developments and a framework for the perceptual side of music processing. This framework lays out a model of the cognitive modules involved in music perception, and incorporates information about the time course of activity of some of these modules, as well as research findings about where in the brain these modules might be located.  相似文献   

3.
We present a case that is unusual in many respects from other documented incidences of auditory agnosia, including the mechanism of injury, age of the individual, and location of neurological insult. The clinical presentation is one of disturbance in the perception of spoken language, music, pitch, emotional prosody, and temporal auditory processing in the absence of significant deficits in the comprehension of written language, expressive language production, or peripheral auditory function. Furthermore, the patient demonstrates relatively preserved function in other aspects of audition such as sound localization, voice recognition, and perception of animal noises and environmental sounds. This case study demonstrates that auditory agnosia is possible following traumatic brain injury in a child, and illustrates the necessity of assessment with a wide variety of auditory stimuli to fully characterize auditory agnosia in a single individual.  相似文献   

4.
People naturally dance to music, and research has shown that rhythmic auditory stimuli facilitate production of precisely timed body movements. If motor mechanisms are closely linked to auditory temporal processing, just as auditory temporal processing facilitates movement production, producing action might reciprocally enhance auditory temporal sensitivity. We tested this novel hypothesis with a standard temporal-bisection paradigm, in which the slope of the temporal-bisection function provides a measure of temporal sensitivity. The bisection slope for auditory time perception was steeper when participants initiated each auditory stimulus sequence via a keypress than when they passively heard each sequence, demonstrating that initiating action enhances auditory temporal sensitivity. This enhancement is specific to the auditory modality, because voluntarily initiating each sequence did not enhance visual temporal sensitivity. A control experiment ruled out the possibility that tactile sensation associated with a keypress increased auditory temporal sensitivity. Taken together, these results demonstrate a unique reciprocal relationship between auditory time perception and motor mechanisms. As auditory perception facilitates precisely timed movements, generating action enhances auditory temporal sensitivity.  相似文献   

5.
Salient auditory stimuli (e.g., music or sound effects) are commonly used in advertising to elicit attention. However, issues related to the effectiveness of such stimuli are not well understood. This research examines the ability of a salient auditory stimulus, in the form of a contrast interval (CI), to enhance recall of message-related information. Researchers have argued that the effectiveness of the CI is a function of the temporal duration between the onset and offset of the change in the background stimulus and the nature of this stimulus. Three experiments investigate these propositions and indicate that recall is enhanced, providing the CI is 3 s or less. Information highlighted with silence is recalled better than information highlighted with music.  相似文献   

6.
Abstract

The links between music and human movement have been shown to provide insight into crucial aspects of human’s perception, cognition, and sensorimotor systems. In this study, we examined the influence of music on movement during standstill, aiming at further characterizing the correspondences between movement, music, and perception, by analyzing head sway fractality. Eighty seven participants were asked to stand as still as possible for 500?seconds while being presented with alternating silence and audio stimuli. The audio stimuli were all rhythmic in nature, ranging from a metronome track to complex electronic dance music. The head position of each participant was captured with an optical motion capture system. Long-range correlations of head movement were estimated by detrended fluctuation analysis (DFA). Results agree with previous work on the movement-inducing effect of music, showing significantly greater head sway and lower head sway fractality during the music stimuli. In addition, patterns across stimuli suggest a two-way adaptation process to the effects of music, with musical stimuli influencing head sway while at the same time fractality modulated movement responses. Results indicate that fluctuations in head movement in both conditions exhibit long-range correlations, suggesting that the effects of music on head movement depended not only on the value of the most recent measured intervals, but also on the values of those intervals at distant times.  相似文献   

7.
Phillips-Silver and Trainor (Phillips-Silver, J., Trainor, L.J., (2005). Feeling the beat: movement influences infants' rhythm perception. Science, 308, 1430) demonstrated an early cross-modal interaction between body movement and auditory encoding of musical rhythm in infants. Here we show that the way adults move their bodies to music influences their auditory perception of the rhythm structure. We trained adults, while listening to an ambiguous rhythm with no accented beats, to bounce by bending their knees to interpret the rhythm either as a march or as a waltz. At test, adults identified as similar an auditory version of the rhythm pattern with accented strong beats that matched their previous bouncing experience in comparison with a version whose accents did not match. In subsequent experiments we showed that this effect does not depend on visual information, but that movement of the body is critical. Parallel results from adults and infants suggest that the movement-sound interaction develops early and is fundamental to music processing throughout life.  相似文献   

8.
人类对感觉阈限附近的视觉刺激的知觉不总是一致的。为探究这种视知觉不一致的现象及其神经机制, 一些研究者关注刺激前脑内自发alpha神经振荡(8~13 Hz)对视知觉的影响。近年来的研究发现, 刺激前alpha振荡能量的降低能提高被试的探测击中率, 但不能提高知觉精确度; 而刺激前alpha振荡的相位能预测被试能否成功探测刺激。刺激前alpha能量被认为调控了视皮层的基础活动强度; alpha能量的降低反映了皮层基础活动的增强, 进而提高了对较弱刺激的探测率。刺激前alpha相位则被认为调控了皮层兴奋和抑制的时间; 大脑在刺激呈现时的不同状态(兴奋/抑制)决定了最终的知觉结果。  相似文献   

9.
An attempt is made to study the impact of visual information on the perception of music by employing (rock) music videos as stimuli. Forty music videos were presented to judges, who either saw the video or only heard the respective pieces of music. They had to judge the emotions conveyed via the piece of music/video on scales and their overall impression on semantic differential scales. Results indicate that visual information as presented in music videos has considerable effects on impressions: When pieces of music are presented as music videos, more positive emotions are attributed, while presentation of the pieces of music alone resulted in more negative emotion attributions. Thus, music videos seem to "euphorize" the recipient. Furthermore, video presentation compared to presentation of music alone evoked more intense "complexity/interest" as well as "activity" judgments, while "evaluation" judgments were not influenced by the medium of presentation. In addition, a number of presentation factors (like the speed of the music or the number of cuts in the videos) do influence impressions. This leads to the conclusion that researchers should pay more attention to such microcharacteristics of stimuli. In general, the effects found to be due to the medium of presentation are independent of the effects on judgments due to content (i.e., presentation factors).  相似文献   

10.
McCotter MV  Jordan TR 《Perception》2003,32(8):921-936
We conducted four experiments to investigate the role of colour and luminance information in visual and audiovisual speech perception. In experiments 1a (stimuli presented in quiet conditions) and 1b (stimuli presented in auditory noise), face display types comprised naturalistic colour (NC), grey-scale (GS), and luminance inverted (LI) faces. In experiments 2a (quiet) and 2b (noise), face display types comprised NC, colour inverted (CI), LI, and colour and luminance inverted (CLI) faces. Six syllables and twenty-two words were used to produce auditory and visual speech stimuli. Auditory and visual signals were combined to produce congruent and incongruent audiovisual speech stimuli. Experiments 1a and 1b showed that perception of visual speech, and its influence on identifying the auditory components of congruent and incongruent audiovisual speech, was less for LI than for either NC or GS faces, which produced identical results. Experiments 2a and 2b showed that perception of visual speech, and influences on perception of incongruent auditory speech, was less for LI and CLI faces than for NC and CI faces (which produced identical patterns of performance). Our findings for NC and CI faces suggest that colour is not critical for perception of visual and audiovisual speech. The effect of luminance inversion on performance accuracy was relatively small (5%), which suggests that the luminance information preserved in LI faces is important for the processing of visual and audiovisual speech.  相似文献   

11.
Research has shown that auditory speech recognition is influenced by the appearance of a talker's face, but the actual nature of this visual information has yet to be established. Here, we report three experiments that investigated visual and audiovisual speech recognition using color, gray-scale, and point-light talking faces (which allowed comparison with the influence of isolated kinematic information). Auditory and visual forms of the syllables /ba/, /bi/, /ga/, /gi/, /va/, and /vi/ were used to produce auditory, visual, congruent, and incongruent audiovisual speech stimuli. Visual speech identification and visual influences on identifying the auditory components of congruent and incongruent audiovisual speech were identical for color and gray-scale faces and were much greater than for point-light faces. These results indicate that luminance, rather than color, underlies visual and audiovisual speech perception and that this information is more than the kinematic information provided by point-light faces. Implications for processing visual and audiovisual speech are discussed.  相似文献   

12.
近年来听觉表象开始得到关注,相关研究包括言语声音、音乐声音、环境声音的听觉表象三类。本文梳理了认知神经科学领域对上述三种听觉表象所激活的脑区研究,比较了听觉表象和听觉对应脑区的异同,并展望了听觉表象未来的研究方向。  相似文献   

13.
Here, we investigate how audiovisual context affects perceived event duration with experiments in which observers reported which of two stimuli they perceived as longer. Target events were visual and/or auditory and could be accompanied by nontargets in the other modality. Our results demonstrate that the temporal information conveyed by irrelevant sounds is automatically used when the brain estimates visual durations but that irrelevant visual information does not affect perceived auditory duration (Experiment 1). We further show that auditory influences on subjective visual durations occur only when the temporal characteristics of the stimuli promote perceptual grouping (Experiments 1 and 2). Placed in the context of scalar expectancy theory of time perception, our third and fourth experiments have the implication that audiovisual context can lead both to changes in the rate of an internal clock and to temporal ventriloquism-like effects on perceived on- and offsets. Finally, intramodal grouping of auditory stimuli diminished any crossmodal effects, suggesting a strong preference for intramodal over crossmodal perceptual grouping (Experiment 5).  相似文献   

14.
Listeners perceive speech sounds relative to context. Contextual influences might differ over hemispheres if different types of auditory processing are lateralized. Hemispheric differences in contextual influences on vowel perception were investigated by presenting speech targets and both speech and non-speech contexts to listeners’ right or left ears (contexts and targets either to the same or to opposite ears). Listeners performed a discrimination task. Vowel perception was influenced by acoustic properties of the context signals. The strength of this influence depended on laterality of target presentation, and on the speech/non-speech status of the context signal. We conclude that contrastive contextual influences on vowel perception are stronger when targets are processed predominately by the right hemisphere. In the left hemisphere, contrastive effects are smaller and largely restricted to speech contexts.  相似文献   

15.
The effect of audiovisual interactions on size perception has yet to be examined, despite its fundamental importance in daily life. Previous studies have reported that object length can be estimated solely on the basis of the sounds produced when an object is dropped. Moreover, it has been shown that people typically and easily perceive the correspondence between object sizes and sound intensities. It is therefore possible that auditory stimuli may act as cues for object size, thereby altering the visual perception of size. Thus, in the present study we examined the effects of auditory stimuli on the visual perception of size. Specifically, we investigated the effects of the sound intensity of auditory stimuli, the temporal window of audiovisual interactions, and the effects of the retinal eccentricity of visual stimuli. The results indicated that high-intensity auditory stimuli increased visually perceived object size, and that this effect was especially strong in the peripheral visual field. Additional consideration indicated that this effect on the visual perception of size is induced when the cue reliability is relatively higher for the auditory than for the visual stimuli. In addition, we further suggest that the cue reliabilities of visual and auditory stimuli relate to retinal eccentricity and sound intensity, respectively.  相似文献   

16.
In two experiments, we investigated whether simultaneous speech reading can influence the detection of speech in envelope-matched noise. Subjects attempted to detect the presence of a disyllabic utterance in noise while watching a speaker articulate a matching or a non-matching utterance. Speech detection was not facilitated by an audio-visual match, which suggests that listeners relied on low-level auditory cues whose perception was immune to cross-modal top-down influences. However, when the stimuli were words (Experiment 1), there was a (predicted) relative shift in bias, suggesting that the masking noise itself was perceived as more speechlike when its envelope corresponded to the visual information. This bias shift was absent, however, with non-word materials (Experiment 2). These results, which resemble earlier findings obtained with orthographic visual input, indicate that the mapping from sight to sound is lexically mediated even when, as in the case of the articulatory-phonetic correspondence, the cross-modal relationship is non-arbitrary.  相似文献   

17.
Although music and dance are often experienced simultaneously, it is unclear what modulates their perceptual integration. This study investigated how two factors related to music–dance correspondences influenced audiovisual binding of their rhythms: the metrical match between the music and dance, and the kinematic familiarity of the dance movement. Participants watched a point-light figure dancing synchronously to a triple-meter rhythm that they heard in parallel, whereby the dance communicated a triple (congruent) or a duple (incongruent) visual meter. The movement was either the participant’s own or that of another participant. Participants attended to both streams while detecting a temporal perturbation in the auditory beat. The results showed lower sensitivity to the auditory deviant when the visual dance was metrically congruent to the auditory rhythm and when the movement was the participant’s own. This indicated stronger audiovisual binding and a more coherent bimodal rhythm in these conditions, thus making a slight auditory deviant less noticeable. Moreover, binding in the metrically incongruent condition involving self-generated visual stimuli was correlated with self-recognition of the movement, suggesting that action simulation mediates the perceived coherence between one’s own movement and a mismatching auditory rhythm. Overall, the mechanisms of rhythm perception and action simulation could inform the perceived compatibility between music and dance, thus modulating the temporal integration of these audiovisual stimuli.  相似文献   

18.
Recently, findings on a wide range of auditory abnormalities among individuals with autism have been reported. To date, functional distinctions among these varied findings are poorly established. Such distinctions should be of interest to clinicians and researchers alike given their potential therapeutic and experimental applications. This review suggests three general trends among these findings as a starting point for future analyses. First, studies of auditory perception of linguistic and social auditory stimuli among individuals with autism generally have found impaired perception versus normal controls. Such findings may correlate with impaired language and communication skills and social isolation observed among individuals with autism. Second, studies of auditory perception of pitch and music among individuals with autism generally have found enhanced perception versus normal controls. These findings may correlate with the restrictive and highly focused behaviors observed among individuals with autism. Third, findings on the auditory perception of non-linguistic, non-musical stimuli among autism patients resist any generalized conclusions. Ultimately, as some researchers have already suggested, the distinction between impaired global processing and enhanced local processing may prove useful in making sense of apparently discordant findings on auditory abnormalities among individuals with autism.  相似文献   

19.
The effects of listening to music on cycling behaviour were evaluated. Twenty-five participants completed a track on a bicycle while listening to music with two standard earbuds, with one earbud, and with two in-earbuds. Conditions with high tempo music and loud volume were also included in the experiment, as were two mobile phone conditions, one in which participants operated the phone hand held and one handsfree condition.Cycle speed was not affected by listening to music, but was reduced in the telephone conditions. In general the response to auditory signals worsened when participants listened to music, in particular when listening with in-earbuds loud auditory stop signals were missed in 68% of the cases. However, when listening with only one standard earbud performance was not affected. In the conditions when participants listened to high volume and to high tempo music, the auditory stop signal was also heard in significantly fewer cases. Completing a task on the mobile phone, using both handheld and handsfree sets, resulted in increased response time to an auditory stop signal and also reduced overall auditory perception. Furthermore, handsfree operation only had minor advantages opposed to hand held operation, with only response time to an auditory stop signal resulting in faster performance. This is likely to be related to the fact that both hands could be used for braking.It is concluded that listening to music worsens auditory perception, in particular if in-earbuds are used. Furthermore, both handheld and handsfree operation of mobile phones has a negative effect on perception, potentially forming a threat to cyclist traffic safety.  相似文献   

20.
Priming is a useful tool for ascertaining the circumstances under which previous experiences influence behavior. Previously, using hierarchical stimuli, we demonstrated (Justus & List, 2005) that selectively attending to one temporal scale of an auditory stimulus improved subsequent attention to a repeated (vs. changed) temporal scale; that is, we demonstrated intertrial auditory temporal level priming. Here, we have extended those results to address whether level priming relied on absolute or relative temporal information. Both relative and absolute temporal information are important in auditory perception: Speech and music can be recognized over various temporal scales but become uninterpretable to a listener when presented too quickly or slowly. We first confirmed that temporal level priming generalized over new temporal scales. Second, in the context of multiple temporal scales, we found that temporal level priming operates predominantly on the basis of relative, rather than absolute, temporal information. These findings are discussed in the context of expectancies and relational invariance in audition.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号