首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Microsaccades keep the eyes' balance during fixation   总被引:2,自引:0,他引:2  
During fixation of a stationary target, small involuntary eye movements exhibit an erratic trajectory-a random walk. Two types of these fixational eye movements are drift and microsaccades (small-amplitude saccades). We investigated fixational eye movements and binocular coordination using a statistical analysis that had previously been applied to human posture control. This random-walk analysis uncovered two different time scales in fixational eye movements and identified specific functions for microsaccades. On a short time scale, microsaccades enhanced perception by increasing fixation errors. On a long time scale, microsaccades reduced fixation errors and binocular disparity (relative to pure drift movements). Thus, our findings clarify the role of oculomotor processes during fixation.  相似文献   

2.
In three experiments, we investigated the behavioral consequences of being absorbed into music on performance in a concurrent task. We tested two competing hypotheses: Based on a cognitive load account, captivation of attention by the music and state absorption might slow down reactions in the decisional task. Alternatively, music could induce spontaneous motor activity, and being absorbed in music might result in a more autonomous, flow-driven behavior with quicker motor reactions. Participants performed a simple, visual, two-alternative forced-choice task while listening to popular musical excerpts. Subsequently, they rated their subjective experience using a short questionnaire. We presented music in four tempo categories (between 80 and 140 BPM) to account for a potential effect of tempo and an interaction between tempo and absorption. In Experiment 1, absorption was related to decreased reaction times (RTs) in the visual task. This effect was small, as expected in this setting, but replicable in Experiment 2. There was no effect of the music’s tempo on RTs but a tendency of mind wandering to relate to task performance. After slightly changing the study setting in Experiment 3, flow predicted decreased RTs, but absorption alone — as part of the flow construct — did not predict RTs. To sum up, we demonstrated that being absorbed in music can have the behavioral consequence of speeded manual reactions in specific task contexts, and people seem to integrate the music into an active, flow-driven and therefore enhanced performance. However, shown relations depend on task settings, and a systematic study of context is necessary to understand how induced states and their measurement contribute to the findings.  相似文献   

3.
Involuntary microsaccades and voluntary saccades reflect human brain activities during attention and cognitive tasks. Our eye movements can also betray our emotional state. However, the effects of attention to emotion on microsaccadic activity remain unknown. The present study was conducted in healthy volunteers to investigate the effects of devoting attention to exogenous emotional stimuli on microsaccadic response, with change in pupil size as an index of sympathetic nervous system activity. Event-related responses to unpleasant images significantly inhibited the rate of microsaccade appearance and altered pupil size (Experiment 1). Additionally, microsaccadic responses to covert orienting of attention to emotional stimuli appeared significantly in the anti-direction to a target, with a fast reaction time (Experiment 2). Therefore, we concluded that attentional shifts induced by exogenous emotional stimuli can modulate microsaccadic activities. Future studies of the interaction between miniature eye movements and emotion may be beneficial in the assessment of pathophysiological responses in mental disorders.  相似文献   

4.
Fixational eye movements are not an index of covert attention   总被引:3,自引:0,他引:3  
The debate about the nature of fixational eye movements has revived recently with the claim that microsaccades reflect the direction of attentional shifts. A number of studies have shown an association between the direction of attentional cues and the direction of microsaccades. We sought to determine whether microsaccades in attentional tasks are causally related to behavior. Is reaction time (RT) faster when microsaccades point toward the target than when they point in the opposite direction? We used a dual-Purkinje-image eyetracker to measure gaze position while 3 observers (2 of the authors, 1 naive observer) performed an attentional cuing task under three different response conditions: saccadic localization, manual localization, and manual detection. Critical trials were those on which microsaccades moved away from the cue. On these trials, RTs were slower when microsaccades were oriented toward the target than when they were oriented away from the target. We obtained similar results for direction of drift. Cues, not fixational eye movements, predicted behavior.  相似文献   

5.
There is strong evidence of shared acoustic profiles common to the expression of emotions in music and speech, yet relatively limited understanding of the specific psychoacoustic features involved. This study combined a controlled experiment and computational modelling to investigate the perceptual codes associated with the expression of emotion in the acoustic domain. The empirical stage of the study provided continuous human ratings of emotions perceived in excerpts of film music and natural speech samples. The computational stage created a computer model that retrieves the relevant information from the acoustic stimuli and makes predictions about the emotional expressiveness of speech and music close to the responses of human subjects. We show that a significant part of the listeners’ second-by-second reported emotions to music and speech prosody can be predicted from a set of seven psychoacoustic features: loudness, tempo/speech rate, melody/prosody contour, spectral centroid, spectral flux, sharpness, and roughness. The implications of these results are discussed in the context of cross-modal similarities in the communication of emotion in the acoustic domain.  相似文献   

6.
This article describes three experiments on the possibility of expressing scent or sound into visual forms made by design engineering students. The hypothesis tested was that people are able to pick up patterns in the energy flow that the students transposed from one perceptual sense to another. In Exp. 1 subjects were given different scents and were asked to choose a sculpture designed according these scents. In Exp. 2 subjects were given different musical pieces and asked to match them with portable cassette players designed according to this music. Exp. 3 was identical to Exp. 2 but different music selections, similar to the ones in Exp. 2, were used. In all three experiments subjects were indeed able to perform the tasks above chance level. Results are discussed within the framework of the theory of direct perception of Gibson.  相似文献   

7.
Recent research suggests that the different components of eye movements (fixations, saccades) are not strictly separate but are interdependent processes. This argument rests on observations that gaze-step sizes yield unimodal distributions and exhibit power-law scaling, indicative of interdependent processes coordinated across timescales. The studies that produced these findings, however, employed complex tasks (visual search, scene perception). Thus, the question is whether the observed interdependence is a fundamental property of eye movements or emerges in the interplay between cognitive processes and complex visual stimuli. In this study, we used a simple eye movement task where participants moved their eyes in a prescribed sequence at several different paces. We outlined diverging predictions for this task for independence versus interdependence of fixational and saccadic fluctuations and tested these predictions by assessing the spectral properties of eye movements. We found no clear peak in the power spectrum attributable exclusively to saccadic fluctuations. Furthermore, changing the pace of the eye movement sequence yielded a global shift in scaling relations evident in the power spectrum, not just a localized shift for saccadic fluctuations. These results support the conclusion that fixations and saccades are interdependent processes.  相似文献   

8.
Schmauder (1991), studying eye movements during reading, cross-modal naming, and cross-modal lexical decision (CMLD) tasks, failed to find evidence of verb argument structure complexity as Shapiro, Zurif, and Grimshaw (1987) had reported for the CMLD task. Shapiro, Brookins, Gordon, and Nagel (1991) suggested that Schmauder did not detect the effect in the CMLD task because the monosyllabic secondary lexical decision (LD) probes she used did not produce enough processing load to detect an effect of argument structure complexity. The present experiment compared the LD probes used by Schmauder with the LD probes used by Shapiro et al. (1987) and failed to find any evidence for the argument structure complexity effect for either type of probe.  相似文献   

9.
Covert shifts of attention are usually reflected in RT differences between responses to valid and invalid cues in the Posner spatial attention task. Such inferences about covert shifts of attention do not control for microsaccades in the cue-target interval. We analyzed the effects of microsaccade orientation on RTs in four conditions, crossing peripheral visual and auditory cues with peripheral visual and auditory discrimination targets. Reaction time was generally faster on trials without microsaccades in the cue-target interval. If microsaccades occurred, the target-location congruency of the last microsaccade in the cue-target interval interacted in a complex way with cue validity. For valid visual cues, irrespective of whether the discrimination target was visual or auditory, target-congruent microsaccades delayed RT. For invalid cues, target-incongruent microsaccades facilitated RTs for visual target discrimination but delayed RT for auditory target discrimination. No reliable effects on RT were associated with auditory cues or with the first microsaccade in the cue-target interval. We discuss theoretical implications on the relation about spatial attention and oculomotor processes.  相似文献   

10.
Cross-modal priming occurs when a prime presented in one sensory modality influences responses to a target in a different sensory modality. Currently, demonstrations of cross-modal evaluative priming have been sparse and limited. In the present study, we seek to partially rectify this state of affairs by examining cross-modal evaluative priming from auditory primes to visual targets. Significant cross-modal priming effects were found, but only for negative primes. Results are discussed in terms of the negativity bias, and several suggestions are provided for using cross-modal evaluative priming to address theoretically important questions about emotion and cognition.  相似文献   

11.
Zohar Eitan  Renee Timmers 《Cognition》2010,114(3):405-422
Though auditory pitch is customarily mapped in Western cultures onto spatial verticality (high–low), both anthropological reports and cognitive studies suggest that pitch may be mapped onto a wide variety of other domains. We collected a total number of 35 pitch mappings and investigated in four experiments how these mappings are used and structured. In particular, we inquired (1) how Western subjects apply Western and non-Western metaphors to “high” and “low” pitches, (2) whether mappings applied in an abstract conceptual task are similarly applied by listeners to actual music, (3) how mappings of spatial height relate to these pitch mappings, and (4) how mappings of “high” and “low” pitch associate with other dimensions, in particular quantity, size, intensity and valence. The results show strong agreement among Western participants in applying familiar and unfamiliar metaphors for pitch, in both an abstract, conceptual task (Exp. 1) and in a music listening task (Exp. 2), indicating that diverse cross-domain mappings for pitch exist latently besides the common verticality metaphor. Furthermore, limited overlap between mappings of spatial height and pitch height was found, suggesting that, the ubiquity of the verticality metaphor in Western usage notwithstanding, cross-domain pitch mappings are largely independent of that metaphor, and seem to be based upon other underlying dimensions. Part of the discrepancy between spatial height and pitch height is that, for pitch, “up” is not necessarily “more,” nor is it necessarily “good.” High pitch is only “more” for height, intensity and brightness. It is “less” for mass, size and quantity. We discuss implications of these findings for music and speech prosody, and their relevance to notions of embodied cognition and of cross-domain magnitude representation.  相似文献   

12.
Research concerning cross-modal influences on perception has neglected auditory influences on perceptions of non-auditory objects, although a small number of studies indicate that auditory stimuli can influence perceptions of the freshness of foodstuffs. Consistent with this, the results reported here indicate that independent groups' ratings of the taste of the wine reflected the emotional connotations of the background music played while they drank it. These results indicate that the symbolic function of auditory stimuli (in this case music) may influence perception in other modalities (in this case gustation); and are discussed in terms of possible future research that might investigate those aspects of music that induce such effects in a particular manner, and how such effects might be influenced by participants' pre-existing knowledge and expertise with regard to the target object in question.  相似文献   

13.
The current study examined the influence of interruption, background speech and music on reading, using an eye movement paradigm. Participants either read paragraphs while being exposed to background speech or music or read the texts in silence. On half of the trials, participants were interrupted by a 60‐second audio story before resuming reading the paragraph. Interruptions increased overall reading time, but the reading of text following the interruption was quicker compared with baseline. Background speech and music did not modulate the interruption effects, but the background speech slowed down the reading rate compared with reading in the presence of music or reading in silence. The increase in reading time was primarily due to an increase in the time spent rereading previously read words. We argue that the observed interruption effects are in line with a theory of long‐term working memory, and we present practical implications for the reported background speech effects. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

14.
Two experiments were performed to examine musicians' and nonmusicians' electroencephalographic (EEG) responses to changes in major dimensions (tempo, melody, and key) of classical music. In Exp. 1, 12 nonmusicians' and 12 musicians' EEGs during melody and tempo changes in classical music showed more alpha desynchronization in the left hemisphere (F3) for changes in tempo than in the right. For melody, the nonmusicians were more right-sided (F4) than left in activation, and musicians showed no left-right differences. In Exp. 2, 18 musicians' and 18 nonmusicians' EEG after a key change in classical music showed that distant key changes elicited more right frontal (F4) alpha desynchronization than left. Musicians showed more reaction to key changes than nonmusicians and instructions to attend to key changes had no significant effect. Classical music, given its well-defined structure, offers a unique set of stimuli to study the brain. Results support the concept of hierarchical modularity in music processing that may be automatic.  相似文献   

15.
We investigated the coupling between a speaker's and a listener's eye movements. Some participants talked extemporaneously about a television show whose cast members they were viewing on a screen in front of them. Later, other participants listened to these monologues while viewing the same screen. Eye movements were recorded for all speakers and listeners. According to cross-recurrence analysis, a listener's eye movements most closely matched a speaker's eye movements at a delay of 2 sec. Indeed, the more closely a listener's eye movements were coupled with a speaker's, the better the listener did on a comprehension test. In a second experiment, low-level visual cues were used to manipulate the listeners' eye movements, and these, in turn, influenced their latencies to comprehension questions. Just as eye movements reflect the mental state of an individual, the coupling between a speaker's and a listener's eye movements reflects the success of their communication.  相似文献   

16.
Chapados C  Levitin DJ 《Cognition》2008,108(3):639-651
This experiment was conducted to investigate cross-modal interactions in the emotional experience of music listeners. Previous research showed that visual information present in a musical performance is rich in expressive content, and moderates the subjective emotional experience of a participant listening and/or observing musical stimuli [Vines, B. W., Krumhansl, C. L., Wanderley, M. M., & Levitin, D. J. (2006). Cross-modal interactions in the perception of musical performance. Cognition, 101, 80--113.]. The goal of this follow-up experiment was to replicate this cross-modal interaction by investigating the objective, physiological aspect of emotional response to music measuring electrodermal activity. The scaled average of electrodermal amplitude for visual-auditory presentation was found to be significantly higher than the sum of the reactions when the music was presented in visual only (VO) and auditory only (AO) conditions, suggesting the presence of an emergent property created by bimodal interaction. Functional data analysis revealed that electrodermal activity generally followed the same contour across modalities of presentation, except during rests (silent parts of the performance) when the visual information took on particular salience. Finally, electrodermal activity and subjective tension judgments were found to be most highly correlated in the audio-visual (AV) condition than in the unimodal conditions. The present study provides converging evidence for the importance of seeing musical performances, and preliminary evidence for the utility of electrodermal activity as an objective measure in studies of continuous music-elicited emotions.  相似文献   

17.
Seductive details in general affect learning and cognitive load negatively. However, especially background music as a seductive detail may also influence the learner's arousal, whose optimal level depends on the learner's extraversion. Therefore, the effects of extraversion and background music on learning outcomes, cognitive load, and arousal were investigated. We tested 167 high school students and found better transfer outcomes for the group with background music. They also reported higher germane load, but no impact of background music on extraneous cognitive load or arousal was found. In the group without background music, learners with higher extraversion reached better recall scores, which was not found in the group with background music. Results may cautiously be interpreted that there is a beneficial impact of background music that compensates for the disadvantages of low extraverted learners and which cannot be explained through arousal.  相似文献   

18.
Two experiments were carried out to examine the relationship between eyeblinks and eye movements under a visual search task. Exp. I showed that the vertical eye movements brought about slightly more eyeblinks than the horizontal ones. In Exp. II, the vertical eye movements were accompanied with significantly more frequent eyeblinks than the horizontal ones. Upward saccadic eye movements especially were associated with the more frequent eyeblinks than the downward ones. These results suggested a possible relationship between the eyeblinks and Bell's phenomenon. However, the comparison of eyeblink rates between eye-movement and the no-eye-movement conditions in Exp. II indicated that in the latter condition eyeblinks were significantly more frequent than in the former condition. Some psychological factors were suggested as likely important determinants of the frequency of eyeblinks.  相似文献   

19.
Phillips-Silver and Trainor (Phillips-Silver, J., Trainor, L.J., (2005). Feeling the beat: movement influences infants' rhythm perception. Science, 308, 1430) demonstrated an early cross-modal interaction between body movement and auditory encoding of musical rhythm in infants. Here we show that the way adults move their bodies to music influences their auditory perception of the rhythm structure. We trained adults, while listening to an ambiguous rhythm with no accented beats, to bounce by bending their knees to interpret the rhythm either as a march or as a waltz. At test, adults identified as similar an auditory version of the rhythm pattern with accented strong beats that matched their previous bouncing experience in comparison with a version whose accents did not match. In subsequent experiments we showed that this effect does not depend on visual information, but that movement of the body is critical. Parallel results from adults and infants suggest that the movement-sound interaction develops early and is fundamental to music processing throughout life.  相似文献   

20.
This study investigated the mental representation of music notation. Notational audiation is the ability to internally "hear" the music one is reading before physically hearing it performed on an instrument. In earlier studies, the authors claimed that this process engages music imagery contingent on subvocal silent singing. This study refines the previously developed embedded melody task and further explores the phonatory nature of notational audiation with throat-audio and larynx-electromyography measurement. Experiment 1 corroborates previous findings and confirms that notational audiation is a process engaging kinesthetic-like covert excitation of the vocal folds linked to phonatory resources. Experiment 2 explores whether covert rehearsal with the mind's voice also involves actual motor processing systems and suggests that the mental representation of music notation cues manual motor imagery. Experiment 3 verifies findings of both Experiments 1 and 2 with a sample of professional drummers. The study points to the profound reliance on phonatory and manual motor processing--a dual-route stratagem--used during music reading. Further implications concern the integration of auditory and motor imagery in the brain and cross-modal encoding of a unisensory input.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号