首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Selective attention requires the ability to focus on relevant information and to ignore irrelevant information. The ability to inhibit irrelevant information has been proposed to be the main source of age-related cognitive change (e.g., Hasher & Zacks, 1988). Although age-related distraction by irrelevant information has been extensively demonstrated in the visual modality, studies involving auditory and cross-modal paradigms have revealed a mixed pattern of results. A comparative evaluation of these paradigms according to sensory modality suggests a twofold trend: Age-related distraction is more likely (a) in unimodal than in cross-modal paradigms and (b) when irrelevant information is presented in the visual modality, rather than in the auditory modality. This distinct pattern of age-related changes in selective attention may be linked to the reliance of the visual and auditory modalities on different filtering mechanisms. Distractors presented through the auditory modality can be filtered at both central and peripheral neurocognitive levels. In contrast, distractors presented through the visual modality are primarily suppressed at more central levels of processing, which may be more vulnerable to aging. We propose the hypothesis that age-related distractibility is modality dependent, a notion that might need to be incorporated in current theories of cognitive aging. Ultimately, this might lead to a more accurate account for the mixed pattern of impaired and preserved selective attention found in advancing age.  相似文献   

2.
Closing the eyes helps memory. We investigated the mechanisms underlying the eyeclosure effect by exposing 80 eyewitnesses to different types of distraction during the witness interview: blank screen (control), eyes closed, visual distraction, and auditory distraction. We examined the cognitive load hypothesis by comparing any type of distraction (visual or auditory) with minimal distraction (blank screen or eyes closed). We found recall to be significantly better when distraction was minimal, providing evidence that eyeclosure reduces cognitive load. We examined the modality-specific interference hypothesis by comparing the effects of visual and auditory distraction on recall of visual and auditory information. Visual and auditory distraction selectively impaired memory for information presented in the same modality, supporting the role of visualisation in the eyeclosure effect. Analysis of recall in terms of grain size revealed that recall of basic information about the event was robust, whereas recall of specific details was prone to both general and modality-specific disruptions.  相似文献   

3.
Age-related deficits in selective attention have often been demonstrated in the visual modality and, to a lesser extent, in the auditory modality. In contrast, a mounting body of evidence has suggested that cross-modal selective attention is intact in aging, especially in visual tasks that require ignoring the auditory modality. Our goal in this study was to investigate age-related differences in the ability to ignore cross-modal auditory and visual distraction and to assess the role of cognitive control demands thereby. In a set of two experiments, 30 young (mean age = 23.3 years) and 30 older adults (mean age = 67.7 years) performed a visual and an auditory n-back task (0 ≤ n ≤ 2), with and without cross-modal distraction. The results show an asymmetry in cross-modal distraction as a function of sensory modality and age: Whereas auditory distraction did not disrupt performance on the visual task in either age group, visual distraction disrupted performance on the auditory task in both age groups. Most important, however, visual distraction was disproportionately larger in older adults. These results suggest that age-related distraction is modality dependent, such that suppression of cross-modal auditory distraction is preserved and suppression of cross-modal visual distraction is impaired in aging.  相似文献   

4.
《Acta psychologica》2013,142(2):184-194
Older adults are known to have reduced inhibitory control and therefore to be more distractible than young adults. Recently, we have proposed that sensory modality plays a crucial role in age-related distractibility. In this study, we examined age differences in vulnerability to unimodal and cross-modal visual and auditory distraction. A group of 24 younger (mean age = 21.7 years) and 22 older adults (mean age = 65.4 years) performed visual and auditory n-back tasks while ignoring visual and auditory distraction. Whereas reaction time data indicated that both young and older adults are particularly affected by unimodal distraction, accuracy data revealed that older adults, but not younger adults, are vulnerable to cross-modal visual distraction. These results support the notion that age-related distractibility is modality dependent.  相似文献   

5.
Buchan JN  Munhall KG 《Perception》2011,40(10):1164-1182
Conflicting visual speech information can influence the perception of acoustic speech, causing an illusory percept of a sound not present in the actual acoustic speech (the McGurk effect). We examined whether participants can voluntarily selectively attend to either the auditory or visual modality by instructing participants to pay attention to the information in one modality and to ignore competing information from the other modality. We also examined how performance under these instructions was affected by weakening the influence of the visual information by manipulating the temporal offset between the audio and video channels (experiment 1), and the spatial frequency information present in the video (experiment 2). Gaze behaviour was also monitored to examine whether attentional instructions influenced the gathering of visual information. While task instructions did have an influence on the observed integration of auditory and visual speech information, participants were unable to completely ignore conflicting information, particularly information from the visual stream. Manipulating temporal offset had a more pronounced interaction with task instructions than manipulating the amount of visual information. Participants' gaze behaviour suggests that the attended modality influences the gathering of visual information in audiovisual speech perception.  相似文献   

6.
A driving simulator was used to examine the effects on driving performance of auditory cues in an in-vehicle information search task. Drivers' distraction by the search tasks was measured on a peripheral detection task. The difficulty of the search task was systematically varied to test the distraction caused by a quantified visual load. 58 participants completed the task. Performance on both search tasks and peripheral detection tasks was measured by mean response time and percent error. Analyses indicated that in-vehicle information search performance can be severely degraded when a target is located within a group of diverse distractors. Inclusion of an auditory cue in the visual search increased the mean response time as a result of a change in modality from auditory to visual. Inclusion of such an auditory cue seemed to influence distraction as measured by performance on the peripheral detection task; accuracy was lower when auditory cues were provided, and responses were slower when no auditory cues were provided. Distraction by the auditory cue varied according to the difficulty of the search task.  相似文献   

7.
In the tripartite model of working memory (WM) it is postulated that a unique part system—the visuo-spatial sketchpad (VSSP)—processes non-verbal content. Due to behavioral and neurophysiological findings, the VSSP was later subdivided into visual object and visual spatial processing, the former representing objects’ appearance and the latter spatial information. This distinction is well supported. However, a challenge to this model is the question how spatial information from non-visual sensory modalities, for example the auditory one, is processed. Only a few studies so far have directly compared visual and auditory spatial WM. They suggest that the distinction of two processing domains—one for object and one for spatial information—also holds true for auditory WM, but that only a part of the processes is modality specific. We propose that processing in the object domain (the item’s appearance) is modality specific, while spatial WM as well as object-location binding relies on modality general processes.  相似文献   

8.
Summary An experiment was presented to test a functionalistic interpretation of the modality effect. This shows a superior recall performance for auditorily as opposed to visually presented verbal information. A total of 60 subjects were presented with mixed-mode (auditory-visual words), mixed-language (Swedish-English words), or mixed-category (category-unrelated words) lists, and were asked to recall the words of each list in any preferred order. The degree of organization according to modality, language, or category and the recall performance were measured. Organization by modality was significantly higher than organization by language or category as predicted by the functionalistic view proposed. The recall performance obtained for auditory and visual words differed in a way predicted by the functionalistic view.  相似文献   

9.
ObjectivesDistracted walking is a major cause of pedestrian road traffic injuries, but little is known about how distraction affects pedestrian safety. The study was designed to explore how visual and auditory distraction might influence pedestrian safety.MethodsThree experiments were conducted to explore causal mechanisms from two theoretical perspectives, increased cognitive load from the distraction task and resource competition in the same sensory modality. Pedestrians’ behavior patterns and cortex oxyhemoglobin changes were recorded while they performed a series of dual tasks.ResultsFour primary results emerged: (a) participants responded more slowly to both visual and auditory stimuli in traffic, as well as walked more slowly, while talking on the phone or text messaging compared to when undistracted or listening to music; (b) when participants completed pedestrian response tasks while distracted with a high cognitive load, their response was significantly slower and poorer than when they carried out a lower cognitive load distraction task, (c) participants had higher levels of oxy-Hb change in cortices related to visual processing and executive function while distracted with a higher cognitive load; and (d) participants' responses to traffic lights were slower and resulted in a higher activation in prefrontal cortex and occipital areas when distracted by a visual distraction task compared to when distracted with an auditory task; similarly, brain activation increased significantly in temporal areas when participants responded to an auditory car horn task compared to when they responded to visual traffic lights.ConclusionsBoth distracting cognitive load demands and the type of distraction task significantly affect young adult pedestrian performance and threaten pedestrian safety. Pedestrian injury prevention efforts should consider the effects of the type of distracting task and its cognitive demands on pedestrian safety.  相似文献   

10.
Unexpected events often distract us. In the laboratory, novel auditory stimuli have been shown to capture attention away from a focal visual task and yield specific electrophysiological responses as well as a behavioral cost to performance. Distraction is thought to follow ineluctably from the sound’s low probability of occurrence or, put more simply, its unexpected occurrence. Our study challenges this view with respect to behavioral distraction and argues that past research failed to identify the informational value of sound as a mediator of novelty distraction. We report an experiment showing that (1) behavioral novelty distraction is only observed when the sound announces the occurrence and timing of an upcoming visual target (as is the case in all past research); (2) that no such distraction is observed for deviant sounds conveying no such information; and that (3) deviant sounds can actually facilitate performance when these, but not the standards, convey information. We conclude that behavioral novelty distraction, as observed in oddball tasks, is observed in the presence of novel sounds but only when the cognitive system can take advantage of the auditory distracters to optimize performance.  相似文献   

11.
Past research has demonstrated that the occurrence of unexpected task-irrelevant changes in the auditory or visual sensory channels captured attention in an obligatory fashion, hindering behavioral performance in ongoing auditory or visual categorization tasks and generating orientation and re-orientation electrophysiological responses. We report the first experiment extending the behavioral study of cross-modal distraction to tactile novelty. Using a vibrotactile-visual cross-modal oddball task and a bespoke hand-arm vibration device, we found that participants were significantly slower at categorizing the parity of visually presented digits following a rare and unexpected change in vibrotactile stimulation (novelty distraction), and that this effect extended to the subsequent trial (postnovelty distraction). These results are in line with past research on auditory and visual novelty and fit the proposition of common and amodal cognitive mechanisms for the involuntary detection of change.  相似文献   

12.
Temporal preparation often has been assumed to influence motor stages of information processing. Recent studies, however, challenge this notion and provide evidence for a facilitation of visual processing. The present study was designed to investigate whether perceptual processing in the auditory domain also benefits from temporal preparation. To this end, we employed a pitch discrimination task. In Experiment 1, discrimination performance was clearly improved when participants were temporally prepared. This finding was confirmed in Experiment 2, which ruled out possible influences of short-term memory. The results support the notion that temporal preparation enhances perceptual processing not only in the visual, but also in the auditory, modality.  相似文献   

13.
Temporal preparation often has been assumed to influence motor stages of information processing. Recent studies, however, challenge this notion and provide evidence for a facilitation of visual processing. The present study was designed to investigate whether perceptual processing in the auditory domain also benefits from temporal preparation. To this end, we employed a pitch discrimination task. In Experiment 1, discrimination performance was clearly improved when participants were temporally prepared. This finding was confirmed in Experiment 2, which ruled out possible influences of short-term memory. The results support the notion that temporal preparation enhances perceptual processing not only in the visual, but also in the auditory, modality.  相似文献   

14.
Eyewitnesses instructed to close their eyes during retrieval remember more correct, and fewer incorrect, visual and auditory details. These effects are assumed to arise because eye‐closure reduces distraction from the retrieval environment, and so increased environmental distraction should have the reverse effects. To test this idea, 48 participants witnessed a video clip before verbally answering questions about visual and auditory details in the presence of irrelevant visual distraction varying in amount and predictability. More distraction led to fewer correct and more incorrect visual and auditory details being recalled, but the predictability of the distraction had no effect. These findings suggest that environmental distraction impacts upon memory quality rather than quantity, a pattern that may be hard for interviewers to detect. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

15.
行人过街时, 因使用手机而造成的伤亡事故持续增长。过街时使用手机对行人信息加工、行为及安全均会产生影响。结果表明:使用手机的行人, 在场景知觉阶段注意范围更狭窄, 在外周视野的注意力明显下降; 过街决策阶段, 更容易错过过街机会或做出更有风险的决策; 运动控制阶段, 使用手机会改变行人步态, 降低其动作稳定性。以上效应还受到行人使用手机方式的调节, 但综合看来, 手机分心提升了行人过街风险, 更容易造成过街事故。最后, 通过建立手机分心对行人过街中信息加工及行为影响的概念模型, 指出未来研究应评估手机分心如何影响行人听觉信息的获取, 以及间隔接受决策的子过程, 从而为后期有针对性地干预提供理论基础。  相似文献   

16.
潘禄  钱秀莹 《心理科学进展》2015,23(11):1910-1919
节奏感知是人类独有的认知现象, 听觉在节奏加工中具有优势, 声音节奏可以更好地与肢体运动同步, 但视觉和触觉通道在节奏感知上也有着各自特点并存在广泛的交互作用。视觉通道的节奏感知较弱, 与听觉节奏同时呈现时会受到时间定位上的拉扯, 但通过加入运动信息和增强后天经验可以得到强化; 节奏刺激可以调节注意在时间上的分配使其同步化, 这种调节作用可以单通道或者跨通道地出现; 触动觉与听觉联系紧密, 人们可以通过听触通道的整合对节奏进行高级加工。  相似文献   

17.
康冠兰  罗霄骁 《心理科学》2020,(5):1072-1078
多通道信息交互是指来自某个感觉通道的信息与另一感觉通道的信息相互作用、相互影响的一系列加工过程。主要包括两个方面:一是不同感觉通道的输入如何整合;二是跨通道信息的冲突控制。本文综述了视听跨通道信息整合与冲突控制的行为心理机制和神经机制,探讨了注意对视听信息整合与冲突控制的影响。未来需探究视听跨通道信息加工的脑网络机制,考察特殊群体的跨通道整合和冲突控制以帮助揭示其认知和社会功能障碍的机制。  相似文献   

18.
Advancing age is associated with decrements in selective attention. It was recently hypothesized that age-related differences in selective attention depend on sensory modality. The goal of the present study was to investigate the role of sensory modality in age-related vulnerability to distraction, using a response interference task. To this end, 16 younger (mean age = 23.1 years) and 24 older (mean age = 65.3 years) adults performed four response interference tasks, involving all combinations of visual and auditory targets and distractors. The results showed that response interference effects differ across sensory modalities, but not across age groups. These results indicate that sensory modality plays an important role in vulnerability to distraction, but not in age-related distractibility by irrelevant spatial information.  相似文献   

19.
Processing multiple complex features to create cohesive representations of objects is an essential aspect of both the visual and auditory systems. It is currently unclear whether these processes are entirely modality specific or whether there are amodal processes that contribute to complex object processing in both vision and audition. We investigated this using a dual-stream target detection task in which two concurrent streams of novel visual or auditory stimuli were presented. We manipulated the degree to which each stream taxed processing conjunctions of complex features. In two experiments, we found that concurrent visual tasks that both taxed conjunctive processing strongly interfered with each other but that concurrent auditory and visual tasks that both taxed conjunctive processing did not. These results suggest that resources for processing conjunctions of complex features within vision and audition are modality specific.  相似文献   

20.
This paper reports a study which examined an interaction between action planning and processing of perceptual information in two different sensory modalities. In line with the idea that action planning consists in representing the action’s sensory outcomes, it was assumed that different types of actions should be coupled with different modalities. A visual and auditory oddball paradigm was combined with two types of actions: pointing and knocking (unrelated to the perceptual task). Results showed an interactive effect between the action type and the sensory modality of the oddballs, with impaired detection of auditory oddballs for knocking (congruent) action, as compared to a pointing (incongruent) action. These findings reveal that action planning can interact with modality-specific perceptual processing and that preparing an action presumably binds the respective perceptual features with an action plan, thereby making these features less available for other tasks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号