首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Previous studies showed that the identification of a left- or right-pointing arrowhead is impaired when it appears while planning and executing a spatially compatible left or right keypress (Müsseler & Hommel, 1997a). We attribute this effect to stimulus processing and action control operating on the same feature codes so that, once a code is integrated in an action plan, it is less available for perceptual processing. In three pairs of experiments we tested the generality of this account by using stimulus–response combinations other than arrows and manual keypresses. Planning manual left–right keypressing actions impaired the identification of spatially corresponding arrows but not of words with congruent meaning. On the contrary, planning to say “left” or “right” impaired the identification of corresponding spatial words but not of congruent arrows. Thus, as the feature-integration approach suggests, stimulus identification is impaired only with overlap of perceptual or perceptually derived stimulus and response features while mere semantic congruence is insufficient.  相似文献   

2.
Previous studies reported impairments in a perceptual task performed during the selection and execution of an action. These findings, however, always raise the question of whether the impairment actually reflects a reduction in perceptual sensitivity or whether it results only from an unspecific reduction in attentiveness given the perceptual task. Recent studies by the authors indicate that actions can also have a specific impact on perception in a dual-task situation. The identification of a left or right arrow is impaired when it appears during the execution of a compatible left or right keypress. In three experiments Signal Detection Theory is applied to test whether this impairment is also found in the sensitivity measure d' or whether it originates only from a response tendency. The results revealed a general lower d' for the identification of arrows that were compatible to simultaneously executed keypresses than for arrows that were incompatible. The bias measure c was small and/or did not differ between conditions. Additional analyses revealed that the impairment is due to a higher mean perceptual degradation of stimuli in the compatible condition and that it is restricted to the point in time when the central movement command is generated. Thus, actions actually seem able to affect perceptual processing.  相似文献   

3.
Participants are worse at identifying spatial symbols (arrowheads) while performing spatially compatible manual key presses. The present experiments investigated the generality of this blindness effect to response-compatible stimuli. In Experiment 1 a left key press deteriorated the identification of left-pointing arrows, and a right key press deteriorated the perception of right-pointing arrows, independent of the hands used to press the key. Thus the blindness effect is based on codes of the distal response location rather than on the body-intrinsic anatomical connection of the hands. Experiment 2 extended the blindness effect to verbal responses and written position words (left, right, up, down). Vocalizing a position word blinded to directly compatible position words (e.g., left-left), but not to orthogonally compatible position words (e.g., left-down). This result suggests that the use of identical stimulus-response codes, and not the use of saliency-matching but distinct codes, suffices to produce blindness effects. Finally, Experiment 3 extended the blindness phenomenon beyond the spatial domain by demonstrating blindness between saying color words and perceiving color patches. Altogether, the experiments revealed action-induced blindness to be a phenomenon of broad empirical validity occurring whenever action and perception afford simultaneous access to the same conceptual codes.  相似文献   

4.
The spatial Stroop effect (slower left/right responses to left/right pointing arrows when they appear at spatially incongruent than at congruent locations) has often been used to examine the processing of irrelevant spatial information. We present data from two experiments in which the magnitude of such location-based interference is drastically reduced when the location of the arrow is precued by a spatial noninformative cue. The main aim of the present study was to clarify whether such modulation takes place at perceptual or at response-related stages of processing. First, we manipulated the spatial compatibility between the direction of the arrow and the location of the response so that each subject would respond with spatially compatible vs. incompatible key presses for each half of the experiment. We found that such manipulation did not have any effect on the standard reduction of the congruency effect by peripheral cues. In a second experiment, subjects made left/right key presses to directional arrows pointing bottom/up, which could appear equally often at left/right/bottom/up locations. In cued trials, we found a reduction of the congruency effect on the vertical axis (stimulus location-direction congruency), whereas congruency was unaffected by cueing in targets presented on the horizontal axis (stimulus-response location congruency). According to these results, we conclude that spatial noninformative cues modulate spatial Stroop interference by reducing the conflict between stimulus dimensions at perceptual- rather than motor-related stages of processing.  相似文献   

5.
This study investigated the conditions under which the processing in a speeded response task interferes with concurrent processing in a visual encoding task. Three experiments used a dual-task paradigm, in which a speeded left or right response to at one was combined with the identification of a masked left-or right pointing arrow following the tone with variable SOA. Two additional experiments tested the impact of the presentation of pure tone on visual encoding.There were four major findings. First, an unspecific decrease in identification accuracy was observed with decreasing SOA. Second, a blindness to response-compatible stimuli was observed with speeded responses. Third, a specific interference was found between low- and high-pitched tones and left- or right-pointing arrows. Fourth, the specific tone-arrow interference modulated the specific responsearrow interference when the task allowed both to occur simultaneously. The present findings, which suggest both procedural and structural interference between response preparation and stimulus encoding, are discussed in terms of a two-stage model of action planning.  相似文献   

6.
Experimental designs that require the simultaneous perception and reproduction of a stimulus sequence could help to clarify the relationship between perception and action. This contribution examines a specific stimulus-response compatibility with the reproduction of simple stimulus sequences. In the procedure a response just prepared or one to be prepared is confronted with a new incoming stimulus that is compatible or incompatible with the response. Interference is predicted from a framework in which stimulus perception and action control are assumed to share common codes.Five arrows were successively presented at 1-s intervals. The arrows pointed either to the left or to the right with equal probability. One of the five arrows was accompanied by a randomly presented go signal. Subjects then had to reproduce the sequence by pressing corresponding left or right keys while the stimulus presentation continued. Reaction-time latencies and reaction intervals within a sequence were analyzed in six experiments. Results showed increasing reaction-time latencies the later the go signal was presented — that is, the longer the sequence to be reproduced was. In contrast to previous findings, this effect interacted with the compatibility between the arrow displayed together with the go signal and the first reaction. It is argued that the go signal initiates a transfer of a cognitive action plan to a peripheral motor program and that this process is subject to interference the more the current stimulus is at odds with one of the first parameter specification.  相似文献   

7.
Previous work indicates that action-control processes influence perceptual processes: The identification probability of a left- or right-pointing arrow is reduced when it appears during the execution of a compatible left-right-key press (Müsseler & Hommel, in press). The present study addresses the question of whether this effect would also be observed in a detection task—that is, with judgments that do not require discriminating between left- and right-pointing arrows. Indeed, we found comparable effects in both the identification task and the detection task. This outcome is interpreted within a commoncoding framework, which holds that stimulus processing and action control operate on the same codes.  相似文献   

8.
The present study compared the processing of direction for up and down arrows and for left and right arrows in visual displays. Experiment 1 demonstrated that it is more difficult to deal with left and right than with up and down when the two directions must be discriminated but not when they must simply be oriented to. Experiments 2 and 3 showed that telling left from right is harder regardless of whether the responses are manual or verbal. Experiment 4 showed that left-right discriminations take longer than up-down discriminations for judgments of position as well as direction. In Experiment 5 it was found that position information can intrude on direction judgments both within a dimension (e.g., a left arrow to the left of fixation is judged faster than a left arrow to the right of fixation) and across dimensions (e.g., judging vertically positioned left and right arrows is more difficult than judging horizontally positioned left and right arrows). There was indirect evidence in these experiments that although the spatial codes for up and down are symmetrical, the codes for left and right may be less so; this in turn could account for the greater difficulty of discriminating left from right.  相似文献   

9.
Using a lexical decision task, the authors investigated whether brain asymmetries in the detection of emotionally negative semantic associations arise only at a perceptually discriminative stage at which lexical analysis is accurate or can already be found at crude and incomplete levels of perceptual representation at which word-nonword discrimination is based solely on guessing. Emotionally negative and neutral items were presented near perceptual threshold in the left and right visual hemifields. Word-nonword discrimination performance as well as the bias to classify a stimulus as a "word" (whether or not it actually is a word) were assessed for a normal, horizontal stimulus presentation format (Experiment 1) and for an unusual, vertical presentation format (Experiment 2). Results show that while the two hemispheres are equally able to detect affective semantic associations at a prelexical processing stage (both experiments), the right hemisphere is superior at a postlexical, perceptually discriminative stage (Experiment 2). Moreover, the findings suggest that only an unusual, nonoverlearned stimulus presentation format allows adequate assessment of the right hemisphere's lexical-semantic skills.  相似文献   

10.
Strong associations between target stimuli and responses usually facilitate fast and effortless reactions. The present study investigated whether long-term associations between distractor stimuli and responses modulate behavior. In particular, distractor stimuli can affect behavior due to distractor-based stimulus-response retrieval, a phenomenon called distractor-response binding: An ignored stimulus becomes temporarily associated with a response and retrieves it at stimulus repetition. In a flanker task, participants ignored left and right pointing arrows and responded to a target letter either with left and right (strongly associated) responses or with upper and lower (weakly associated) responses. Binding effects were modulated in dependence of the long-term association strength between distractors and responses. If the association was strong (arrows pointing left and right with left and right responses), binding effects emerged but only in case of compatible responses. If the long-term association between distractors and responses was weak (arrows pointing left and right with upper and lower responses), binding was weaker and not modulated by compatibility. In contrast, sequential compatibility effects were not modulated by association strength between distractor and response. The results indicate that existing long-term associations between stimuli responses may modulate the impact of an ignored stimulus on action control.  相似文献   

11.
The process-dissociation procedure was used to estimate the influence of spatial and form-based processing in the Simon task. Subjects made manual (left/right) responses to the direction of arrows (> or <) presented to the left or right of fixation. Manipulating the proportion of incongruent trails (e.g., a right-pointing arrow presented to the left of fixation) affected both the size and direction of the Simon effect. To account for this pattern of data, we compared process estimates based on three possible relationships between spatial and form-based processing: independence, redundancy, and exclusivity. The independence model provided the best account of the data. Most telling was that independent form-based estimates were superior at predicting observed performance on arrows presented at fixation and did so consistently across conditions (r′s > .80). The results provide evidence that the form ("what") and spatial location ("where") of a single stimulus can have functionally independent effects on performance. They also indicate the existence of two kinds of automaticity—an associative ("implicit learning") component that reflects prior S-R mappings and a nonassociative component that reflects the correspondence between stimulus and response codes.  相似文献   

12.
Previous research on dual-tasks has shown that, under some circumstances, actions impair the perception of action-consistent stimuli, whereas, under other conditions, actions facilitate the perception of action-consistent stimuli. We propose a new model to reconcile these contrasting findings. The planning and control model (PCM) of motorvisual priming proposes that action planning binds categorical representations of action features so that their availability for perceptual processing is inhibited. Thus, the perception of categorically action-consistent stimuli is impaired during action planning. Movement control processes, on the other hand, integrate multi-sensory spatial information about the movement and, therefore, facilitate perceptual processing of spatially movement-consistent stimuli. We show that the PCM is consistent with a wider range of empirical data than previous models on motorvisual priming. Furthermore, the model yields previously untested empirical predictions. We also discuss how the PCM relates to motorvisual research paradigms other than dual-tasks.  相似文献   

13.
Neurophysiological observations suggest that attending to a particular perceptual dimension, such as location or shape, engages dimension-related action, such as reaching and prehension networks. Here we reversed the perspective and hypothesized that activating action systems may prime the processing of stimuli defined on perceptual dimensions related to these actions. Subjects prepared for a reaching or grasping action and, before carrying it out, were presented with location- or size-defined stimulus events. As predicted, performance on the stimulus event varied with action preparation: planning a reaching action facilitated detecting deviants in location sequences whereas planning a grasping action facilitated detecting deviants in size sequences. These findings support the theory of event coding, which claims that perceptual codes and action plans share a common representational medium, which presumably involves the human premotor cortex.  相似文献   

14.
Preparation provided by visual location cues is known to speed up behavior. However, the role of concurrent saccades in response to visual cues remains unclear. In this study, participants performed a spatial precueing task by pressing one of four response keys with one of four fingers (two of each hand) while eye movements were monitored. Prior to the stimulus, we presented a neutral cue (baseline), a hand cue (corresponding to left vs. right positions), or a finger cue (corresponding to inner vs. outer positions). Participants either remained fixated on a central fixation point or moved their eyes freely. The results demonstrated that saccades during the cueing interval altered the pattern of cueing effects. Finger cueing trials in which saccades were spatially incompatible (vs. compatible) with the subsequently required manual response exhibited slower manual RTs. We propose that interference between saccades and manual responses affects manual motor preparation.  相似文献   

15.
It is assumed that there are hemispheric differences in the type of information available for the processing of word meanings, e.g., categorical or associative information. In the present experiment, we used a semantic priming paradigm to examine whether perceptual or conceptual properties of word meanings would be associated with the left or right hemisphere. The present experiment also examined time-course activation of these properties across the hemispheres, using short and long stimulus onset asynchronies. The results indicated that perceptual information is available only in the right hemisphere at an early rather than a late stage of target processing, while conceptual information is available in both hemispheres at both early and later stages of target processing. It is suggested that the imagery system in the right hemisphere may contribute to the perceptual priming observed in this hemisphere.  相似文献   

16.
Previous research suggests that past and future temporal concepts are spatially represented from left to right along a mental line. And these concepts can both prime motor responses to left or right space and direct visual spatial attention. The present study aimed at investigating the nature of this space-time conceptual metaphor in different auditory tasks. In the first experiment, subjects categorized time-related words (past or future) that were presented binaurally. In the second experiment, subjects detected left-ear or right-ear targets following time-related words. The similar space-time compatibility effects were found in these two experiments. Our results demonstrate that the activation of temporal concepts can both prime motor responses to left or right space and influence the orientation of auditory spatial attention, suggesting that the modality of the stimulus input is unimportant for the left-right mapping of time. These results are explained by the "intermediate coding" account.  相似文献   

17.
The assumptions tested were that the relative contribution of each hemisphere to reading alters with experience and that experience increases suppression of the simultaneous use of identical strategies by the non-dominant hemisphere. Males that were reading disabled and phonologically impaired, reading disabled and phonologically normal, or with no reading disability were presented familiar words, orthographically correct pseudowords, and orthographically incorrect non-words for lexical decision. Accuracy and response times in all groups showed a shift from no asymmetry in processing non-words to a stable left hemisphere advantage and clear suppression of the right hemisphere in processing words. In the pseudoword condition, accuracy scores were higher when both hemispheres were free to engage, especially in those with a reading disability and responses slowed in the phonologically impaired group but not the phonologically normal groups when the right hemisphere was disengaged. As familiar words typically invoke lexical processing by both hemispheres while pseudowords invoke lexical processing by the right and non-lexical processing by the left hemisphere, and as non-lexical processing is weak in the phonologically impaired, the results support the assumptions that were tested.  相似文献   

18.
Research on the lateralisation of brain functions for emotion has yielded different results as a function of whether it is the experience, expression, or perceptual processing of emotion that is examined. Further, for the perception of emotion there appear to be differences between the processing of verbal and nonverbal stimuli. The present research examined the hemispheric asymmetry in the processing of verbal stimuli varying in emotional valence. Participants performed a lexical decision task for words varying in affective valence (but equated in terms of arousal) that were presented briefly to the right or left visual field. Participants were significantly faster at recognising positive words presented to the right visual field/left hemisphere. This pattern did not occur for negative words (and was reversed for high arousal negative words). These results suggest that the processing of verbal stimuli varying in emotional valence tends to parallel hemispheric asymmetry in the experience of emotion.  相似文献   

19.
Research on the lateralisation of brain functions for emotion has yielded different results as a function of whether it is the experience, expression, or perceptual processing of emotion that is examined. Further, for the perception of emotion there appear to be differences between the processing of verbal and nonverbal stimuli. The present research examined the hemispheric asymmetry in the processing of verbal stimuli varying in emotional valence. Participants performed a lexical decision task for words varying in affective valence (but equated in terms of arousal) that were presented briefly to the right or left visual field. Participants were significantly faster at recognising positive words presented to the right visual field/left hemisphere. This pattern did not occur for negative words (and was reversed for high arousal negative words). These results suggest that the processing of verbal stimuli varying in emotional valence tends to parallel hemispheric asymmetry in the experience of emotion.  相似文献   

20.
Perceptual fluency is the subjective experience of ease with which an incoming stimulus is processed. Although perceptual fluency is assessed by speed of processing, it remains unclear how objective speed is related to subjective experiences of fluency. We present evidence that speed at different stages of the perceptual process contributes to perceptual fluency. In an experiment, figure-ground contrast influenced detection of briefly presented words, but not their identification at longer exposure durations. Conversely, font in which the word was written influenced identification, but not detection. Both contrast and font influenced subjective fluency. These findings suggest that speed of processing at different stages condensed into a unified subjective experience of perceptual fluency.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号