首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
To investigate how tactile and proprioceptive information are used in haptic object discrimination we conducted a haptic search task in which participants had to search for either a cylinder, a bar or a rotated cube within a grid of aligned cubes. Tactile information from one finger is enough to detect a cylinder amongst the cubes. For detecting a bar or a rotated cube amongst cubes touch alone is not enough. For the rotated cube this is evident because its shape is identical to that of the non-targets, so proprioception must provide information about the orientation of the fingers and hand when touching it. For the bar one either needs proprioceptive information about the distance and direction of a single finger’s movements along the surfaces, or proprioceptive information from several fingers when they touch it simultaneously. When using only one finger, search times for the bar were much longer than those for the other two targets. When the whole hand or both hands were used the search times were similar for all shapes. Most errors were made when searching for the rotated cube, probably due to systematic posture-related biases in judging orientation on the basis of proprioception. The results suggest that tactile and proprioceptive information are readily combined for shape discrimination.  相似文献   

2.
We experience the shape of objects in our world largely by way of our vision and touch but the availability and integration of information between the senses remains an open question. The research presented in this article examines the effect of stimulus complexity on visual, haptic and crossmodal discrimination. Using sculpted three-dimensional objects whose features vary systematically, we perform a series of three experiments to determine perceptual equivalence as a function of complexity. Two unimodal experiments - vision and touch-only, and one crossmodal experiment investigating the availability of information across the senses, were performed. We find that, for the class of stimuli used, subjects were able to visually discriminate them reliably across the entire range of complexity, while the experiments involving haptic information show a marked decrease in performance as the objects become more complex. Performance in the crossmodal condition appears to be constrained by the limits of the subjects’ haptic representation, but the combination of the two sources of information is of some benefit over vision alone when comparing the simpler, low-frequency stimuli. This result shows that there is crossmodal transfer, and therefore perceptual equivalency, but that this transfer is limited by the object’s complexity.  相似文献   

3.
The purpose of this study was to examine the subjective dimensionality of tactile surface texture perception. Seventeen tactile stimuli, such as wood, sandpaper, and velvet, were moved across the index finger of the subject, who sorted them into categories on the basis of perceived similarity. Multidimensional scaling (MDS) techniques were then used to position the stimuli in a perceptual space on the basis of combined data of 20 subjects. A three-dimensional space was judged to give a satisfactory representation of the data. Subjects’ ratings of each stimulus on five scales representing putative dimensions of perceived surface texture were then fitted by regression analysis into the MDS space. Roughness-smoothness and hardness-softness were found to be robust and orthogonal dimensions; the third dimension did not correspond closely with any of the rating scales used, but post hoc inspection of the data suggested that it may reflect the compressional elasticity (“springiness”) of the surface.  相似文献   

4.
Peripheral cues are thought to facilitate responses to stimuli presented at the same location because they lead to exogenous attention shifts. Facilitation has been observed in numerous studies of visual and auditory attention, but there have been only four demonstrations of tactile facilitation, all in studies with potential confounds. Three studies used a spatial (finger versus thumb) discrimination task, where the cue could have provided a spatial framework that might have assisted the discrimination of subsequent targets presented on the same side as the cue. The final study circumvented this problem by using a non-spatial discrimination; however, the cues were informative and interspersed with visual cues which may have affected the attentional effects observed. In the current study, therefore, we used a non-spatial tactile frequency discrimination task following a non-informative tactile white noise cue. When the target was presented 150 ms after the cue, we observed faster discrimination responses to targets presented on the same side compared to the opposite side as the cue; by 1000 ms, responses were significantly faster to targets presented on the opposite side to the cue. Thus, we demonstrated that tactile attentional facilitation can be observed in a non-spatial discrimination task, under unimodal conditions and with entirely non-predictive cues. Furthermore, we provide the first demonstration of significant tactile facilitation and tactile inhibition of return within a single experiment.  相似文献   

5.
Seeing one's own body (either directly or indirectly) can influence visuotactile crossmodal interactions. Recently, it has been shown that even viewing a simple line drawing of a hand can also modulate such crossmodal interactions, as if the picture of the hand somehow corresponds to (or primes) the participants' own hand. Alternatively, however, it could be argued that the modulatory effects of viewing the picture of a hand on visuotactile interactions might simply be attributed to cognitive processes such as the semantic referral to the relevant body part or to the orientation cues provided by the hand picture instead. In the present study, we evaluated these various different interpretations of the hand picture effect. Participants made speeded discrimination responses to the location of brief vibrotactile targets presented to either the tip or base of their forefinger, while trying to ignore simultaneously-presented visual distractors presented to either side of central fixation. We compared the modulatory effect of the picture of a hand with that seen when the visual distractors were presented next to words describing the tip and base of the forefinger (Experiment 1), or were superimposed over arrows which provided another kind of directional cue (Experiment 2). Tactile discrimination performance was modulated in the hand picture condition, but not in the word or arrow conditions. These results therefore suggest that visuotactile interactions are specifically modulated by the image of the hand rather than by cognitive cues such as simply semantic referral to the relevant body sites and/or any visual orientation cues provided by the picture of a hand.  相似文献   

6.
The aim of this study was to examine the occurrence of a so-called time-shrinking illusion in the tactile modality, while it had been tested so far mainly with auditory and visual stimuli. We examined whether the perception of an empty time interval marked by two brief tactile stimuli, S (240 ms), would be influenced by the presence of a preceding time interval, P (160, 240, or 320 ms). Results showed that S was underestimated when P was shorter than S. This underestimation appeared as a kind of perceptual assimilation between P and S, but S was not overestimated when P was longer. The underestimation was rather interpreted as a manifestation of the time-shrinking illusion.  相似文献   

7.
Individuals often describe objects in their world in terms of perceptual dimensions that span a variety of modalities; the visual (e.g., brightness: dark–bright), the auditory (e.g., loudness: quiet–loud), the gustatory (e.g., taste: sour–sweet), the tactile (e.g., hardness: soft vs. hard) and the kinaesthetic (e.g., speed: slow–fast). We ask whether individuals use perceptual dimensions to differentiate emotions from one another. Participants in two studies (one where respondents reported on abstract emotion concepts and a second where they reported on specific emotion episodes) rated the extent to which features anchoring 29 perceptual dimensions (e.g., temperature, texture and taste) are associated with 8 emotions (anger, fear, sadness, guilt, contentment, gratitude, pride and excitement). Results revealed that in both studies perceptual dimensions differentiate positive from negative emotions and high arousal from low arousal emotions. They also differentiate among emotions that are similar in arousal and valence (e.g., high arousal negative emotions such as anger and fear). Specific features that anchor particular perceptual dimensions (e.g., hot vs. cold) are also differentially associated with emotions.  相似文献   

8.
Previous research has shown that subjects appear unable to restrict processing to a single finger and ignore a stimulus presented to an adjacent finger. Furthermore, the evidence suggests that, at least for moving stimuli, an adjacent nontarget is fully processed to the level of incipient response activation. The present study replicated and expanded upon these original findings. The results of Experiment 1 showed that an equally large response-competition effect occurred when the nontarget was presented to adjacent and nonadjacent fingers4on the same hand). The results of Experiment 2 showed that the effects observed in Experiment 1 (and in previous studies) were also obtained with stationary stimuli. Although small, there was some indication in the results of Experiment 2 that interference may dissipate more rapidly with distance with stationary stimuli. An additional finding was that interference effects were observed in both experiments with temporal separations between the target and nontarget of up to 100 msec. In Experiment 3, target and nontarget stimuli were presented to opposite hands. Although reduced, interference was still evident with target and nontarget stimuli presented to opposite hands. Varying the physical distance between hands did not produce any change in the amount of interference. The results suggest that the focus of attention on the skin extends nearly undiminished across the fingers of one hand and is not dependent upon the physical distance between sites of stimulation.  相似文献   

9.
Crossing 2 adjacent fingers produces a distorted perception when 2 tactile stimuli are touched to the crossed fingertips. This is because a given pair of fingers has a functional range of action within which spatial perception is correct (uncrossed fingers) and beyond which perceived location of tactile stimuli is wrong (crossed fingers). The present study tested the possibility that this range of action could be modified following a long-lasting crossing. Therefore, the functional range of action of the 2nd and 3rd fingers was tested every month in 6 volunteers who maintained the finger crossing for as long as 6 months. The results show that after a variable period of time the range of action was enlarged such that spatial perception was correct with crossed fingers as well. The perceptual organization of the human hand therefore depends on experience and is not rigidly determined on a genetic basis.  相似文献   

10.
Previous research has shown that subjects appear unable to restrict processing to a single finger and ignore a stimulus presented to an adjacent finger. Furthermore, the evidence suggests that, at least for moving stimuli, an adjacent nontarget is fully processed to the level of incipient response activation. The present study replicated and expanded upon these original findings. The results of Experiment 1 showed that an equally large response-competition effect occurred when the nontarget was presented to adjacent and nonadjacent fingers (on the same hand). The results of Experiment 2 showed that the effects observed in Experiment 1 (and in previous studies) were also obtained with stationary stimuli. Although small, there was some indication in the results of Experiment 2 that interference may dissipate more rapidly with distance with stationary stimuli. An additional finding was that interference effects were observed in both experiments with temporal separations between the target and nontarget of up to 100 msec. In Experiment 3, target and nontarget stimuli were presented to opposite hands. Although reduced, interference was still evident with target and nontarget stimuli presented to opposite hands. Varying the physical distance between hands did not produce any change in the amount of interference. The results suggest that the focus of attention on the skin extends nearly undiminished across the fingers of one hand and is not dependent upon the physical distance between sites of stimulation.  相似文献   

11.
Phenomenology and the Cognitive Sciences - Perceptual constancy, often defined as the perception of stable features under changing conditions, goes hand in hand with variation in how things look. A...  相似文献   

12.
Haptic perception of parallelity in the midsagittal plane.   总被引:10,自引:0,他引:10  
Previous studies [Perception 28 (1999) 1001; Perception 28 (1999) 781] on the haptic perception of parallelity on a horizontal plane showed that what subjects haptically perceive as being parallel deviates considerably from what is physically parallel. The deviations could be described with a subject-dependent orientation gradient in the left-right direction. The gradients found in the bimanual conditions were significantly larger (about 70%) than those in the unimanual conditions. The questions to be answered in the present study are the following: (1) Does the haptic perception of parallelity in the midsagittal plane also show systematic deviations from veridicality? (2) Are the unimanual and bimanual performances again quantitatively but not qualitatively different? The set-up consisted of a plate positioned in the midsagittal plane of the subject. The subject touched the right side of the plate with his/her right hand and the left side with the left hand. The results show again large systematic deviations. The major part of the deviations can be described by means of a subject-dependent orientation gradient in the vertical direction. The quantitative (but not qualitative) difference between the unimanual and the bimanual conditions is much larger in the midsagittal plane than in the horizontal plane.  相似文献   

13.
Kappers AM 《Acta psychologica》2004,117(3):333-340
The influence of egocentric and allocentric reference frames on performance in haptic spatial tasks, was tested in three conditions. Blindfolded subjects had to make two bars haptically parallel, perpendicular or mirrored in the midsagittal plane. The hypothesis is that the contributions of egocentric and allocentric reference frames are combined, resulting in settings that lie in between the allo-representation and the ego-representation. This leads to different predictions for the outcome of different conditions. All findings were consistent with the hypothesis. In addition, for subjects with large deviations a reversal of the oblique effect was found once again, which provides extra support for the hypothesis.  相似文献   

14.
Vatakis, A. and Spence, C. (in press) [Crossmodal binding: Evaluating the 'unity assumption' using audiovisual speech stimuli. Perception &Psychophysics] recently demonstrated that when two briefly presented speech signals (one auditory and the other visual) refer to the same audiovisual speech event, people find it harder to judge their temporal order than when they refer to different speech events. Vatakis and Spence argued that the 'unity assumption' facilitated crossmodal binding on the former (matching) trials by means of a process of temporal ventriloquism. In the present study, we investigated whether the 'unity assumption' would also affect the binding of non-speech stimuli (video clips of object action or musical notes). The auditory and visual stimuli were presented at a range of stimulus onset asynchronies (SOAs) using the method of constant stimuli. Participants made unspeeded temporal order judgments (TOJs) regarding which modality stream had been presented first. The auditory and visual musical and object action stimuli were either matched (e.g., the sight of a note being played on a piano together with the corresponding sound) or else mismatched (e.g., the sight of a note being played on a piano together with the sound of a guitar string being plucked). However, in contrast to the results of Vatakis and Spence's recent speech study, no significant difference in the accuracy of temporal discrimination performance for the matched versus mismatched video clips was observed. Reasons for this discrepancy are discussed.  相似文献   

15.
This work investigated the accuracy of the perception of the main orientations (i.e., vertical and horizontal orientations) with the kinesthetic modality--a modality not previously used in this field of research. To further dissociate the influence of the postural and physical verticals, two body positions were explored (supine and upright). Twenty-two blindfolded participants were asked to set, as accurately as possible, a rod to both physical orientations while assuming one of the two body positions. The horizontal was perceived more accurately than the vertical orientation in the upright position but not in the supine position. Essentially, there were no differences in the supine position because the adjustments to the physical vertical were much more accurate than they were in the upright position. The lower accuracy in the estimation of the vertical orientation observed in the upright position might be linked to the dynamics associated with the maintenance of posture.  相似文献   

16.
Kappers AM 《Acta psychologica》2003,114(2):131-145
Previous studies showed that what subjects haptically perceive as parallel deviates largely from what is actually physically parallel [Perception 28 (1999) 1001; Acta Psychol. 109 (2002) 25; Perception 28 (1999) 781]. It also turned out that the deviations were strongly subject-dependent. It was hypothesized that what is haptically parallel is decided in a frame of reference intermediate to an allocentric and an egocentric one. The purposes of the present study were to collect more evidence for this hypothesis and to investigate the factor(s) that determines the specific weighting between the two reference frames. We found a highly significant reversal of a haptic oblique effect (in context: larger systematic deviations for oblique orientations) for subjects with large deviations. This reversal provides convincing evidence that an intermediate frame of reference is used for the decision of haptic parallelity. Contrary to common expectation, several factors that might have been of influence on the weighting of the two frames of reference, such as arm length, arm span, shoulder width, turned out to be irrelevant. Surprisingly, the only factors that seem to be of influence are gender and job experience or education.  相似文献   

17.
In this paper, results of a free sorting task of 124 different material samples are analysed using multidimensional scaling. The relevant number of dimensions for haptic perception of materials is estimated to be 4. In addition, the haptic material space is calibrated by means of physical measurements of compressibility and roughness. The relation between objective and perceived compressibility and that between objective and perceived roughness could be described by an exponential function.  相似文献   

18.
These three experiments employed rectangles in stimulus identification tasks. Consistent with the stimulus set used by Weintraub, the rectangles were generated by modifying a square. Across experiments, the number of stimulus/response alternatives was varied (two-, three-, and four-choice tasks). In the two-choice task, redundancy gain for the positively correlated set was just as large as for the negatively correlated set. In contrast, reaction time ws faster for the negatively correlated set than for the positively correlated set in the three-choice task (after extended practice) and in the four-choice task. Considered in the context of previous research, the data support two conclusions. First, the initial perceptual processing of rectangles is accomplished by separate dimensional analyzers operating in parallel. Second, observers adopt a different decision strategy for the negatively correlated set than for the positively correlated and the single dimension sets when the number of stimulus/response alternatives is increased.  相似文献   

19.
While an increasing number of behavioral studies examining spatial cognition use experimental paradigms involving disorientation, the process by which one becomes disoriented is not well explored. The current study examined this process using a paradigm in which participants were blindfolded and underwent a succession of 70° or 200° passive, whole body rotations around a fixed vertical axis. After each rotation, participants used a pointer to indicate either their heading at the start of the most recent turn or their heading at the start of the current series of turns. Analyses showed that in both cases, mean pointing errors increased gradually over successive turns. In addition to the gradual loss of orientation indicated by this increase, analysis of the pointing errors also showed evidence of occasional, abrupt loss orientation. Results indicate multiple routes from an oriented to a disoriented state, and shed light on the process of becoming disoriented.  相似文献   

20.
In this study, we are interested in the following two questions: (1) how does perceived roughness correlate with physical roughness, and (2) how do visually and haptically perceived roughness compare? We used 96 samples of everyday materials, such as wood, paper, glass, sandpaper, ceramics, foams, textiles, etc. The samples were characterized by various different physical roughness measures, all determined from accurately measured roughness profiles. These measures consisted of spectral densities measured at different spatial scales and industrial roughness standards (R(a), R(q) and R(z)). In separate haptic and visual conditions, 12 na?ve subjects were instructed to order the 96 samples according to perceived roughness. The rank orders of both conditions were correlated with the various physical roughness measures. With most physical roughness measures, haptic and visual correspondence with the physical ordering was about equal. With others, haptic correspondence was slightly better. It turned out that different subjects ordered the samples using different criteria; for some subjects the correlation was better with roughness measures that were based on higher spatial frequencies, while others seemed to be paying more attention to the lower spatial frequencies. Also, physical roughness was not found to be the same as perceived roughness.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号