全文获取类型
收费全文 | 373篇 |
免费 | 37篇 |
国内免费 | 17篇 |
专业分类
427篇 |
出版年
2025年 | 1篇 |
2024年 | 1篇 |
2023年 | 3篇 |
2022年 | 4篇 |
2021年 | 17篇 |
2020年 | 20篇 |
2019年 | 14篇 |
2018年 | 9篇 |
2017年 | 19篇 |
2016年 | 18篇 |
2015年 | 16篇 |
2014年 | 29篇 |
2013年 | 60篇 |
2012年 | 29篇 |
2011年 | 37篇 |
2010年 | 12篇 |
2009年 | 28篇 |
2008年 | 22篇 |
2007年 | 15篇 |
2006年 | 13篇 |
2005年 | 12篇 |
2004年 | 14篇 |
2003年 | 8篇 |
2002年 | 9篇 |
2001年 | 6篇 |
2000年 | 3篇 |
1998年 | 2篇 |
1997年 | 3篇 |
1996年 | 1篇 |
1993年 | 1篇 |
1976年 | 1篇 |
排序方式: 共有427条查询结果,搜索用时 0 毫秒
141.
Combinations of different sensory words produce mismatch expressions like smooth color and red touch in contrast to normal expressions like red color and smooth touch. Concerning these sensory mismatch expressions, results of three experiments are reported. Experiment 1 revealed that (i) mismatch expressions were less comprehensible than normal expressions, and that (ii) there were two patterns among mismatch expressions: the high-comprehensible mismatch expression (HighCME, e.g., smooth color) and the low-comprehensible mismatch expression (LowCME, e.g., red touch). Experiment 2 revealed that the mismatch expressions produced a significantly greater N400 amplitude than the normal expressions. Experiment 3 implied that the difference between High/LowCME was reflected in a later latency band or in a topographical difference of N400, although the statistical significance was marginal. It is argued that the processes to integrate linguistic elements (e.g., combining adjectives and nouns) are not homogeneous. 相似文献
142.
When we move to music we feel the beat, and this feeling can shape the sound we hear. Previous studies have shown that when people listen to a metrically ambiguous rhythm pattern, moving the body on a certain beat--adults, by actively bouncing themselves in synchrony with the experimenter, and babies, by being bounced passively in the experimenter's arms--can bias their auditory metrical representation so that they interpret the pattern in a corresponding metrical form [Phillips-Silver, J., & Trainor, L. J. (2005). Feeling the beat: Movement influences infant rhythm perception. Science, 308, 1430; Phillips-Silver, J., & Trainor, L. J. (2007). Hearing what the body feels: Auditory encoding of rhythmic movement. Cognition, 105, 533-546]. The present studies show that in adults, as well as in infants, metrical encoding of rhythm can be biased by passive motion. Furthermore, because movement of the head alone affected auditory encoding whereas movement of the legs alone did not, we propose that vestibular input may play a key role in the effect of movement on auditory rhythm processing. We discuss possible cortical and subcortical sites for the integration of auditory and vestibular inputs that may underlie the interaction between movement and auditory metrical rhythm perception. 相似文献
143.
Saying What You Don't Mean 总被引:1,自引:0,他引:1
144.
It is not unusual to find it stated as a fact that the left hemisphere is specialized for the processing of rapid, or temporal aspects of sound, and that the dominance of the left hemisphere in the perception of speech can be a consequence of this specialization. In this review we explore the history of this claim and assess the weight of this assumption. We will demonstrate that instead of a supposed sensitivity of the left temporal lobe for the acoustic properties of speech, it is the right temporal lobe which shows a marked preference for certain properties of sounds, for example longer durations, or variations in pitch. We finish by outlining some alternative factors that contribute to the left lateralization of speech perception. 相似文献
145.
Although, in everyday life, patients with attention deficit hyperactivity disorder (ADHD) are frequently distracted by goal-irrelevant affective stimuli, little is known about the neural and behavioral substrates underlying this emotional distractibility. Because some of the most important brain responses associated with the sudden onset of an emotional distracter are characterized by their early latency onset and short duration, we addressed this issue by using a temporally agile neural signal capable of detecting and distinguishing them. Specifically, scalp event-related potentials (ERPs) were recorded while 20 boys with ADHD combined type and 20 healthy comparison subjects performed a digit categorization task during the presentation of three types of irrelevant, distracting stimuli: arousing negative (A−), neutral (N) and arousing positive (A+). Behavioral data showed that emotional distracters (both A− and A+) were associated with longer reaction times than neutral ones in the ADHD group, whereas no differences were found in the control group. ERP data revealed that, compared with control subjects, boys with ADHD showed larger anterior N2 amplitudes for emotional than for neutral distracters. Furthermore, regression analyses between ERP data and subjects’ emotional ratings of distracting stimuli showed that only in the ADHD group, emotional arousal (ranging from calming to arousing) was associated with anterior N2: its amplitude increased as the arousal content of the visual distracter increased. These results suggest that boys with ADHD are more vulnerable to the distracting effects of irrelevant emotional stimuli than control subjects. The present study provides first data on the neural substrates underlying emotional distractibility in ADHD. 相似文献
146.
Vernat, J.-P. & Gordon, M. S. (2010). Indirect interception actions by blind and visually impaired perceivers: Echolocation for interceptive actions. Scandinavian Journal of Psychology, 51, 75–83.
This research examined the acoustic information used to support interceptive actions by the blind. Congenitally blind and severely visually impaired participants (all wearing an opaque, black eye-mask) were asked to listen to a target ball rolling down a track. In response, participants rolled their own ball along a perpendicular path to intercept the target. To better understand what information was used the echoic conditions and rolling dynamics of the target were varied across test sessions. In addition the rolling speed of the target and the distance of the participant from the target were varied across trials. Results demonstrated that participants tended to perform most accurately at moderate speeds and distances, overestimating the target's arrival at the fastest speed, and underestimating it at the slowest speed. However, changes to the target's dynamics, that is, the amount of deceleration it underwent on approach, did not strongly influence performance. Echoic conditions were found to affect performance, as participants were slightly more accurate in conditions with faster, higher-intensity echoes. Based on these results blind individuals in this research seemed to be using spatial and temporal cues to coordinate their interceptive actions. 相似文献
This research examined the acoustic information used to support interceptive actions by the blind. Congenitally blind and severely visually impaired participants (all wearing an opaque, black eye-mask) were asked to listen to a target ball rolling down a track. In response, participants rolled their own ball along a perpendicular path to intercept the target. To better understand what information was used the echoic conditions and rolling dynamics of the target were varied across test sessions. In addition the rolling speed of the target and the distance of the participant from the target were varied across trials. Results demonstrated that participants tended to perform most accurately at moderate speeds and distances, overestimating the target's arrival at the fastest speed, and underestimating it at the slowest speed. However, changes to the target's dynamics, that is, the amount of deceleration it underwent on approach, did not strongly influence performance. Echoic conditions were found to affect performance, as participants were slightly more accurate in conditions with faster, higher-intensity echoes. Based on these results blind individuals in this research seemed to be using spatial and temporal cues to coordinate their interceptive actions. 相似文献
147.
Batty M Meaux E Wittemeyer K Rogé B Taylor MJ 《Journal of experimental child psychology》2011,109(4):430-444
Social deficits are one of the most striking manifestations of autism spectrum disorders (ASDs). Among these social deficits, the recognition and understanding of emotional facial expressions has been widely reported to be affected in ASDs. We investigated emotional face processing in children with and without autism using event-related potentials (ERPs). High-functioning children with autism (n = 15, mean age = 10.5 ± 3.3 years) completed an implicit emotional task while visual ERPs were recorded. Two groups of typically developing children (chronological age-matched and verbal equivalent age-matched [both ns = 15, mean age = 7.7 ± 3.8 years]) also participated in this study. The early ERP responses to faces (P1 and N170) were delayed, and the P1 was smaller in children with autism than in typically developing children of the same chronological age, revealing that the first stages of emotional face processing are affected in autism. However, when matched by verbal equivalent age, only P1 amplitude remained affected in autism. Our results suggest that the emotional and facial processing difficulties in autism could start from atypicalities in visual perceptual processes involving rapid feedback to primary visual areas and subsequent holistic processing. 相似文献
148.
Previous studies have suggested that auditory hallucination is closely related to thought insertion. In this study, we investigated the relationship between the external misattribution of thought and auditory hallucination-like experiences. We used the AHES-17, which measures auditory hallucination-like experiences in normal, healthy people, and the Deese–Roediger–McDermott paradigm, in which false alarms of critical lure are regarded as spontaneous external misattribution of thought. We found that critical lures elicited increased the number of false alarms as AHES-17 scores increased and that scores of AHES-17 predicted the rate of false memory of critical lures. Furthermore, we revealed that the relationship between AHES-17 scores and the rates of false alarms to critical lures was strictly linear. Therefore, it might be said that individual differences in auditory hallucination-like experiences are highly related to the external misattribution of thought. We discussed these results from the perspective of the sense of agency over thought. 相似文献
149.
To examine whether anticipatory attention or expectancy is a cognitive process that is automatic or requires conscious control, we employed a paired-stimulus event-related potential (ERP) paradigm during the transition to sleep. The slow negative ERP wave observed between two successive stimuli, the Contingent Negative Variation (CNV), reflects attention and expectancy to the second stimulus. Thirteen good sleepers were instructed to respond to the second stimulus in a pair during waking sessions. In a non-response paradigm modified for sleep, participants then fell asleep while tones played. As expected, N1 decreased and P2 increased in amplitude systematically with the loss of consciousness at sleep onset; the CNV was increasingly more positive. Sleep onset latency was correlated with the amplitude of the CNV. The systematic attenuation of the CNV waveform at sleep onset and its absence in sleep indicates that anticipatory attention requires endogenous conscious control. 相似文献
150.
The perception of time is heavily influenced by attention and memory, both of which change over the lifespan. In the current study, children (8 yrs), young adults (18–25 yrs), and older adults (60–75 yrs) were tested on a duration bisection procedure using 3 and 6-s auditory and visual signals as anchor durations. During test, participants were exposed to a range of intermediate durations, and the task was to indicate whether test durations were closer to the “short” or “long” anchor. All groups reproduced the classic finding that “sounds are judged longer than lights”. This effect was greater for older adults and children than for young adults, but for different reasons. Replicating previous results, older adults made similar auditory judgments as young adults, but underestimated the duration of visual test stimuli. Children showed the opposite pattern, with similar visual judgments as young adults but overestimation of auditory stimuli. Psychometric functions were analyzed using the Sample Known Exactly-Mixed Memory quantitative model of the Scalar Timing Theory of interval timing. Results indicate that children show an auditory-specific deficit in reference memory for the anchors, rather than a general bias to overestimate time and that aged adults show an exaggerated tendency to judge visual stimuli as “short” due to a reduction in the availability of controlled attention. 相似文献