首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   27篇
  免费   2篇
  29篇
  2019年   1篇
  2015年   1篇
  2014年   1篇
  2013年   3篇
  2009年   1篇
  2003年   2篇
  2001年   1篇
  1992年   1篇
  1991年   4篇
  1987年   1篇
  1986年   1篇
  1985年   1篇
  1982年   1篇
  1980年   2篇
  1978年   1篇
  1977年   3篇
  1976年   1篇
  1975年   1篇
  1974年   2篇
排序方式: 共有29条查询结果,搜索用时 15 毫秒
21.
The Ss were presented pairs of stimuli, /bae/s, /ae/s, or isolated transitions from /bae/s, which differed in the initial 60 msec of the signals by 0, 7.5, or 9 dB. In the syllable context, the intensity differences were discriminated essentially at chance; in both the vowel and isolated transition conditions, the intensity differences were discriminated essentially perfectly. This outcome suggests that after the acoustic features of a stop-consonant/vowel syllable have been recorded into a phonetic representation, the acoustic information is relatively inaccessible for recall from auditory short-term memory.  相似文献   
22.
An electrophysiological correlate of the discrimination of stop consonants drawn from within and across phonetic categories was investigated by an auditory evoked response (AER) technique. Ss were presented a string of stimuli from the phonetic category [ba] (the standard stimulus) and were asked to detect the occurrence of a stimulus from the same phonetic category (within-category shift), or the occurrence of a stimulus from a different phonetic category [pa] (across-category shift). Both the across- and within-category shift stimuli differed equally from the standard stimulus in the time of onset of the first formant and in the amount of aspiration in the second and third formants. The NIP2 response of the AER was larger to the across-category shift than to the within-category shift. The within-category shift did not differ from a no-shift control. These findings suggest (1) that the AER can reflect the relative discriminability of stop consonants drawn from the same or different phonetic categories in a manner similar to other behavioral measures; (2) that the detailed acoustic representation of stop consonants is transformed into a categorized phonetic representation within 200 msec after stimulus onset.  相似文献   
23.
Informational masking is broadly defined as a degradation of auditory detection or discrimination of a signal embedded ina context of other similar sounds; it is not related to energetic masking caused by physical interactions between signal and masker. In this paper, we report a systematic release from informational masking of a target tone in anine-tone rapid auditory sequence as the target is increasingly isolated in frequency or intensity from the remaieining sequence components. Improved target-tone frequency difference limens as isolation increases are interpreted as a reflection of increasingly focused auditory attention. The change from diffuse to highly focused attention is gradual over the frequency and intensity ranges examined, with each 1-dB increment in target intensity relative to the remaining components producing performance improvements equivalent to those produced by a 2% increase in frequency isolation. The results are modeled as bands of attention in the frequency and intensity domains. For attention directed by frequency isolation, there is a strong correspondence with auditory filters predicted by the power spectrum model of masking. These data also support the existence of an attention band of intensity, with a bandwidth of about 5–7 dB at the moderate levels used in this experiment.  相似文献   
24.
In this report we review the vowel and consonant recognition ability of patients who use a multichannel cochlear implant and who achieve relatively good word identification scores. The results suggest that vowel recognition is accomplished by good resolution of the frequency of the first formant (F1) combined with poor resolution of the frequency of the second formant (F2). The results also suggest that consonant recognition is accomplished (1) by using information from the amplitude envelope, including periodicity/aperiodicity, as cues to manner and voicing, (2) by using F1 as an aid to the identification of manner and voicing, and (3) by using information from cochlear place of stimulation to provide a very crude indication of the shape of the frequency spectrum above 1kHz.  相似文献   
25.
31 adolescents with cerebral palsy were administered measures of verbal production, speech perception, nonverbal auditory perception, visuospatial perception and verbal intelligence as well as measures of reading recognition and reading comprehension. Nonverbal auditory perception and verbal intelligence were most highly correlated with both reading measures despite the fact that most subjects were most severely impaired in visuospatial perception.  相似文献   
26.
27.
28.
Mary Cover Jones has played many roles during her career as a psychologist—researcher, professor, wife of the eminent psychologist Harold E. Jones, and friend to some of the great names in the field such as Erik Erikson and Nevitt Sanford. Included in the paper is a discussion of three of her primary areas of research—the case study of Peter which provided a preview of behavior modification, evidence from longitudinal studies regarding the problems of early and late maturing, and work on personality antecedents in problem drinkers. In addition, her part in the establishment of the major longitudinal studies at the University of California is reported. Finally, her successful application of traditionally feminine strengths to these many professional undertakings is discussed.  相似文献   
29.
Mental health differences due to sex, sex-role identification, and sex-role attitudes were investigated using 109 undergraduate students. Females reported higher levels of depression and anxiety. Both males and females with more liberal scores on the Attitudes Toward Women Scale scored higher on the Well-Being Scale of the California Psychological Inventory. No differences due to androgyny were found.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号