首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   274篇
  免费   30篇
  国内免费   67篇
  2023年   10篇
  2022年   12篇
  2021年   16篇
  2020年   29篇
  2019年   27篇
  2018年   32篇
  2017年   29篇
  2016年   18篇
  2015年   24篇
  2014年   19篇
  2013年   37篇
  2012年   14篇
  2011年   13篇
  2010年   9篇
  2009年   6篇
  2008年   7篇
  2007年   11篇
  2006年   7篇
  2005年   5篇
  2004年   3篇
  2003年   5篇
  2002年   6篇
  2001年   1篇
  2000年   3篇
  1998年   2篇
  1997年   7篇
  1996年   1篇
  1995年   2篇
  1994年   1篇
  1992年   1篇
  1991年   2篇
  1990年   2篇
  1987年   1篇
  1986年   2篇
  1985年   2篇
  1984年   1篇
  1983年   3篇
  1977年   1篇
排序方式: 共有371条查询结果,搜索用时 15 毫秒
71.
Interpreting and responding appropriately to facial expressions of emotion are important aspects of social skills. Some children, adolescents, and adults with various psychological and psychiatric disorders recognize facial expressions less proficiently than their peers in the general population. We wished to determine if such deficits existed in a group of 133 children and adolescents with emotional and behavioral disorders (EBD). The subjects were receiving in-patient psychiatric services for at least one of substance-related disorders, adjustment disorders, anxiety disorders, mood disorders or disruptive behavior disorders. After being read stories describing various emotional reactions, all subjects were tested for their ability to recognize the 6 basic facial expressions of emotion depicted in Ekman and Friesen's (1976) normed photographs. Overall, they performed well on this task at levels comparable to those occurring in the general population. Accuracy increased with age, irrespective of gender, ethnicity, or clinical diagnosis. After adjusting for age effects, the subjects diagnosed with either adjustment disorders, mood disorders, or disruptive behavior disorders were significantly more accurate at identifying anger than those without those diagnoses. In addition, subjects with mood disorders identified sadness significantly more accurately than those without this diagnosis, although the effect was greatest with younger children.  相似文献   
72.
73.
This study explored whether subjects high as compared to low in social fear react with a more negative emotional response, measured as facial electromyographic (EMG) activity, when exposed to social stimuli (pictures of angry and happy facial expressions). It was found that subjects who rated themselves as relatively high in public speaking fear gave larger negative facial EMG responses (Corrugator supercilii muscle activity) to angry faces than did the low fear subjects. Low fear subjects, on the other hand, gave larger positive facial EMG responses (Zygomatic major muscle activity) to happy faces than did the high fear subjects. It was further found that happy stimuli were rated as more hostile and less friendly and happy by the high fear group. Consistent with earlier findings, it was concluded that the facial EMG technique is sensitive to detecting different reactions among subjects relatively high and low in social fear.  相似文献   
74.
75.
76.
Participants rated the attractiveness and racial typicality of male faces varying in their facial features from Afrocentric to Eurocentric and in skin tone from dark to light in two experiments. Experiment 1 provided evidence that facial features and skin tone have an interactive effect on perceptions of attractiveness and mixed-race faces are perceived as more attractive than single-race faces. Experiment 2 further confirmed that faces with medium levels of skin tone and facial features are perceived as more attractive than faces with extreme levels of these factors. Black phenotypes (combinations of dark skin tone and Afrocentric facial features) were rated as more attractive than White phenotypes (combinations of light skin tone and Eurocentric facial features); ambiguous faces (combinations of Afrocentric and Eurocentric physiognomy) with medium levels of skin tone were rated as the most attractive in Experiment 2. Perceptions of attractiveness were relatively independent of racial categorization in both experiments.  相似文献   
77.
Women tend to be more accurate in decoding facial expressions than men. We hypothesized that women’s better performance in decoding facial expressions extends to distinguishing between authentic and nonauthentic smiles. We showed participants portrait photos of persons who smiled because either they saw a pleasant picture (authentic smile) or were instructed to smile by the experimenter (nonauthentic smile) and asked them to identify the smiles. Participants judged single photos of persons depicting either an authentic or a nonauthentic smile, and they judged adjacent photos of the same person depicting an authentic smile and a nonauthentic smile. Women outperformed men in identifying the smiles when judging the adjacent photos. We discuss implications for judging smile authenticity in real life and limitations for the observed sex difference.  相似文献   
78.
Individuals vary in perceptual accuracy when categorising facial expressions, yet it is unclear how these individual differences in non-clinical population are related to cognitive processing stages at facial information acquisition and interpretation. We tested 104 healthy adults in a facial expression categorisation task, and correlated their categorisation accuracy with face-viewing gaze allocation and personal traits assessed with Autism Quotient, anxiety inventory and Self-Monitoring Scale. The gaze allocation had limited but emotion-specific impact on categorising expressions. Specifically, longer gaze at the eyes and nose regions were coupled with more accurate categorisation of disgust and sad expressions, respectively. Regarding trait measurements, higher autistic score was coupled with better recognition of sad but worse recognition of anger expressions, and contributed to categorisation bias towards sad expressions; whereas higher anxiety level was associated with greater categorisation accuracy across all expressions and with increased tendency of gazing at the nose region. It seems that both anxiety and autistic-like traits were associated with individual variation in expression categorisation, but this association is not necessarily mediated by variation in gaze allocation at expression-specific local facial regions. The results suggest that both facial information acquisition and interpretation capabilities contribute to individual differences in expression categorisation within non-clinical populations.  相似文献   
79.
Processing Faces and Facial Expressions   总被引:10,自引:0,他引:10  
This paper reviews processing of facial identity and expressions. The issue of independence of these two systems for these tasks has been addressed from different approaches over the past 25 years. More recently, neuroimaging techniques have provided researchers with new tools to investigate how facial information is processed in the brain. First, findings from traditional approaches to identity and expression processing are summarized. The review then covers findings from neuroimaging studies on face perception, recognition, and encoding. Processing of the basic facial expressions is detailed in light of behavioral and neuroimaging data. Whereas data from experimental and neuropsychological studies support the existence of two systems, the neuroimaging literature yields a less clear picture because it shows considerable overlap in activation patterns in response to the different face-processing tasks. Further, activation patterns in response to facial expressions support the notion of involved neural substrates for processing different facial expressions.  相似文献   
80.
The hypotheses of this investigation were based on attachment theory and Bowlby's conception of "internal working models", supposed to consist of one mainly emotional (model-of-self) and one more conscious cognitive structure (model-of-others), which are assumed to operate at different temporal stages of information processing. Facial muscle reactions in individuals with positive versus negative internal working models were compared at different stages of information processing. The Relationship Scale Questionnaire (RSQ) was used to categorize subjects into positive or negative model-of-self and model-of-others and the State-Trait Anxiety Inventory was used to measure trait anxiety (STAI-T). Pictures of happy and angry faces followed by backward masking stimuli were exposed to 61 subjects at three different exposure times (17 ms, 56 ms, 2,350 ms) in order to elicit reactions first at an automatic level and then consecutively at more cognitively elaborated levels. Facial muscle reactions were recorded by electromyography (EMG), a higher corrugator activity representing more negative emotions and a higher zygomaticus activity more positive emotions. In line with the hypothesis, subjects with a negative model-of-self scored significantly higher on STAI-T than subjects with a positive model-of-self. They also showed an overall stronger corrugator than zygomatic activity, giving further evidence of a negative tonic affective state. At the longest exposure time (2,350 ms), representing emotionally regulated responses, negative model-of-self subjects showed a significantly stronger corrugator response and reported more negative feelings than subjects with a positive model-of-self. These results supported the hypothesis that subjects with a negative model-of-self would show difficulties in self-regulation of negative affect. In line with expectations, model-of-others, assumed to represent mainly knowledge structures, did not interact with the physiological emotional measures employed, facial muscle reactions or tonic affective state.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号