首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1690篇
  免费   71篇
  国内免费   24篇
  2024年   3篇
  2023年   15篇
  2022年   18篇
  2021年   59篇
  2020年   68篇
  2019年   115篇
  2018年   82篇
  2017年   94篇
  2016年   92篇
  2015年   73篇
  2014年   67篇
  2013年   471篇
  2012年   40篇
  2011年   94篇
  2010年   47篇
  2009年   90篇
  2008年   103篇
  2007年   92篇
  2006年   41篇
  2005年   32篇
  2004年   17篇
  2003年   17篇
  2002年   9篇
  2001年   8篇
  2000年   5篇
  1999年   4篇
  1998年   4篇
  1997年   4篇
  1996年   2篇
  1995年   2篇
  1994年   1篇
  1993年   3篇
  1992年   2篇
  1991年   1篇
  1990年   5篇
  1988年   1篇
  1986年   1篇
  1985年   1篇
  1983年   2篇
排序方式: 共有1785条查询结果,搜索用时 15 毫秒
1.
Recently, cross-cultural facial-expression recognition has become a research hotspot, and a standardised facial-expression material system can significantly help researchers compare and demonstrate the results of other studies. We developed a facial-expression database of Chinese Han, Hui and Tibetan ethnicities. In this study, six basic human facial expressions (and one neutral expression) were collected from 200 Han, 220 Hui and 210 Tibetan participants who lived in these regions. Four experts on each ethnicity evaluated the facial-expression images according to the expressions, and only those achieving inter-rater agreement were retained. Subsequently, 240 raters evaluated these images according to the seven emotions and rated the intensity of the expressions. Consequently, 2980 images were included in the database, including 930 images of Han individuals, 962 images of Hui individuals and 1088 images of Tibetan individuals. In conclusion, the facial-expression database of Chinese Han, Hui and Tibetan people was representative and reliable with a recognition rate of over 60%, making it well-suited for cross-cultural research on emotions.  相似文献   
2.
The present study attempts to locate brain regions that are related to vividness control, a hypothesized mechanism that reduces the vividness of negative imagery by controlling memory retrieval and emotion processing. The results showed that BOLD response in the left posterior cingulate gyrus in the negative imagery condition, in which activation of vividness control mechanisms was considered to be strong, was greater than that in the positive imagery condition, in which the activation of control mechanisms was considered to be weak. Moreover, the activation of this region negatively correlated with the subjective vividness of negative imagery. These results support the idea that the posterior cingulate gyrus may be involved in the suppression of imagery generation. Several previous studies have suggested that the posterior cingulate cortex is involved in both memory and emotion processing. Therefore, the current results indicate that the posterior cingulate gyrus may function as the vividness control mechanism.  相似文献   
3.
We investigated the relationship between ambient temperature and prosocial behaviour in real-life settings. It was guided by two mechanisms of opposite predictions, namely (1) higher temperatures decrease prosociality by harming well-being, and (2) higher temperatures increase prosociality by promoting the embodied cognition of social warmth. In Study 1, U.S. state-level time-series data (2002–2015) supported the first mechanism, with higher temperatures predicting lower volunteer rates through lower well-being. Study 2 furthered the investigation by probing the relationship between neighbourhood temperature and civic engagement of 2268 U.S. citizens. The data partially supported the well-being mechanism and reported findings contradictory to the social embodiment mechanism. Higher temperatures predicted lower interpersonal trust and subsequently lower civic engagement. The unexpected finding hinted at a cognitive effect of heat and a compensatory mechanism in social thermoregulation. We discussed the findings regarding their methodological strengths and weaknesses, with cautions made on ecological fallacies and alternative models.  相似文献   
4.
Supportive parent emotion socialization has been associated with greater child emotion understanding and expression and lower levels of externalizing behavior problems, with limited understanding on parent emotion socialization in toddlerhood. The current study examined the developmental trajectory of emotion socialization via emotion talk in mothers of toddlers from a predominantly Latine sample. Participants were 101 mother-toddler dyads assessed over three time points from ages 12–25 months. Overall, maternal emotion talk remained relatively stable over time, although there was a significant decrease between the first and second assessments before returning to initial rates at the third assessment. Maternal emotion talk did not predict child externalizing behavior over time. Interestingly, however, greater toddler externalizing behavior problems was associated with an increase in maternal emotion talk over time. These findings suggest maternal emotion talk is relatively stable for parents of children who are low on externalizing behaviors and may fluctuate (i.e., slowly increase) for mothers of children who are high in externalizing behaviors. Understanding these mechanisms further could help inform how we implement and personalize parenting interventions.  相似文献   
5.
IntroductionThe way we interact with our environment depends on our spontaneous tendency to approach or to avoid emotional experiences triggered by that environment. This dimension of the emotional experience is called the need for affect, that is, the tendency of individuals to adopt approaching or avoidance behaviour with regard to emotional stimuli.MethodsThe Need For Affect (NFA) Scale has been the subject of numerous studies since the validation of the original version (Maio & Esses, 2001) and its short version (Appel et al., 2012). However, no validation of the latter scale has been conducted in French. We propose a French version of the short NFA scale on a student sample and a sample from the general population.ResultsWe found the structure of the original scale in a French translation (of the English version). In addition, invariance tests showed that this structure remained the same for both samples.ConclusionWe recommend the use of this version of the short NFA scale for studies conducted on French-speaking samples.  相似文献   
6.
This study explored whether males and females differ in facial muscle activity when exposed to tone stimuli with different intensity. Males and females were repeatedly exposed to 95 dB and 75 dB 1000 Hz tones while their facial electromyographic (EMG) activity from corrugator and zygomatic muscle regions were measured. Skin conductance responses were also measured. It was found that 95 dB but not 75 dB tones evoked increased corrugator activity. This effect differed significantly between males and females. Thus, it was only females that reacted with a significant increased corrugator response to the high intensity tone. While facial responses differed between the sexes, the skin conductance response patterns did not. Consistent with previous research it is concluded that females are more facially expressive than are males.  相似文献   
7.
The purpose of this study is to explore whether subjects exposed to stimuli of facial expressions respond with facial electromyographic (EMG) reactions consistent with the hypothesis that facial expressions are contagious. This study further examines whether males and females differ in facial EMG intensity. Two experiments demonstrated that subjects responded with facial EMG activity over the corrugator supercilii, the zygomatic major , the lateral frontalis , the depressor supercilii , and the levator labii muscle regions to stimuli of sad, angry, fearful, surprised, disgusted and happy faces, that, to large extent, were consistent with the hypothesis that facial expressions are contagious. Aspects of gender differences reported in earlier studies were found, indicating a tendency for females to respond with more pronounced facial EMG intensity.  相似文献   
8.
Aristotle's illustrations of the fallacy of Figure of Speech (or Form of Expression) are none too convincing. They are tied to Aristotle's theory of categories and to peculiarities of Greek grammar that fail to hold appeal for a contemporary readership. Yet, upon closer inspection, Figure of Speech shows many points of contact with views and problems that inhabit 20th-century analytical philosophy. In the paper, some Aristotelian examples will be analyzed to gain a better understanding of this fallacy. The case of the Third Man argument and some modern cases lend plausibility to the claim that Figure of Speech is of more interest as a type of fallacy than has generally been assumed. Finally, a case is made for the view that Figure of Speech, though listed among the fallacies dependent upon language, is not properly classified as a fallacy of ambiguity. More likely, it should be looked upon as a type of non sequitur. This has important consequences for the profile of dialogue associated with this fallacy.  相似文献   
9.
10.
An immense body of research demonstrates that emotional facial expressions can be processed unconsciously. However, it has been assumed that such processing takes place solely on a global valence-based level, allowing individuals to disentangle positive from negative emotions but not the specific emotion. In three studies, we investigated the specificity of emotion processing under conditions of limited awareness using a modified variant of an affective priming task. Faces with happy, angry, sad, fearful, and neutral expressions were presented as masked primes for 33 ms (Study 1) or 14 ms (Studies 2 and 3) followed by emotional target faces (Studies 1 and 2) or emotional adjectives (Study 3). Participants’ task was to categorise the target emotion. In all three studies, discrimination of targets was significantly affected by the emotional primes beyond a simple positive versus negative distinction. Results indicate that specific aspects of emotions might be automatically disentangled in addition to valence, even under conditions of subjective unawareness.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号