首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This study examined the relationships among nonverbal behaviors, dimensions of source credibility, and speaker persuasiveness in a public speaking context. Relevant nonverbal literature was organized according to a Brunswikian lens model. Nonverbal behavioral composites, grouped according to their likely proximal percepts, were hypothesized to significantly affect both credibility and persuasiveness. A sample of 60 speakers gave videotaped speeches that were judged on credibility and persuasiveness by classmates. Pairs of trained raters coded 22 vocalic, kinesic, and proxemic nonverbal behaviors evidenced in the tapes. Results confirmed numerous associations between nonverbal behaviors and attributions of credibility and persuasiveness. Greater perceived competence and composure were associated with greater vocal and facial pleasantness, with greater facial expressiveness contributing to competence perceptions. Greater sociability was associated with more kinesic/proxemic immediacy, dominance, and relaxation and with vocal pleasantness. Most of these same cues also enhanced character judgments. No cues were related to dynamism judgments. Greater perceived persuasiveness correlated with greater vocal pleasantness (especially fluency and pitch variety), kinesic/proxemic immediacy, facial expressiveness, and kinesic relaxation (especially high random movement but little tension). All five dimensions of credibility related to persuasiveness. Advantages of analyzing nonverbal cues according to proximal percepts are discussed.  相似文献   

2.
The effects of focal brain lesions on the decoding of emotional concepts in facial expressions were investigated. Facial emotions are hierarchically organized patterns comprising (1) structural surface features, (2) discrete (primary) emotional categories and (3) secondary dimensions, such as valence and arousal.Categoricaldecoding was measured using (1) selection of category labels and selection of the named emotion category; (2) matching one facial expression with two choice expressions.Dimensionaldecoding was assessed by matching one face with two different expressions with regard to valence or arousal. 70 patients with well documented cerebral lesions and 15 matched hospital controls participated in the study. 27 had left brain damage (LBD; 10 frontal, 10 temporal, 7 parietal); 37 had right brain damage (RBD; 15 frontal, 11 temporal, 11 parietal). Six additional patients had lesions involving both frontal lobes. Right temporal and parietal lesioned patients were markedly impaired in the decoding of primary emotions. The same patients also showed a reduced arousal decoding. In contrast to several patients with frontal and left hemisphere lesions, emotional conceptualization and face discrimination was not independent in these groups. No group differences were observed in valence decoding. However, right frontal lesions appeared to interfere with the discrimination of negative valence. Moreover, a distraction by structural features was noted in RBD when facial identities were varied across stimulus and response pictures in matching tasks with differing conceptual load. Our results suggest that focal brain lesions differentially affect the comprehension of emotional meaning in faces depending on the level of conceptual load and interference of structural surface features.  相似文献   

3.
Conversational involvement refers to the degree to which participants in a communicative exchange are cognitively and behaviorially engaged in the topic, relationship, and/or situation. It is argued that involvement should be viewed from a functional perspective and conceptualized as entailing fiue dimensions: immediacy, expressiveness, interaction management, altercentrism, and social anxiety. Specific nonverbal behaviors that are actually encoded to express involvement along these five dimensions are examined within an interview context. Unacquainted dyads (N=52) engaged in baseline interviews followed by a second interview in which one participant was asked to increase or decrease involvement significantly. Tiuentyone kinesic, proxemic, and vocalic behaviors were rated during five intervals. Change scores from baseline to manipulations shouted numerous differences between high and low involvement, as did correlations between magnitude of involvement and nonverbal behaviors. The behaviors that most strongly discriminated high from low involvement were general kinesic/proxemic attentiveness, forward lean, relaxed laughter, coordinated speech, fewer silences and latencies, and fewer object manipulations. Behaviors most predictive of magnitude of involvement change were facial animation, vocal warmth/interest, deeper pitch, less random movement, and more vocal attentiveness.  相似文献   

4.
5.
It is often assumed that intimacy and familiarity will lead to better and more effective emotional communication between two individuals. However, research has failed to unequivocally support this claim. The present study proposes that close dyads exhibit superiority in the decoding of subdued facial cues than in the decoding of highly intense expressions. A total of 43 close friend dyads and 49 casual acquaintance dyads (all women) were compared on their recognition of their partner's and a stranger's subdued facial expressions. Dyadic analyses indicate that close friends were more accurate and also improved more rapidly than casual acquaintances in decoding one other's subdued expressions of sadness, anger, and happiness, especially the two negative emotions, but not in detecting the stranger's subdued expressions. The results strongly suggest that intimacy fosters more accurate decoding of subdued facial expressions.  相似文献   

6.
With over 560 citations reported on Google Scholar by April 2018, a publication by Juslin and Gabrielsson (1996) presented evidence supporting performers’ abilities to communicate, with high accuracy, their intended emotional expressions in music to listeners. Though there have been related studies published on this topic, there has yet to be a direct replication of this paper. A replication is warranted given the paper’s influence in the field and the implications of its results. The present experiment joins the recent replication effort by producing a five-lab replication using the original methodology. Expressive performances of seven emotions (e.g. happy, sad, angry, etc.) by professional musicians were recorded using the same three melodies from the original study. Participants (N?=?319) were presented with recordings and rated how well each emotion matched the emotional quality using a 0–10 scale. The same instruments from the original study (i.e. violin, voice, and flute) were used, with the addition of piano. In an effort to increase the accessibility of the experiment and allow for a more ecologically-valid environment, the recordings were presented using an internet-based survey platform. As an extension to the original study, this experiment investigated how musicality, emotional intelligence, and emotional contagion might explain individual differences in the decoding process. Results found overall high decoding accuracy (57%) when using emotion ratings aggregated for the sample of participants, similar to the method of analysis from the original study. However, when decoding accuracy was scored for each participant individually the average accuracy was much lower (31%). Unlike in the original study, the voice was found to be the most expressive instrument. Generalised Linear Mixed Effects Regression modelling revealed that musical training and emotional engagement with music positively influences emotion decoding accuracy.  相似文献   

7.
An interaction between receiver ability to decode vocalic cues and speaker vocalic patterns in obtaining compliance was investigated in this study. Expectancy theory was offered as an explanation for this interaction. Because changes in vocalic patterns can violate expectations, receivers make consistent interpretations of these vocalic cues, and evaluations of these interpretations may be affected by decoder predispositions toward communication that, in turn, produce differential perceptions of source reward. Respondents were interviewed by trained encoders who used neutral, pleasant, and hostile vocal patterns. Compliance was assessed by asking for a donation of time to communication research. Follow-up surveys measured perceived relational messages, interviewer credibility, vocal pleasantness, and the degree to which the vocalic pattern was expected. The predicted disordinal interaction between decoding ability and voice condition was found. Decoding ability did not correlate with predispositions, nullifying source reward as a factor in the evaluation of vocalic violations. It was suggested that preferences for vocalic patterns influenced evaluations: Good decoders may have preferred affiliative cues and thus complied more with pleasant voices, whereas poor decoders may have preferred assertive patterns and complied more with hostile voices.  相似文献   

8.
Several studies have provided evidence of a women's better accuracy in interpreting emotional states. Despite this difference is generally ascribed to the primary role of female gender in the affective relation with the offspring, to date, little information is available regarding gender differences in the ability to interpret infant facial expressions. In the present study, we examined the roles of gender and expertise in interpreting infant expression in 34 men and women who differed in their experience with infants. Women showed a significantly higher level of decoding accuracy compared to men. Expertise positively affected facial expressions decoding among women only. Our results suggest that in judging emotional facial expressions of infants, there is an interaction of biological (i.e., gender) and cultural factors that is independent of a woman's socioeconomic status.  相似文献   

9.
This study investigated how target sex, target age, and expressive ambiguity influence emotion perception. Undergraduate participants (N = 192) watched morphed video clips of eight child and eight adult facial expressions shifting from neutral to either sadness or anger. Participants were asked to stop the video clip when they first saw an emotion appear (perceptual sensitivity) and were asked to identify the emotion that they saw (accuracy). Results indicate that female participants identified sad expressions sooner in female targets than in male targets. Participants were also more accurate identifying angry facial expressions by male children than by female children. Findings are discussed in terms of the effects of ambiguity, gender, and age on the perception of emotional expressions.  相似文献   

10.
Perinatal psychological problems such as post-natal depression are associated with poor mother–baby interaction, but the reason for this is not clear. One explanation is that mothers with negative mood have biased processing of infant emotion. This review aimed to synthesise research on processing of infant emotion by pregnant or post-natal women with anxiety, depression or post-traumatic stress disorder (PTSD). Systematic searches were carried out on 11 electronic databases using terms related to negative affect, childbirth and perception of emotion. Fourteen studies were identified which looked at the effect of depression, anxiety and PTSD on interpretation of infant emotional expressions (k = 10), or reaction times when asked to ignore emotional expressions (k = 4). Results suggest mothers with depression and anxiety are more likely to identify negative emotions (i.e., sadness) and less accurate at identifying positive emotions (i.e., happiness) in infant faces. Additionally, women with depression may disengage faster from positive and negative infant emotional expressions. Very few studies examined PTSD (k = 2), but results suggest biases towards specific infant emotions may be influenced by characteristics of the traumatic event. The implications of this research for mother–infant interaction are explored.  相似文献   

11.
Eleven infant–mother dyads in Crete were videod during spontaneous interactions at home, from the second to the sixth month of life. Micro‐analysis was used to investigate‘coordination'and ‘non‐matching’ of facial expressions of emotion. ‘Emotional coordination’ was evaluated with four measures: matching of facial expressions, completion when one responded to the other with ‘pleasure’ or ‘interest’, synchrony by matching frequency of change or rhythm of emotional expressions, and attunement when shifts of emotional intensity of the two partners were in the same direction.‘Emotional non‐matching'was coded when neither the infant nor the mother showed interest in interacting with the other. In emotional coordination or non‐matching between mother and infant, who performed first was also recorded. We obtained evidence of emotional matching, synchrony, and attunement. Importantly, the probability of emotional non‐matching by the infant was higher than the probability of emotional matching and completion, indicating a tendency for thoughtful attention or playful rivalry in the responses of infants, who also initiated emotional matching, completion, and non‐matching more frequently than mothers. The probability of expression of emotional matching, completion, and non‐matching changed with age. Both mothers and infants act to obtain sympathetic complementarity of feelings and co‐operative inter‐synchrony of actions. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

12.
Attentional shifting may represent a means of regulating the stress response. Previously, automatic processing of emotional information was predictive of subsequent cortisol levels during a repeated loss stressor (Ellenbogen, Schwartzman, Stewart, & Walker, 2006). The stress induction did not, however, elicit a substantive cortisol increase. Thus, we sought to replicate this finding using the Trier Social Stress Test (TSST), a validated psychosocial stress induction. Seventy-nine students performed a modified spatial cuing task with supraliminal and masked pictorial stimuli during the TSST (n = 36) and a control condition (n = 43). The TSST elicited a greater cortisol response than did the control condition [F(1,76) = 4.6, p < .05]. Attentional shifting during trials with masked angry faces predicted cortisol change during the TSST (β = .76; t = 2.1, p < .05), but not during the control condition. These data suggest that early automatic emotional information processing is important in the regulation of the cortisol stress response, although the direction of effect is not known.  相似文献   

13.
This study examined the hypothesis that rejected children's inability to interact successfully with their peers stems from misperception of nonverbal communication cues, whereas neglected children have the necessary perceptual skills and their inability arises because they are unable to use them. Comparisons were made among 5- and 9-year-old neglected, rejected, and control children (six groups, N = 15 per group) on four tasks: affective empathy, cognitive empathy, decoding of facial expressions of emotion, and decoding of emotional situations. The results, which were consistent with the hypothesis, are interpreted in a social-skills model based on the work of Argyle and Powers.  相似文献   

14.
ABSTRACT

Previous research has found that individuals vary greatly in emotion differentiation, that is, the extent to which they distinguish between different emotions when reporting on their own feelings. Building on previous work that has shown that emotion differentiation is associated with individual differences in intrapersonal functions, the current study asks whether emotion differentiation is also related to interpersonal skills. Specifically, we examined whether individuals who are high in emotion differentiation would be more accurate in recognising others’ emotional expressions. We report two studies in which we used an established paradigm tapping negative emotion differentiation and several emotion recognition tasks. In Study 1 (N?=?363), we found that individuals high in emotion differentiation were more accurate in recognising others’ emotional facial expressions. Study 2 (N?=?217), replicated this finding using emotion recognition tasks with varying amounts of emotional information. These findings suggest that the knowledge we use to understand our own emotional experience also helps us understand the emotions of others.  相似文献   

15.
Most previous studies investigating children’s ability to recognize facial expressions used only intense exemplars. Here we compared the sensitivity of 5-, 7-, and 10-year-olds with that of adults (n = 24 per age group) for less intense expressions of happiness, sadness, and fear. The developmental patterns differed across expressions. For happiness, by 5 years of age, children were as sensitive as adults even to low intensities. For sadness, by 5 years of age, children were as accurate as adults in judging that the face was expressive (i.e., not neutral), but even at 10 years of age, children were more likely to misjudge it as fearful. For fear, children’s thresholds were not adult-like until 10 years of age, and children often confused it with sadness at 5 years of age. For all expressions, including even happy expressions, 5- and 7-year-olds were less accurate than adults in judging which of two expressions was more intense. Together, the results indicate that there is slow development of accurate decoding of subtle facial expressions.  相似文献   

16.
The main aim of this study was to examine the relationships between stage of change, which was guided by the transtheoretical model, for stress management behavior and perceived stress and coping. First, we developed the Korean version of the Rhode Island Stress and Coping Inventory (RISCI). Second, we related stage of change for stress management behavior to perceived stress and coping. Based on two surveys that we conducted (n = 530 for survey 1 and n = 299 for survey 2), we developed the Korean version of the RISCI with acceptable internal consistency and criterion‐related validity against the depressive level measured using the Korean Beck Depression Inventory II. The stress score of the Korean version of the RISCI was significantly lower in maintenance than in the other stages, while the coping score was significantly higher in action and maintenance than in the first three stages (n = 804), irrespective of sex. These results provided further empirical evidence to validate stage classification in the field of stress management behavior.  相似文献   

17.
Many models of decision making neglect emotional states that could affect individuals' cognitive processes. The present work explores the effect of emotional stress on people's cognitive processes when making probabilistic inferences. Two contrasting hypotheses are tested against one another: the uncertainty‐reduction and attention‐narrowing hypotheses of how emotional stress affects decision making. In the experimental study, emotional stress was induced through the use of highly aversive pictures immediately before each decision. Emotional state was assessed by both subjective (state anxiety, arousal, and pleasantness ratings) and objective (skin conductance) measures. The results show that emotional stress impacts decision making; in particular, emotionally aroused participants seem to have focused on the most important information and selected simpler decision strategies relative to participants in a control condition. The results are in line with the attention‐narrowing hypothesis and suggest that emotional stress can impact decision making through limited predecisional information search and the selection of simpler decision strategies. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

18.
The purpose of the present investigation was to assess whether interpersonal closeness facilitates earlier emotion detection as the emotional expression unfolds. Female undergraduate participants were either paired with a close friend or an acquaintance (n = 92 pairs). Participants viewed morphed movies of their partner and a stranger gradually shifting from a neutral to either a sad, angry, or happy expression. As predicted, findings indicate a closeness advantage. Close friends detected the onset of their partners’ angry and sad expressions earlier than acquaintances. Additionally, close friends were more accurate than acquaintances in identifying angry and sad expressions at the onset, particularly in non-vignette conditions when these expressions were void of context. These findings suggest that closeness does indeed facilitate emotional perception, particularly in ambiguous situations for negative emotions.  相似文献   

19.
The development of children's ability to identify facial emotional expressions has long been suggested to be experience dependent, with parental caregiving as an important influencing factor. This study attempts to further this knowledge by examining disorganization of the attachment system as a potential psychological mechanism behind aberrant caregiving experiences and deviations in the ability to identify facial emotional expressions. Typically developing children (= 105, 49.5% boys) aged 6–7 years (= 6 years 8 months, SD = 1.8 months) completed an attachment representation task and an emotion identification task, and parents rated children's negative emotionality. The results showed a generally diminished ability in disorganized children to identify facial emotional expressions, but no response biases. Disorganized attachment was also related to higher levels of negative emotionality, but discrimination of emotional expressions did not moderate or mediate this relation. Our novel findings relate disorganized attachment to deviations in emotion identification, and therefore suggest that disorganization of the attachment system may constitute a psychological mechanism linking aberrant caregiving experiences to deviations in children's ability to identify facial emotional expressions. Our findings further suggest that deviations in emotion identification in disorganized children, in the absence of maltreatment, may manifest in a generally diminished ability to identify emotional expressions, rather than in specific response biases.  相似文献   

20.
Despite reports documenting adverse effects of stress on police marriages, few empirical studies focus on actual emotional behaviors of officers and spouses. In this preliminary investigation, 17 male police officers and their nonpolice wives completed daily stress diaries for 1 week and then participated in a laboratory‐based discussion about their respective days. Conversations were video‐recorded and coded for specific emotional behaviors reflecting hostility and affection, which are strong predictors of marital outcomes. We examined associations between officers' job stress (per diaries and the Police Stress Survey) and couples' emotional behavior (mean levels and behavioral synchrony) using a dyadic repeated measures design capitalizing on the large number of observations available for each couple (1020 observations). When officers reported more job stress, they showed less hostility, less synchrony with their wives' hostility, and more synchrony with their wives' affection; their wives showed greater synchrony with officers' hostility and less synchrony with officers' affection. Therefore, for officers, greater job stress was associated with less behavioral negativity, potentially less attunement to wives' negativity, but potentially greater attunement to wives' affection—perhaps a compensatory strategy or attempt to buffer their marriage from stress. These attempts may be less effective, however, if, as our synchrony findings may suggest, wives are focusing on officers' hostility rather than affection. Although it will be important to replicate these results given the small sample, our findings reveal that patterns of behavioral synchrony may be a key means to better understand how job stress exacts a toll on police marriages.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号