首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We examined the effects of joint attention for object learning in 5‐ and 7‐month‐old infants. Infants interacted with an adult social partner who taught them about a novel toy in two conditions. In the Joint Attention condition, the adult spoke about the toy while alternating gaze between the infant and the toy, while in the Object Only condition, the adult looked to the toy and to a spot on the ceiling, but never at the infant. In the test trials following each social interaction, we presented infants with the ‘familiarization’ toy and a novel toy, and monitored looking times to each object. We found that 7‐month‐olds looked significantly longer to the novel toy following the Joint Attention relative to the Object Only condition, while 5‐month‐old infants did not show a significant difference across conditions. We interpret these results to suggest that joint attention facilitated 7‐month‐old infants' encoding of information about the familiarization object. Implications for the ontogeny of infant learning in joint attention contexts are discussed. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

2.
This study examined whether 4‐month‐olds (N = 40) could perceptually categorize happy and angry faces, and show appropriate behavior in response to these faces. During the habituation phase, infants were shown the same type of facial expressions (happy or angry) posed by three models, and their behavior in response to those faces was observed. During the test phase immediately after the habituation phase, infants saw a novel emotional expression and a familiar expression posed by a new model, and their looking times were measured. The results indicated that, although 4‐month‐olds could perceptually categorize happy and angry faces accurately, they responded positively to both expression types. These findings suggest that, although infants can perceptually categorize facial expressions at 4 months of age, they require further time to learn the affective meanings of the facial expressions.  相似文献   

3.
The role of facial expression in the determination of infants' reaction to the sudden still‐face of a social partner was investigated. In a within subject design, 2, 4 and 6‐month‐old infants were tested in periods of normal interaction interspersed with periods of prolonged still‐face episodes in which the female adult social partner adopted either a happy, neutral, or sad static facial expression while maintaining eye contact with the infant. Proportion of infants' smiling and gazing at the social partner as indices of reaction from the various still‐face episodes reveal that, in comparison with same age control groups, four and six‐month‐old infants did not demonstrate any differential responses depending on either happy, neutral, and sad still‐faced expression. In contrast, two‐month olds demonstrated some evidence of a reduced still‐face effect in the happy still‐face condition. These results point to early developmental changes in the mechanisms underlying the still‐face phenomenon. We propose that by 4 months, and not prior, the reaction to still‐face episodes are essentially based on the detection of social contingencies. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

4.
In two experiments, we examined whether 14‐month‐olds understand the subjective nature of gaze. In the first experiment, infants first observed an experimenter express happiness as she looked inside a container that either contained a toy (reliable looker condition) or was empty (unreliable looker condition). Then, infants had to follow the same experimenter's gaze to a target object located either behind or in front of a barrier. Infants in the reliable looker condition followed the experimenter's gaze behind the barrier more often than infants in the unreliable looker condition, whereas both groups followed the experimenter's gaze to the target object located in front of the barrier equally often. In the second experiment, infants did not generalize their knowledge about the unreliability of a looker to a second ‘naïve’ looker. These findings suggest that 14‐month‐old infants adapt their gaze following as a function of their past experience with the looker.  相似文献   

5.
We used a novel intermodal association task to examine whether infants associate own‐ and other‐race faces with music of different emotional valences. Three‐ to 9‐month‐olds saw a series of neutral own‐ or other‐race faces paired with happy or sad musical excerpts. Three‐ to 6‐month‐olds did not show any specific association between face race and music. At 9 months, however, infants looked longer at own‐race faces paired with happy music than at own‐race faces paired with sad music. Nine‐month‐olds also looked longer at other‐race faces paired with sad music than at other‐race faces paired with happy music. These results indicate that infants with nearly exclusive own‐race face experience develop associations between face race and music emotional valence in the first year of life. The potential implications of such associations for developing racial biases in early childhood are discussed.  相似文献   

6.
Body movements, as well as faces, communicate emotions. Research in adults has shown that the perception of action kinematics has a crucial role in understanding others’ emotional experiences. Still, little is known about infants’ sensitivity to body emotional expressions, since most of the research in infancy focused on faces. While there is some first evidence that infants can recognize emotions conveyed in whole‐body postures, it is still an open question whether they can extract emotional information from action kinematics. We measured electromyographic (EMG) activity over the muscles involved in happy (zygomaticus major, ZM), angry (corrugator supercilii, CS) and fearful (frontalis, F) facial expressions, while 11‐month‐old infants observed the same action performed with either happy or angry kinematics. Results demonstrate that infants responded to angry and happy kinematics with matching facial reactions. In particular, ZM activity increased while CS activity decreased in response to happy kinematics and vice versa for angry kinematics. Our results show for the first time that infants can rely on kinematic information to pick up on the emotional content of an action. Thus, from very early in life, action kinematics represent a fundamental and powerful source of information in revealing others’ emotional state.  相似文献   

7.
Infants’ ability to discriminate emotional facial expressions and tones of voice is well-established, yet little is known about infant discrimination of emotional body movements. Here, we asked if 10–20-month-old infants rely on high-level emotional cues or low-level motion related cues when discriminating between emotional point-light displays (PLDs). In Study 1, infants viewed 18 pairs of angry, happy, sad, or neutral PLDs. Infants looked more at angry vs. neutral, happy vs. neutral, and neutral vs. sad. Motion analyses revealed that infants preferred the PLD with more total body movement in each pairing. Study 2, in which infants viewed inverted versions of the same pairings, yielded similar findings except for sad-neutral. Study 3 directly paired all three emotional stimuli in both orientations. The angry and happy stimuli did not significantly differ in terms of total motion, but both had more motion than the sad stimuli. Infants looked more at angry vs. sad, more at happy vs. sad, and about equally to angry vs. happy in both orientations. Again, therefore, infants preferred PLDs with more total body movement. Overall, the results indicate that a low-level motion preference may drive infants’ discrimination of emotional human walking motions.  相似文献   

8.
In social situations, skillful regulation of emotion and behavior depends on efficiently discerning others' emotions. Identifying factors that promote timely and accurate discernment of facial expressions can therefore advance understanding of social emotion regulation and behavior. The present research examined whether trait mindfulness predicts neural and behavioral markers of early top‐down attention to, and efficient discrimination of, socioemotional stimuli. Attention‐based event‐related potentials (ERPs) and behavioral responses were recorded while participants (N = 62; White; 67% female; Mage = 19.09 years, SD = 2.14 years) completed an emotional go/no‐go task involving happy, neutral, and fearful facial expressions. Mindfulness predicted larger (more negative) N100 and N200 ERP amplitudes to both go and no‐go stimuli. Mindfulness also predicted faster response time that was not attributable to a speed‐accuracy trade‐off. Significant relations held after accounting for attentional control or social anxiety. This study adds neurophysiological support for foundational accounts that mindfulness entails moment‐to‐moment attention with lower tendencies toward habitual patterns of responding. Mindfulness may enhance the quality of social behavior in socioemotional contexts by promoting efficient top‐down attention to and discrimination of others' emotions, alongside greater monitoring and inhibition of automatic response tendencies.  相似文献   

9.
Two groups of mothers and their infants (24 infants, mean age=3.5 months and 24 infants, mean age=5.5 months) were video‐ and audio‐taped in their homes while playing with a Jack‐in‐the‐box. The mean fundamental frequency of spontaneous surprise exclamations of mothers when opening the toy were analysed, and infant and maternal facial expressions of surprise were coded in three regions of the face. A t‐test established that significantly more of the older children in comparison with younger children showed surprise (t=?2.96, df=46, p<0.005, 2‐tailed). Twenty‐nine per cent of the younger infants, in comparison with 67% of the older children showed facial expressions of surprise. A t‐test of maternal pitch height (Hz) indicated that mothers exclaimed in surprise with a higher pitch when the child did not show a surprise facial expression (mean=415.61 Hz) in comparison with the child showing surprise (mean=358.97 Hz; t=2.9, df=46, p=0.006, 2‐tailed). A multiple regression established that infant's expression was a stronger predictor of maternal vocal pitch than was the age of the infant. These results are discussed in terms of maternal use of emotional expressions as ‘social signals’. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

10.
Recognition of emotional facial expressions is a crucial skill for adaptive behavior. Past research suggests that at 5 to 7 months of age, infants look longer to an unfamiliar dynamic angry/happy face which emotionally matches a vocal expression. This suggests that they can match stimulations of distinct modalities on their emotional content. In the present study, olfaction–vision matching abilities were assessed across different age groups (3, 5 and 7 months) using dynamic expressive faces (happy vs. disgusted) and distinct hedonic odor contexts (pleasant, unpleasant and control) in a visual‐preference paradigm. At all ages the infants were biased toward the disgust faces. This visual bias reversed into a bias for smiling faces in the context of the pleasant odor context in the 3‐month‐old infants. In infants aged 5 and 7 months, no effect of the odor context appeared in the present conditions. This study highlights the role of the olfactory context in the modulation of visual behavior toward expressive faces in infants. The influence of olfaction took the form of a contingency effect in 3‐month‐old infants, but later evolved to vanish or to take another form that could not be evidenced in the present study.  相似文献   

11.
The use of an adult as a resource for help and instruction in a problem solving situation was examined in 9, 14, and 18‐month‐old infants. Infants were placed in various situations ranging from a simple means‐end task where a toy was placed beyond infants' prehensile space on a mat, to instances where an attractive toy was placed inside closed transparent boxes that were more or less difficult for the child to open. The experimenter gave hints and modelled the solution each time the infant made a request (pointing, reaching, or showing a box to the experimenter), or if the infant was unable to solve the problem. Infants' success on the problems, sensitivity to the experimenter's modelling, and communicative gestures (requests, co‐occurrence of looking behaviour and requests) were analysed. Results show that older infants had better success in solving problems although they exhibited difficulties in solving the simple means‐end task compared to the younger infants. Moreover, 14‐ and 18‐month‐olds were sensitive to the experimenter's modelling and used her demonstration cues to solve problems. By contrast, 9‐month‐olds did not show such sensitivity. Finally, 9‐month‐old infants displayed significantly fewer communicative gestures toward the adult compared to the other age groups, although in general, all infants tended to increase their frequency of requests as a function of problem difficulty. These observations support the idea that during the first half of the second year infants develop a new collaborative stance toward others. The stance is interpreted as foundational to teaching and instruction, two mechanisms of social learning that are sometime considered as specifically human. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

12.
Face‐to‐face interaction between infants and their caregivers is a mainstay of developmental research. However, common laboratory paradigms for studying dyadic interaction oversimplify the act of looking at the partner's face by seating infants and caregivers face to face in stationary positions. In less constrained conditions when both partners are freely mobile, infants and caregivers must move their heads and bodies to look at each other. We hypothesized that face looking and mutual gaze for each member of the dyad would decrease with increased motor costs of looking. To test this hypothesis, 12‐month‐old crawling and walking infants and their parents wore head‐mounted eye trackers to record eye movements of each member of the dyad during locomotor free play in a large toy‐filled playroom. Findings revealed that increased motor costs decreased face looking and mutual gaze: Each partner looked less at the other's face when their own posture or the other's posture required more motor effort to gain visual access to the other's face. Caregivers mirrored infants' posture by spending more time down on the ground when infants were prone, perhaps to facilitate face looking. Infants looked more at toys than at their caregiver's face, but caregivers looked at their infant's face and at toys in equal amounts. Furthermore, infants looked less at toys and faces compared to studies that used stationary tasks, suggesting that the attentional demands differ in an unconstrained locomotor task. Taken together, findings indicate that ever‐changing motor constraints affect real‐life social looking.  相似文献   

13.
Three studies investigated infants’ understanding that gaze involves a relation between a person and the object of his or her gaze. Infants were habituated to an event in which an actor turned and looked at one of two toys. Then, infants saw test events in which (1) the actor turned to the same side as during habituation to look at a different toy, or (2) the actor turned to the other side to look at the same toy as during habituation. The first of these involved a change in the relation between actor and object. The second involved a new physical motion on the part of the actor but no change in the relation between actor and object. Seven‐ and 9‐month‐old infants did not respond to the change in relation between actor and object, although infants at both ages followed the actor's gaze to the toys. In contrast, 12‐month‐old infants responded to the change in the actor–object relation. Control conditions verified that the paradigm was a sensitive index of the younger infants’ representations of action: 7‐ and 9‐month‐olds responded to a change in the actor–object relation when the actor's gaze was accompanied by a grasp. Taken together, these findings indicate that gaze‐following does not initially go hand in hand with understanding the relation between a person who looks and the object of his or her gaze, and that infants begin to understand this relation between 9 and 12 months.  相似文献   

14.
In the current study, 24‐ to 27‐month‐old children (N = 37) used pointing gestures in a cooperative object choice task with either peer or adult partners. When indicating the location of a hidden toy, children pointed equally accurately for adult and peer partners but more often for adult partners. When choosing from one of three hiding places, children used adults’ pointing to find a hidden toy significantly more often than they used peers’. In interaction with peers, children's choice behavior was at chance level. These results suggest that toddlers ascribe informative value to adults’ but not peers’ pointing gestures, and highlight the role of children's social expectations in their communicative development.  相似文献   

15.
A preference for static face patterns is observed in newborns and disappears around 3 months after birth. A previous study has demonstrated that 5‐month‐old infants prefer schematic faces only when the internal features are moving, suggesting that face‐specific movement enhances infants' preference. The present study investigates the facilitative effect of the movement of internal facial features on infants' preference. To examine infants' preference, we used animated face patterns consisting of a head‐shaped contour and three disk blobs. The inner blobs expanded and contracted to represent the opening and closing of the eyes and mouth, and were constrained to open and close only in a biologically possible vertical direction resembling the facial muscle structure. We compared infants' preferential looking time for this vertically moving (VM) face pattern with their looking time for a horizontally moving (HM) face pattern in which blobs transformed at the same speed in a biologically impossible, horizontal direction. In Experiment 1, 7 to 8‐month‐olds preferred the VM to the HM, but 5 to 6‐month‐olds did not. However, the preference was diminished in both cases when the moving face patterns were presented without contour (Experiment 2). Our results suggest that internal facial features with vertical movements promote face preference in 7 to 8‐month‐olds. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

16.
The present study aims to explore the influence of facial emotional expressions on pre-scholars' identity recognition was analyzed using a two-alternative forced-choice matching task. A decrement was observed in children's performance with emotional faces compared with neutral faces, both when a happy emotional expression remained unchanged between the target face and the test faces and when the expression changed from happy to neutral or from neutral to happy between the target and the test faces (Experiment 1). Negative emotional expressions (i.e. fear and anger) also interfered with children's identity recognition (Experiment 2). Obtained evidence suggests that in preschool-age children, facial emotional expressions are processed in interaction with, rather than independently from, the encoding of facial identity information. The results are discussed in relationship with relevant research conducted with adults and children.  相似文献   

17.

This paper describes a method to measure the sensitivity of an individual to different facial expressions. It shows that individual participants are more sensitive to happy than to fearful expressions and that the differences are statistically significant using the model-comparison approach. Sensitivity is measured by asking participants to discriminate between an emotional facial expression and a neutral expression of the same face. The expression was diluted to different degrees by combining it in different proportions with the neutral expression using morphing software. Sensitivity is defined as measurement of the proportion of neutral expression in a stimulus required for participants to discriminate the emotional expression on 75% of presentations. Individuals could reliably discriminate happy expressions diluted with a greater proportion of the neutral expression compared with that required for discrimination of fearful expressions. This tells us that individual participants are more sensitive to happy compared with fearful expressions. Sensitivity is equivalent when measured on two different testing sessions, and greater sensitivity to happy expressions is maintained with short stimulus durations and stimuli generated using different morphing software. Increased sensitivity to happy compared with fear expressions was affected at smaller image sizes for some participants. Application of the approach for use with clinical populations, as well as understanding the relative contribution of perceptual processing and affective processing in facial expression recognition, is discussed.

  相似文献   

18.
This study investigated vocal and facial expression matching in 24 10-month-old infants. Half of the mothers had reported depressive symptoms [i.e., elevated scores on the Center for Epidemiological Studies-Depression Index (CES-D)] during the previous week. Infants were tested using a two-screen preference procedure in which they were presented side-by-side videos of different facial expressions modeled by one female reciting a children's story. A centrally located speaker was used to present a vocal expression soundtrack that matched one of the facial expressions. Separate analyses of variances (ANOVAs) were conducted to analyze the proportion total matching and proportion total looking to the happy and sad expressions. Infants of mothers who reported depressive symptoms displayed less accurate matching of the happy facial and vocal expressions and looked more to sad facial expressions compared to infants of mothers who had not reported depressive symptoms above the normal range. Infants' performance on the expression matching task appears to be related to their primary caregivers' reports of depressive symptoms during the previous week. However, other factors that may be related to the group differences also need to be considered. For example, maternal reports of depressive symptoms may be a marker for other underlying factors that may have affected their infants' performance. © 1997 Michigan Association for Infant Mental Health  相似文献   

19.
Infants' understanding of how their actions affect the visibility of hidden objects may be a crucial aspect of the development of search behaviour. To investigate this possibility, 7‐month‐old infants took part in a two‐day training study. At the start of the first session, and at the end of the second, all infants performed a search task with a hiding‐well. On both days, infants had an additional training experience. The ‘Agency group’ learnt to spin a turntable to reveal a hidden toy, whilst the ‘Means‐End’ group learnt the same means‐end motor action, but the toy was always visible. The Agency group showed greater improvement on the hiding‐well search task following their training experience. We suggest that the Agency group's turntable experience was effective because it provided the experience of bringing objects back into visibility by one's actions. Further, the performance of the Agency group demonstrates generalized transfer of learning across situations with both different motor actions and stimuli in infants as young as 7 months.  相似文献   

20.
An ability to detect the common location of multisensory stimulation is essential for us to perceive a coherent environment, to represent the interface between the body and the external world, and to act on sensory information. Regarding the tactile environment “at hand”, we need to represent somatosensory stimuli impinging on the skin surface in the same spatial reference frame as distal stimuli, such as those transduced by vision and audition. Across two experiments we investigated whether 6‐ (n = 14; Experiment 1) and 4‐month‐old (n = 14; Experiment 2) infants were sensitive to the colocation of tactile and auditory signals delivered to the hands. We recorded infants’ visual preferences for spatially congruent and incongruent auditory‐tactile events delivered to their hands. At 6 months, infants looked longer toward incongruent stimuli, whilst at 4 months infants looked longer toward congruent stimuli. Thus, even from 4 months of age, infants are sensitive to the colocation of simultaneously presented auditory and tactile stimuli. We conclude that 4‐ and 6‐month‐old infants can represent auditory and tactile stimuli in a common spatial frame of reference. We explain the age‐wise shift in infants’ preferences from congruent to incongruent in terms of an increased preference for novel crossmodal spatial relations based on the accumulation of experience. A comparison of looking preferences across the congruent and incongruent conditions with a unisensory control condition indicates that the ability to perceive auditory‐tactile colocation is based on a crossmodal rather than a supramodal spatial code by 6 months of age at least.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号