Journal of Experimental Child Psychology 126 (2014) 68–79

Contents lists available at ScienceDirect

Journal of Experimental Child Psychology journal homepage: www.elsevier.com/locate/jecp

The development of intermodal emotion perception from bodies and voices Nicole Zieber 1, Ashley Kangas 2, Alyson Hock, Ramesh S. Bhatt ⇑ Department of Psychology, University of Kentucky, Lexington, KY 40506, USA

a r t i c l e

i n f o

Article history: Received 18 January 2014 Revised 9 March 2014

Keywords: Emotion recognition Intermodal emotion perception Body emotion Body knowledge development Vocal emotion Infant emotion perception

a b s t r a c t Even in the absence of facial information, adults are able to efficiently extract emotions from bodies and voices. Although prior research indicates that 6.5-month-old infants match emotional body movements to vocalizations, the developmental origins of this function are unknown. Moreover, it is not clear whether infants perceive emotion conveyed in static body postures and match them to vocalizations. In the current experiments, 6.5month-olds matched happy and angry static body postures to corresponding vocalizations in upright images but not in inverted images. However, 3.5-month-olds failed to match. The younger infants also failed to match when tested with videos of emotional body movements that older infants had previously matched. Thus, whereas 6.5-month-olds process emotional cues from body images and match them to emotional vocalizations, 3.5-month-olds do not exhibit such emotion knowledge. These results indicate developmental changes that lead to sophisticated emotion processing from bodies and voices early in life. Ó 2014 Elsevier Inc. All rights reserved.

Introduction The ability to understand the emotions of another individual, whether they are conveyed through facial expressions, vocal cues, or body language, is vital for effective social interactions. A great deal of ⇑ Corresponding author. 1 2

E-mail address: [email protected] (R.S. Bhatt). Current address: Department of Psychology, University of South Carolina, Columbia, SC 29208, USA. Current address: Department of Psychology, Kent State University at Tuscarawas, New Philadelphia, OH 44663, USA.

http://dx.doi.org/10.1016/j.jecp.2014.03.005 0022-0965/Ó 2014 Elsevier Inc. All rights reserved.

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

69

research has examined how adults process emotions in faces and voices and, to a lesser extent, from body postures or movement. Bodies are a significant source of emotion information, and under some circumstances (such as when a person is at a distance or when action is required), bodies may reveal information that cannot be gleaned from faces alone (Camurri, Lagerlöf, & Volpe, 2003; de Gelder, 2009). Recent studies suggest that adults’ accuracy in identifying emotions conveyed in body postures and movements is comparable to their accuracy in perceiving emotions from faces (Atkinson, 2013; Atkinson, Dittrich, Gemmell, & Young, 2004; Atkinson, Tunstell, & Dittrich, 2007; Coulson, 2004). In addition, children as young as 4 or 5 years have been shown to accurately derive emotion information from bodies (Boone & Cunningham, 1998). However, although many studies have documented the development of emotion perception from facial and vocal expressions early in life, little research has explored infants’ sensitivity to emotion portrayed in bodies. One recent study found that 6.5-month-old infants are able to extract emotions from dynamic body movements and match them to emotional vocalizations (Zieber, Kangas, Hock, & Bhatt, 2014). However, no study has examined the developmental origins of this capacity. Moreover, it is unknown whether infants are able to extract emotion from bodies when motion information is not available (i.e., from static body postures). The current study addressed both of these issues by examining whether 3.5- and 6.5-month-old infants match happy and angry body postures to their corresponding emotional vocalizations. The ability to perceive emotion information in a variety of modalities develops early in life, presumably due to the importance of accurate emotion perception for effective social functioning. A number of studies have shown that infants perceive emotion from faces and voices during their first year of life, and some studies even suggest that newborns discriminate between some basic facial expressions, although this discrimination may rely on a perceptual bias to attend to certain basic cues such as toothy smiles (Caron, Caron, & Myers, 1985; Farroni, Menon, Rigato, & Johnson, 2007; Field, Woodson, Greenberg, & Cohen, 1982; Leppanen & Nelson, 2006; Nelson, 1987; Oster, 1981). In the infant emotion perception literature, a distinction has been made between the discrimination and recognition of emotional expressions (Walker-Andrews, 1997). Discrimination requires that infants perceive the difference between two emotions, whereas recognition suggests that infants understand the underlying meaning of the expressions at least to some extent (Walker-Andrews, 1997). By 3 or 4 months of age, it is clear that infants discriminate among many basic emotions (happiness, sadness, anger, fear, and surprise) in images of static faces (e.g., Barrera & Maurer, 1981; Flom, Whipple, & Hyde, 2009; LaBarbara, Izard, Vietze, & Parisi, 1976; Schwartz, Izard, & Ansul, 1985; Young-Browne, Rosenfeld, & Horowitz, 1978). In addition, although fewer studies have examined vocal expressions of emotion, several have found that 5-month-olds discriminate among happy, sad, and angry vocal expressions, at least when presented in the context of a face (Flom & Bahrick, 2007; Walker-Andrews, 1997; Walker-Andrews & Grolnick, 1983; Walker-Andrews & Lennon, 1991). One way in which emotion recognition has been assessed is via intermodal matching tasks. In these tasks, infants simultaneously view two videos of emotional expressions (presented side by side) while hearing an auditory recording that matches one of the emotional expressions (Soken & Pick, 1992; Walker, 1982; Walker-Andrews, 1986; Walker-Andrews, 1988). If infants look longer at the emotionally congruent stimulus, then this is considered evidence that infants detect some common affective information because, in order to match emotional expressions in physically different modalities (e.g., in faces, in voices), infants must recognize the correlations between affective information portrayed in the two modalities. Thus, many previous studies have used intermodal matching as an index of affective knowledge. It should be noted, however, that the use of the term recognition to characterize matching of affective information in different modalities (Walker-Andrews, 1997) does not imply that infants’ emotion processing is adult-like and involves complete understanding of the meanings and functions of emotions. Rather, the suggestion is that intermodal matching indicates a level of knowledge about affect that goes beyond simple discrimination of modality-specific features and indicates sensitivity to common affective information in expressions from different modalities such as faces, voices, and bodies. Studies that have used the intermodal matching procedure have found that infants match facial and vocal expressions of emotion sometime between 5 and 7 months of age (Soken & Pick, 1992; Walker, 1982; Walker-Andrews, 1986; Walker-Andrews, 1988). This age range for emotion recognition is consistent with other measures of recognition, including categorization (Bornstein &

70

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

Arterberry, 2003; Ludemann, 1991; Ludemann & Nelson, 1988; Nelson & Dolgin, 1985; Nelson, Morse, & Leavitt, 1979; Quinn et al., 2011). However, Kahana-Kalman and Walker-Andrews (2001) found that infants match their mothers’ faces to voices even at 3.5 months of age (see also Walker-Andrews, Krogh-Jespersen, Mayhew, & Coffield, 2011). Thus, at least by 3.5 months of age, infants exhibit emotion recognition as indexed by intermodal matching, but it is restricted to their mothers at this age. One recent study suggested that recognition of emotions in bodies is evident at around 6 months of age. Zieber and colleagues (2014) tested 6.5-month-olds using an intermodal preference procedure to determine whether infants match happy and angry body movements to their corresponding vocalizations. In that study, infants preferred to view emotional body movements congruent with the happy or angry nonverbal emotional vocalizations heard during the task. Infants did not exhibit preferences and failed to match sounds to actions when the exact same stimuli were inverted, indicating that matching in the upright conditions was not merely attributable to some general stimulus property unrelated to affect. In addition, the actors in the study wore gray bodysuits that obscured all facial information, demonstrating that infants are sensitive to emotions exclusively portrayed through nonfacial body movements. However, Zieber and colleagues (2014) did not address the developmental origins of body emotion knowledge. It is possible that body knowledge development parallels face knowledge development, and infants’ matching of body emotions to vocalizations will be evident only sometime around 5 to 7 months of age and not earlier. On the other hand, bodies are larger than faces, and emotional body movements generally involve more overall movement than facial emotions. Thus, body information may be more readily available to young infants. Moreover, in some circumstances, body cues are more reliable signals of emotions than facial cues. For example, Aviezer, Trope, and Todorov (2012) found that adults perceive peak emotions better from body cues than from facial information. Thus, it is conceivable that even infants younger than 5 months would match body information to vocalizations. This possibility is also supported by the previously discussed finding by Kahana-Kalman and Walker-Andrews (2001) that 3.5-month-olds match their mothers’ facial emotions to voices, suggesting some basic level of facial emotion recognition by this age. Thus, the current study examined the development of emotion recognition by comparing the performance of 3.5- and 6.5-month-olds in an intermodal body–vocalization matching task. In addition, we examined whether infants are sensitive to emotions conveyed in static body postures. Adults readily detect emotions in static body postures (e.g., Atkinson et al., 2004; Aviezer et al., 2012; Coulson, 2004). Coulson (2004), for example, tested adults’ discrimination of emotion from static body postures and concluded that ‘‘static body posture offers a reliable source of information concerning emotion’’ (p. 137). Aviezer and colleagues (2012) found that static body posture can be more informative than even facial cues when adults detect peak emotions. Thus, it is clear that adults are adept at processing emotions from static bodies. Moreover, in general, most of what is known about emotion processing in adults comes from static face and body images. Therefore, it is important to examine the developmental origins of perception of emotions from static cues. However, to date the only study that has examined infants’ sensitivity to body emotions (i.e., Zieber et al., 2014) used videos of body movements, so it is unknown whether infants will respond to emotions in static body postures. We examined this issue in the current experiments.

Experiment 1 In Experiment 1, we asked whether 6.5-month-old infants recognize emotions conveyed in static body postures. Our task used a similar procedure to Zieber and colleagues (2014) except that infants were tested on static body postures rather than on body movements. Specifically, infants heard either a happy or angry nonverbal vocalization (e.g., laughing or grunting) while viewing two emotional static body postures (happy and angry) side by side on a computer display. The dependent measure was infants’ preference for the congruent body emotion. In addition, half of the infants were tested with inverted images in a control condition. This was included to determine whether infants in the upright condition were matching based on recognition of the emotional information conveyed by the upright canonical body or whether matching might be based on some feature present in both the visual and

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

71

auditory modalities that is unrelated to affect. For example, the angry vocalization might have qualities (e.g., a sharp staccato sound) that may have an equivalent presence in the visual perception of the bodies (e.g., sharp angles created by the position of the limbs). These qualities are present both when the bodies are upright and when they are inverted, so matching based on these qualities could be performed regardless of the body orientation. If, however, infants match to the congruent vocalization based on affect knowledge, we would expect matching to be constrained to the more frequently experienced familiar (upright) body posture. Thus, if infants match an emotional vocalization to an appropriate body posture in the upright condition but not in the inverted condition, then it would suggest that infants derive at least some level of affective information from body postures (Zieber et al., 2010; Zieber et al., 2014). Method Participants Thirty-two 6.5-month-old infants (18 boys and 14 girls; Mage = 198.87 days, SD = 10.66) from predominantly middle-class Caucasian families participated in Experiment 1. They were recruited from a university hospital and from newspaper birth reports. Data from an additional three infants were excluded due to side bias (i.e., >95% looking to one side). Visual stimuli The body images used were static displays of happy and angry body postures obtained from Atkinson and colleagues (2004) (see Fig. 1). These postures were stills of peaks of emotions taken from videos of actors who were asked to enact five emotions (happy, angry, sad, disgust, and fear) while wearing a suit that covered their faces and bodies. The actors were five men and five women between 18 and 22 years of age. They were told to portray the emotions in any way they saw fit. Adult participants (18–33 years of age) classified these stimuli as belonging to one of five categories of emotion while viewing stimuli one at a time. Their accuracy levels were high (>85%, with chance being 20%). Four different happy/angry actor pairs (two male pairs and two female pairs) were chosen from this set. Inverted stimuli were the same images rotated 180 degrees. From the infants’ viewpoint, the body images subtended approximately 11.42  7.63 degrees.

Happy

Angry

Fig. 1. Examples of the stimuli used in Experiments 1 and 2 depicting happy and angry body postures. Infants simultaneously viewed happy and angry body postures while hearing either a happy or angry vocalization. Infants in the upright condition viewed stimuli upright, whereas those in the inverted condition saw the same stimuli rotated 180 degrees.

72

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

Auditory stimuli The happy and angry nonverbal vocalizations were adapted from Sauter, Eisner, Calder, and Scott (2010). In that and a subsequent study (Sauter, Eisner, Ekman, & Scott, 2010), adults from different cultures demonstrated recognition of the emotions depicted in these vocalizations at levels greater than would be expected by chance. The auditory tokens used in the current study were each less than 3 s in length (average duration = 1.83 s) and were repeated every 3 s for a total of five repetitions in one test trial. Four vocalizations (two happy and two angry) were chosen while matching to the gender of the body pairs (two male and two female). These auditory stimuli were the same as those that infants matched to emotional body movements in Zieber and colleagues’ (2014) study. Each pair of happy/angry body postures was equally often accompanied by a happy or angry sound, so that each body posture was equally often the matching and nonmatching stimulus. Apparatus and procedure Infants were seated approximately 45 cm from a 50-cm computer monitor in a darkened chamber. A video camera and a DVD recorder recorded infants’ looks. Infants first saw a red flashing star located centrally on the computer monitor, and each trial began when infants fixated the center and the experimenter pressed a key. Then, a pair of images appeared side by side on the screen for 15 s. Each infant was tested on one of the four actor pairs. Half of the infants were tested on upright stimuli, whereas the others were tested on inverted stimuli. Within each condition, the initial left–right position of the congruent body was counterbalanced across infants, and this position was switched on the second trial. For a given pair of visual happy–angry stimuli, half of the infants heard a happy sound, whereas the others heard an angry sound. Thus, each happy and angry body posture was equally often a matching or nonmatching stimulus. The vocalizations were presented via two speakers that were centrally located on top of the monitor. The dependent measure was the percentage preference for the congruent body posture across the two trials. Coding of infants’ performance was conducted offline by a naive coder unaware of the left– right location of the stimulus patterns and with the DVD player slowed to 25% of the normal speed. A second coder verified coding reliability for 25% of the infants (Pearson’s r = .91). Results and discussion Infants matched body postures to emotional vocalizations in the upright condition but not in the inverted condition (Table 1). Infants in the upright condition displayed a preference for the congruent video that was significantly different from the chance level of 50%, t(15) = 2.54, p = .02, d = 1.31. In contrast, infants in the inverted condition failed to demonstrate a significant preference, t(15) = 1.46, p = .16, suggesting that the preference in the upright condition was not due to low-level image features. Of 16 infants in the upright condition, 12 had matching preference scores that were greater than 50% (binomial probability < .04), whereas only 7 of 16 infants in the inverted condition had such scores (binomial probability > .70). To directly compare performance in the upright and inverted conditions, and to examine whether performance differed as a function of the matching emotion (i.e., when the vocalization was happy vs. when it was angry), we conducted an Orientation (upright or inverted)  Emotion (happy or angry) analysis of variance (ANOVA). Only the orientation main effect was significant, F(1, 28) = 6.65, p = .016, n2p = .70. Neither the emotion main effect, F(1, 28) = 0.62, p = .81, nor the interaction, F(1, 28) = 1.16, p = .29, was significant. Thus, infants’ matching performance was superior in the upright condition compared with the inverted condition and did not differ as a function of the matching emotion (Table 1). [Note that, although we have compared performance on happy versus angry vocalization trials (and found no difference), any differences in scores must be viewed with caution because they may be due to disparities in infants’ preference for one kind of visual stimuli over another rather than to different levels of matching on the two kinds of emotions. For instance, if infants prefer to view happy postures over angry ones, this would result in higher scores in the happy vocalization trials and poorer scores in the angry vocalization trials independent of degree of matching of visual stimuli to vocalization. Thus, overall performance across the two kinds of trials (with counterbalanced sound

73

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

Table 1 6.5-Month-old infants’ mean look durations (in seconds) to the matching and nonmatching body postures and percentage preferences for the matching stimuli in Experiment 1. Orientation

n

Matching [M (SE)]

Nonmatching [M (SE)]

Percent preference [M (SE)]

ta

pa

Upright Combined Happy Angry

16 8 8

12.56 (0.72) 12.04 (1.09) 13.07 (0.99)

10.39 (0.87) 9.38 (1.26) 11.39 (1.17)

55.25 (2.06) 56.83 (2.91) 53.66 (3.01)

2.54 2.34 1.22

.02⁄ .05⁄ .26#

Inverted Combined Happy Angry

16 8 8

11.06 (1.03) 10.59 (1.36) 16.71 (1.63)

12.95 (0.83) 14.14 (1.42) 14.85 (0.72)

45.34 (3.18) 42.80 (4.23) 47.89 (4.86)

1.46 1.70 0.43

.16# .13# .68#

Note. Standard errors are presented in parentheses. a Comparison of mean percentage preference score against the chance level of 50%: ⁄p < .05 (two-tailed); #p > .10, ns (twotailed). Note that, for the reasons discussed in the text, caution should be exercised in interpreting data on each kind of trial (happy, angry) separately.

and posture pairings) is more meaningful in indicating matching performance rather than performance on one kind of trial.] These results indicate that 6.5-month-olds are sensitive to emotions exclusively portrayed through static body postures. They discriminated between two emotions (happy and angry) conveyed by body postures and matched them to appropriate emotional sounds. Infants did not exhibit preferences and failed to match sounds to postures when the exact same stimuli were inverted, indicating that preference in the upright condition was not driven by some general stimulus property unrelated to affect. This result is in line with previous findings that infants at this age are sensitive to facial and vocal expressions of emotion and prefer to view emotionally congruent displays when presented with affective cues from different modalities (Soken & Pick, 1992; Walker, 1982; Walker-Andrews, 1986). Experiment 2 Along with the previous findings from Zieber and colleagues (2014), the results of Experiment 1 demonstrate that infants at 6.5 months of age recognize emotions conveyed by body posture and actions. However, to date no study has examined whether even younger infants possess a similar ability to match body emotions to vocalizations. As reviewed previously, most research has found that infants categorize facial and vocal emotional expressions sometime between 5 and 7 months of age and match facial emotions to vocalizations at around 7 months (for reviews, see Grossman, 2010; Quinn et al., 2011; Walker-Andrews, 1997; Walker-Andrews, 2008). As such, it seems likely that infants younger than 5 months will not match body emotions to vocalizations, but in order to examine this issue, younger infants needed to be tested under the same conditions as older infants. To this end, in Experiment 2, 3.5-month-old infants were tested using the same stimuli and procedure as those used with 6.5-month-olds in Experiment 1. Method Participants Sixteen 3.5-month-old infants (nine boys and seven girls; Mage = 106.44 days, SD = 8.20) from predominantly middle-class Caucasian families participated in Experiment 2. Participants were recruited in the same manner as in Experiment 1. Data from 1 additional infant were excluded due to side bias. Stimuli, apparatus, and procedure The stimuli, apparatus, and procedure were the same as in Experiment 1, with the exception that infants were assigned only to the upright condition. We chose to include only an upright condition because if infants failed to match even in that condition, then performance on inverted stimuli would

74

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

Table 2 3.5-Month-old infants’ mean look durations (in seconds) to the matching and nonmatching stimuli and percentage preferences for the matching stimuli in Experiments 2 and 3. Stimulus

n

Matching [M (SE)]

Nonmatching [M (SE)]

Percentage preference [M (SE)]

Static (Experiment 2) Combined 16 Happy 8 Angry 8

9.69 (1.32) 9.40 (2.16) 9.99 (1.67)

13.26 (1.71) 12.23 (2.65) 16.30 (2.08)

41.87 (5.21) 45.01 (8.20) 38.73 (6.81)

Dynamic (Experiment 3) Combined 16 Happy 8 Angry 8

14.08 (2.26) 17.13 (3.93) 11.04 (1.95)

14.66 (2.16) 11.61 (3.65) 17.72 (2.02)

48.58 (7.68) 58.64 (13.33) 38.52 (6.82)

ta

pa

1.56 0.56 1.64

.14# .56# .15#

0.18 0.65 1.68

.86# .53# .14#

Note. Standard errors are presented in parentheses. a Comparison of mean percentage preference score against the chance level of 50%: #p > .10, ns (two-tailed). Note that, for the reasons discussed in the text, caution should be exercised in interpreting data on each kind of trial (happy or angry) separately.

not be informative. The dependent measure was the percentage preference for the congruent body emotion across the two trials, and it was coded and calculated in the same manner as in Experiment 1. A second coder verified coding reliability for 25% of the infants (Pearson’s r = .98). Results and discussion Infants in this experiment failed to match emotional vocalizations to the corresponding body postures (see Table 2). The mean preference for the congruent body emotion did not differ significantly from the chance level of 50%, t(15) = 1.56, p = .14. Moreover, a t test failed to reveal a significant difference in performance on the happy versus angry test trials, t(14) = 0.59, p = .57 (see Table 2). In addition, only 7 of 16 3.5-month-olds in the current experiment had scores that were greater than 50% (binomial probability p > .70). Thus, unlike the 6.5-month-olds in Experiment 1, the 3.5-month-olds in Experiment 2 failed to match emotional body postures to the appropriate vocalizations. Experiment 3 The results of Experiments 1 and 2 indicate that a developmental change occurs between 3.5 and 6.5 months of age and allows older infants to match emotional vocalizations to emotional body postures. There could be several explanations for the younger infants’ failure to match. One possibility is that the static nature of the bodies rendered them uninformative to the infants. In infants’ experience, it is likely that bodies are typically seen in motion rather than completely still, and this may be especially true of emotional bodies. Movement may play an even greater role in the perception of emotion from bodies than from faces; facial movements, although still very informative, are smaller and require more fine motor control than body movements, which tend to be on a larger scale with a greater range of motion. As such, it seemed that additional information provided by dynamic movements during the expression of an emotion might enable even younger infants to demonstrate the ability to match an emotional vocalization to a corresponding dynamic emotional body expression. Support for the possibility comes from prior research suggesting that infants exhibit superior performance on dynamic stimuli compared with static stimuli in a variety of tasks (for reviews, see Kellman & Arterberry, 1998; Quinn et al., 2011; Walker-Andrews, 1997; Walker-Andrews, 2008). Experiment 3 tested a group of 3.5-month-old infants to see whether dynamic stimuli facilitate intermodal body– voice emotion matching. Method Participants Sixteen 3.5-month-old infants (seven boys and nine girls; Mage = 100.50 days, SD = 7.11) from predominantly middle-class Caucasian families participated in Experiment 3. Participants were recruited

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

75

in the same manner as in Experiments 1 and 2. Data from 1 additional infant were excluded due to fussiness. Stimuli The stimuli were the same as those used in Zieber and colleagues’ (2014) study, in which 6.5-montholds matched emotional vocalizations to congruent emotional body movements. Four videos (two male and two female) of angry and happy expressions from Atkinson and colleagues (2004) comprised the visual stimuli. The videos were 3-s clips of dynamic body expressions by actors with covered faces. Each video was repeated five times during each of the 15-s test trials. The vocalizations were the same as those used in Experiments 1 and 2 of the current study and in Experiment 2 of Zieber and colleagues’ (2014) study. Stimulus presentation was the same as in Experiments 1 and 2, such that on each trial happy and angry videos played side by side while either a happy or angry sound was played. Apparatus and procedure The apparatus, procedure, and stimulus counterbalancing were the same as in Experiments 1 and 2 of the current study and Experiment 2 of Zieber and colleagues (2014). Coding of the infants’ performance was conducted offline as in previous experiments. A second coder verified coding reliability for 25% of the infants (Pearson’s r = .98). Results and discussion Infants failed to match body movements to vocalizations (Table 2). Their mean preference for the congruent body emotion was not significantly different from 50%, t(15) = 0.18, p = .86. Of 16 infants, 10 exhibited matching scores that were significantly greater than 50% (binomial probability > .20). Moreover, performance did not differ significantly as a function of the matching emotion, t(14) = 1.34, p = .20 (Table 2). Thus, even when body emotions were depicted through dynamic movements, 3.5-month-olds failed to match the corresponding vocalization to the appropriate body emotion. Using 6.5-month-olds’ matching performance on dynamic stimuli from Experiment 2 of Zieber and colleagues’ (2014) study (mean matching score = 57.95%, SE = 2.07) and the data from Experiments 1 to 3 of the current study (Tables 1 and 2), we conducted a Stimulus (dynamic or static)  Age (3.5 or 6.5 months) ANOVA to examine whether the static versus dynamic nature of the stimuli affected infants’ performance and/or interacted with the age effect documented in this research. This ANOVA revealed only a main effect of age, F(1, 58) = 5.15, p = .027, n2p = .08. Neither the stimulus main effect, F(1, 58) = 0.88, p = .35, nor the interaction, F(1, 58) = 0.16, p = .69, was significant. This analysis indicates a developmental change in infants’ matching of emotional vocalizations to body movements from 3.5 to 6.5 months of age. Moreover, there was no evidence to suggest that infants’ performance differed as a function of whether the visual stimuli were static or dynamic. Overall, the results indicate that 6.5-month-olds match body emotions conveyed in static and dynamic displays to affective vocalizations, but 3.5-month-olds fail to match with either kind of stimuli. General discussion The current experiments found a developmental change during the first half year of life in infants’ matching of emotions between vocalizations and body information. They also extended previous findings by demonstrating that 6.5-month-old infants are sensitive to body emotions even when no dynamic information is presented. When hearing a happy or angry emotional vocalization, 6.5-month-olds preferred to view an emotionally congruent static body posture when the images were presented upright but not when the images were inverted. These results reveal a remarkable ability to perceive emotion from bodily displays during the first year of life. However, under the same experimental conditions, 3.5-month-olds failed to match emotional vocalizations to static emotional body postures (Experiment 2) and dynamic emotional body movements (Experiment 3). To our knowledge, this is the first study to demonstrate a developmental change in the intermodal perception of emotions from bodies and vocalizations.

76

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

The developmental change from 3.5 to 6.5 months of age revealed in the current experiments is consistent with prior reports of emotion knowledge development in other modalities. Infants categorize facial emotions at around 5 to 7 months of age (Quinn et al., 2011). It is around 5 months of age that infants generally recognize emotions from vocalizations (Walker-Andrews, 1997; Walker-Andrews & Grolnick, 1983; Walker-Andrews & Lennon, 1991; but see Kahana-Kalman & Walker-Andrews, 2001; Walker-Andrews et al., 2011). Face–voice intermodal matching of emotion has also been demonstrated at 5 to 7 months of age (Soken & Pick, 1992; Walker, 1982; Walker-Andrews, 1986; but see KahanaKalman & Walker-Andrews, 2001). Thus, our findings of a developmental change in body–voice intermodal emotion perception from 3.5 to 6.5 months of age is further evidence of a significant period of emotion knowledge development from before 5 months to around 7 months of age. A key issue that future research needs to address concerns the nature of the mechanism that underlies the developmental change documented in the current study. Young infants’ failure to match body information to vocalization may be due to a failure to process body emotion. Research by Slaughter and colleagues (Slaughter & Heron, 2004; Slaughter, Heron-Delaney, & Christie, 2012) suggests that body knowledge is slow to develop compared with face knowledge, and it is possible that young infants even fail to process emotions from body posture and movements. If so, then clearly they will not be able to match body emotion information to vocalization. Alternatively, or in addition, it is possible that infants fail to recognize emotion information in vocalizations. In general, clear evidence of emotion recognition from voices is seen only at around 5 months of age (Walker-Andrews, 1997; Walker-Andrews & Grolnick, 1983; Walker-Andrews & Lennon, 1991; but see Kahana-Kalman & Walker-Andrews, 2001; Walker-Andrews et al., 2011). Thus, it is possible that 3.5-month-olds failed to match vocalizations to body emotions in Experiments 2 and 3 because they did not discern the emotions in the vocalization. Yet another possibility is that young infants are able to process emotional information from bodies and from vocalizations separately but are unable to match the information across modalities. General relational information processing has been shown to be poorer at 3.5 months than at older ages in some instances (Bhatt & Quinn, 2011; Cashon & Cohen, 2004; Cohen, 2010), and it is possible that this deficit underlies the matching of emotions across modalities. In addition, the developmental change in emotion processing demonstrated in the current study brings up the question of the mechanisms that underlie the general development of emotion knowledge. Several theorists have proposed that experience with emotional situations and people facilitates the development of emotion knowledge (Lewis, 2008; Walker-Andrews, 1997). Thus, it is possible that greater experience with emotional bodies and vocalizations and the correspondence between them drives the developmental change from 3.5 to 6.5 months of age. This could include embodied knowledge (i.e., knowledge from own body and actions) and observational learning (i.e., knowledge from observing other people’s behavior). A challenge for future research is to document the exact nature of this experience and the manner in which it facilitates development. The fact that 6.5-month-olds in Experiment 1 matched static body poses to emotional vocalizations is noteworthy in indicating the robust nature of infants’ knowledge of emotions derived from body information. Not only do adults perceive emotions in static bodies (e.g., Atkinson et al., 2004), but in some instances static body information can be more informative than facial information (Aviezer et al., 2012). Thus, it appears that emotion processing from static body information is a significant component of our emotional lives, and this significance is reflected in infants as young as 6.5 months being able to perceive emotions with minimal input from body stimuli. A challenge for future research is to understand the information that older infants use to match body information to vocalizations. This information is unlikely to be low-level stimulus information because inversion disrupted matching even though low-level information would not be affected by inversion. It is more likely that affective knowledge is the bridge connecting body emotions to vocalizations. The exact nature of this affective information, however, needs to be specified. This is particularly important given that there is uncertainty about the nature of emotion knowledge, especially during development (Mondloch, Horner, & Mian, 2013). Some researchers argue that infants, and even children, process only general emotional valence and arousal and are not tuned into the discrete categories of emotions to which adults are thought to be sensitive (Barrett, 2006; Widen & Russell, 2008). Others argue, however, that sensitivity to different emotion categories is likely to be evident even early in life (Izard, 2007).

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

77

One way to address the debate concerning the nature of emotion knowledge in early infancy is to ascertain whether sensitivity to discrete emotion categories is evident during infancy. Categorical perception procedures have been thought to provide evidence of discrete categories within broad classes of stimuli. For example, color has been thought to be perceived categorically during infancy (Bornstein, Kessen, & Weiskopf, 1976; Clifford, Franklin, Davies, & Holmes, 2009) because infants respond differently to equivalent wavelength contrasts depending on whether these contrasts are within or across category boundaries. Etcoff and Magee (1992; see also Sauter, LeGuen, & Haun, 2011), using morphed stimuli, found such categorical boundaries in adults’ processing of facial emotions. Thus, examination of the prevalence, or lack thereof, of such categorical boundaries in emotion stimuli during infancy would address the nature of early emotion perception. To our knowledge, Kotsoni, de Haan, and Johnson (2001) is the only previous report of categorical perception of emotions during infancy. However, that study examined the categorical nature of the contrast between happiness and fear in faces. Because these two emotions are from general positive and negative global classes, this finding does not answer the question of whether infants exhibit categorical perception within each of the broad affective classes, which is the subject of debate in the emotion literature. Thus, future research will need to examine categorical perception within broad positive and negative affective classes in order to address the debate about the nature of early emotion knowledge. A limitation of the current research is that the number of participants in individual conditions (n = 16) was quite low; this warrants caution in the interpretation of the results. However, note that we reported in the Results and Discussion section of Experiment 3 the analyses of combined data from static and dynamic conditions, resulting in 32 participants at each age in each condition. This analysis revealed results that are consistent with the conclusion that there are systematic developmental changes from 3.5 to 6.5 months of age. Another issue is that the number of participants tested on each of the emotions (happy or angry) was limited to only eight participants within each experimental condition. This was because the overall intent of the research was not to compare matching performance across different emotions. To do that, the intensity of different emotions in both the visual and auditory modalities would need to be matched. Otherwise, it would result in apples-and-oranges comparisons, and it would not be clear whether performance differences were due to different categories of emotion or to intensity differences. Moreover, as noted earlier, the binary nature of the matching procedure (matching to one emotion or the other) makes it difficult to interpret performance on one emotion independent of the other. For example, higher scores on one kind of emotion trials than on another could be due to preference for the video/static image of the first emotion rather than to matching per se. Thus, even if the number of participants had been larger and had revealed differences in performance across emotions, the results would have been difficult to interpret and would not have led to strong conclusions about knowledge of different emotions. In conclusion, the current experiments clearly show that there are developmental changes during the first half year of life in infants’ intermodal matching of emotions from bodies and vocalizations. At the same time, these experiments demonstrate robust emotion processing in 6.5-month-olds to the extent that they are able to match static emotional body postures to emotional vocalizations. These findings indicate that rapid developmental changes lead to a high degree of emotion knowledge quite early in life.

Acknowledgments This research was supported by a grant from the United States National Science Foundation (BCS1121096) to R.S.B. The authors thank the infants and parents who participated in this study and thank A. P. Atkinson and D. A. Sauter for providing the images, videos, and audio stimuli that were used in this study.

References Atkinson, A. P. (2013). Bodily expressions of emotion: Visual cues and neural mechanisms. In J. Armony & P. Vuilleumier (Eds.), The Cambridge handbook of human affective neuroscience (pp. 198–222). New York: Cambridge University Press.

78

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

Atkinson, A. P., Dittrich, W. H., Gemmell, A. J., & Young, A. W. (2004). Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception, 33, 717–746. Atkinson, A. P., Tunstell, M. L., & Dittrich, W. H. (2007). Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures. Cognition, 104, 59–72. Aviezer, H., Trope, Y., & Todorov, A. (2012). Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science, 338, 1225–1229. Barrera, M. E., & Maurer, D. (1981). The perception of facial expressions by the three-month-old. Child Development, 52, 203–206. Barrett, L. F. (2006). Are emotions natural kinds? Perspectives on Psychological Science, 1, 28–58. Bhatt, R. S., & Quinn, P. C. (2011). How does learning impact development in infancy? The case of perceptual organization. Infancy, 16, 2–38. Boone, R. T., & Cunningham, J. G. (1998). Children’s decoding of emotion in expressive body movement: The development of cue attunement. Developmental Psychology, 34, 1007–1016. Bornstein, M. H., & Arterberry, M. E. (2003). Recognition, discrimination, and categorization of smiling by 5-month-old infants. Developmental Science, 6, 585–599. Bornstein, M., Kessen, W., & Weiskopf, S. (1976). Color vision and hue categorization in young human infants. Journal of Experimental Psychology: Human Perception and Performance, 2, 115–129. Camurri, A., Lagerlöf, I., & Volpe, G. (2003). Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques. International Journal of Human-Computer Studies, 59, 213–225. Caron, R. E., Caron, A. J., & Myers, R. S. (1985). Do infants see emotional expressions in static faces? Child Development, 56, 1552–1560. Cashon, C. H., & Cohen, L. B. (2004). Beyond U-shaped development in infants’ processing of faces: An information-processing account. Journal of Cognition and Development, 5, 59–80. Clifford, A., Franklin, A., Davies, I. R. L., & Holmes, A. (2009). Electrophysiological markers of categorical perception of color in 7month-old infants. Brain and Cognition, 71, 165–172. Cohen, L. B. (2010). A bottom-up approach to infant perception and cognition: A summary of evidence and discussion of issues. In S. P. Johnson (Ed.), Neoconstructivism: The new science of cognitive development (pp. 335–346). New York: Oxford University Press. Coulson, M. (2004). Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior, 28, 117–139. de Gelder, B. (2009). Twelve reasons for including bodily expressions in affective neuroscience. Philosophical Transactions of the Royal Society B, 364, 3475–3484. Etcoff, N., & Magee, J. (1992). Categorical perception of facial expressions. Cognition, 44, 227–240. Farroni, T., Menon, E., Rigato, S., & Johnson, M. H. (2007). The perception of facial expressions in newborns. European Journal of Developmental Psychology, 4, 2–13. Field, T. M., Woodson, R. W., Greenberg, R., & Cohen, C. (1982). Discrimination and imitation of facial expressions by neonates. Science, 218, 179–181. Flom, R., & Bahrick, L. (2007). The development of infant discrimination of affect in multimodal and unimodal stimulation: The role of intersensory redundancy. Developmental Psychology, 43, 238–252. Flom, R., Whipple, H., & Hyde, D. (2009). Infants’ intermodal perception of canine (Canis familairis) facial expressions and vocalizations. Developmental Psychology, 45, 1143–1151. Grossman, T. (2010). The development of emotion perception in face and voice during infancy. Restorative Neurology and Neuroscience, 28, 219–236. Izard, C. E. (2007). Basic emotions, natural kinds, emotion schemas, and a new paradigm. Perspectives on Psychological Science, 2, 260–280. Kahana-Kalman, R., & Walker-Andrews, A. S. (2001). The role of person familiarity in young infants’ perception of emotional expression. Child Development, 72, 352–369. Kellman, P. J., & Arterberry, M. E. (1998). The cradle of knowledge: Development of perception in infancy. Cambridge, MA: MIT Press. Kotsoni, E., de Haan, M., & Johnson, M. H. (2001). Categorical perception of facial expressions by 7-month-old infants. Perception, 30, 1115–1125. LaBarbara, J. D., Izard, C. E., Vietze, P., & Parisi, S. A. (1976). Four- and six-month-old infants’ visual responses to joy, anger, and neutral expressions. Child Development, 47, 533–538. Leppanen, J. M., & Nelson, C. A. (2006). The development and neural bases of facial emotion recognition. In R. V. Kail (Ed.). Advances in child development and behavior (Vol. 34, pp. 207–246). San Diego: Academic Press. Lewis, M. (2008). The emergence of human emotions. In M. Lewis, J. M. Haviland-Jones, & L. S. Barrett (Eds.), Handbook of emotions (3rd ed., pp. 304–319). New York: Guilford. Ludemann, P. M. (1991). Generalized discrimination of positive facial expressions by 7- to 10-month-old infants. Child Development, 62, 55–67. Ludemann, P. M., & Nelson, C. A. (1988). The categorical representation of facial expressions by 7-month-old infants. Developmental Psychology, 24, 492–501. Mondloch, C. J., Horner, M., & Mian, J. (2013). Wide eyes and drooping arms: Adult-like congruency effects emerge early in development of sensitivity to emotional faces and body posture. Journal of Experimental Child Psychology, 114, 203–216. Nelson, C. A. (1987). The recognition of facial expressions in the first two years of life: Mechanisms of development. Child Development, 58, 889–909. Nelson, C. A., & Dolgin, K. (1985). The generalized discrimination of facial expressions by 7-month-old infants. Child Development, 56, 58–61. Nelson, C. A., Morse, P. A., & Leavitt, L. A. (1979). Recognition of facial expression by 7-month-old infants. Child Development, 50, 1239–1242.

N. Zieber et al. / Journal of Experimental Child Psychology 126 (2014) 68–79

79

Oster, H. (1981). ‘‘Recognition’’ of emotional expression in infancy. In M. E. Lamb & L. R. Sherrod (Eds.), Infant social cognition: Empirical and theoretical considerations (pp. 85–125). Hillsdale, NJ: Lawrence Erlbaum. Quinn, P. C., Anzures, G., Izard, C. E., Lee, K., Pascalis, O., Slater, A. M., et al (2011). Looking across domains to understand infant representation of emotion. Emotion Review, 3, 197–206. Sauter, D. A., Eisner, F., Calder, A. J., & Scott, S. K. (2010a). Perceptual cues in nonverbal vocal expressions of emotion. Quarterly Journal of Experimental Psychology, 63, 2251–2272. Sauter, D. A., Eisner, F., Ekman, P., & Scott, S. K. (2010b). Cross-cultural recognition of basic emotions through nonverbal emotional vocalizations. Proceedings of the National Academy of Sciences of the United States of America, 107, 2408–2412. Sauter, D. A., LeGuen, O., & Haun, D. (2011). Categorical perception of emotional facial expressions does not require lexical categories. Emotion, 11, 1479–1484. Schwartz, G. M., Izard, C. E., & Ansul, S. E. (1985). The 5-month-old’s ability to discriminate facial expressions of emotion. Infant Behavior and Development, 8, 65–77. Slaughter, V., & Heron, M. (2004). Origins and early development of human body knowledge. Monographs of the Society for Research in Child Development, 69 (2, Serial No. 276). Slaughter, V., Heron-Delaney, M., & Christie, T. (2012). Developing expertise in human body perception. In V. Slaughter & C. A. Brownell (Eds.), Early development of body representations (pp. 81–100). Cambridge, UK: Cambridge University Press. Soken, N. H., & Pick, A. D. (1992). Intermodal perception of happy and angry expressive behaviors by 7-month-old infants. Child Development, 63, 787–795. Walker, A. S. (1982). Intermodal perception of expressive behaviors by human infants. Journal of Experimental Child Psychology, 33, 514–535. Walker-Andrews, A. S. (1986). Intermodal perception of expressive behaviors: Relation of eye and voice? Developmental Psychology, 22, 373–377. Walker-Andrews, A. S. (1997). Infants’ perception of expressive behaviors: Differentiation of multimodal information. Psychological Bulletin, 121, 437–456. Walker-Andrews, A. S., & Grolnick, W. (1983). Discrimination of vocal expression by young infants. Infant Behavior and Development, 6, 491–498. Walker-Andrews, A. S., Krogh-Jespersen, S., Mayhew, E. M. Y., & Coffield, C. N. (2011). Young infants’ generalization of emotional expressions: Effects of familiarity. Emotion, 11, 842–851. Walker-Andrews, A. S., & Lennon, E. (1991). Infants’ discrimination of vocal expressions: Contribution of auditory and visual information. Infant Behavior and Development, 14, 131–142. Walker-Andrews, A. S. (2008). Intermodal emotion processes in infancy. In M. Lewis, J. M. Haviland-Jones, & L. S. Barrett (Eds.), Handbook of emotions (3rd ed., pp. 364–375). New York: Guilford. Walker-Andrews, A. S. (1988). Infants’ perception of the affordances of expressive behaviors. In C. K. Rovee-Collier (Ed.), Advances in infancy research (pp. 173–221). Norwood, NJ: Ablex. Widen, S. C., & Russell, J. A. (2008). Young children’s understanding of other’s emotions. In M. Lewis, J. M. Haviland-Jones, & L. S. Barrett (Eds.), Handbook of emotions (3rd ed., pp. 348–363). New York: Guilford. Young-Browne, G., Rosenfeld, H. M., & Horowitz, F. D. (1978). Infant discrimination of facial expression. Child Development, 48, 555–562. Zieber, N., Bhatt, R. S., Hayden, A., Kangas, A., Collins, R., & Bada, H. (2010). Body representation in the first year of life. Infancy, 15, 534–544. Zieber, N., Kangas, A., Hock, A., & Bhatt, R. S. (2014). Infants’ perception of emotions from body movements. Child Development, 85, 675–684.

The development of intermodal emotion perception from bodies and voices.

Even in the absence of facial information, adults are able to efficiently extract emotions from bodies and voices. Although prior research indicates t...
369KB Sizes 3 Downloads 4 Views