Page 187 - Emotions through the eyes of our closest living relatives- Exploring attentional and behavioral mechanisms
P. 187

                                2010; Kret et al., 2013b). Although I did not explicitly test this, it is plausible that this is also true for apes’ recognition of emotion. For instance, bonobos show a bared-teeth display when scared and nervous, but also during sex (De Waal, 1988). Providing contextual information by showing a scene rather than an isolated facial expression may thus facilitate the processing of its emotional content. Nevertheless, our stimuli were static (Chapters 2-4), and only contained social information coming from the visual modality (Chapters 2-4, 6). Although human facial expressions of emotions are highly ritualized and therefore salient (Kret et al., 2020), expressions of emotions are often multi-modal, consisting of vocalizations, gestures, and facial and bodily expressions. Therefore, the perception of emotional expressions may be enhanced when emotional information is coming from multiple channels (Paulmann & Pell, 2011). Indeed, human studies have shown that emotional information across different modalities is integrated, creating holistic, enhanced emotion recognition (De Gelder et al., 1999; De Gelder & Vroomen, 2000; Schirmer & Adolphs, 2017).
Similarly, there is some work suggesting that great apes use social information
coming from different modalities to categorize expressions. For instance, Parr (2004)
found that when chimpanzees categorized faces, they preferentially categorized
pant-hoots and play faces based on their auditory components, and scream faces
based on their visual components. From an evolutionary perspective, the results
suggest that the auditory modality is more informative for pant-hoots and play faces
because pant-hoots are used for long-distance communication, and play faces are
often concealed during rough, close-contact play. Thus, vocalizations during play
may be more salient than facial expressions for indicating playful intentions (Parr,
2004). Nevertheless, despite numerous studies in primates investigating expressions
in a single domain (e.g., vocalizing, gesturing, and facial expressions), multi-modal
signaling is virtually unexplored (Fröhlich et al., 2019b; Liebal et al., 2014). The call for 8 a multi-modal approach is in line with recent work stressing the effects of different
natural ecologies of animals (including humans) on the evolution and development
of behavior and cognition (Bräuer et al., 2020). Moving forward, comparative
studies could take a multi-componential approach, and for instance investigate
how dynamic emotional scenes that also include auditory cues are viewed, what
behaviors these dynamic scenes elicit, and how different modalities contribute to
emotion perception.
In Chapters 5 and 6 I described our studies investigating yawn and self- scratch contagion in orangutans. Yawn contagion was virtually unexplored in this species, and only a limited amount of work has previously looked into self-scratch
General discussion
 185













































































   185   186   187   188   189