Page 186 - Emotions through the eyes of our closest living relatives- Exploring attentional and behavioral mechanisms
P. 186

                                Chapter 8
which the PIAT (Chapter 7) could be an option. I discuss this in more detail in the final paragraphs of the next part.
Methodological considerations and future directions
Studying the emotional modulation of attention and spontaneous mimicry in bonobos, orangutans, and humans has revealed intriguing commonalities and differences between the species, and applying similar methods that make inter- species comparisons possible has proved to be a worthwhile approach. Future studies should consider improving the employed methods in several ways.
Firstly, although increasingly more effort is put into studying the perception of emotions in great apes and other animals, scientists are still only scratching the surface (see Kret et al. (2020) for a review). We based our stimulus selection in Chapters 2-4 on previous work that investigated how great apes perceive emotional expressions (e.g., De Waal, 1988; Kret et al., 2016; Parr et al., 1998, 2008), but several important candidates have yet to be studied. For instance, disgust is associated with a distinct, universal facial expression in humans that may have evolved as a response to harmful foods or other substances (Curtis et al., 2011). Moreover, disgust can also be used as an intentional signal to express strong disapproval of for instance immoral behavior (Chapman et al., 2009). There is some evidence that great apes show some features of the prototypical disgust expression, i.e., nose wrinkling and tongue protrusion (Case et al., 2020), but we still know surprisingly little about disgust in great apes and other primates. Similarly, anger is a core emotion in humans (Ekman, 1999), and the bulging lip face that for instance chimpanzees and bonobos produce may be a homologue of human anger (De Waal, 1988; Parr et al., 2007). In our studies, we lacked stimuli depicting anger and aggression or disgust, as they were very hard to come by, and therefore we could not measure to what extent these emotional cues modulate attention. Thus, future studies could include a wider range of emotional categories that also include more negatively-valenced emotional states such as anger and disgust. Moreover, we did not have the sample size to zoom in on specific emotion categories. Nevertheless, in humans, there are some mixed findings indicating either an attentional bias away or towards certain emotional categories (e.g., Pool et al., 2016; van Rooijen et al., 2017; Zvielli et al., 2014). Thus, future studies could focus on investigating how valence may impact attentional biases.
Secondly, we made use of emotional scenes that contain more contextual information compared to isolated facial expressions. Previous findings have shown that providing this context facilitates the recognition of emotions (De Gelder et al.,
184




























































































   184   185   186   187   188