Page 23 - Emotions through the eyes of our closest living relatives- Exploring attentional and behavioral mechanisms
P. 23
towards outgroups, but would also allow us to examine how animals judge emotional 1 categories themselves.
Currently, we design experiments based on our notion of what entails an emotionally-salient cue to animals. These notions are of course grounded in the knowledge that we have gathered on emotions in animals, but it remains impossible for us to truly know the meaning of emotions to animals because they cannot use language to convey this information to us. We make a first step towards developing a way to probe all kinds of implicit associations in animals that are capable of categorizing images and working on a touchscreen in Chapter 7. Here, we validate a pictorial adaptation to the IAT in humans and adult children, with the hope that this pictorial version may one day be tested in animals. In essence, the IAT is a matching- to-sample task in which an image has to be matched to its appropriate category. Great apes have previously been shown to be capable of performing matching-to-sample tasks in which they categorized bodies in different configurations (Gao & Tomonaga, 2020), sexes (De Waal & Pokorny, 2011), genital regions (Kret & Tomonaga, 2016), familiar and unfamiliar faces (Parr et al., 2000; Pokorny & De Waal, 2009; Talbot et al., 2015; Vonk & Hamilton, 2014), facial expressions (Parr et al., 2008), and emotions (Parr, 2001). It is therefore plausible that great apes could also perform in a pictorial IAT.
Dissertation outline
This dissertation is based on six empirical research articles focusing on the unconscious and automatic cognitive and behavioral markers of emotion perception, as these markers offer a strong basis from which we can study emotions across species. Specifically, in Chapter 2, the role of implicit, immediate attention in perceiving emotions of familiar and unfamiliar conspecifics (i.e., other individuals of the same species) as well as heterospecifics (i.e., individuals of another species) is investigated in bonobos and humans (Figure 1i). Central to this chapter is i) replicating earlier findings on an emotion bias in bonobos and humans (Figure 1a), and ii) tackling how familiarity modulates attention for emotions, which to date has not yet been examined (Figure 1b).
Chapter 3 zooms in on emotion-biased attention in humans across all age categories, using emotional scenes as cues rather than isolated facial expressions, as they provide more contextual information to the observer (Figure 1a, c). Moreover,
The mechanisms of emotion perception
21