Page 121 - Demo
P. 121
Facial Mimicry and Metacognition in Facial Emotion Recognition1194recognition of sadness in static faces compared to point-light-displays has been found to be only improved in individuals with low but not with high autistic traits (Actis-Grosso et al., 2015). Moreover, while dynamic information (i.e., videos) generally improved emotion recognition for both autistic and neurotypical individuals, individuals on the autism spectrum recognized dynamic sad expressions worse compared to static ones (Enticott et al., 2014). Information that facilitates the recognition of sadness in neurotypical individuals might not serve individuals on the autism spectrum in the same way. Why this is specifically the case for sadness should be investigated in future studies.Taken our findings related to autistic traits together, feedback from multiple sources might not be integrated beneficially in emotion recognition. On the one hand, confidence in emotion recognition does not seem to be scaled to actual performance. Internal feedback, in other words, the%u201c feeling%u201d how well one performed, might not be informative of actual performance in autism and, thus, cannot assist successful learning. Our findings suggest that, on the other hand, a simulation of observed expression might not be as informative for emotion processing in ASD compared to a neurotypical population. This claim is supported by research showing a reduced access to bodily signals (i.e., interoceptive accuracy) next to a heightened sensitivity to those signals in autism (Garfinkel et al., 2016), which seems to be driven by comorbid alexithymia (Ketelaars et al., 2016; Shah et al., 2016). Consequently, while interventions targeting metacognitive abilities could help overcome the gap between actual performance and subjective judgments in individuals on the autism spectrum, a training focusing on the integration of information from the bodily component of an emotional experience could indirectly benefit emotion recognition and other social skills.In addition to the results specific to the trait dimensions, our findings also add to the current discussion on the general role of facial mimicry in emotion recognition. Recent meta-analyses have described no robust relationship between facial mimicry and emotion recognition (Holland et al., 2020), as well as broader affective judgments (Coles et al., 2019). Our study, in contrast, revealed a link between facial mimicry responses to happy and sad expressions and associated recognition accuracy. More specifically, stronger activation of the zygomaticus and relaxation of the corrugator predicted better recognition of happiness, and stronger activation of the corrugator predicted better recognition