Increased sensorimotor activity during categorisation of emotionally
ambiguous faces
Abstract
Actions are rarely devoid of emotional content. Thus, a more complete
picture of the neural mechanisms underlying mental simulation of
observed actions requires more research using emotion information. The
present study used high-density electroencephalography to investigate
mental simulation associated with facial emotion categorisation.
Alpha-mu rhythm modulation was measured at each frequency, from 8 Hz to
13 Hz, to infer the degree of sensorimotor simulation. Results suggest
the sensitivity of the sensorimotor activity to emotional information,
because (1) categorising static images of neutral faces as happy or sad
was associated with stronger suppression in the central region than
categorising clearly happy faces, (2) there was preliminary evidence
indicating that the strongest suppression in the central region was in
response to neutral faces, followed by sad and then happy faces, and (3)
in the control task, which required categorising images with the head
oriented right, left, or forward as right or left, differences between
conditions showed a pattern more indicative of task difficulty rather
than sensorimotor engagement. Dissociable processing of emotional
information in facial expressions and directionality information in head
orientations was further captured in beta band activity (14-20 Hz).
Stronger mu suppression to neutral faces indicates that sensorimotor
simulation extends beyond crude motor mimicry. We propose that mu rhythm
responses to facial expressions may serve as a biomarker for empathy
circuit activation. Future research should investigate whether atypical
or inconsistent mu rhythm responses to facial expressions indicate
difficulties in understanding or sharing emotions.