Repository logo
 

Statistical, Spectral and Graph Representations for Video-Based Facial Expression Recognition in Children

Accepted version
Peer-reviewed

Loading...
Thumbnail Image

Change log

Abstract

Child facial expression recognition is a relatively less investigated area within affective computing. Children’s facial expressions differ significantly from adults; thus, it is necessary to develop emotion recognition frameworks that are more objective, descriptive and specific to this target user group. In this paper we propose the first approach that (i) constructs video-level heterogeneous graph representation for facial expression recognition in children, and (ii) predicts children’s facial expressions using the automatically detected Action Units (AUs). To this aim, we construct three separate length-independent representations, namely, statistical, spectral and graph at video-level for detailed multi-level facial behaviour decoding (AU activation status, AU temporal dynamics and spatio-temporal AU activation patterns, respectively). Our experimental results on the LIRIS Children Spontaneous Facial Expression Video Database demonstrate that combining these three feature representations provides the highest accuracy for expression recognition in children.

Description

Journal Title

ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

Conference Name

ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

Journal ISSN

1520-6149

Volume Title

2022-May

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Rights and licensing

Except where otherwised noted, this item's license is described as All Rights Reserved
Sponsorship
Engineering and Physical Sciences Research Council (EP/R030782/1)
European Commission Horizon 2020 (H2020) Societal Challenges (826232)
W.D.Armstrong Trust Studentship and the Cambridge Trusts. The European Union’s Horizon 2020 Research and Innovation programme under grant agreement No. 826232.