Automatic Recognition of Emotions and Membership in Group Videos
IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
MetadataShow full item record
Mou, W., Gunes, H., & Patras, I. (2016). Automatic Recognition of Emotions and Membership in Group Videos. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 1478-1486. https://doi.org/10.1109/CVPRW.2016.185
Automatic affect analysis and understanding has become a well established research area in the last two decades. However, little attention has been paid to the analysis of the affect expressed in group settings, either in the form of affect expressed by the whole group collectively or affect expressed by each individual member of the group. This paper presents a framework which, in group settings automatically classifies the affect expressed by each individual group member along both arousal and valence dimensions. We first introduce a novel Volume Quantised Local Zernike Moments Fisher Vectors (vQLZM-FV) descriptor to represent the facial behaviours of individuals in the spatio-temporal domain and then propose a method to recognize the group membership of each individual (i.e., which group the individual in question is part of) by using their face and body behavioural cues. We conduct a set of experiments on a newly collected dataset that contains fourteen recordings of four groups, each consisting of four people watching affective movie stimuli. Our experimental results show that (1) the proposed vQLZM-FV outperforms the other feature representations in affect recognition, and (2) group membership can be recognized using the non-verbal face and body features, indicating that individuals influence each other's behaviours within a group setting.
EPSRC (via University of Exeter) (EP/L00416X/1)
External DOI: https://doi.org/10.1109/CVPRW.2016.185
This record's URL: https://www.repository.cam.ac.uk/handle/1810/268998