Multimodal Human-Human-Robot Interactions (MHHRI) Dataset for Studying Personality and Engagement
View / Open Files
Journal Title
IEEE Transactions on Affective Computing
ISSN
1949-3045
Publisher
IEEE
Volume
PP
Issue
99
Language
English
Type
Article
This Version
AM
Metadata
Show full item recordCitation
Celiktutan, O., Skordos, S., & Gunes, H. (2017). Multimodal Human-Human-Robot Interactions (MHHRI) Dataset for Studying Personality and Engagement. IEEE Transactions on Affective Computing, PP (99)https://doi.org/10.1109/TAFFC.2017.2737019
Abstract
In this paper we introduce a novel dataset, the Multimodal Human-Human-Robot-Interactions (MHHRI) dataset, with the aim of studying personality simultaneously in human-human interactions (HHI) and human-robot interactions (HRI) and its relationship with engagement. Multimodal data was collected during a controlled interaction study where dyadic interactions between two human participants and triadic interactions between two human participants and a robot took place with interactants asking a set of personal questions to each other. Interactions were recorded using two static and two dynamic cameras as well as two biosensors, and meta-data was collected by having participants fill in two types of questionnaires, for assessing their own personality traits and their perceived engagement with their partners (self labels) and for assessing personality traits of the other participants partaking in the study (acquaintance labels). As a proof of concept, we present baseline results for personality and engagement classification. Our results show that (i) trends in personality classification performance remain the same with respect to the self and the acquaintance labels across the HHI and HRI settings; (ii) for extroversion, the acquaintance labels yield better results as compared to the self labels; (iii) in general, multi-modality yields better performance for the classification of personality traits.
Sponsorship
This work was funded by the EPSRC under its IDEAS Factory Sandpits call on Digital Personhood (Grant Ref: EP/L00416X/1).
Funder references
EPSRC (via University of Exeter) (EP/L00416X/1)
Identifiers
External DOI: https://doi.org/10.1109/TAFFC.2017.2737019
This record's URL: https://www.repository.cam.ac.uk/handle/1810/267481
Rights
Licence:
http://www.rioxx.net/licenses/all-rights-reserved