Repository logo
 

Multimodal Human-Human-Robot Interactions (MHHRI) Dataset for Studying Personality and Engagement

Accepted version
Peer-reviewed

Type

Article

Change log

Authors

Celiktutan, O 
Skordos, S 

Abstract

In this paper we introduce a novel dataset, the Multimodal Human-Human-Robot-Interactions (MHHRI) dataset, with the aim of studying personality simultaneously in human-human interactions (HHI) and human-robot interactions (HRI) and its relationship with engagement. Multimodal data was collected during a controlled interaction study where dyadic interactions between two human participants and triadic interactions between two human participants and a robot took place with interactants asking a set of personal questions to each other. Interactions were recorded using two static and two dynamic cameras as well as two biosensors, and meta-data was collected by having participants fill in two types of questionnaires, for assessing their own personality traits and their perceived engagement with their partners (self labels) and for assessing personality traits of the other participants partaking in the study (acquaintance labels). As a proof of concept, we present baseline results for personality and engagement classification. Our results show that (i) trends in personality classification performance remain the same with respect to the self and the acquaintance labels across the HHI and HRI settings; (ii) for extroversion, the acquaintance labels yield better results as compared to the self labels; (iii) in general, multi-modality yields better performance for the classification of personality traits.

Description

Keywords

Databases, Cameras, Robots, Microphones, Sensors, Magnetic heads, Human-robot interaction, Multimodal interaction dataset, human-human interaction, human-robot interaction, personality analysis, engagement classification, benchmarking

Journal Title

IEEE Transactions on Affective Computing

Conference Name

Journal ISSN

1949-3045
1949-3045

Volume Title

PP

Publisher

IEEE
Sponsorship
Engineering and Physical Sciences Research Council (EP/L00416X/1)
This work was funded by the EPSRC under its IDEAS Factory Sandpits call on Digital Personhood (Grant Ref: EP/L00416X/1).