Repository logo
 

Pantomimic Gestures for Human-Robot Interaction


Change log

Authors

Lasenby, J 

Abstract

This work introduces a pantomimic gesture interface, which classifies human hand gestures using unmanned aerial vehicle (UAV) behaviour recordings as training data. We argue that pantomimic gestures are more intuitive than iconic gestures and show that a pantomimic gesture recognition strategy using micro UAV behaviour recordings can be more robust than one trained directly using hand gestures. Hand gestures are isolated by applying a maximum information criterion, with features extracted using principal component analysis (PCA) and compared using a nearest neighbour classifier. These features are biased in that they are better suited to classifying certain behaviours. We show how a Bayesian update step accounting for the geometry of training features compensates for this, resulting in fairer classification results, and introduce a weighted voting system to aid in sequence labelling.

Description

Keywords

Gesture recognition, human-robot interaction, pantomimic, principal component analysis (PCA), time series classification

Journal Title

IEEE Transactions on Robotics

Conference Name

Journal ISSN

1552-3098
1941-0468

Volume Title

31

Publisher

Institute of Electrical and Electronics Engineers (IEEE)