Pantomimic Gestures for Human-Robot Interaction
IEEE Transactions on Robotics
MetadataShow full item record
Burke, M., & Lasenby, J. (2015). Pantomimic Gestures for Human-Robot Interaction. IEEE Transactions on Robotics, 31 1225-1237. https://doi.org/10.1109/TRO.2015.2475956
This work introduces a pantomimic gesture interface, which classifies human hand gestures using unmanned aerial vehicle (UAV) behaviour recordings as training data. We argue that pantomimic gestures are more intuitive than iconic gestures and show that a pantomimic gesture recognition strategy using micro UAV behaviour recordings can be more robust than one trained directly using hand gestures. Hand gestures are isolated by applying a maximum information criterion, with features extracted using principal component analysis (PCA) and compared using a nearest neighbour classifier. These features are biased in that they are better suited to classifying certain behaviours. We show how a Bayesian update step accounting for the geometry of training features compensates for this, resulting in fairer classification results, and introduce a weighted voting system to aid in sequence labelling.
pantomimic, gesture recognition, human-robot interaction, principal component analysis, time series classification
External DOI: https://doi.org/10.1109/TRO.2015.2475956
This record's URL: https://www.repository.cam.ac.uk/handle/1810/250385