Generative Model-Enhanced Human Motion Prediction
Publication Date
2022-03-23Journal Title
Applied AI Letters
ISSN
2689-5595
Publisher
Wiley
Language
en
Type
Other
This Version
AO
VoR
Metadata
Show full item recordCitation
Bourached, A., Griffiths, R., Gray, R., Jha, A., & Nachev, P. (2022). Generative Model-Enhanced Human Motion Prediction. [Other]. https://doi.org/10.1002/ail2.63
Description
Funder: UCLH Biomedical Research Centre; Id: http://dx.doi.org/10.13039/501100012621
Funder: UK Research and Innovation; Id: http://dx.doi.org/10.13039/100014013
Funder: Biomedical Research Centre
Funder: Wellcome Trust; Id: http://dx.doi.org/10.13039/100010269
Abstract
The task of predicting human motion is complicated by the natural
heterogeneity and compositionality of actions, necessitating robustness to
distributional shifts as far as out-of-distribution (OoD). Here we formulate a
new OoD benchmark based on the Human3.6M and CMU motion capture datasets, and
introduce a hybrid framework for hardening discriminative architectures to OoD
failure by augmenting them with a generative model. When applied to current
state-of-the-art discriminative models, we show that the proposed approach
improves OoD robustness without sacrificing in-distribution performance, and
can theoretically facilitate model interpretability. We suggest human motion
predictors ought to be constructed with OoD challenges in mind, and provide an
extensible general framework for hardening diverse discriminative architectures
to extreme distributional shift. The code is available at
https://github.com/bouracha/OoDMotion.
Keywords
LETTER, LETTERS, deep learning, generative models, human motion prediction, variational autoencoders
Identifiers
ail263
External DOI: https://doi.org/10.1002/ail2.63
This record's DOI: https://doi.org/10.17863/CAM.82810
Rights
Licence:
http://creativecommons.org/licenses/by/4.0/
Statistics
Total file downloads (since January 2020). For more information on metrics see the
IRUS guide.