Recurrent Gaussian processes
View / Open Files
Publication Date
2016-01-01Journal Title
4th International Conference on Learning Representations, ICLR 2016 - Conference Track Proceedings
Type
Conference Object
This Version
AM
Metadata
Show full item recordCitation
Mattos, C., Dai, Z., Damianou, A., Forth, J., Barreto, G., & Lawrence, N. (2016). Recurrent Gaussian processes. 4th International Conference on Learning Representations, ICLR 2016 - Conference Track Proceedings https://doi.org/10.17863/CAM.47964
Abstract
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We define Recurrent Gaussian Processes (RGP) models, a general family of Bayesian nonparametric models with recurrent GP priors which are able to learn dynamical patterns from sequential data. Similar to Recurrent Neural Networks (RNNs), RGPs can have different formulations for their internal states, distinct inference methods and be extended with deep structures. In such context, we propose a novel deep RGP model whose autoregressive states are latent, thereby performing representation and dynamical learning simultaneously. To fully exploit the Bayesian nature of the RGP model we develop the Recurrent Variational Bayes (REVARB) framework, which enables efficient inference and strong regularization through coherent propagation of uncertainty across the RGP layers and states. We also introduce a RGP extension where variational parameters are greatly reduced by being reparametrized through RNN-based sequential recognition models. We apply our model to the tasks of nonlinear system identification and human motion modeling. The promising obtained results indicate that our RGP model maintains its highly flexibility while being able to avoid overfitting and being applicable even when larger datasets are not available.
Identifiers
This record's DOI: https://doi.org/10.17863/CAM.47964
This record's URL: https://www.repository.cam.ac.uk/handle/1810/300889
Rights
All rights reserved