Variational auto-encoded deep Gaussian processes
View / Open Files
Publication Date
2016-01-01Journal Title
4th International Conference on Learning Representations, ICLR 2016 - Conference Track Proceedings
Type
Conference Object
This Version
VoR
Metadata
Show full item recordCitation
Dai, Z., Damianou, A., González, J., & Lawrence, N. (2016). Variational auto-encoded deep Gaussian processes. 4th International Conference on Learning Representations, ICLR 2016 - Conference Track Proceedings https://doi.org/10.17863/CAM.47957
Abstract
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model. Inference is performed in a novel scalable variational framework where the variational posterior distributions are reparametrized through a multilayer perceptron. The key aspect of this reformulation is that it prevents the proliferation of variational parameters which otherwise grow linearly in proportion to the sample size. We derive a new formulation of the variational lower bound that allows us to distribute most of the computation in a way that enables to handle datasets of the size of mainstream deep learning tasks. We show the efficacy of the method on a variety of challenges including deep unsupervised learning and deep Bayesian optimization.
Identifiers
This record's DOI: https://doi.org/10.17863/CAM.47957
This record's URL: https://www.repository.cam.ac.uk/handle/1810/300882
Rights
All rights reserved