Repository logo
 

Variational auto-encoded deep Gaussian processes

Published version
Peer-reviewed

Type

Conference Object

Change log

Authors

Dai, Z 
Damianou, A 
González, J 
Lawrence, Neil David  ORCID logo  https://orcid.org/0000-0001-9258-1030

Abstract

We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model. Inference is performed in a novel scalable variational framework where the variational posterior distributions are reparametrized through a multilayer perceptron. The key aspect of this reformulation is that it prevents the proliferation of variational parameters which otherwise grow linearly in proportion to the sample size. We derive a new formulation of the variational lower bound that allows us to distribute most of the computation in a way that enables to handle datasets of the size of mainstream deep learning tasks. We show the efficacy of the method on a variety of challenges including deep unsupervised learning and deep Bayesian optimization.

Description

Keywords

cs.LG, cs.LG, stat.ML

Journal Title

4th International Conference on Learning Representations, ICLR 2016 - Conference Track Proceedings

Conference Name

4th International Conference on Learning Representations

Journal ISSN

Volume Title

Publisher

Rights

All rights reserved