Repository logo
 

Variational auto-encoded deep Gaussian processes

Published version
Peer-reviewed

Loading...
Thumbnail Image

Change log

Abstract

© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model. Inference is performed in a novel scalable variational framework where the variational posterior distributions are reparametrized through a multilayer perceptron. The key aspect of this reformulation is that it prevents the proliferation of variational parameters which otherwise grow linearly in proportion to the sample size. We derive a new formulation of the variational lower bound that allows us to distribute most of the computation in a way that enables to handle datasets of the size of mainstream deep learning tasks. We show the efficacy of the method on a variety of challenges including deep unsupervised learning and deep Bayesian optimization.

Description

Keywords

Journal Title

4th International Conference on Learning Representations, ICLR 2016 - Conference Track Proceedings

Conference Name

4th International Conference on Learning Representations

Journal ISSN

Volume Title

Publisher

Rights and licensing

Except where otherwised noted, this item's license is described as All rights reserved