Repository logo
 

Convolutional Conditional Neural Processes

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Gordon, Jonathan 
Bruinsma, Wessel P 
Foong, Andrew YK 
Requeima, James 
Dubois, Yann 

Abstract

We introduce the Convolutional Conditional Neural Process (ConvCNP), a new member of the Neural Process family that models translation equivariance in the data. Translation equivariance is an important inductive bias for many learning problems including time series modelling, spatial data, and images. The model embeds data sets into an infinite-dimensional function space as opposed to a finite-dimensional vector space. To formalize this notion, we extend the theory of neural representations of sets to include functional representations, and demonstrate that any translation-equivariant embedding can be represented using a convolutional deep set. We evaluate ConvCNPs in several settings, demonstrating that they achieve state-of-the-art performance compared to existing NPs. We demonstrate that building in translation equivariance enables zero-shot generalization to challenging, out-of-domain tasks.

Description

Keywords

stat.ML, stat.ML, cs.LG

Journal Title

Conference Name

International Conference on Learning Representations (ICLR) 2020

Journal ISSN

Volume Title

Publisher

Rights

All rights reserved