Repository logo
 

Meta-Learning Probabilistic Inference For Prediction

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Gordon, Jonathan 
Bauer, Matthias 
Nowozin, Sebastian 
Turner, Richard E 

Abstract

This paper introduces a new framework for data efficient and versatile learning. Specifically: 1) We develop ML-PIP, a general framework for Meta-Learning approximate Probabilistic Inference for Prediction. ML-PIP extends existing probabilistic interpretations of meta-learning to cover a broad class of methods. 2) We introduce VERSA, an instance of the framework employing a flexible and versatile amortization network that takes few-shot learning datasets as inputs, with arbitrary numbers of shots, and outputs a distribution over task-specific parameters in a single forward pass. VERSA substitutes optimization at test time with forward passes through inference networks, amortizing the cost of inference and relieving the need for second derivatives during training. 3) We evaluate VERSA on benchmark datasets where the method sets new state-of-the-art results, handles arbitrary numbers of shots, and for classification, arbitrary numbers of classes at train and test time. The power of the approach is then demonstrated through a challenging few-shot ShapeNet view reconstruction task.

Description

Keywords

stat.ML, stat.ML, cs.LG

Journal Title

International Conference on Learning Representations (2019)

Conference Name

Journal ISSN

Volume Title

Publisher

Rights

All rights reserved