Repository logo
 

Fast and Flexible Multi-Task Classification Using Conditional Neural Adaptive Processes

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Requeima, James 
Gordon, Jonathan 
Nowozin, Sebastian 
Turner, Richard E 

Abstract

The goal of this paper is to design image classification systems that, after an initial multi-task training phase, can automatically adapt to new tasks encountered at test time. We introduce a conditional neural process based approach to the multi-task classification setting for this purpose, and establish connections to the meta-learning and few-shot learning literature. The resulting approach, called CNAPs, comprises a classifier whose parameters are modulated by an adaptation network that takes the current task's dataset as input. We demonstrate that CNAPs achieves state-of-the-art results on the challenging Meta-Dataset benchmark indicating high-quality transfer-learning. We show that the approach is robust, avoiding both over-fitting in low-shot regimes and under-fitting in high-shot regimes. Timing experiments reveal that CNAPs is computationally efficient at test-time as it does not involve gradient based adaptation. Finally, we show that trained models are immediately deployable to continual learning and active learning where they can outperform existing approaches that do not leverage transfer learning.

Description

Keywords

stat.ML, stat.ML, cs.LG

Journal Title

Advances in Neural Information Processing Systems 32 (NIPS 2019)

Conference Name

33rd Conference on Neural Information Processing Systems (NeurIPS 2019)

Journal ISSN

1049-5258

Volume Title

Publisher

Neural Information Processing Systems Foundation

Rights

All rights reserved