Repository logo
 

Variational continual learning

Accepted version
Peer-reviewed

Loading...
Thumbnail Image

Type

Conference Object

Change log

Authors

Turner, RE 
Bui, Thang 
Li, Yingzhen 
Cuong, Nguyen 

Abstract

This paper develops variational continual learning (VCL), a simple but general framework for continual learning that fuses online variational inference (VI) and recent advances in Monte Carlo VI for neural networks. The framework can suc- cessfully train both deep discriminative models and deep generative models in complex continual learning settings where existing tasks evolve over time and en- tirely new tasks emerge. Experimental results show that VCL outperforms state- of-the-art continual learning methods on a variety of tasks, avoiding catastrophic forgetting in a fully automatic way.

Description

Keywords

Journal Title

Conference Name

International Conference on Learning Representations

Journal ISSN

Volume Title

Publisher

Sponsorship
Engineering and Physical Sciences Research Council (EP/L000776/1)
Engineering and Physical Sciences Research Council (EP/M026957/1)
EPSRC grants EP/M0269571 and EP/L000776/1