Variational continual learning
This paper develops variational continual learning (VCL), a simple but general framework for continual learning that fuses online variational inference (VI) and recent advances in Monte Carlo VI for neural networks. The framework can suc- cessfully train both deep discriminative models and deep generative models in complex continual learning settings where existing tasks evolve over time and en- tirely new tasks emerge. Experimental results show that VCL outperforms state- of-the-art continual learning methods on a variety of tasks, avoiding catastrophic forgetting in a fully automatic way.
Engineering and Physical Sciences Research Council (EP/M026957/1)