Repository logo
 

Self-Distribution Distillation: Efficient Uncertainty Estimation

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Fathullah, Y 
Gales, MJF 

Abstract

Deep learning is increasingly being applied in safety-critical domains. For these scenarios it is important to know the level of uncertainty in a model’s prediction to ensure appropriate decisions are made by the system. Deep ensembles are the de-facto standard approach to obtaining various measures of uncertainty. However, ensembles often significantly increase the resources required in the training and/or deployment phases. Approaches have been developed that typically address the costs in one of these phases. In this work we propose a novel training approach, self-distribution distillation (S2D), which is able to efficiently train a single model that can estimate uncertainties. Furthermore it is possible to build ensembles of these models and apply hierarchical ensemble distillation approaches. Experiments on CIFAR-100 showed that S2D models outperformed standard models and Monte-Carlo dropout. Additional out-of-distribution detection experiments on LSUN, Tiny ImageNet, SVHN showed that even a stan- dard deep ensemble can be outperformed using S2D based ensembles and novel distilled models.

Description

Keywords

Journal Title

Proceedings of the 38th Conference on Uncertainty in Artificial Intelligence, UAI 2022

Conference Name

Conference on Uncertainty in Artificial Intelligence

Journal ISSN

Volume Title

Publisher