Self-Distribution Distillation: Efficient Uncertainty Estimation
View / Open Files
Authors
Fathullah, Y
Gales, MJF
Journal Title
Proceedings of the 38th Conference on Uncertainty in Artificial Intelligence, UAI 2022
Conference Name
Conference on Uncertainty in Artificial Intelligence
ISBN
9781713863298
Type
Conference Object
This Version
AM
Metadata
Show full item recordCitation
Fathullah, Y., & Gales, M. Self-Distribution Distillation: Efficient Uncertainty Estimation. Proceedings of the 38th Conference on Uncertainty in Artificial Intelligence, UAI 2022 https://doi.org/10.17863/CAM.85540
Abstract
Deep learning is increasingly being applied in safety-critical domains. For these scenarios it is important to know the level of uncertainty in a model’s prediction to ensure appropriate decisions are made by the system. Deep ensembles are the de-facto standard approach to obtaining various measures of uncertainty. However, ensembles often significantly increase the resources required in the training and/or deployment phases. Approaches have been developed that typically address the costs in one of these phases. In this work we propose a novel training approach, self-distribution distillation (S2D), which is able to efficiently train a single model that can estimate uncertainties. Furthermore it is possible to build ensembles of these models and apply hierarchical ensemble distillation approaches. Experiments on CIFAR-100 showed that S2D models outperformed standard models and Monte-Carlo dropout. Additional out-of-distribution detection experiments on LSUN, Tiny ImageNet, SVHN showed that even a stan- dard deep ensemble can be outperformed using S2D based ensembles and novel distilled models.
Embargo Lift Date
2023-06-15
Identifiers
External DOI: https://doi.org/10.17863/CAM.85540
This record's URL: https://www.repository.cam.ac.uk/handle/1810/338131
Statistics
Total file downloads (since January 2020). For more information on metrics see the
IRUS guide.
Recommended or similar items
The current recommendation prototype on the Apollo Repository will be turned off on 03 February 2023. Although the pilot has been fruitful for both parties, the service provider IKVA is focusing on horizon scanning products and so the recommender service can no longer be supported. We recognise the importance of recommender services in supporting research discovery and are evaluating offerings from other service providers. If you would like to offer feedback on this decision please contact us on: support@repository.cam.ac.uk