Depth uncertainty in neural networks


Type
Conference Object
Change log
Authors
Antorán, J 
Allingham, JU 
Hernández-Lobato, JM 
Abstract

Existing methods for estimating uncertainty in deep learning tend to require multiple forward passes, making them unsuitable for applications where computational resources are limited. To solve this, we perform probabilistic reasoning over the depth of neural networks. Different depths correspond to subnetworks which share weights and whose predictions are combined via marginalisation, yielding model uncertainty. By exploiting the sequential structure of feed-forward networks, we are able to both evaluate our training objective and make predictions with a single forward pass. We validate our approach on real-world regression and image classification tasks. Our approach provides uncertainty calibration, robustness to dataset shift, and accuracies competitive with more computationally expensive baselines.

Description
Keywords
Journal Title
Advances in Neural Information Processing Systems
Conference Name
34th Conference on Neural Information Processing Systems (NeurIPS 2020)
Journal ISSN
1049-5258
Volume Title
2020-December
Publisher
Rights
All rights reserved