Show simple item record

dc.contributor.authorFathullah, Y
dc.contributor.authorGales, MJF
dc.date.accessioned2022-06-15T23:30:06Z
dc.date.available2022-06-15T23:30:06Z
dc.identifier.isbn9781713863298
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/338131
dc.description.abstractDeep learning is increasingly being applied in safety-critical domains. For these scenarios it is important to know the level of uncertainty in a model’s prediction to ensure appropriate decisions are made by the system. Deep ensembles are the de-facto standard approach to obtaining various measures of uncertainty. However, ensembles often significantly increase the resources required in the training and/or deployment phases. Approaches have been developed that typically address the costs in one of these phases. In this work we propose a novel training approach, self-distribution distillation (S2D), which is able to efficiently train a single model that can estimate uncertainties. Furthermore it is possible to build ensembles of these models and apply hierarchical ensemble distillation approaches. Experiments on CIFAR-100 showed that S2D models outperformed standard models and Monte-Carlo dropout. Additional out-of-distribution detection experiments on LSUN, Tiny ImageNet, SVHN showed that even a stan- dard deep ensemble can be outperformed using S2D based ensembles and novel distilled models.
dc.rightsAll Rights Reserved
dc.rights.urihttp://www.rioxx.net/licenses/all-rights-reserved
dc.titleSelf-Distribution Distillation: Efficient Uncertainty Estimation
dc.typeConference Object
dc.publisher.departmentDepartment of Engineering
dc.date.updated2022-05-18T05:02:49Z
prism.publicationNameProceedings of the 38th Conference on Uncertainty in Artificial Intelligence, UAI 2022
dc.identifier.doi10.17863/CAM.85540
dcterms.dateAccepted2022-05-15
rioxxterms.versionofrecord10.17863/CAM.85540
rioxxterms.versionAM
pubs.conference-nameConference on Uncertainty in Artificial Intelligence
pubs.conference-start-date2022-08-01
cam.orpheus.counter23*
cam.depositDate2022-05-18
pubs.conference-finish-date2022-08-05
pubs.licence-identifierapollo-deposit-licence-2-1
pubs.licence-display-nameApollo Repository Deposit Licence Agreement
rioxxterms.freetoread.startdate2023-06-15


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record