Repository logo
 

Variational Bayesian dropout: pitfalls and fixes

cam.issuedOnline2018-07-16
dc.contributor.authorMatthews, Alexander G de G
dc.contributor.authorGhahramani, Zoubin
dc.contributor.orcidMatthews, Alexander [0000-0002-8552-3526]
dc.date.accessioned2018-09-24T08:47:32Z
dc.date.available2018-09-24T08:47:32Z
dc.date.issued2018
dc.description.abstractDropout, a stochastic regularisation technique for training of neural networks, has recently been reinterpreted as a specific type of approximate inference algorithm for Bayesian neural networks. The main contribution of the reinterpretation is in providing a theoretical framework useful for analysing and extending the algorithm. We show that the proposed framework suffers from several issues; from undefined or pathological behaviour of the true posterior related to use of improper priors, to an ill-defined variational objective due to singularity of the approximating distribution relative to the true posterior. Our analysis of the improper log uniform prior used in variational Gaussian dropout suggests the pathologies are generally irredeemable, and that the algorithm still works only because the variational formulation annuls some of the pathologies. To address the singularity issue, we proffer Quasi-KL (QKL) divergence, a new approximate inference objective for approximation of high-dimensional distributions. We show that motivations for variational Bernoulli dropout based on discretisation and noise have QKL as a limit. Properties of QKL are studied both theoretically and on a simple practical example which shows that the QKL-optimal approximation of a full rank Gaussian with a degenerate one naturally leads to the Principal Component Analysis solution.
dc.description.sponsorshipJiri Hron holds a Nokia CASE Studentship. Alexander Matthews and Zoubin Ghahramani acknowledge the support of EPSRC Grant EP/N014162/1 and EPSRC Grant EP/N510129/1 (The Alan Turing Institute).
dc.identifier.doi10.17863/CAM.28034
dc.identifier.issn2640-3498
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/280670
dc.language.isoeng
dc.publisherProceedings of Machine Learning Research
dc.publisher.urlhttp://proceedings.mlr.press/v80/hron18a.html
dc.subjectstat.ML
dc.subjectstat.ML
dc.subjectcs.LG
dc.titleVariational Bayesian dropout: pitfalls and fixes
dc.typeConference Object
dcterms.dateAccepted2018-05-11
prism.endingPage2033
prism.publicationNameProceedings of Machine Learning Research
prism.startingPage2024
prism.volume80
pubs.conference-finish-date2018-07-15
pubs.conference-nameICML 2018: 35th International Conference on Machine Learning
pubs.conference-start-date2018-07-10
pubs.funder-project-idEPSRC (via University of Sheffield) (143103)
pubs.funder-project-idAlan Turing Institute (EP/N510129/1)
rioxxterms.licenseref.startdate2018-05-11
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.typeConference Paper/Proceeding/Abstract
rioxxterms.versionVoR
rioxxterms.versionofrecord10.17863/CAM.28034

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
paper.pdf
Size:
2.83 MB
Format:
Adobe Portable Document Format
Description:
Published version
Licence
http://www.rioxx.net/licenses/all-rights-reserved
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
DepositLicenceAgreement.pdf
Size:
417.78 KB
Format:
Adobe Portable Document Format