Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-differentiable Priors
Accepted version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
The use of nondifferentiable priors in Bayesian statistics has become increasingly popular, in particular in Bayesian imaging analysis. Current state-of-the-art methods are approximate in the sense that they replace the posterior with a smooth approximation via Moreau-Yosida envelopes, and apply gradient-based discretized diffusions to sample from the resulting distribution. We characterize the error of the Moreau-Yosida approximation and propose a novel implementation using underdamped Langevin dynamics. In misson-critical cases, however, replacing the posterior with an approximation may not be a viable option. Instead, we show that piecewise-deterministic Markov processes (PDMP) can be used for exact posterior inference from distributions satisfying almost everywhere differentiability. Furthermore, in contrast with diffusion-based methods, the suggested PDMP-based samplers place no assumptions on the prior shape, nor require access to a computationally cheap proximal operator, and consequently have a much broader scope of application. Through detailed numerical examples, including a nondifferentiable circular distribution and a nonconvex genomics model, we elucidate the relative strengths of these sampling methods on problems of moderate to high dimensions, underlining the benefits of PDMP-based methods when accurate sampling is decisive. Supplementary materials for this article are available online.
Description
Keywords
Journal Title
Conference Name
Journal ISSN
1537-274X