Repository logo
 

Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-differentiable Priors

Accepted version
Peer-reviewed

Type

Article

Change log

Authors

Goldman, JV 
Singh, SS 

Abstract

The use of nondifferentiable priors in Bayesian statistics has become increasingly popular, in particular in Bayesian imaging analysis. Current state-of-the-art methods are approximate in the sense that they replace the posterior with a smooth approximation via Moreau-Yosida envelopes, and apply gradient-based discretized diffusions to sample from the resulting distribution. We characterize the error of the Moreau-Yosida approximation and propose a novel implementation using underdamped Langevin dynamics. In misson-critical cases, however, replacing the posterior with an approximation may not be a viable option. Instead, we show that piecewise-deterministic Markov processes (PDMP) can be used for exact posterior inference from distributions satisfying almost everywhere differentiability. Furthermore, in contrast with diffusion-based methods, the suggested PDMP-based samplers place no assumptions on the prior shape, nor require access to a computationally cheap proximal operator, and consequently have a much broader scope of application. Through detailed numerical examples, including a nondifferentiable circular distribution and a nonconvex genomics model, we elucidate the relative strengths of these sampling methods on problems of moderate to high dimensions, underlining the benefits of PDMP-based methods when accurate sampling is decisive. Supplementary materials for this article are available online.

Description

Keywords

Bayesian imaging, Markov chain Monte Carlo, Markov processes, Proximal operators, Piece-wise deterministic

Journal Title

Journal of the American Statistical Association

Conference Name

Journal ISSN

0162-1459
1537-274X

Volume Title

Publisher

Informa UK Limited

Rights

All rights reserved
Sponsorship
Alan Turing Institute (unknown)