Repository logo
 

Bayesian pseudocoresets

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Manousakas, Dionysios  ORCID logo  https://orcid.org/0000-0002-3751-8781
Xu, Z 
Campbell, T 

Abstract

Standard Bayesian inference algorithms are prohibitively expensive in the regime of modern large-scale data. Recent work has found that a small, weighted subset of data (a coreset) may be used in place of the full dataset during inference, taking advantage of data redundancy to reduce computational cost. However, this approach has limitations in the increasingly common setting of sensitive, high-dimensional data. Indeed, we prove that there are situations in which the Kullback-Leibler (KL) divergence between the optimal coreset and the true posterior grows with data dimension; and as coresets include a subset of the original data, they cannot be constructed in a manner that preserves individual privacy. We address both of these issues with a single unified solution, Bayesian pseudocoresets—a small weighted collection of synthetic “pseudodata”—along with a variational optimization method to select both pseudodata and weights. The use of pseudodata (as opposed to the original datapoints) enables both the summarization of high-dimensional data and the differentially private summarization of sensitive data. Real and synthetic experiments on high-dimensional data demonstrate that Bayesian pseudocoresets achieve significant improvements in posterior approximation error compared to traditional coresets, and that pseudocoresets provide privacy without a significant loss in approximation quality.

Description

Keywords

Journal Title

Advances in Neural Information Processing Systems

Conference Name

Thirty-fourth Conference on Neural Information Processing Systems

Journal ISSN

1049-5258

Volume Title

2020-December

Publisher

Rights

All rights reserved
Sponsorship
NSERC Discovery Grant, NSERC Discovery Launch Supplement, Nokia Bell Labs, Lundgren Fund, Darwin College Cambridge.