Repository logo
 

Active Slices for Sliced Stein Discrepancy

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Gong, W 
Zhang, K 
Li, Y 
Hernández-Lobato, JM 

Abstract

Sliced Stein discrepancy (SSD) and its kernelized variants have demonstrated promising successes in goodness-of-fit tests and model learning in high dimensions. Despite their theoretical elegance, their empirical performance depends crucially on the search of optimal slicing directions to discriminate between two distributions. Unfortunately, previous gradient-based optimisation approaches for this task return sub-optimal results: they are computationally expensive, sensitive to initialization, and they lack theoretical guarantees for convergence. We address these issues in two steps. First, we provide theoretical results stating that the requirement of using optimal slicing directions in the kernelized version of SSD can be relaxed, validating the resulting discrepancy with finite random slicing directions. Second, given that good slicing directions are crucial for practical performance, we propose a fast algorithm for finding such slicing directions based on ideas of active sub-space construction and spectral decomposition. Experiments on goodness-of-fit tests and model learning show that our approach achieves both improved performance and faster convergence. Especially, we demonstrate a 14-80x speed-up in goodness-of-fit tests when comparing with gradient-based alternatives.

Description

Keywords

Journal Title

Proceedings of Machine Learning Research

Conference Name

Thirty-eighth International Conference on Machine Learning (ICML 2021)

Journal ISSN

2640-3498
2640-3498

Volume Title

139

Publisher

Rights

All rights reserved