Repository logo
 

Turing: A language for flexible probabilistic inference

Published version
Peer-reviewed

Type

Conference Object

Change log

Authors

Ge, H 
Xu, K 

Abstract

Probabilistic programming is becoming an attractive approach to probabilistic machine learning. Through relieving researchers from the tedious burden of hand-deriving inference algorithms, not only does it enable the development of more accurate and interpretable models but it also encourages reproducible research. However, successful probabilistic programming systems require flexible, generic and efficient inference engines. In this work, we present a system called Turing for flexible composable probabilistic programming inference. Turing has an intuitive modelling syntax and supports a wide range of sampling-based inference algorithms. Most importantly, Turing inference is composable: it combines Markov chain sampling operations on subsets of model variables, e.g. using a combination of a Hamiltonian Monte Carlo (HMC) engine and a particle Gibbs (PG) engine. This composable inference engine allows the user to easily switch between black-box style inference methods such as HMC and customized inference methods. Our aim is to present Turing and its composable inference engines to the community and encourage other researchers to build on this system to help advance the field of probabilistic machine learning.

Description

Keywords

Journal Title

Proceedings of Machine Learning Research

Conference Name

21st International Conference on Artificial Intelligence and Statistics (AISTATS 2018)

Journal ISSN

2640-3498

Volume Title

84

Publisher

Proceedings of Machine Learning Research

Rights

All rights reserved
Sponsorship
HG and ZG acknowledge support from the Alan Turing Institute (EPSRC Grant EP/N510129/1) and EPSRC Grant EP/N014162/1, and donations from Google and Microsoft Research.