Probabilistic Machine Learning for Circular Statistics: Models and inference using the Multivariate Generalised von Mises distribution
View / Open Files
Authors
Advisors
Turner, Richard
Date
2019-04-27Awarding Institution
University of Cambridge
Author Affiliation
Engineering Deparment
Qualification
Doctor of Philosophy (PhD)
Language
English
Type
Thesis
Metadata
Show full item recordCitation
Wu Navarro, A. K. (2019). Probabilistic Machine Learning for Circular Statistics: Models and inference using the Multivariate Generalised von Mises distribution (Doctoral thesis). https://doi.org/10.17863/CAM.26449
Abstract
Probabilistic machine learning and circular statistics—the branch of statistics
concerned with data as angles and directions—are two research communities
that have grown mostly in isolation from one another. On the one hand, probabilistic
machine learning community has developed powerful frameworks for
problems whose data lives on Euclidean spaces, such as Gaussian Processes, but
have generally neglected other topologies studied by circular statistics. On the
other hand, the approximate inference frameworks from probabilistic machine
learning have only recently started to the circular statistics landscape. This
thesis intends to redress the gap between these two fields by contributing to
both fields with models and approximate inference algorithms. In particular, we
introduce the multivariate Generalised von Mises distribution (mGvM), which
allows the use of kernels in circular statistics akin to Gaussian Processes, and an
augmented representation. These models account for a vast number of applications
comprising both latent variable modelling and regression of circular data.
Then, we propose methods to conduct approximate inference on these models.
In particular, we investigate the use of Variational Inference, Expectation Propagation
and Markov chain Monte Carlo methods. The variational inference route
taken was a mean field approach to efficiently leverage the mGvM tractable
conditionals and create a baseline for comparison with other methods. Then,
an Expectation Propagation approach is presented drawing on the Expectation
Consistent Framework for Ising models and connecting the approximations used
to the augmented model presented. In the final MCMC chapter, efficient Gibbs
and Hamiltonian Monte Carlo samplers are derived for the mGvM and the augmented
model.
Keywords
Machine Learning, Circular Statistics, von Mises distribution, Gaussian Processes, Probabilistic models, Approximate Inference
Sponsorship
Cambridge Trusts
CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Identifiers
This record's DOI: https://doi.org/10.17863/CAM.26449
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
Licence URL: https://creativecommons.org/licenses/by-nc-nd/4.0/
Statistics
Total file downloads (since January 2020). For more information on metrics see the
IRUS guide.
Recommended or similar items
The current recommendation prototype on the Apollo Repository will be turned off on 03 February 2023. Although the pilot has been fruitful for both parties, the service provider IKVA is focusing on horizon scanning products and so the recommender service can no longer be supported. We recognise the importance of recommender services in supporting research discovery and are evaluating offerings from other service providers. If you would like to offer feedback on this decision please contact us on: support@repository.cam.ac.uk