Repository logo
 

Probabilistic Machine Learning for Circular Statistics: Models and inference using the Multivariate Generalised von Mises distribution

cam.restrictionthesis_access_open
cam.supervisorTurner, Richard
cam.thesis.fundingfalse
dc.contributor.authorWu Navarro, Alexandre Khae
dc.contributor.orcidWu Navarro, Alexandre Khae [0000-0003-1452-8773]
dc.date.accessioned2018-09-04T08:19:43Z
dc.date.available2018-09-04T08:19:43Z
dc.date.issued2019-04-27
dc.date.submitted2018-09-03
dc.date.updated2018-09-03T22:12:53Z
dc.description.abstractProbabilistic machine learning and circular statistics—the branch of statistics concerned with data as angles and directions—are two research communities that have grown mostly in isolation from one another. On the one hand, probabilistic machine learning community has developed powerful frameworks for problems whose data lives on Euclidean spaces, such as Gaussian Processes, but have generally neglected other topologies studied by circular statistics. On the other hand, the approximate inference frameworks from probabilistic machine learning have only recently started to the circular statistics landscape. This thesis intends to redress the gap between these two fields by contributing to both fields with models and approximate inference algorithms. In particular, we introduce the multivariate Generalised von Mises distribution (mGvM), which allows the use of kernels in circular statistics akin to Gaussian Processes, and an augmented representation. These models account for a vast number of applications comprising both latent variable modelling and regression of circular data. Then, we propose methods to conduct approximate inference on these models. In particular, we investigate the use of Variational Inference, Expectation Propagation and Markov chain Monte Carlo methods. The variational inference route taken was a mean field approach to efficiently leverage the mGvM tractable conditionals and create a baseline for comparison with other methods. Then, an Expectation Propagation approach is presented drawing on the Expectation Consistent Framework for Ising models and connecting the approximations used to the augmented model presented. In the final MCMC chapter, efficient Gibbs and Hamiltonian Monte Carlo samplers are derived for the mGvM and the augmented model.
dc.description.sponsorshipCambridge Trusts CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
dc.identifier.doi10.17863/CAM.26449
dc.identifier.urihttps://www.repository.cam.ac.uk/handle/1810/279067
dc.language.isoen
dc.publisher.collegeMagdalene College
dc.publisher.departmentEngineering Deparment
dc.publisher.institutionUniversity of Cambridge
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subjectMachine Learning
dc.subjectCircular Statistics
dc.subjectvon Mises distribution
dc.subjectGaussian Processes
dc.subjectProbabilistic models
dc.subjectApproximate Inference
dc.titleProbabilistic Machine Learning for Circular Statistics: Models and inference using the Multivariate Generalised von Mises distribution
dc.typeThesis
dc.type.qualificationlevelDoctoral
dc.type.qualificationnameDoctor of Philosophy (PhD)
dc.type.qualificationtitlePhD in Information Engineering

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
navarro_thesis_final.pdf
Size:
6.53 MB
Format:
Adobe Portable Document Format
Description:
Thesis
Licence
https://creativecommons.org/licenses/by-nc-nd/4.0/
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
3.8 KB
Format:
Item-specific license agreed upon to submission