On large lag smoothing for hidden markov models
View / Open Files
Authors
Houssineau, J
Jasra, A
Singh, Sumeetpal
Publication Date
2019-01-01Journal Title
SIAM Journal on Numerical Analysis
ISSN
0036-1429
Publisher
Society for Industrial and Applied Mathematics
Volume
57
Issue
6
Pages
2812-2828
Type
Article
This Version
AM
Metadata
Show full item recordCitation
Houssineau, J., Jasra, A., & Singh, S. (2019). On large lag smoothing for hidden markov models. SIAM Journal on Numerical Analysis, 57 (6), 2812-2828. https://doi.org/10.1137/18M1198004
Abstract
In this article we consider the smoothing problem for hidden Markov models (HMM).
Given a hidden Markov chain $\{X_n\}_{n\geq 0}$ and observations $\{Y_n\}_{n\geq 0}$, our
objective is to compute $\mathbb{E}[\varphi(X_0,\dots,X_k)|y_{0},\dots,y_n]$ for some real-valued, integrable
functional $\varphi$ and $k$ fixed, $k \ll n$ and for some realisation $(y_0,\dots,y_n)$ of $(Y_0,\dots,Y_n)$. We introduce a novel application of the multilevel Monte Carlo (MLMC) method with a coupling
based on the Knothe-Rosenblatt rearrangement. We prove that this method can approximate the afore-mentioned quantity with a mean square error (MSE) of $\mathcal{O}(\epsilon^2)$,
for arbitrary $\epsilon>0$ with a cost of $\mathcal{O}(\epsilon^{-2})$. This is in contrast to the same direct Monte Carlo method, which requires a cost of $\mathcal{O}(n\epsilon^{-2})$ for the same MSE.
The approach we suggest is, in general, not possible to implement, so the optimal transport methodology of \cite{span, parno} is used, which directly
approximates our strategy. We show that our theoretical improvements are achieved, even under approximation, in several numerical examples.
Sponsorship
Alan Turing Institute (unknown)
Identifiers
External DOI: https://doi.org/10.1137/18M1198004
This record's URL: https://www.repository.cam.ac.uk/handle/1810/296419
Rights
All rights reserved