A reversible infinite HMM using normalised random measures

David Knowles, Zoubin Ghahramani, Konstantina Palla
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1998-2006, 2014.

Abstract

We present a nonparametric prior over reversible Markov chains. We use completely random measures, specifically gamma processes, to construct a countably infinite graph with weighted edges. By enforcing symmetry to make the edges undirected we define a prior over random walks on graphs that results in a reversible Markov chain. The resulting prior over infinite transition matrices is closely related to the hierarchical Dirichlet process but enforces reversibility. A reinforcement scheme has recently been proposed with similar properties, but the de Finetti measure is not well characterised. We take the alternative approach of explicitly constructing the mixing measure, which allows more straightforward and efficient inference at the cost of no longer having a closed form predictive distribution. We use our process to construct a reversible infinite HMM which we apply to two real datasets, one from epigenomics and one ion channel recording.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-knowles14, title = {A reversible infinite HMM using normalised random measures}, author = {Knowles, David and Ghahramani, Zoubin and Palla, Konstantina}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1998--2006}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/knowles14.pdf}, url = {https://proceedings.mlr.press/v32/knowles14.html}, abstract = {We present a nonparametric prior over reversible Markov chains. We use completely random measures, specifically gamma processes, to construct a countably infinite graph with weighted edges. By enforcing symmetry to make the edges undirected we define a prior over random walks on graphs that results in a reversible Markov chain. The resulting prior over infinite transition matrices is closely related to the hierarchical Dirichlet process but enforces reversibility. A reinforcement scheme has recently been proposed with similar properties, but the de Finetti measure is not well characterised. We take the alternative approach of explicitly constructing the mixing measure, which allows more straightforward and efficient inference at the cost of no longer having a closed form predictive distribution. We use our process to construct a reversible infinite HMM which we apply to two real datasets, one from epigenomics and one ion channel recording.} }
Endnote
%0 Conference Paper %T A reversible infinite HMM using normalised random measures %A David Knowles %A Zoubin Ghahramani %A Konstantina Palla %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-knowles14 %I PMLR %P 1998--2006 %U https://proceedings.mlr.press/v32/knowles14.html %V 32 %N 2 %X We present a nonparametric prior over reversible Markov chains. We use completely random measures, specifically gamma processes, to construct a countably infinite graph with weighted edges. By enforcing symmetry to make the edges undirected we define a prior over random walks on graphs that results in a reversible Markov chain. The resulting prior over infinite transition matrices is closely related to the hierarchical Dirichlet process but enforces reversibility. A reinforcement scheme has recently been proposed with similar properties, but the de Finetti measure is not well characterised. We take the alternative approach of explicitly constructing the mixing measure, which allows more straightforward and efficient inference at the cost of no longer having a closed form predictive distribution. We use our process to construct a reversible infinite HMM which we apply to two real datasets, one from epigenomics and one ion channel recording.
RIS
TY - CPAPER TI - A reversible infinite HMM using normalised random measures AU - David Knowles AU - Zoubin Ghahramani AU - Konstantina Palla BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-knowles14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 1998 EP - 2006 L1 - http://proceedings.mlr.press/v32/knowles14.pdf UR - https://proceedings.mlr.press/v32/knowles14.html AB - We present a nonparametric prior over reversible Markov chains. We use completely random measures, specifically gamma processes, to construct a countably infinite graph with weighted edges. By enforcing symmetry to make the edges undirected we define a prior over random walks on graphs that results in a reversible Markov chain. The resulting prior over infinite transition matrices is closely related to the hierarchical Dirichlet process but enforces reversibility. A reinforcement scheme has recently been proposed with similar properties, but the de Finetti measure is not well characterised. We take the alternative approach of explicitly constructing the mixing measure, which allows more straightforward and efficient inference at the cost of no longer having a closed form predictive distribution. We use our process to construct a reversible infinite HMM which we apply to two real datasets, one from epigenomics and one ion channel recording. ER -
APA
Knowles, D., Ghahramani, Z. & Palla, K.. (2014). A reversible infinite HMM using normalised random measures. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):1998-2006 Available from https://proceedings.mlr.press/v32/knowles14.html.

Related Material