On learning parametric-output HMMs

Aryeh Kontorovich, Boaz Nadler, Roi Weiss
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):702-710, 2013.

Abstract

We present a novel approach to learning an HMM whose outputs are distributed according to a parametric family. This is done by \em decoupling the learning task into two steps: first estimating the output parameters, and then estimating the hidden states transition probabilities. The first step is accomplished by fitting a mixture model to the output stationary distribution. Given the parameters of this mixture model, the second step is formulated as the solution of an easily solvable convex quadratic program. We provide an error analysis for the estimated transition probabilities and show they are robust to small perturbations in the estimates of the mixture parameters. Finally, we support our analysis with some encouraging empirical results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-kontorovich13, title = {On learning parametric-output HMMs}, author = {Kontorovich, Aryeh and Nadler, Boaz and Weiss, Roi}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {702--710}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/kontorovich13.pdf}, url = {https://proceedings.mlr.press/v28/kontorovich13.html}, abstract = {We present a novel approach to learning an HMM whose outputs are distributed according to a parametric family. This is done by \em decoupling the learning task into two steps: first estimating the output parameters, and then estimating the hidden states transition probabilities. The first step is accomplished by fitting a mixture model to the output stationary distribution. Given the parameters of this mixture model, the second step is formulated as the solution of an easily solvable convex quadratic program. We provide an error analysis for the estimated transition probabilities and show they are robust to small perturbations in the estimates of the mixture parameters. Finally, we support our analysis with some encouraging empirical results.} }
Endnote
%0 Conference Paper %T On learning parametric-output HMMs %A Aryeh Kontorovich %A Boaz Nadler %A Roi Weiss %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-kontorovich13 %I PMLR %P 702--710 %U https://proceedings.mlr.press/v28/kontorovich13.html %V 28 %N 3 %X We present a novel approach to learning an HMM whose outputs are distributed according to a parametric family. This is done by \em decoupling the learning task into two steps: first estimating the output parameters, and then estimating the hidden states transition probabilities. The first step is accomplished by fitting a mixture model to the output stationary distribution. Given the parameters of this mixture model, the second step is formulated as the solution of an easily solvable convex quadratic program. We provide an error analysis for the estimated transition probabilities and show they are robust to small perturbations in the estimates of the mixture parameters. Finally, we support our analysis with some encouraging empirical results.
RIS
TY - CPAPER TI - On learning parametric-output HMMs AU - Aryeh Kontorovich AU - Boaz Nadler AU - Roi Weiss BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-kontorovich13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 702 EP - 710 L1 - http://proceedings.mlr.press/v28/kontorovich13.pdf UR - https://proceedings.mlr.press/v28/kontorovich13.html AB - We present a novel approach to learning an HMM whose outputs are distributed according to a parametric family. This is done by \em decoupling the learning task into two steps: first estimating the output parameters, and then estimating the hidden states transition probabilities. The first step is accomplished by fitting a mixture model to the output stationary distribution. Given the parameters of this mixture model, the second step is formulated as the solution of an easily solvable convex quadratic program. We provide an error analysis for the estimated transition probabilities and show they are robust to small perturbations in the estimates of the mixture parameters. Finally, we support our analysis with some encouraging empirical results. ER -
APA
Kontorovich, A., Nadler, B. & Weiss, R.. (2013). On learning parametric-output HMMs. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):702-710 Available from https://proceedings.mlr.press/v28/kontorovich13.html.

Related Material