The Extended Parameter Filter

Yusuf Bugra Erol, Lei Li, Bharath Ramsundar, Russell Stuart
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):1103-1111, 2013.

Abstract

The parameters of temporal models, such as dynamic Bayesian networks, may be modelled in a Bayesian context as static or atemporal variables that influence transition probabilities at every time step. Particle filters fail for models that include such variables, while methods that use Gibbs sampling of parameter variables may incur a per-sample cost that grows linearly with the length of the observation sequence. Storvik devised a method for incremental computation of exact sufficient statistics that, for some cases, reduces the per-sample cost to a constant. In this paper, we demonstrate a connection between Storvik’s filter and a Kalman filter in parameter space and establish more general conditions under which Storvik’s filter works. Drawing on an analogy to the extended Kalman filter, we develop and analyze, both theoretically and experimentally, a Taylor approximation to the parameter posterior that allows Storvik’s method to be applied to a broader class of models. Our experiments on both synthetic examples and real applications show improvement over existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-bugraerol13, title = {The Extended Parameter Filter}, author = {Bugra Erol, Yusuf and Li, Lei and Ramsundar, Bharath and Stuart, Russell}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {1103--1111}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/bugraerol13.pdf}, url = {https://proceedings.mlr.press/v28/bugraerol13.html}, abstract = {The parameters of temporal models, such as dynamic Bayesian networks, may be modelled in a Bayesian context as static or atemporal variables that influence transition probabilities at every time step. Particle filters fail for models that include such variables, while methods that use Gibbs sampling of parameter variables may incur a per-sample cost that grows linearly with the length of the observation sequence. Storvik devised a method for incremental computation of exact sufficient statistics that, for some cases, reduces the per-sample cost to a constant. In this paper, we demonstrate a connection between Storvik’s filter and a Kalman filter in parameter space and establish more general conditions under which Storvik’s filter works. Drawing on an analogy to the extended Kalman filter, we develop and analyze, both theoretically and experimentally, a Taylor approximation to the parameter posterior that allows Storvik’s method to be applied to a broader class of models. Our experiments on both synthetic examples and real applications show improvement over existing methods.} }
Endnote
%0 Conference Paper %T The Extended Parameter Filter %A Yusuf Bugra Erol %A Lei Li %A Bharath Ramsundar %A Russell Stuart %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-bugraerol13 %I PMLR %P 1103--1111 %U https://proceedings.mlr.press/v28/bugraerol13.html %V 28 %N 3 %X The parameters of temporal models, such as dynamic Bayesian networks, may be modelled in a Bayesian context as static or atemporal variables that influence transition probabilities at every time step. Particle filters fail for models that include such variables, while methods that use Gibbs sampling of parameter variables may incur a per-sample cost that grows linearly with the length of the observation sequence. Storvik devised a method for incremental computation of exact sufficient statistics that, for some cases, reduces the per-sample cost to a constant. In this paper, we demonstrate a connection between Storvik’s filter and a Kalman filter in parameter space and establish more general conditions under which Storvik’s filter works. Drawing on an analogy to the extended Kalman filter, we develop and analyze, both theoretically and experimentally, a Taylor approximation to the parameter posterior that allows Storvik’s method to be applied to a broader class of models. Our experiments on both synthetic examples and real applications show improvement over existing methods.
RIS
TY - CPAPER TI - The Extended Parameter Filter AU - Yusuf Bugra Erol AU - Lei Li AU - Bharath Ramsundar AU - Russell Stuart BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-bugraerol13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 1103 EP - 1111 L1 - http://proceedings.mlr.press/v28/bugraerol13.pdf UR - https://proceedings.mlr.press/v28/bugraerol13.html AB - The parameters of temporal models, such as dynamic Bayesian networks, may be modelled in a Bayesian context as static or atemporal variables that influence transition probabilities at every time step. Particle filters fail for models that include such variables, while methods that use Gibbs sampling of parameter variables may incur a per-sample cost that grows linearly with the length of the observation sequence. Storvik devised a method for incremental computation of exact sufficient statistics that, for some cases, reduces the per-sample cost to a constant. In this paper, we demonstrate a connection between Storvik’s filter and a Kalman filter in parameter space and establish more general conditions under which Storvik’s filter works. Drawing on an analogy to the extended Kalman filter, we develop and analyze, both theoretically and experimentally, a Taylor approximation to the parameter posterior that allows Storvik’s method to be applied to a broader class of models. Our experiments on both synthetic examples and real applications show improvement over existing methods. ER -
APA
Bugra Erol, Y., Li, L., Ramsundar, B. & Stuart, R.. (2013). The Extended Parameter Filter. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):1103-1111 Available from https://proceedings.mlr.press/v28/bugraerol13.html.

Related Material