Particle Belief Propagation

Alexander Ihler, David McAllester
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:256-263, 2009.

Abstract

The popularity of particle filtering for inference in Markov chain models defined over random variables with very large or continuous domains makes it natural to consider sample–based versions of belief propagation (BP) for more general (tree–structured or loopy) graphs. Already, several such algorithms have been proposed in the literature. However, many questions remain open about the behavior of particle–based BP algorithms, and little theory has been developed to analyze their performance. In this paper, we describe a generic particle belief propagation (PBP) algorithm which is closely related to previously proposed methods. We prove that this algorithm is consistent, approaching the true BP messages as the number of samples grows large. We then use concentration bounds to analyze the finite-sample behavior and give a convergence rate for the algorithm on tree–structured graphs. Our convergence rate is $O(1/\sqrt{n})$ where n is the number of samples, independent of the domain size of the variables.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-ihler09a, title = {Particle Belief Propagation}, author = {Ihler, Alexander and McAllester, David}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {256--263}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/ihler09a/ihler09a.pdf}, url = {https://proceedings.mlr.press/v5/ihler09a.html}, abstract = {The popularity of particle filtering for inference in Markov chain models defined over random variables with very large or continuous domains makes it natural to consider sample–based versions of belief propagation (BP) for more general (tree–structured or loopy) graphs. Already, several such algorithms have been proposed in the literature. However, many questions remain open about the behavior of particle–based BP algorithms, and little theory has been developed to analyze their performance. In this paper, we describe a generic particle belief propagation (PBP) algorithm which is closely related to previously proposed methods. We prove that this algorithm is consistent, approaching the true BP messages as the number of samples grows large. We then use concentration bounds to analyze the finite-sample behavior and give a convergence rate for the algorithm on tree–structured graphs. Our convergence rate is $O(1/\sqrt{n})$ where n is the number of samples, independent of the domain size of the variables.} }
Endnote
%0 Conference Paper %T Particle Belief Propagation %A Alexander Ihler %A David McAllester %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-ihler09a %I PMLR %P 256--263 %U https://proceedings.mlr.press/v5/ihler09a.html %V 5 %X The popularity of particle filtering for inference in Markov chain models defined over random variables with very large or continuous domains makes it natural to consider sample–based versions of belief propagation (BP) for more general (tree–structured or loopy) graphs. Already, several such algorithms have been proposed in the literature. However, many questions remain open about the behavior of particle–based BP algorithms, and little theory has been developed to analyze their performance. In this paper, we describe a generic particle belief propagation (PBP) algorithm which is closely related to previously proposed methods. We prove that this algorithm is consistent, approaching the true BP messages as the number of samples grows large. We then use concentration bounds to analyze the finite-sample behavior and give a convergence rate for the algorithm on tree–structured graphs. Our convergence rate is $O(1/\sqrt{n})$ where n is the number of samples, independent of the domain size of the variables.
RIS
TY - CPAPER TI - Particle Belief Propagation AU - Alexander Ihler AU - David McAllester BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-ihler09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 256 EP - 263 L1 - http://proceedings.mlr.press/v5/ihler09a/ihler09a.pdf UR - https://proceedings.mlr.press/v5/ihler09a.html AB - The popularity of particle filtering for inference in Markov chain models defined over random variables with very large or continuous domains makes it natural to consider sample–based versions of belief propagation (BP) for more general (tree–structured or loopy) graphs. Already, several such algorithms have been proposed in the literature. However, many questions remain open about the behavior of particle–based BP algorithms, and little theory has been developed to analyze their performance. In this paper, we describe a generic particle belief propagation (PBP) algorithm which is closely related to previously proposed methods. We prove that this algorithm is consistent, approaching the true BP messages as the number of samples grows large. We then use concentration bounds to analyze the finite-sample behavior and give a convergence rate for the algorithm on tree–structured graphs. Our convergence rate is $O(1/\sqrt{n})$ where n is the number of samples, independent of the domain size of the variables. ER -
APA
Ihler, A. & McAllester, D.. (2009). Particle Belief Propagation. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:256-263 Available from https://proceedings.mlr.press/v5/ihler09a.html.

Related Material