Part & Clamp: Efficient Structured Output Learning

Patrick Pletscher, Cheng Soon Ong
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:877-885, 2012.

Abstract

Discriminative training for general graphical models is a challenging task, due to the intractability of the partition function. We propose a computationally efficient approach to estimate the partition sum in a structured learning problem. The key idea is a lower bound of the partition sum that can be evaluated in a fixed number of message passing iterations. The bound makes use of a subset of the variables, a feedback vertex set, which allows us to decompose the graph into tractable parts. Furthermore, a tightening strategy for the bound is presented, which finds the states of the feedback vertex set that maximally increase the bound, and clamps them. Based on this lower bound we derive batch and online learning algorithms and demonstrate their effectiveness on a computer vision problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-pletscher12a, title = {Part & Clamp: Efficient Structured Output Learning}, author = {Pletscher, Patrick and Ong, Cheng Soon}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {877--885}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/pletscher12a/pletscher12a.pdf}, url = {https://proceedings.mlr.press/v22/pletscher12a.html}, abstract = {Discriminative training for general graphical models is a challenging task, due to the intractability of the partition function. We propose a computationally efficient approach to estimate the partition sum in a structured learning problem. The key idea is a lower bound of the partition sum that can be evaluated in a fixed number of message passing iterations. The bound makes use of a subset of the variables, a feedback vertex set, which allows us to decompose the graph into tractable parts. Furthermore, a tightening strategy for the bound is presented, which finds the states of the feedback vertex set that maximally increase the bound, and clamps them. Based on this lower bound we derive batch and online learning algorithms and demonstrate their effectiveness on a computer vision problem.} }
Endnote
%0 Conference Paper %T Part & Clamp: Efficient Structured Output Learning %A Patrick Pletscher %A Cheng Soon Ong %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-pletscher12a %I PMLR %P 877--885 %U https://proceedings.mlr.press/v22/pletscher12a.html %V 22 %X Discriminative training for general graphical models is a challenging task, due to the intractability of the partition function. We propose a computationally efficient approach to estimate the partition sum in a structured learning problem. The key idea is a lower bound of the partition sum that can be evaluated in a fixed number of message passing iterations. The bound makes use of a subset of the variables, a feedback vertex set, which allows us to decompose the graph into tractable parts. Furthermore, a tightening strategy for the bound is presented, which finds the states of the feedback vertex set that maximally increase the bound, and clamps them. Based on this lower bound we derive batch and online learning algorithms and demonstrate their effectiveness on a computer vision problem.
RIS
TY - CPAPER TI - Part & Clamp: Efficient Structured Output Learning AU - Patrick Pletscher AU - Cheng Soon Ong BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-pletscher12a PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 877 EP - 885 L1 - http://proceedings.mlr.press/v22/pletscher12a/pletscher12a.pdf UR - https://proceedings.mlr.press/v22/pletscher12a.html AB - Discriminative training for general graphical models is a challenging task, due to the intractability of the partition function. We propose a computationally efficient approach to estimate the partition sum in a structured learning problem. The key idea is a lower bound of the partition sum that can be evaluated in a fixed number of message passing iterations. The bound makes use of a subset of the variables, a feedback vertex set, which allows us to decompose the graph into tractable parts. Furthermore, a tightening strategy for the bound is presented, which finds the states of the feedback vertex set that maximally increase the bound, and clamps them. Based on this lower bound we derive batch and online learning algorithms and demonstrate their effectiveness on a computer vision problem. ER -
APA
Pletscher, P. & Ong, C.S.. (2012). Part & Clamp: Efficient Structured Output Learning. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:877-885 Available from https://proceedings.mlr.press/v22/pletscher12a.html.

Related Material