Learning Low-order Models for Enforcing High-order Statistics

Patrick Pletscher, Pushmeet Kohli
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:886-894, 2012.

Abstract

Models such as pairwise conditional random fields (CRFs) are extremely popular in computer vision and various other machine learning disciplines. However, they have limited expressive power and often cannot represent the posterior distribution correctly. While learning the parameters of such models which have insufficient expressivity, researchers use loss functions to penalize certain misrepresentations of the solution space. Till now, researchers have used only simplistic loss functions such as the hamming loss, to enable efficient inference. The paper shows how sophisticated and useful higher order loss functions can be incorporated in the learning process. These loss functions ensure that the MAP solution does not deviate much from the ground truth in terms of certain \emphhigher order statistics. We propose a learning algorithm which uses the recently proposed lower-envelope representation of higher order functions to transform them to pairwise functions, which allow efficient inference. We test the efficacy of our method on the problem of foreground-background image segmentation. Experimental results show that the incorporation of higher order loss functions in the learning formulation using our method leads to much better results compared to those obtained by using the traditional hamming loss.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-pletscher12b, title = {Learning Low-order Models for Enforcing High-order Statistics}, author = {Pletscher, Patrick and Kohli, Pushmeet}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {886--894}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/pletscher12b/pletscher12b.pdf}, url = {https://proceedings.mlr.press/v22/pletscher12b.html}, abstract = {Models such as pairwise conditional random fields (CRFs) are extremely popular in computer vision and various other machine learning disciplines. However, they have limited expressive power and often cannot represent the posterior distribution correctly. While learning the parameters of such models which have insufficient expressivity, researchers use loss functions to penalize certain misrepresentations of the solution space. Till now, researchers have used only simplistic loss functions such as the hamming loss, to enable efficient inference. The paper shows how sophisticated and useful higher order loss functions can be incorporated in the learning process. These loss functions ensure that the MAP solution does not deviate much from the ground truth in terms of certain \emphhigher order statistics. We propose a learning algorithm which uses the recently proposed lower-envelope representation of higher order functions to transform them to pairwise functions, which allow efficient inference. We test the efficacy of our method on the problem of foreground-background image segmentation. Experimental results show that the incorporation of higher order loss functions in the learning formulation using our method leads to much better results compared to those obtained by using the traditional hamming loss.} }
Endnote
%0 Conference Paper %T Learning Low-order Models for Enforcing High-order Statistics %A Patrick Pletscher %A Pushmeet Kohli %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-pletscher12b %I PMLR %P 886--894 %U https://proceedings.mlr.press/v22/pletscher12b.html %V 22 %X Models such as pairwise conditional random fields (CRFs) are extremely popular in computer vision and various other machine learning disciplines. However, they have limited expressive power and often cannot represent the posterior distribution correctly. While learning the parameters of such models which have insufficient expressivity, researchers use loss functions to penalize certain misrepresentations of the solution space. Till now, researchers have used only simplistic loss functions such as the hamming loss, to enable efficient inference. The paper shows how sophisticated and useful higher order loss functions can be incorporated in the learning process. These loss functions ensure that the MAP solution does not deviate much from the ground truth in terms of certain \emphhigher order statistics. We propose a learning algorithm which uses the recently proposed lower-envelope representation of higher order functions to transform them to pairwise functions, which allow efficient inference. We test the efficacy of our method on the problem of foreground-background image segmentation. Experimental results show that the incorporation of higher order loss functions in the learning formulation using our method leads to much better results compared to those obtained by using the traditional hamming loss.
RIS
TY - CPAPER TI - Learning Low-order Models for Enforcing High-order Statistics AU - Patrick Pletscher AU - Pushmeet Kohli BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-pletscher12b PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 886 EP - 894 L1 - http://proceedings.mlr.press/v22/pletscher12b/pletscher12b.pdf UR - https://proceedings.mlr.press/v22/pletscher12b.html AB - Models such as pairwise conditional random fields (CRFs) are extremely popular in computer vision and various other machine learning disciplines. However, they have limited expressive power and often cannot represent the posterior distribution correctly. While learning the parameters of such models which have insufficient expressivity, researchers use loss functions to penalize certain misrepresentations of the solution space. Till now, researchers have used only simplistic loss functions such as the hamming loss, to enable efficient inference. The paper shows how sophisticated and useful higher order loss functions can be incorporated in the learning process. These loss functions ensure that the MAP solution does not deviate much from the ground truth in terms of certain \emphhigher order statistics. We propose a learning algorithm which uses the recently proposed lower-envelope representation of higher order functions to transform them to pairwise functions, which allow efficient inference. We test the efficacy of our method on the problem of foreground-background image segmentation. Experimental results show that the incorporation of higher order loss functions in the learning formulation using our method leads to much better results compared to those obtained by using the traditional hamming loss. ER -
APA
Pletscher, P. & Kohli, P.. (2012). Learning Low-order Models for Enforcing High-order Statistics. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:886-894 Available from https://proceedings.mlr.press/v22/pletscher12b.html.

Related Material