Robust Forward Algorithms via PAC-Bayes and Laplace Distributions

Asaf Noy, Koby Crammer
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:678-686, 2014.

Abstract

Laplace random variables are commonly used to model extreme noise in many fields, while systems trained to deal with such noises are often characterized by robustness properties. We introduce new learning algorithms that minimize objectives derived directly from PAC-Bayes bounds, incorporating Laplace distributions. The resulting algorithms are regulated by the Huber loss function and are robust to noise, as the Laplace distribution integrated large deviation of parameters. We analyze the convexity properties of the objective, and propose a few bounds which are fully convex, two of which jointly convex in the mean and standard-deviation under certain conditions. We derive new forward algorithms analogous to recent boosting algorithms, providing novel relations between boosting and PAC-Bayes analysis. Experiments show that our algorithms outperforms AdaBoost, L1-LogBoost, and RobustBoost in a wide range of input noise.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-noy14, title = {{Robust Forward Algorithms via PAC-Bayes and Laplace Distributions}}, author = {Noy, Asaf and Crammer, Koby}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {678--686}, year = {2014}, editor = {Kaski, Samuel and Corander, Jukka}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/noy14.pdf}, url = {https://proceedings.mlr.press/v33/noy14.html}, abstract = {Laplace random variables are commonly used to model extreme noise in many fields, while systems trained to deal with such noises are often characterized by robustness properties. We introduce new learning algorithms that minimize objectives derived directly from PAC-Bayes bounds, incorporating Laplace distributions. The resulting algorithms are regulated by the Huber loss function and are robust to noise, as the Laplace distribution integrated large deviation of parameters. We analyze the convexity properties of the objective, and propose a few bounds which are fully convex, two of which jointly convex in the mean and standard-deviation under certain conditions. We derive new forward algorithms analogous to recent boosting algorithms, providing novel relations between boosting and PAC-Bayes analysis. Experiments show that our algorithms outperforms AdaBoost, L1-LogBoost, and RobustBoost in a wide range of input noise.} }
Endnote
%0 Conference Paper %T Robust Forward Algorithms via PAC-Bayes and Laplace Distributions %A Asaf Noy %A Koby Crammer %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-noy14 %I PMLR %P 678--686 %U https://proceedings.mlr.press/v33/noy14.html %V 33 %X Laplace random variables are commonly used to model extreme noise in many fields, while systems trained to deal with such noises are often characterized by robustness properties. We introduce new learning algorithms that minimize objectives derived directly from PAC-Bayes bounds, incorporating Laplace distributions. The resulting algorithms are regulated by the Huber loss function and are robust to noise, as the Laplace distribution integrated large deviation of parameters. We analyze the convexity properties of the objective, and propose a few bounds which are fully convex, two of which jointly convex in the mean and standard-deviation under certain conditions. We derive new forward algorithms analogous to recent boosting algorithms, providing novel relations between boosting and PAC-Bayes analysis. Experiments show that our algorithms outperforms AdaBoost, L1-LogBoost, and RobustBoost in a wide range of input noise.
RIS
TY - CPAPER TI - Robust Forward Algorithms via PAC-Bayes and Laplace Distributions AU - Asaf Noy AU - Koby Crammer BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-noy14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 33 SP - 678 EP - 686 L1 - http://proceedings.mlr.press/v33/noy14.pdf UR - https://proceedings.mlr.press/v33/noy14.html AB - Laplace random variables are commonly used to model extreme noise in many fields, while systems trained to deal with such noises are often characterized by robustness properties. We introduce new learning algorithms that minimize objectives derived directly from PAC-Bayes bounds, incorporating Laplace distributions. The resulting algorithms are regulated by the Huber loss function and are robust to noise, as the Laplace distribution integrated large deviation of parameters. We analyze the convexity properties of the objective, and propose a few bounds which are fully convex, two of which jointly convex in the mean and standard-deviation under certain conditions. We derive new forward algorithms analogous to recent boosting algorithms, providing novel relations between boosting and PAC-Bayes analysis. Experiments show that our algorithms outperforms AdaBoost, L1-LogBoost, and RobustBoost in a wide range of input noise. ER -
APA
Noy, A. & Crammer, K.. (2014). Robust Forward Algorithms via PAC-Bayes and Laplace Distributions. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 33:678-686 Available from https://proceedings.mlr.press/v33/noy14.html.

Related Material