Bayesian Max-margin Multi-Task Learning with Data Augmentation

Chengtao Li, Jun Zhu, Jianfei Chen
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):415-423, 2014.

Abstract

Both max-margin and Bayesian methods have been extensively studied in multi-task learning, but have rarely been considered together. We present Bayesian max-margin multi-task learning, which conjoins the two schools of methods, thus allowing the discriminative max-margin methods to enjoy the great flexibility of Bayesian methods on incorporating rich prior information as well as performing nonparametric Bayesian feature learning with the latent dimensionality resolved from data. We develop Gibbs sampling algorithms by exploring data augmentation to deal with the non-smooth hinge loss. For nonparametric models, our algorithms do not need to make mean-field assumptions or truncated approximation. Empirical results demonstrate superior performance than competitors in both multi-task classification and regression.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-lic14, title = {Bayesian Max-margin Multi-Task Learning with Data Augmentation}, author = {Li, Chengtao and Zhu, Jun and Chen, Jianfei}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {415--423}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/lic14.pdf}, url = {https://proceedings.mlr.press/v32/lic14.html}, abstract = {Both max-margin and Bayesian methods have been extensively studied in multi-task learning, but have rarely been considered together. We present Bayesian max-margin multi-task learning, which conjoins the two schools of methods, thus allowing the discriminative max-margin methods to enjoy the great flexibility of Bayesian methods on incorporating rich prior information as well as performing nonparametric Bayesian feature learning with the latent dimensionality resolved from data. We develop Gibbs sampling algorithms by exploring data augmentation to deal with the non-smooth hinge loss. For nonparametric models, our algorithms do not need to make mean-field assumptions or truncated approximation. Empirical results demonstrate superior performance than competitors in both multi-task classification and regression.} }
Endnote
%0 Conference Paper %T Bayesian Max-margin Multi-Task Learning with Data Augmentation %A Chengtao Li %A Jun Zhu %A Jianfei Chen %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-lic14 %I PMLR %P 415--423 %U https://proceedings.mlr.press/v32/lic14.html %V 32 %N 2 %X Both max-margin and Bayesian methods have been extensively studied in multi-task learning, but have rarely been considered together. We present Bayesian max-margin multi-task learning, which conjoins the two schools of methods, thus allowing the discriminative max-margin methods to enjoy the great flexibility of Bayesian methods on incorporating rich prior information as well as performing nonparametric Bayesian feature learning with the latent dimensionality resolved from data. We develop Gibbs sampling algorithms by exploring data augmentation to deal with the non-smooth hinge loss. For nonparametric models, our algorithms do not need to make mean-field assumptions or truncated approximation. Empirical results demonstrate superior performance than competitors in both multi-task classification and regression.
RIS
TY - CPAPER TI - Bayesian Max-margin Multi-Task Learning with Data Augmentation AU - Chengtao Li AU - Jun Zhu AU - Jianfei Chen BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-lic14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 415 EP - 423 L1 - http://proceedings.mlr.press/v32/lic14.pdf UR - https://proceedings.mlr.press/v32/lic14.html AB - Both max-margin and Bayesian methods have been extensively studied in multi-task learning, but have rarely been considered together. We present Bayesian max-margin multi-task learning, which conjoins the two schools of methods, thus allowing the discriminative max-margin methods to enjoy the great flexibility of Bayesian methods on incorporating rich prior information as well as performing nonparametric Bayesian feature learning with the latent dimensionality resolved from data. We develop Gibbs sampling algorithms by exploring data augmentation to deal with the non-smooth hinge loss. For nonparametric models, our algorithms do not need to make mean-field assumptions or truncated approximation. Empirical results demonstrate superior performance than competitors in both multi-task classification and regression. ER -
APA
Li, C., Zhu, J. & Chen, J.. (2014). Bayesian Max-margin Multi-Task Learning with Data Augmentation. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):415-423 Available from https://proceedings.mlr.press/v32/lic14.html.

Related Material