A PAC-Bayesian Approach for Domain Adaptation with Specialization to Linear Classifiers

Pascal Germain, Amaury Habrard, François Laviolette, Emilie Morvant
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):738-746, 2013.

Abstract

We provide a first PAC-Bayesian analysis for domain adaptation (DA) which arises when the learning and test distributions differ. It relies on a novel distribution pseudodistance based on a disagreement averaging. Using this measure, we derive a PAC-Bayesian DA bound for the stochastic Gibbs classifier. This bound has the advantage of being directly optimizable for any hypothesis space. We specialize it to linear classifiers, and design a learning algorithm which shows interesting results on a synthetic problem and on a popular sentiment annotation task. This opens the door to tackling DA tasks by making use of all the PAC-Bayesian tools.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-germain13, title = {A PAC-Bayesian Approach for Domain Adaptation with Specialization to Linear Classifiers}, author = {Germain, Pascal and Habrard, Amaury and Laviolette, François and Morvant, Emilie}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {738--746}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/germain13.pdf}, url = {https://proceedings.mlr.press/v28/germain13.html}, abstract = {We provide a first PAC-Bayesian analysis for domain adaptation (DA) which arises when the learning and test distributions differ. It relies on a novel distribution pseudodistance based on a disagreement averaging. Using this measure, we derive a PAC-Bayesian DA bound for the stochastic Gibbs classifier. This bound has the advantage of being directly optimizable for any hypothesis space. We specialize it to linear classifiers, and design a learning algorithm which shows interesting results on a synthetic problem and on a popular sentiment annotation task. This opens the door to tackling DA tasks by making use of all the PAC-Bayesian tools. } }
Endnote
%0 Conference Paper %T A PAC-Bayesian Approach for Domain Adaptation with Specialization to Linear Classifiers %A Pascal Germain %A Amaury Habrard %A François Laviolette %A Emilie Morvant %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-germain13 %I PMLR %P 738--746 %U https://proceedings.mlr.press/v28/germain13.html %V 28 %N 3 %X We provide a first PAC-Bayesian analysis for domain adaptation (DA) which arises when the learning and test distributions differ. It relies on a novel distribution pseudodistance based on a disagreement averaging. Using this measure, we derive a PAC-Bayesian DA bound for the stochastic Gibbs classifier. This bound has the advantage of being directly optimizable for any hypothesis space. We specialize it to linear classifiers, and design a learning algorithm which shows interesting results on a synthetic problem and on a popular sentiment annotation task. This opens the door to tackling DA tasks by making use of all the PAC-Bayesian tools.
RIS
TY - CPAPER TI - A PAC-Bayesian Approach for Domain Adaptation with Specialization to Linear Classifiers AU - Pascal Germain AU - Amaury Habrard AU - François Laviolette AU - Emilie Morvant BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-germain13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 738 EP - 746 L1 - http://proceedings.mlr.press/v28/germain13.pdf UR - https://proceedings.mlr.press/v28/germain13.html AB - We provide a first PAC-Bayesian analysis for domain adaptation (DA) which arises when the learning and test distributions differ. It relies on a novel distribution pseudodistance based on a disagreement averaging. Using this measure, we derive a PAC-Bayesian DA bound for the stochastic Gibbs classifier. This bound has the advantage of being directly optimizable for any hypothesis space. We specialize it to linear classifiers, and design a learning algorithm which shows interesting results on a synthetic problem and on a popular sentiment annotation task. This opens the door to tackling DA tasks by making use of all the PAC-Bayesian tools. ER -
APA
Germain, P., Habrard, A., Laviolette, F. & Morvant, E.. (2013). A PAC-Bayesian Approach for Domain Adaptation with Specialization to Linear Classifiers. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):738-746 Available from https://proceedings.mlr.press/v28/germain13.html.

Related Material