Sparse coding for multitask and transfer learning

Andreas Maurer, Massi Pontil, Bernardino Romera-Paredes
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(2):343-351, 2013.

Abstract

We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning. The central assumption of our learning method is that the tasks parameters are well approximated by sparse linear combinations of the atoms of a dictionary on a high or infinite dimensional space. This assumption, together with the large quantity of available data in the multitask and transfer learning settings, allows a principled choice of the dictionary. We provide bounds on the generalization error of this approach, for both settings. Numerical experiments on one synthetic and two real datasets show the advantage of our method over single task learning, a previous method based on orthogonal and dense representation of the tasks and a related method learning task grouping.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-maurer13, title = {Sparse coding for multitask and transfer learning}, author = {Maurer, Andreas and Pontil, Massi and Romera-Paredes, Bernardino}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {343--351}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/maurer13.pdf}, url = {https://proceedings.mlr.press/v28/maurer13.html}, abstract = {We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning. The central assumption of our learning method is that the tasks parameters are well approximated by sparse linear combinations of the atoms of a dictionary on a high or infinite dimensional space. This assumption, together with the large quantity of available data in the multitask and transfer learning settings, allows a principled choice of the dictionary. We provide bounds on the generalization error of this approach, for both settings. Numerical experiments on one synthetic and two real datasets show the advantage of our method over single task learning, a previous method based on orthogonal and dense representation of the tasks and a related method learning task grouping.} }
Endnote
%0 Conference Paper %T Sparse coding for multitask and transfer learning %A Andreas Maurer %A Massi Pontil %A Bernardino Romera-Paredes %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-maurer13 %I PMLR %P 343--351 %U https://proceedings.mlr.press/v28/maurer13.html %V 28 %N 2 %X We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning. The central assumption of our learning method is that the tasks parameters are well approximated by sparse linear combinations of the atoms of a dictionary on a high or infinite dimensional space. This assumption, together with the large quantity of available data in the multitask and transfer learning settings, allows a principled choice of the dictionary. We provide bounds on the generalization error of this approach, for both settings. Numerical experiments on one synthetic and two real datasets show the advantage of our method over single task learning, a previous method based on orthogonal and dense representation of the tasks and a related method learning task grouping.
RIS
TY - CPAPER TI - Sparse coding for multitask and transfer learning AU - Andreas Maurer AU - Massi Pontil AU - Bernardino Romera-Paredes BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-maurer13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 2 SP - 343 EP - 351 L1 - http://proceedings.mlr.press/v28/maurer13.pdf UR - https://proceedings.mlr.press/v28/maurer13.html AB - We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning. The central assumption of our learning method is that the tasks parameters are well approximated by sparse linear combinations of the atoms of a dictionary on a high or infinite dimensional space. This assumption, together with the large quantity of available data in the multitask and transfer learning settings, allows a principled choice of the dictionary. We provide bounds on the generalization error of this approach, for both settings. Numerical experiments on one synthetic and two real datasets show the advantage of our method over single task learning, a previous method based on orthogonal and dense representation of the tasks and a related method learning task grouping. ER -
APA
Maurer, A., Pontil, M. & Romera-Paredes, B.. (2013). Sparse coding for multitask and transfer learning. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(2):343-351 Available from https://proceedings.mlr.press/v28/maurer13.html.

Related Material