A Generalized Kernel Approach to Structured Output Learning

Hachem Kadri, Mohammad Ghavamzadeh, Philippe Preux
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(1):471-479, 2013.

Abstract

We study the problem of structured output learning from a regression perspective. We first provide a general formulation of the kernel dependency estimation (KDE) approach to this problem using operator-valued kernels. Our formulation overcomes the two main limitations of the original KDE approach, namely the decoupling between outputs in the image space and the inability to use a joint feature space. We then propose a covariance-based operator-valued kernel that allows us to take into account the structure of the kernel feature space. This kernel operates on the output space and only encodes the interactions between the outputs without any reference to the input space. To address this issue, we introduce a variant of our KDE method based on the conditional covariance operator that in addition to the correlation between the outputs takes into account the effects of the input variables. Finally, we evaluate the performance of our KDE approach using both covariance and conditional covariance kernels on three structured output problems, and compare it to the state-of-the art kernel-based structured output regression methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-kadri13, title = {A Generalized Kernel Approach to Structured Output Learning}, author = {Kadri, Hachem and Ghavamzadeh, Mohammad and Preux, Philippe}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {471--479}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/kadri13.pdf}, url = {https://proceedings.mlr.press/v28/kadri13.html}, abstract = {We study the problem of structured output learning from a regression perspective. We first provide a general formulation of the kernel dependency estimation (KDE) approach to this problem using operator-valued kernels. Our formulation overcomes the two main limitations of the original KDE approach, namely the decoupling between outputs in the image space and the inability to use a joint feature space. We then propose a covariance-based operator-valued kernel that allows us to take into account the structure of the kernel feature space. This kernel operates on the output space and only encodes the interactions between the outputs without any reference to the input space. To address this issue, we introduce a variant of our KDE method based on the conditional covariance operator that in addition to the correlation between the outputs takes into account the effects of the input variables. Finally, we evaluate the performance of our KDE approach using both covariance and conditional covariance kernels on three structured output problems, and compare it to the state-of-the art kernel-based structured output regression methods.} }
Endnote
%0 Conference Paper %T A Generalized Kernel Approach to Structured Output Learning %A Hachem Kadri %A Mohammad Ghavamzadeh %A Philippe Preux %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-kadri13 %I PMLR %P 471--479 %U https://proceedings.mlr.press/v28/kadri13.html %V 28 %N 1 %X We study the problem of structured output learning from a regression perspective. We first provide a general formulation of the kernel dependency estimation (KDE) approach to this problem using operator-valued kernels. Our formulation overcomes the two main limitations of the original KDE approach, namely the decoupling between outputs in the image space and the inability to use a joint feature space. We then propose a covariance-based operator-valued kernel that allows us to take into account the structure of the kernel feature space. This kernel operates on the output space and only encodes the interactions between the outputs without any reference to the input space. To address this issue, we introduce a variant of our KDE method based on the conditional covariance operator that in addition to the correlation between the outputs takes into account the effects of the input variables. Finally, we evaluate the performance of our KDE approach using both covariance and conditional covariance kernels on three structured output problems, and compare it to the state-of-the art kernel-based structured output regression methods.
RIS
TY - CPAPER TI - A Generalized Kernel Approach to Structured Output Learning AU - Hachem Kadri AU - Mohammad Ghavamzadeh AU - Philippe Preux BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/02/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-kadri13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 1 SP - 471 EP - 479 L1 - http://proceedings.mlr.press/v28/kadri13.pdf UR - https://proceedings.mlr.press/v28/kadri13.html AB - We study the problem of structured output learning from a regression perspective. We first provide a general formulation of the kernel dependency estimation (KDE) approach to this problem using operator-valued kernels. Our formulation overcomes the two main limitations of the original KDE approach, namely the decoupling between outputs in the image space and the inability to use a joint feature space. We then propose a covariance-based operator-valued kernel that allows us to take into account the structure of the kernel feature space. This kernel operates on the output space and only encodes the interactions between the outputs without any reference to the input space. To address this issue, we introduce a variant of our KDE method based on the conditional covariance operator that in addition to the correlation between the outputs takes into account the effects of the input variables. Finally, we evaluate the performance of our KDE approach using both covariance and conditional covariance kernels on three structured output problems, and compare it to the state-of-the art kernel-based structured output regression methods. ER -
APA
Kadri, H., Ghavamzadeh, M. & Preux, P.. (2013). A Generalized Kernel Approach to Structured Output Learning. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(1):471-479 Available from https://proceedings.mlr.press/v28/kadri13.html.

Related Material