A Geometric Algorithm for Scalable Multiple Kernel Learning

John Moeller, Parasaran Raman, Suresh Venkatasubramanian, Avishek Saha
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:633-642, 2014.

Abstract

We present a geometric formulation of the Multiple Kernel Learning (MKL) problem. To do so, we reinterpret the problem of learning kernel weights as searching for a kernel that maximizes the minimum (kernel) distance between two convex polytopes. This interpretation combined with additional structural insights from our geometric formulation allows us to reduce the MKL problem to a simple optimization routine that yields provable convergence as well as quality guarantees. As a result our method scales efficiently to much larger data sets than most prior methods can handle. Empirical evaluation on eleven datasets shows that we are significantly faster and even compare favorably with an uniform unweighted combination of kernels.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-moeller14, title = {{A Geometric Algorithm for Scalable Multiple Kernel Learning}}, author = {Moeller, John and Raman, Parasaran and Venkatasubramanian, Suresh and Saha, Avishek}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {633--642}, year = {2014}, editor = {Kaski, Samuel and Corander, Jukka}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/moeller14.pdf}, url = {https://proceedings.mlr.press/v33/moeller14.html}, abstract = {We present a geometric formulation of the Multiple Kernel Learning (MKL) problem. To do so, we reinterpret the problem of learning kernel weights as searching for a kernel that maximizes the minimum (kernel) distance between two convex polytopes. This interpretation combined with additional structural insights from our geometric formulation allows us to reduce the MKL problem to a simple optimization routine that yields provable convergence as well as quality guarantees. As a result our method scales efficiently to much larger data sets than most prior methods can handle. Empirical evaluation on eleven datasets shows that we are significantly faster and even compare favorably with an uniform unweighted combination of kernels.} }
Endnote
%0 Conference Paper %T A Geometric Algorithm for Scalable Multiple Kernel Learning %A John Moeller %A Parasaran Raman %A Suresh Venkatasubramanian %A Avishek Saha %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-moeller14 %I PMLR %P 633--642 %U https://proceedings.mlr.press/v33/moeller14.html %V 33 %X We present a geometric formulation of the Multiple Kernel Learning (MKL) problem. To do so, we reinterpret the problem of learning kernel weights as searching for a kernel that maximizes the minimum (kernel) distance between two convex polytopes. This interpretation combined with additional structural insights from our geometric formulation allows us to reduce the MKL problem to a simple optimization routine that yields provable convergence as well as quality guarantees. As a result our method scales efficiently to much larger data sets than most prior methods can handle. Empirical evaluation on eleven datasets shows that we are significantly faster and even compare favorably with an uniform unweighted combination of kernels.
RIS
TY - CPAPER TI - A Geometric Algorithm for Scalable Multiple Kernel Learning AU - John Moeller AU - Parasaran Raman AU - Suresh Venkatasubramanian AU - Avishek Saha BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-moeller14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 33 SP - 633 EP - 642 L1 - http://proceedings.mlr.press/v33/moeller14.pdf UR - https://proceedings.mlr.press/v33/moeller14.html AB - We present a geometric formulation of the Multiple Kernel Learning (MKL) problem. To do so, we reinterpret the problem of learning kernel weights as searching for a kernel that maximizes the minimum (kernel) distance between two convex polytopes. This interpretation combined with additional structural insights from our geometric formulation allows us to reduce the MKL problem to a simple optimization routine that yields provable convergence as well as quality guarantees. As a result our method scales efficiently to much larger data sets than most prior methods can handle. Empirical evaluation on eleven datasets shows that we are significantly faster and even compare favorably with an uniform unweighted combination of kernels. ER -
APA
Moeller, J., Raman, P., Venkatasubramanian, S. & Saha, A.. (2014). A Geometric Algorithm for Scalable Multiple Kernel Learning. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 33:633-642 Available from https://proceedings.mlr.press/v33/moeller14.html.

Related Material