Random Fourier Features For Operator-Valued Kernels

Romain Brault, Markus Heinonen, Florence Buc
Proceedings of The 8th Asian Conference on Machine Learning, PMLR 63:110-125, 2016.

Abstract

Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v63-Brault39, title = {Random Fourier Features For Operator-Valued Kernels}, author = {Brault, Romain and Heinonen, Markus and Buc, Florence}, booktitle = {Proceedings of The 8th Asian Conference on Machine Learning}, pages = {110--125}, year = {2016}, editor = {Durrant, Robert J. and Kim, Kee-Eung}, volume = {63}, series = {Proceedings of Machine Learning Research}, address = {The University of Waikato, Hamilton, New Zealand}, month = {16--18 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v63/Brault39.pdf}, url = {https://proceedings.mlr.press/v63/Brault39.html}, abstract = {Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets.} }
Endnote
%0 Conference Paper %T Random Fourier Features For Operator-Valued Kernels %A Romain Brault %A Markus Heinonen %A Florence Buc %B Proceedings of The 8th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Robert J. Durrant %E Kee-Eung Kim %F pmlr-v63-Brault39 %I PMLR %P 110--125 %U https://proceedings.mlr.press/v63/Brault39.html %V 63 %X Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets.
RIS
TY - CPAPER TI - Random Fourier Features For Operator-Valued Kernels AU - Romain Brault AU - Markus Heinonen AU - Florence Buc BT - Proceedings of The 8th Asian Conference on Machine Learning DA - 2016/11/20 ED - Robert J. Durrant ED - Kee-Eung Kim ID - pmlr-v63-Brault39 PB - PMLR DP - Proceedings of Machine Learning Research VL - 63 SP - 110 EP - 125 L1 - http://proceedings.mlr.press/v63/Brault39.pdf UR - https://proceedings.mlr.press/v63/Brault39.html AB - Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets. ER -
APA
Brault, R., Heinonen, M. & Buc, F.. (2016). Random Fourier Features For Operator-Valued Kernels. Proceedings of The 8th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 63:110-125 Available from https://proceedings.mlr.press/v63/Brault39.html.

Related Material