Stochastic Optimization for Multiview Representation Learning using Partial Least Squares

Raman Arora, Poorya Mianjy, Teodor Marinov
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1786-1794, 2016.

Abstract

Partial Least Squares (PLS) is a ubiquitous statistical technique for bilinear factor analysis. It is used in many data analysis, machine learning, and information retrieval applications to model the covariance structure between a pair of data matrices. In this paper, we consider PLS for representation learning in a multiview setting where we have more than one view in data at training time. Furthermore, instead of framing PLS as a problem about a fixed given data set, we argue that PLS should be studied as a stochastic optimization problem, especially in a "big data" setting, with the goal of optimizing a population objective based on sample. This view suggests using Stochastic Approximation (SA) approaches, such as Stochastic Gradient Descent (SGD) and enables a rigorous analysis of their benefits. In this paper, we develop SA approaches to PLS and provide iteration complexity bounds for the proposed algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-aroraa16, title = {Stochastic Optimization for Multiview Representation Learning using Partial Least Squares}, author = {Arora, Raman and Mianjy, Poorya and Marinov, Teodor}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1786--1794}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/aroraa16.pdf}, url = {https://proceedings.mlr.press/v48/aroraa16.html}, abstract = {Partial Least Squares (PLS) is a ubiquitous statistical technique for bilinear factor analysis. It is used in many data analysis, machine learning, and information retrieval applications to model the covariance structure between a pair of data matrices. In this paper, we consider PLS for representation learning in a multiview setting where we have more than one view in data at training time. Furthermore, instead of framing PLS as a problem about a fixed given data set, we argue that PLS should be studied as a stochastic optimization problem, especially in a "big data" setting, with the goal of optimizing a population objective based on sample. This view suggests using Stochastic Approximation (SA) approaches, such as Stochastic Gradient Descent (SGD) and enables a rigorous analysis of their benefits. In this paper, we develop SA approaches to PLS and provide iteration complexity bounds for the proposed algorithms.} }
Endnote
%0 Conference Paper %T Stochastic Optimization for Multiview Representation Learning using Partial Least Squares %A Raman Arora %A Poorya Mianjy %A Teodor Marinov %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-aroraa16 %I PMLR %P 1786--1794 %U https://proceedings.mlr.press/v48/aroraa16.html %V 48 %X Partial Least Squares (PLS) is a ubiquitous statistical technique for bilinear factor analysis. It is used in many data analysis, machine learning, and information retrieval applications to model the covariance structure between a pair of data matrices. In this paper, we consider PLS for representation learning in a multiview setting where we have more than one view in data at training time. Furthermore, instead of framing PLS as a problem about a fixed given data set, we argue that PLS should be studied as a stochastic optimization problem, especially in a "big data" setting, with the goal of optimizing a population objective based on sample. This view suggests using Stochastic Approximation (SA) approaches, such as Stochastic Gradient Descent (SGD) and enables a rigorous analysis of their benefits. In this paper, we develop SA approaches to PLS and provide iteration complexity bounds for the proposed algorithms.
RIS
TY - CPAPER TI - Stochastic Optimization for Multiview Representation Learning using Partial Least Squares AU - Raman Arora AU - Poorya Mianjy AU - Teodor Marinov BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-aroraa16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1786 EP - 1794 L1 - http://proceedings.mlr.press/v48/aroraa16.pdf UR - https://proceedings.mlr.press/v48/aroraa16.html AB - Partial Least Squares (PLS) is a ubiquitous statistical technique for bilinear factor analysis. It is used in many data analysis, machine learning, and information retrieval applications to model the covariance structure between a pair of data matrices. In this paper, we consider PLS for representation learning in a multiview setting where we have more than one view in data at training time. Furthermore, instead of framing PLS as a problem about a fixed given data set, we argue that PLS should be studied as a stochastic optimization problem, especially in a "big data" setting, with the goal of optimizing a population objective based on sample. This view suggests using Stochastic Approximation (SA) approaches, such as Stochastic Gradient Descent (SGD) and enables a rigorous analysis of their benefits. In this paper, we develop SA approaches to PLS and provide iteration complexity bounds for the proposed algorithms. ER -
APA
Arora, R., Mianjy, P. & Marinov, T.. (2016). Stochastic Optimization for Multiview Representation Learning using Partial Least Squares. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1786-1794 Available from https://proceedings.mlr.press/v48/aroraa16.html.

Related Material