Recovering the Optimal Solution by Dual Random Projection

Lijun Zhang, Mehrdad Mahdavi, Rong Jin, Tianbao Yang, Shenghuo Zhu
Proceedings of the 26th Annual Conference on Learning Theory, PMLR 30:135-157, 2013.

Abstract

Random projection has been widely used in data classification. It maps high-dimensional data into a low-dimensional subspace in order to reduce the computational cost in solving the related optimization problem. While previous studies are focused on analyzing the classification performance of using random projection, in this work, we consider the recovery problem, i.e., how to accurately recover the optimal solution to the original optimization problem in the high-dimensional space based on the solution learned from the subspace spanned by random projections. We present a simple algorithm, termed Dual Random Projection, that uses the dual solution of the low-dimensional optimization problem to recover the optimal solution to the original problem. Our theoretical analysis shows that with a high probability, the proposed algorithm is able to accurately recover the optimal solution to the original problem, provided that the data matrix is of low rank or can be well approximated by a low rank matrix.

Cite this Paper


BibTeX
@InProceedings{pmlr-v30-Zhang13a, title = {Recovering the Optimal Solution by Dual Random Projection}, author = {Zhang, Lijun and Mahdavi, Mehrdad and Jin, Rong and Yang, Tianbao and Zhu, Shenghuo}, booktitle = {Proceedings of the 26th Annual Conference on Learning Theory}, pages = {135--157}, year = {2013}, editor = {Shalev-Shwartz, Shai and Steinwart, Ingo}, volume = {30}, series = {Proceedings of Machine Learning Research}, address = {Princeton, NJ, USA}, month = {12--14 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v30/Zhang13a.pdf}, url = {https://proceedings.mlr.press/v30/Zhang13a.html}, abstract = {Random projection has been widely used in data classification. It maps high-dimensional data into a low-dimensional subspace in order to reduce the computational cost in solving the related optimization problem. While previous studies are focused on analyzing the classification performance of using random projection, in this work, we consider the recovery problem, i.e., how to accurately recover the optimal solution to the original optimization problem in the high-dimensional space based on the solution learned from the subspace spanned by random projections. We present a simple algorithm, termed Dual Random Projection, that uses the dual solution of the low-dimensional optimization problem to recover the optimal solution to the original problem. Our theoretical analysis shows that with a high probability, the proposed algorithm is able to accurately recover the optimal solution to the original problem, provided that the data matrix is of low rank or can be well approximated by a low rank matrix.} }
Endnote
%0 Conference Paper %T Recovering the Optimal Solution by Dual Random Projection %A Lijun Zhang %A Mehrdad Mahdavi %A Rong Jin %A Tianbao Yang %A Shenghuo Zhu %B Proceedings of the 26th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2013 %E Shai Shalev-Shwartz %E Ingo Steinwart %F pmlr-v30-Zhang13a %I PMLR %P 135--157 %U https://proceedings.mlr.press/v30/Zhang13a.html %V 30 %X Random projection has been widely used in data classification. It maps high-dimensional data into a low-dimensional subspace in order to reduce the computational cost in solving the related optimization problem. While previous studies are focused on analyzing the classification performance of using random projection, in this work, we consider the recovery problem, i.e., how to accurately recover the optimal solution to the original optimization problem in the high-dimensional space based on the solution learned from the subspace spanned by random projections. We present a simple algorithm, termed Dual Random Projection, that uses the dual solution of the low-dimensional optimization problem to recover the optimal solution to the original problem. Our theoretical analysis shows that with a high probability, the proposed algorithm is able to accurately recover the optimal solution to the original problem, provided that the data matrix is of low rank or can be well approximated by a low rank matrix.
RIS
TY - CPAPER TI - Recovering the Optimal Solution by Dual Random Projection AU - Lijun Zhang AU - Mehrdad Mahdavi AU - Rong Jin AU - Tianbao Yang AU - Shenghuo Zhu BT - Proceedings of the 26th Annual Conference on Learning Theory DA - 2013/06/13 ED - Shai Shalev-Shwartz ED - Ingo Steinwart ID - pmlr-v30-Zhang13a PB - PMLR DP - Proceedings of Machine Learning Research VL - 30 SP - 135 EP - 157 L1 - http://proceedings.mlr.press/v30/Zhang13a.pdf UR - https://proceedings.mlr.press/v30/Zhang13a.html AB - Random projection has been widely used in data classification. It maps high-dimensional data into a low-dimensional subspace in order to reduce the computational cost in solving the related optimization problem. While previous studies are focused on analyzing the classification performance of using random projection, in this work, we consider the recovery problem, i.e., how to accurately recover the optimal solution to the original optimization problem in the high-dimensional space based on the solution learned from the subspace spanned by random projections. We present a simple algorithm, termed Dual Random Projection, that uses the dual solution of the low-dimensional optimization problem to recover the optimal solution to the original problem. Our theoretical analysis shows that with a high probability, the proposed algorithm is able to accurately recover the optimal solution to the original problem, provided that the data matrix is of low rank or can be well approximated by a low rank matrix. ER -
APA
Zhang, L., Mahdavi, M., Jin, R., Yang, T. & Zhu, S.. (2013). Recovering the Optimal Solution by Dual Random Projection. Proceedings of the 26th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 30:135-157 Available from https://proceedings.mlr.press/v30/Zhang13a.html.

Related Material