Low Rank Approximation using Error Correcting Coding Matrices

Shashanka Ubaru, Arya Mazumdar, Yousef Saad
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:702-710, 2015.

Abstract

Low-rank matrix approximation is an integral component of tools such as principal component analysis (PCA), as well as is an important instrument used in applications like web search models, text mining and computer vision, e.g., face recognition. Recently, randomized algorithms were proposed to effectively construct low rank approximations of large matrices. In this paper, we show how matrices from error correcting codes can be used to find such low rank approximations. The benefits of using these code matrices are the following: (i) They are easy to generate and they reduce randomness significantly. (ii) Code matrices have low coherence and have a better chance of preserving the geometry of an entire subspace of vectors; (iii) Unlike Fourier transforms or Hadamard matrices, which require sampling O(k\log k) columns for a rank-k approximation, the log factor is not necessary in the case of code matrices. (iv) Under certain conditions, the approximation errors can be better and the singular values obtained can be more accurate, than those obtained using Gaussian random matrices and other structured random matrices.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-ubaru15, title = {Low Rank Approximation using Error Correcting Coding Matrices}, author = {Ubaru, Shashanka and Mazumdar, Arya and Saad, Yousef}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {702--710}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/ubaru15.pdf}, url = {https://proceedings.mlr.press/v37/ubaru15.html}, abstract = {Low-rank matrix approximation is an integral component of tools such as principal component analysis (PCA), as well as is an important instrument used in applications like web search models, text mining and computer vision, e.g., face recognition. Recently, randomized algorithms were proposed to effectively construct low rank approximations of large matrices. In this paper, we show how matrices from error correcting codes can be used to find such low rank approximations. The benefits of using these code matrices are the following: (i) They are easy to generate and they reduce randomness significantly. (ii) Code matrices have low coherence and have a better chance of preserving the geometry of an entire subspace of vectors; (iii) Unlike Fourier transforms or Hadamard matrices, which require sampling O(k\log k) columns for a rank-k approximation, the log factor is not necessary in the case of code matrices. (iv) Under certain conditions, the approximation errors can be better and the singular values obtained can be more accurate, than those obtained using Gaussian random matrices and other structured random matrices.} }
Endnote
%0 Conference Paper %T Low Rank Approximation using Error Correcting Coding Matrices %A Shashanka Ubaru %A Arya Mazumdar %A Yousef Saad %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-ubaru15 %I PMLR %P 702--710 %U https://proceedings.mlr.press/v37/ubaru15.html %V 37 %X Low-rank matrix approximation is an integral component of tools such as principal component analysis (PCA), as well as is an important instrument used in applications like web search models, text mining and computer vision, e.g., face recognition. Recently, randomized algorithms were proposed to effectively construct low rank approximations of large matrices. In this paper, we show how matrices from error correcting codes can be used to find such low rank approximations. The benefits of using these code matrices are the following: (i) They are easy to generate and they reduce randomness significantly. (ii) Code matrices have low coherence and have a better chance of preserving the geometry of an entire subspace of vectors; (iii) Unlike Fourier transforms or Hadamard matrices, which require sampling O(k\log k) columns for a rank-k approximation, the log factor is not necessary in the case of code matrices. (iv) Under certain conditions, the approximation errors can be better and the singular values obtained can be more accurate, than those obtained using Gaussian random matrices and other structured random matrices.
RIS
TY - CPAPER TI - Low Rank Approximation using Error Correcting Coding Matrices AU - Shashanka Ubaru AU - Arya Mazumdar AU - Yousef Saad BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-ubaru15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 702 EP - 710 L1 - http://proceedings.mlr.press/v37/ubaru15.pdf UR - https://proceedings.mlr.press/v37/ubaru15.html AB - Low-rank matrix approximation is an integral component of tools such as principal component analysis (PCA), as well as is an important instrument used in applications like web search models, text mining and computer vision, e.g., face recognition. Recently, randomized algorithms were proposed to effectively construct low rank approximations of large matrices. In this paper, we show how matrices from error correcting codes can be used to find such low rank approximations. The benefits of using these code matrices are the following: (i) They are easy to generate and they reduce randomness significantly. (ii) Code matrices have low coherence and have a better chance of preserving the geometry of an entire subspace of vectors; (iii) Unlike Fourier transforms or Hadamard matrices, which require sampling O(k\log k) columns for a rank-k approximation, the log factor is not necessary in the case of code matrices. (iv) Under certain conditions, the approximation errors can be better and the singular values obtained can be more accurate, than those obtained using Gaussian random matrices and other structured random matrices. ER -
APA
Ubaru, S., Mazumdar, A. & Saad, Y.. (2015). Low Rank Approximation using Error Correcting Coding Matrices. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:702-710 Available from https://proceedings.mlr.press/v37/ubaru15.html.

Related Material