Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices

Jonathan Scarlett, Volkan Cevher
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:149-158, 2016.

Abstract

Linear sketching is a powerful tool for the problem of sparse signal recovery, having numerous applications such as compressive sensing, data stream computing, graph sketching, and routing. Motivated by applications where the \emphpositions of the non-zero entries in a sparse vector are of primary interest, we consider the problem of \emphsupport recovery from a linear sketch taking the form \mathbfY = \mathbfXβ+ \mathbfZ. We focus on a widely-used expander-based construction in the columns of the measurement matrix \mathbfX ∈\mathbbR^n \times p are random permutations of a sparse binary vector containing d ≪n ones and n-d zeros. We provide a sharp characterization of the number of measurements required for an information-theoretically optimal decoder, thus permitting a precise comparison to the i.i.d. Gaussian construction. Our findings reveal both positive and negative results, showing that the performance nearly matches the Gaussian construction at moderate-to-high noise levels, while being worse by an arbitrarily large factor at low noise levels.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-scarlett16, title = {Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices}, author = {Scarlett, Jonathan and Cevher, Volkan}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {149--158}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/scarlett16.pdf}, url = {https://proceedings.mlr.press/v51/scarlett16.html}, abstract = {Linear sketching is a powerful tool for the problem of sparse signal recovery, having numerous applications such as compressive sensing, data stream computing, graph sketching, and routing. Motivated by applications where the \emphpositions of the non-zero entries in a sparse vector are of primary interest, we consider the problem of \emphsupport recovery from a linear sketch taking the form \mathbfY = \mathbfXβ+ \mathbfZ. We focus on a widely-used expander-based construction in the columns of the measurement matrix \mathbfX ∈\mathbbR^n \times p are random permutations of a sparse binary vector containing d ≪n ones and n-d zeros. We provide a sharp characterization of the number of measurements required for an information-theoretically optimal decoder, thus permitting a precise comparison to the i.i.d. Gaussian construction. Our findings reveal both positive and negative results, showing that the performance nearly matches the Gaussian construction at moderate-to-high noise levels, while being worse by an arbitrarily large factor at low noise levels.} }
Endnote
%0 Conference Paper %T Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices %A Jonathan Scarlett %A Volkan Cevher %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-scarlett16 %I PMLR %P 149--158 %U https://proceedings.mlr.press/v51/scarlett16.html %V 51 %X Linear sketching is a powerful tool for the problem of sparse signal recovery, having numerous applications such as compressive sensing, data stream computing, graph sketching, and routing. Motivated by applications where the \emphpositions of the non-zero entries in a sparse vector are of primary interest, we consider the problem of \emphsupport recovery from a linear sketch taking the form \mathbfY = \mathbfXβ+ \mathbfZ. We focus on a widely-used expander-based construction in the columns of the measurement matrix \mathbfX ∈\mathbbR^n \times p are random permutations of a sparse binary vector containing d ≪n ones and n-d zeros. We provide a sharp characterization of the number of measurements required for an information-theoretically optimal decoder, thus permitting a precise comparison to the i.i.d. Gaussian construction. Our findings reveal both positive and negative results, showing that the performance nearly matches the Gaussian construction at moderate-to-high noise levels, while being worse by an arbitrarily large factor at low noise levels.
RIS
TY - CPAPER TI - Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices AU - Jonathan Scarlett AU - Volkan Cevher BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-scarlett16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 149 EP - 158 L1 - http://proceedings.mlr.press/v51/scarlett16.pdf UR - https://proceedings.mlr.press/v51/scarlett16.html AB - Linear sketching is a powerful tool for the problem of sparse signal recovery, having numerous applications such as compressive sensing, data stream computing, graph sketching, and routing. Motivated by applications where the \emphpositions of the non-zero entries in a sparse vector are of primary interest, we consider the problem of \emphsupport recovery from a linear sketch taking the form \mathbfY = \mathbfXβ+ \mathbfZ. We focus on a widely-used expander-based construction in the columns of the measurement matrix \mathbfX ∈\mathbbR^n \times p are random permutations of a sparse binary vector containing d ≪n ones and n-d zeros. We provide a sharp characterization of the number of measurements required for an information-theoretically optimal decoder, thus permitting a precise comparison to the i.i.d. Gaussian construction. Our findings reveal both positive and negative results, showing that the performance nearly matches the Gaussian construction at moderate-to-high noise levels, while being worse by an arbitrarily large factor at low noise levels. ER -
APA
Scarlett, J. & Cevher, V.. (2016). Limits on Sparse Support Recovery via Linear Sketching with Random Expander Matrices. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:149-158 Available from https://proceedings.mlr.press/v51/scarlett16.html.

Related Material