Minimax rates for memory-bounded sparse linear regression

Jacob Steinhardt, John Duchi
Proceedings of The 28th Conference on Learning Theory, PMLR 40:1564-1587, 2015.

Abstract

We establish a minimax lower bound of Ω(\frackdBε) on the sample size needed to estimate parameters in a k-sparse linear regression of dimension d under memory restrictions to B bits, where εis the \ell_2 parameter error. When the covariance of the regressors is the identity matrix, we also provide an algorithm that uses \tildeO(B+k) bits and requires \tildeO(\frackdBε^2) observations to achieve error ε. Our lower bound also holds in the more general communication-bounded setting, where instead of a memory bound, at most B bits of information are allowed to be (adaptively) communicated about each sample.

Cite this Paper


BibTeX
@InProceedings{pmlr-v40-Steinhardt15, title = {Minimax rates for memory-bounded sparse linear regression}, author = {Steinhardt, Jacob and Duchi, John}, booktitle = {Proceedings of The 28th Conference on Learning Theory}, pages = {1564--1587}, year = {2015}, editor = {Grünwald, Peter and Hazan, Elad and Kale, Satyen}, volume = {40}, series = {Proceedings of Machine Learning Research}, address = {Paris, France}, month = {03--06 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v40/Steinhardt15.pdf}, url = {https://proceedings.mlr.press/v40/Steinhardt15.html}, abstract = {We establish a minimax lower bound of Ω(\frackdBε) on the sample size needed to estimate parameters in a k-sparse linear regression of dimension d under memory restrictions to B bits, where εis the \ell_2 parameter error. When the covariance of the regressors is the identity matrix, we also provide an algorithm that uses \tildeO(B+k) bits and requires \tildeO(\frackdBε^2) observations to achieve error ε. Our lower bound also holds in the more general communication-bounded setting, where instead of a memory bound, at most B bits of information are allowed to be (adaptively) communicated about each sample. } }
Endnote
%0 Conference Paper %T Minimax rates for memory-bounded sparse linear regression %A Jacob Steinhardt %A John Duchi %B Proceedings of The 28th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2015 %E Peter Grünwald %E Elad Hazan %E Satyen Kale %F pmlr-v40-Steinhardt15 %I PMLR %P 1564--1587 %U https://proceedings.mlr.press/v40/Steinhardt15.html %V 40 %X We establish a minimax lower bound of Ω(\frackdBε) on the sample size needed to estimate parameters in a k-sparse linear regression of dimension d under memory restrictions to B bits, where εis the \ell_2 parameter error. When the covariance of the regressors is the identity matrix, we also provide an algorithm that uses \tildeO(B+k) bits and requires \tildeO(\frackdBε^2) observations to achieve error ε. Our lower bound also holds in the more general communication-bounded setting, where instead of a memory bound, at most B bits of information are allowed to be (adaptively) communicated about each sample.
RIS
TY - CPAPER TI - Minimax rates for memory-bounded sparse linear regression AU - Jacob Steinhardt AU - John Duchi BT - Proceedings of The 28th Conference on Learning Theory DA - 2015/06/26 ED - Peter Grünwald ED - Elad Hazan ED - Satyen Kale ID - pmlr-v40-Steinhardt15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 40 SP - 1564 EP - 1587 L1 - http://proceedings.mlr.press/v40/Steinhardt15.pdf UR - https://proceedings.mlr.press/v40/Steinhardt15.html AB - We establish a minimax lower bound of Ω(\frackdBε) on the sample size needed to estimate parameters in a k-sparse linear regression of dimension d under memory restrictions to B bits, where εis the \ell_2 parameter error. When the covariance of the regressors is the identity matrix, we also provide an algorithm that uses \tildeO(B+k) bits and requires \tildeO(\frackdBε^2) observations to achieve error ε. Our lower bound also holds in the more general communication-bounded setting, where instead of a memory bound, at most B bits of information are allowed to be (adaptively) communicated about each sample. ER -
APA
Steinhardt, J. & Duchi, J.. (2015). Minimax rates for memory-bounded sparse linear regression. Proceedings of The 28th Conference on Learning Theory, in Proceedings of Machine Learning Research 40:1564-1587 Available from https://proceedings.mlr.press/v40/Steinhardt15.html.

Related Material