Home Page

Papers

Submissions

News

Editorial Board

Open Source Software

Proceedings (PMLR)

Transactions (TMLR)

Search

Statistics

Login

Frequently Asked Questions

Contact Us



RSS Feed

Communication-efficient Sparse Regression

Jason D. Lee, Qiang Liu, Yuekai Sun, Jonathan E. Taylor; 18(5):1−30, 2017.

Abstract

We devise a communication-efficient approach to distributed sparse regression in the high-dimensional setting. The key idea is to average debiased or desparsified lasso estimators. We show the approach converges at the same rate as the lasso as long as the dataset is not split across too many machines, and consistently estimates the support under weaker conditions than the lasso. On the computational side, we propose a new parallel and computationally-efficient algorithm to compute the approximate inverse covariance required in the debiasing approach, when the dataset is split across samples. We further extend the approach to generalized linear models.

[abs][pdf][bib]       
© JMLR 2017. (edit, beta)