Home Page

Papers

Submissions

News

Editorial Board

Announcements

Proceedings

Open Source Software

Search

Login



RSS Feed

An Improved Gap-Dependency Analysis of the Noisy Power Method

Maria-Florina Balcan, Simon Shaolei Du, Yining Wang, Adams Wei Yu
29th Annual Conference on Learning Theory, pp. 284–309, 2016

Abstract

We consider the noisy power method algorithm, which has wide applications in machine learning and statistics, especially those related to principal component analysis (PCA) under resource (communication, memory or privacy) constraints. Existing analysis of the noisy power method shows an unsatisfactory dependency over the ``consecutive" spectral gap \((\sigma_k-\sigma_{k+1})\) of an input data matrix, which could be very small and hence limits the algorithm’s applicability. In this paper, we present a new analysis of the noisy power method that achieves improved gap dependency for both sample complexity and noise tolerance bounds. More specifically, we improve the dependency over \((\sigma_k-\sigma_{k+1})\) to dependency over \((\sigma_k-\sigma_{q+1})\), where \(q\) is an intermediate algorithm parameter and could be much larger than the target rank \(k\). Our proofs are built upon a novel characterization of proximity between two subspaces that differ from canonical angle characterizations analyzed in previous works. Finally, we apply our improved bounds to distributed private PCA and memory-efficient streaming PCA and obtain bounds that are superior to existing results in the literature.

Related Material