Menu

Gradient Descent with Sparsification: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property

calendar icon Aug 26, 2009 5199 views
video thumbnail
Pause
Mute
speed icon
speed icon
0.25
0.5
0.75
1
1.25
1.5
1.75
2

In this paper, we present an algorithm for finding an s-sparse vector x that minimizes the square-error ∥y − Φx∥ 2 where Φ satisfies the restricted isometry property (RIP). Our algorithm, called GraDeS (Gradient Descent with Sparsification) starts from an arbitrary s-sparse x and iteratively updates it as: x← Hsx + 1γ · Φt (y − Φx)where γ > 1 is a constant and Hs sets all but largest s coordinates in absolute value to zero. We show that GraDeS, in constant number of iterations, computes the correct s-sparse solution to the system y = Φx where Φ satisfies the condition that the isometric constant δ2s < 1/3. This is the most general condition for which, near-linear time algorithm is known. In comparison, the best condition under which any polynomial-time algorithm is known, is δ2s < √2 − 1. An important contribution of the paper is to analyze how the hard-thresholding function Hs acts w.r.t. the potential ∥y − Φx∥ 2 . A special case of GraDeS, corresponding to γ = 1, called Iterative Hard Thresholding (IHT), was previously shown to converge when δ3s < 1/√32. Our Matlab implementation of GraDeS out-performs previously proposed algorithms like Subspace Pursuit, StOMP, OMP, and Lasso by an order of magnitude. Curiously, our experiments also uncovered several cases where L1-regularized regression (Lasso) fails but GraDeS finds the correct solution.

RELATED CATEGORIES

MORE VIDEOS FROM THE EVENT

MORE VIDEOS FROM THE SAME CATEGORIES

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.