IMI Interdisciplinary Mathematics InstituteCollege of Arts and Sciences
Preprint Series by Year

## Stable recovery of sparse overcomplete representations in the presence of noise

A 2004 Preprint by D. Donoho, M. Elad, and V. Temlyakov

• 2004:06
• Overcomplete representations are attracting interest in signal processing theory, particularly due to their potential to generate sparse representations of signals. However, in general, the problem of finding sparse representations must be unstable in the presence of noise. We prove the possibility of stable recovery under a combination of sufficient sparsity and favorable structure of the overcomplete system.
We consider the situation where an ideal underlying signal indeed has a sparse representation but we can observe only a noisy version. We assume that the overcomplete system has a property of mutual incoherence, and that the ideal underlying signal has a sufficiently sparse representation, according to a sparsity threshold defined using the mutual coherence of the overcomplete system. Under these conditions, we show that the optimally–sparse approximation to the noisy data, to within the noise level, differs from the optimally–sparse decomposition of the ideal noiseless signal by at most a constant multiple of the noise level.
In general, this optimal-sparsity method requires heavy computational effort (e.g. brute-force combinatorial optimization). However, we show that stable reconstruction is also available by solving a convex quadratic programming problem. In this approach, the sparsity objective is replaced by the $l^1$ objective; one finds the approximate representation whose coefficients have the minimal $l^1$ norm while fitting the data to within the noise level. This explains certain successes of Basis Pursuit in signal processing and Lasso in statistical modeling. We also consider greedy stepwise least-squares and show that, when stopped at the point where the size of the residual equals the noise level, this has at least a local stability property. This explains certain successes of Matching Pursuit in signal processing and Stepwise Regression in statistical modeling.
These methods can also be applied with an exaggerated noise tolerance. When done properly, the resulting sparse approximation of the noisy data will actually contain only ’correct’ nonzeros, i.e. only terms also appearing in the unique sparsest representation of the ideal noiseless sparse signal.

© Interdisciplinary Mathematics Institute | The University of South Carolina Board of Trustees | Webmaster