dc.contributor.author Dossal, Charles dc.contributor.author Fadili, Jalal dc.contributor.author Peyré, Gabriel dc.contributor.author Chabanol, Marie-Line dc.date.accessioned 2011-01-10T11:10:47Z dc.date.available 2011-01-10T11:10:47Z dc.date.issued 2012 dc.identifier.uri https://basepub.dauphine.fr/handle/123456789/5410 dc.language.iso en en dc.subject consistency en dc.subject sparsistency en dc.subject L1 minimization en dc.subject Compressed sensing en dc.subject.ddc 519 en dc.title Sharp Support Recovery from Noisy Random Measurements by L1 minimization en dc.type Article accepté pour publication ou publié dc.contributor.editoruniversityother Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen (GREYC);France dc.contributor.editoruniversityother Institut de Mathématiques de Bordeaux (IMB);France dc.description.abstracten In this paper, we investigate the theoretical guarantees of penalized $\lun$ minimization (also called Basis Pursuit Denoising or Lasso) in terms of sparsity pattern recovery (support and sign consistency) from noisy measurements with non-necessarily random noise, when the sensing operator belongs to the Gaussian ensemble (i.e. random design matrix with i.i.d. Gaussian entries). More precisely, we derive sharp non-asymptotic bounds on the sparsity level and (minimal) signal-to-noise ratio that ensure support identification for most signals and most Gaussian sensing matrices by solving the Lasso problem with an appropriately chosen regularization parameter. Our first purpose is to establish conditions allowing exact sparsity pattern recovery when the signal is strictly sparse. Then, these conditions are extended to cover the compressible or nearly sparse case. In these two results, the role of the minimal signal-to-noise ratio is crucial. Our third main result gets rid of this assumption in the strictly sparse case, but this time, the Lasso allows only partial recovery of the support. We also provide in this case a sharp $\ell_2$-consistency result on the coefficient vector. The results of the present work have several distinctive features compared to previous ones. One of them is that the leading constants involved in all the bounds are sharp and explicit. This is illustrated by some numerical experiments where it is indeed shown that the sharp sparsity level threshold identified by our theoretical results below which sparsistency of the Lasso is guaranteed meets that empirically observed. en dc.relation.isversionofjnlname Applied and Computational Harmonic Analysis dc.relation.isversionofjnlvol 33 dc.relation.isversionofjnlissue 1 dc.relation.isversionofjnldate 2012 dc.relation.isversionofjnlpages 24-43 dc.relation.isversionofdoi http://dx.doi.org/10.1016/j.acha.2011.09.003 dc.identifier.urlsite http://hal.archives-ouvertes.fr/hal-00553670/fr/ en dc.description.sponsorshipprivate oui en dc.relation.isversionofjnlpublisher Elsevier dc.subject.ddclabel Probabilités et mathématiques appliquées en
﻿

## Files in this item

FilesSizeFormatView

There are no files associated with this item.