• français
    • English
  • français 
    • français
    • English
  • Connexion
JavaScript is disabled for your browser. Some features of this site may not work without it.
Accueil

Afficher

Cette collectionPar Date de CréationAuteursTitresSujetsNoms de revueToute la baseCentres de recherche & CollectionsPar Date de CréationAuteursTitresSujetsNoms de revue

Mon compte

Connexion

Statistiques

Afficher les statistiques d'usage

Model Selection with Low Complexity Priors

Thumbnail
Date
2015
Lien vers un document non conservé dans cette base
https://arxiv.org/abs/1307.2342v2
Indexation documentaire
Probabilités et mathématiques appliquées
Subject
Sparsity; Partial smoothness; Inverse problems; Compressed Sensing; onvex regularization; Model selection; Total variation
Nom de la revue
Information and Inference
Volume
4
Numéro
3
Date de publication
2015
Pages article
230-287
DOI
http://dx.doi.org/10.1093/imaiai/iav005
URI
https://basepub.dauphine.fr/handle/123456789/13750
Collections
  • CEREMADE : Publications
Métadonnées
Afficher la notice complète
Auteur
Vaiter, Samuel
Golbabaee, Mohammad
Fadili, Jalal
60 CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Peyré, Gabriel
60 CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Type
Article accepté pour publication ou publié
Résumé en anglais
Regularization plays a pivotal role when facing the challenge of solving ill-posed inverse problems, where the number of observations is smaller than the ambient dimension of the object to be estimated. A line of recent work has studied regularization models with various types of low-dimensional structures. In such settings, the general approach is to solve a regularized optimization problem, which combines a data fidelity term and some regularization penalty that promotes the assumed low-dimensional/simple structure. This paper provides a general framework to capture this low-dimensional structure through what we coin partly smooth functions relative to a linear manifold. These are convex, non-negative, closed and finite-valued functions that will promote objects living on low-dimensional subspaces. This class of regularizers encompasses many popular examples such as the L1 norm, L1-L2 norm (group sparsity), as well as several others including the Linfty norm. We also show that the set of partly smooth functions relative to a linear manifold is closed under addition and pre-composition by a linear operator, which allows to cover mixed regularization, and the so-called analysis-type priors (e.g. total variation, fused Lasso, finite-valued polyhedral gauges). Our main result presents a unified sharp analysis of exact and robust recovery of the low-dimensional subspace model associated to the object to recover from partial measurements. This analysis is illustrated on a number of special and previously studied cases, and on an analysis of the performance of Linfty regularization in a compressed sensing scenario.

  • Accueil Bibliothèque
  • Site de l'Université Paris-Dauphine
  • Contact
SCD Paris Dauphine - Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16

 Cette création est mise à disposition sous un contrat Creative Commons.