• français
    • English
  • English 
    • français
    • English
  • Login
JavaScript is disabled for your browser. Some features of this site may not work without it.
BIRD Home

Browse

This CollectionBy Issue DateAuthorsTitlesSubjectsJournals BIRDResearch centres & CollectionsBy Issue DateAuthorsTitlesSubjectsJournals

My Account

Login

Statistics

View Usage Statistics

The Degrees of Freedom of Partly Smooth Regularizers

Thumbnail
Date
2016
Publisher city
Paris
Link to item file
https://arxiv.org/abs/1404.5557v4
Dewey
Probabilités et mathématiques appliquées
Sujet
Total variation; Degrees of freedom; Sparsity; Partial smoothness; Model selection; Group Lasso; Semi-algebraic sets; o-minimal structures
Journal issue
Annals of the Institute of Statistical Mathematics
Volume
69
Number
4
Publication date
2016
Article pages
791-832
DOI
http://dx.doi.org/10.1007/s10463-016-0563-z
URI
https://basepub.dauphine.fr/handle/123456789/13154
Collections
  • CEREMADE : Publications
Metadata
Show full item record
Author
Vaiter, Samuel
Deledalle, Charles-Alban
Fadili, Jalal
60 CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Peyré, Gabriel
60 CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Dossal, Charles
Type
Article accepté pour publication ou publié
Abstract (EN)
We study regularized regression problems where the regularizer is a proper, lower-semicontinuous, convex and partly smooth function relative to a Riemannian submanifold. This encompasses several popular examples including the Lasso, the group Lasso, the max and nuclear norms, as well as their composition with linear operators (e.g., total variation or fused Lasso). Our main sensitivity analysis result shows that the predictor moves locally stably along the same active submanifold as the observations undergo small perturbations. This plays a pivotal role in getting a closed-form expression for the divergence of the predictor w.r.t. observations. We also show that, for many regularizers, including polyhedral ones or the analysis group Lasso, this divergence formula holds Lebesgue a.e. When the perturbation is random (with an appropriate continuous distribution), this allows us to derive an unbiased estimator of the degrees of freedom and the prediction risk. Our results unify and go beyond those already known in the literature

  • Accueil Bibliothèque
  • Site de l'Université Paris-Dauphine
  • Contact
SCD Paris Dauphine - Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16

 Content on this site is licensed under a Creative Commons 2.0 France (CC BY-NC-ND 2.0) license.