• xmlui.mirage2.page-structure.header.title
    • français
    • English
  • Aide
  • Connexion
  • Langue 
    • Français
    • English
Consulter le document 
  •   Accueil
  • CEREMADE (UMR CNRS 7534)
  • CEREMADE : Publications
  • Consulter le document
  •   Accueil
  • CEREMADE (UMR CNRS 7534)
  • CEREMADE : Publications
  • Consulter le document
JavaScript is disabled for your browser. Some features of this site may not work without it.

Afficher

Toute la baseCentres de recherche & CollectionsAnnée de publicationAuteurTitreTypeCette collectionAnnée de publicationAuteurTitreType

Mon compte

Connexion

Enregistrement

Statistiques

Documents les plus consultésStatistiques par paysAuteurs les plus consultés
Thumbnail - No thumbnail

Conditions for posterior contraction in the sparse normal means problem

van der Pas, S.L.; Salomond, Jean-Bernard; Schmidt-Hieber, Johannes (2016), Conditions for posterior contraction in the sparse normal means problem, Electronic Journal of Statistics, 10, 1, p. 976-1000. 10.1214/16-EJS1130

Type
Article accepté pour publication ou publié
Lien vers un document non conservé dans cette base
https://arxiv.org/abs/1510.02232v2
Date
2016
Nom de la revue
Electronic Journal of Statistics
Volume
10
Numéro
1
Éditeur
Institute of Mathematical Statistics
Pages
976-1000
Identifiant publication
10.1214/16-EJS1130
Métadonnées
Afficher la notice complète
Auteur(s)
van der Pas, S.L.

Salomond, Jean-Bernard

Schmidt-Hieber, Johannes
Résumé (EN)
The first Bayesian results for the sparse normal means problem were proven for spike-and-slab priors. However, these priors are less convenient from a computational point of view. In the meanwhile, a large number of continuous shrinkage priors has been proposed. Many of these shrinkage priors can be written as a scale mixture of normals, which makes them particularly easy to implement. We propose general conditions on the prior on the local variance in scale mixtures of normals, such that posterior contraction at the minimax rate is assured. The conditions require tails at least as heavy as Laplace, but not too heavy, and a large amount of mass around zero relative to the tails, more so as the sparsity increases. These conditions give some general guidelines for choosing a shrinkage prior for estimation under a nearly black sparsity assumption. We verify these conditions for the class of priors considered in [12], which includes the horseshoe and the normal-exponential gamma priors, and for the horseshoe+, the inverse-Gaussian prior, the normal-gamma prior, and the spike-and-slab Lasso, and thus extend the number of shrinkage priors which are known to lead to posterior contraction at the minimax estimation rate.
Mots-clés
sparsity; nearly black vectors; normal means problem; horseshoe; horseshoe+; Bayesian inference; frequentist Bayes; posterior contraction; shrinkage priors

Publications associées

Affichage des éléments liés par titre et auteur.

  • Vignette de prévisualisation
    Concentration rate and consistency of the posterior distribution for selected priors under monotonicity constraints 
    Salomond, Jean-Bernard (2014) Article accepté pour publication ou publié
  • Vignette de prévisualisation
    On some aspects of the asymptotic properties of Bayesian approaches in nonparametric and semiparametric models 
    Scricciolo, Catia; Salomond, Jean-Bernard; Rousseau, Judith (2014-01-30) Communication / Conférence
  • Vignette de prévisualisation
    Adaptive Bayes Test for Monotonicity 
    Salomond, Jean-Bernard (2014) Communication / Conférence
  • Vignette de prévisualisation
    Sequential Quasi Monte Carlo for Dirichlet Process Mixture Models 
    Arbel, Julyan; Salomond, Jean-Bernard (2016) Communication / Conférence
  • Vignette de prévisualisation
    Propriétés fréquentistes des méthodes Bayésiennes semi-paramétriques et non paramétriques 
    Salomond, Jean-Bernard (2014-09) Thèse
Dauphine PSL Bibliothèque logo
Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16
Tél. : 01 44 05 40 94
Contact
Dauphine PSL logoEQUIS logoCreative Commons logo