• français
    • English
  • English 
    • français
    • English
  • Login
JavaScript is disabled for your browser. Some features of this site may not work without it.
BIRD Home

Browse

This CollectionBy Issue DateAuthorsTitlesSubjectsJournals BIRDResearch centres & CollectionsBy Issue DateAuthorsTitlesSubjectsJournals

My Account

Login

Statistics

View Usage Statistics

Sample Complexity of Sinkhorn divergences

Thumbnail
View/Open
1810.02733.pdf (407.4Kb)
Date
2019
Dewey
Analyse
Sujet
Sinkhorn divergences
Conference name
AISTATS'19 - 22nd International Conference on Artificial Intelligence and Statistics
Conference date
04-2019
Conference city
Okinawa
Conference country
Japan
URI
https://basepub.dauphine.fr/handle/123456789/20859
Collections
  • CEREMADE : Publications
Metadata
Show full item record
Author
Genevay, Aude
60 CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Chizat, Lenaic
60 CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Bach, Francis
25027 Département d'informatique de l'École normale supérieure [DI-ENS]
Cuturi, Marco
409737 Graduate School of Informatics [Kyoto]
Peyré, Gabriel
66 Département de Mathématiques et Applications - ENS Paris [DMA]
Type
Communication / Conférence
Item number of pages
11
Abstract (EN)
Optimal transport (OT) and maximum mean discrepancies (MMD) are now routinely used in machine learning to compare probability measures. We focus in this paper on \emph{Sinkhorn divergences} (SDs), a regularized variant of OT distances which can interpolate, depending on the regularization strength ε, between OT (ε=0) and MMD (ε=∞). Although the tradeoff induced by that regularization is now well understood computationally (OT, SDs and MMD require respectively O(n3logn), O(n2) and n2 operations given a sample size n), much less is known in terms of their \emph{sample complexity}, namely the gap between these quantities, when evaluated using finite samples \emph{vs.} their respective densities. Indeed, while the sample complexity of OT and MMD stand at two extremes, 1/n1/d for OT in dimension d and 1/n−−√ for MMD, that for SDs has only been studied empirically. In this paper, we \emph{(i)} derive a bound on the approximation error made with SDs when approximating OT as a function of the regularizer ε, \emph{(ii)} prove that the optimizers of regularized OT are bounded in a Sobolev (RKHS) ball independent of the two measures and \emph{(iii)} provide the first sample complexity bound for SDs, obtained,by reformulating SDs as a maximization problem in a RKHS. We thus obtain a scaling in 1/n−−√ (as in MMD), with a constant that depends however on ε, making the bridge between OT and MMD complete.

  • Accueil Bibliothèque
  • Site de l'Université Paris-Dauphine
  • Contact
SCD Paris Dauphine - Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16

 Content on this site is licensed under a Creative Commons 2.0 France (CC BY-NC-ND 2.0) license.