• xmlui.mirage2.page-structure.header.title
    • français
    • English
  • Help
  • Login
  • Language 
    • Français
    • English
View Item 
  •   BIRD Home
  • CEREMADE (UMR CNRS 7534)
  • CEREMADE : Publications
  • View Item
  •   BIRD Home
  • CEREMADE (UMR CNRS 7534)
  • CEREMADE : Publications
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

BIRDResearch centres & CollectionsBy Issue DateAuthorsTitlesTypeThis CollectionBy Issue DateAuthorsTitlesType

My Account

LoginRegister

Statistics

Most Popular ItemsStatistics by CountryMost Popular Authors
Thumbnail

Learning Generative Models with Sinkhorn Divergences

Genevay, Aude; Peyré, Gabriel; Cuturi, Marco (2018), Learning Generative Models with Sinkhorn Divergences, in Amos Storkey, Fernando Perez-Cruz, Proceedings of Machine Learning Research, Volume 84: International Conference on Artificial Intelligence and Statistics, Proceedings of Machine Learning Research (PMLR), p. 1608-1617

View/Open
1706.00292.pdf (1.103Mb)
Type
Communication / Conférence
Date
2018
Conference title
AISTATS
Conference date
2018-04
Conference city
Lanzarote
Conference country
Spain
Book title
Proceedings of Machine Learning Research, Volume 84: International Conference on Artificial Intelligence and Statistics
Book author
Amos Storkey, Fernando Perez-Cruz
Publisher
Proceedings of Machine Learning Research (PMLR)
Pages
1608-1617
Metadata
Show full item record
Author(s)
Genevay, Aude
CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Peyré, Gabriel
CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Cuturi, Marco cc
Centre de Recherche en Économie et Statistique [CREST]
Abstract (EN)
The ability to compare two degenerate probability distributions (i.e. two probability distributions supported on two distinct low-dimensional manifolds living in a much higher-dimensional space) is a crucial problem arising in the estimation of generative models for high-dimensional observations such as those arising in computer vision or natural language. It is known that optimal transport metrics can represent a cure for this problem, since they were specifically designed as an alternative to information divergences to handle such problematic scenarios. Unfortunately, training generative machines using OT raises formidable computational and statistical challenges, because of (i) the computational burden of evaluating OT losses, (ii) the instability and lack of smoothness of these losses, (iii) the difficulty to estimate robustly these losses and their gradients in high dimension. This paper presents the first tractable computational method to train large scale generative models using an optimal transport loss, and tackles these three issues by relying on two key ideas: (a) entropic smoothing, which turns the original OT loss into one that can be computed using Sinkhorn fixed point iterations; (b) algorithmic (automatic) differentiation of these iterations. These two approximations result in a robust and differentiable approximation of the OT loss with streamlined GPU execution. Entropic smoothing generates a family of losses interpolating between Wasserstein (OT) and Maximum Mean Discrepancy (MMD), thus allowing to find a sweet spot leveraging the geometry of OT and the favorable high-dimensional sample complexity of MMD which comes with unbiased gradient estimates. The resulting computational architecture complements nicely standard deep network generative models by a stack of extra layers implementing the loss function.

Related items

Showing items related by title and author.

  • Thumbnail
    Sample Complexity of Sinkhorn divergences 
    Genevay, Aude; Chizat, Lenaic; Bach, Francis; Cuturi, Marco; Peyré, Gabriel (2019) Communication / Conférence
  • Thumbnail
    Stochastic Optimization for Large-scale Optimal Transport 
    Genevay, Aude; Cuturi, Marco; Peyré, Gabriel; Bach, Francis (2016) Communication / Conférence
  • Thumbnail
    Interpolating between Optimal Transport and MMD using Sinkhorn Divergences 
    Feydy, Jean; Séjourné, Thibault; Vialard, François-Xavier; Amari, Shun-ichi; Trouvé, Alain; Peyré, Gabriel (2019) Communication / Conférence
  • Thumbnail
    Wasserstein barycentric coordinates: histogram regression using optimal transport 
    Bonneel, Nicolas; Peyré, Gabriel; Cuturi, Marco (2016) Communication / Conférence
  • Thumbnail
    Convolutional wasserstein distances: efficient optimal transportation on geometric domains 
    Solomon, Justin; De Goes, Fernando; Peyré, Gabriel; Cuturi, Marco; Butscher, Adrian; Nguyen, Andy; Du, Tao; Guibas, Leonidas (2015) Article accepté pour publication ou publié
Dauphine PSL Bibliothèque logo
Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16
Phone: 01 44 05 40 94
Contact
Dauphine PSL logoEQUIS logoCreative Commons logo