Algorithms that get old : the case of generative deep neural networks
Turinici, Gabriel (2022-09), Algorithms that get old : the case of generative deep neural networks, The 8th International Conference on Machine Learning, Optimization, and Data Science - LOD 2022, 2022-09, Siena, Italy
TypeCommunication / Conférence
Conference titleThe 8th International Conference on Machine Learning, Optimization, and Data Science - LOD 2022
MetadataShow full item record
CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Abstract (EN)Generative deep neural networks used in machine learning, like the Variational Auto-Encoders (VAE), and Generative Adversarial Networks (GANs) produce new objects each time when asked to do so with the constraint that the new objects remain similar to some list of examples given as input. However, this behavior is unlike that of human artists that change their style as time goes by and seldom return to the style of the initial creations. We investigate a situation where VAEs are used to sample from a probability measure described by some empirical dataset. Based on recent works on Radon-Sobolev statistical distances, we propose a numerical paradigm, to be used in conjunction with a generative algorithm, that satisfies the two following requirements: the objects created do not repeat and evolve to fill the entire target probability distribution.
Subjects / Keywordsvariational auto-encoder; generative adversarial network; statistical distance; vector quantization; deep neural network; measurecompression
Showing items related by title and author.
Buffa, Annalisa; Maday, Yvon; Patera, Anthony T.; Prud'Homme, Christophe; Turinici, Gabriel (2012) Article accepté pour publication ou publié