• xmlui.mirage2.page-structure.header.title
    • français
    • English
  • Help
  • Login
  • Language 
    • Français
    • English
View Item 
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

BIRDResearch centres & CollectionsBy Issue DateAuthorsTitlesTypeThis CollectionBy Issue DateAuthorsTitlesType

My Account

LoginRegister

Statistics

Most Popular ItemsStatistics by CountryMost Popular Authors
Thumbnail

Geodesically-convex optimization for averaging partially observed covariance matrices

Yger, Florian; Chevallier, S.; Barthélemy, Q.; Suvrit, S. (2020), Geodesically-convex optimization for averaging partially observed covariance matrices, Proceedings of the 12th Asian Conference on Machine Learning, PMLR 129

View/Open
yger20a.pdf (664.6Kb)
Type
Communication / Conférence
Date
2020
Conference title
Proceedings of the Asian Conference on Machine Learning (ACML)
Conference date
2020-11
Conference city
Bangkok
Conference country
Thailand
Book title
Proceedings of the 12th Asian Conference on Machine Learning, PMLR 129
Metadata
Show full item record
Author(s)
Yger, Florian cc
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Chevallier, S. cc

Barthélemy, Q.

Suvrit, S.
Abstract (EN)
Symmetric positive definite (SPD) matrices permeates numerous scientific disciplines, including machine learning, optimization, and signal processing. Equipped with a Riemannian geometry, the space of SPD matrices benefits from compelling properties and its derived Riemannian mean is now the gold standard in some applications, e.g. brain-computer interfaces (BCI). This paper addresses the problem of averaging covariance matrices with missing variables. This situation often occurs with inexpensive or unreliable sensors, or when artifact-suppression techniques remove corrupted sensors leading to rank deficient matrices, hindering the use of the Riemannian geometry in covariance-based approaches. An alternate but questionable method consists in removing the matrices with missing variables, thus reducing the training set size. We address those limitations and propose a new formulation grounded in geodesic convexity. Our approach is evaluated on generated datasets with a controlled number of missing variables and a known baseline, demonstrating the robustness of the proposed estimator. The practical interest of this approach is assessed on real BCI datasets. Our results show that the proposed average is more robust and better suited for classification than classical data imputation methods.
Subjects / Keywords
SPD matrices; average; missing data; data imputation

Related items

Showing items related by title and author.

  • Thumbnail
    Ensemble learning based on functional connectivity and Riemannian geometry for robust workload estimation 
    Corsi, Marie-Constance; Chevallier, Sylvain; Barthélemy, Quentin; Hoxha, Isabelle; Yger, Florian Communication / Conférence
  • Thumbnail
    Riemannian Classification for SSVEP-Based BCI: Offline versus Online Implementations 
    Chevallier, Sylvain; Kalunga, Emmanuel; Barthélemy, Quentin; Yger, Florian (2018) Chapitre d'ouvrage
  • Thumbnail
    Geometry-aware principal component analysis for symmetric positive definite matrices 
    Horev, Inbal; Yger, Florian; Sugiyama, Masashi (2016) Article accepté pour publication ou publié
  • Thumbnail
    Riemannian Geometry on Connectivity for Clinical BCI 
    Corsi, Marie-Constance; Yger, Florian; Chevallier, Sylvain; Noûs, Camille (2021) Communication / Conférence
  • Thumbnail
    Bayesian Inference for Partially Observed Branching Processes 
    Donnet, Sophie; Rousseau, Judith (2016) Article accepté pour publication ou publié
Dauphine PSL Bibliothèque logo
Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16
Phone: 01 44 05 40 94
Contact
Dauphine PSL logoEQUIS logoCreative Commons logo