• xmlui.mirage2.page-structure.header.title
    • français
    • English
  • Aide
  • Connexion
  • Langue 
    • Français
    • English
Consulter le document 
  •   Accueil
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • Consulter le document
  •   Accueil
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • Consulter le document
JavaScript is disabled for your browser. Some features of this site may not work without it.

Afficher

Toute la baseCentres de recherche & CollectionsAnnée de publicationAuteurTitreTypeCette collectionAnnée de publicationAuteurTitreType

Mon compte

Connexion

Enregistrement

Statistiques

Documents les plus consultésStatistiques par paysAuteurs les plus consultés
Thumbnail - Request a copy

Geometry-aware principal component analysis for symmetric positive definite matrices

Horev, Inbal; Yger, Florian; Sugiyama, Masashi (2016), Geometry-aware principal component analysis for symmetric positive definite matrices, Machine Learning, p. 1-30. 10.1007/s10994-016-5605-5

Type
Article accepté pour publication ou publié
Date
2016
Nom de la revue
Machine Learning
Pages
1-30
Identifiant publication
10.1007/s10994-016-5605-5
Métadonnées
Afficher la notice complète
Auteur(s)
Horev, Inbal
University of Tokyo
Yger, Florian cc
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Sugiyama, Masashi
University of Tokyo
Résumé (EN)
Symmetric positive definite (SPD) matrices in the form of covariance matrices, for example, are ubiquitous in machine learning applications. However, because their size grows quadratically with respect to the number of variables, high-dimensionality can pose a difficulty when working with them. So, it may be advantageous to apply to them dimensionality reduction techniques. Principal component analysis (PCA) is a canonical tool for dimensionality reduction, which for vector data maximizes the preserved variance. Yet, the commonly used, naive extensions of PCA to matrices result in sub-optimal variance retention. Moreover, when applied to SPD matrices, they ignore the geometric structure of the space of SPD matrices, further degrading the performance. In this paper we develop a new Riemannian geometry based formulation of PCA for SPD matrices that (1) preserves more data variance by appropriately extending PCA to matrix data, and (2) extends the standard definition from the Euclidean to the Riemannian geometries. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals and for texture image classification.
Mots-clés
dimensionality reduction; PCA; Riemannian geometry; SPD manifold; Grassmann manifold

Publications associées

Affichage des éléments liés par titre et auteur.

  • Vignette de prévisualisation
    Geometry-aware stationary subspace analysis 
    Horev, Inbal; Yger, Florian; Sugiyama, Masashi (2016) Communication / Conférence
  • Vignette de prévisualisation
    Multitask Principal Component Analysis 
    Yamane, Ikko; Yger, Florian; Berar, Maxime; Sugiyama, Masashi (2016) Communication / Conférence
  • Vignette de prévisualisation
    Ensemble learning based on functional connectivity and Riemannian geometry for robust workload estimation 
    Corsi, Marie-Constance; Chevallier, Sylvain; Barthélemy, Quentin; Hoxha, Isabelle; Yger, Florian Communication / Conférence
  • Vignette de prévisualisation
    Riemannian Geometry on Connectivity for Clinical BCI 
    Corsi, Marie-Constance; Yger, Florian; Chevallier, Sylvain; Noûs, Camille (2021) Communication / Conférence
  • Vignette de prévisualisation
    Geodesically-convex optimization for averaging partially observed covariance matrices 
    Yger, Florian; Chevallier, S.; Barthélemy, Q.; Suvrit, S. (2020) Communication / Conférence
Dauphine PSL Bibliothèque logo
Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16
Tél. : 01 44 05 40 94
Contact
Dauphine PSL logoEQUIS logoCreative Commons logo