Geometry-aware principal component analysis for symmetric positive definite matrices
hal.structure.identifier | University of Tokyo | |
dc.contributor.author | Horev, Inbal | |
hal.structure.identifier | Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE] | |
dc.contributor.author | Yger, Florian
HAL ID: 17768 ORCID: 0000-0002-7182-8062 | |
hal.structure.identifier | University of Tokyo | |
dc.contributor.author | Sugiyama, Masashi | |
dc.date.accessioned | 2017-03-07T12:53:49Z | |
dc.date.available | 2017-03-07T12:53:49Z | |
dc.date.issued | 2016 | |
dc.identifier.issn | 0885-6125 | |
dc.identifier.uri | https://basepub.dauphine.fr/handle/123456789/16293 | |
dc.language.iso | en | en |
dc.subject | dimensionality reduction | en |
dc.subject | PCA | en |
dc.subject | Riemannian geometry | en |
dc.subject | SPD manifold | en |
dc.subject | Grassmann manifold | en |
dc.subject.ddc | 516; 519 | en |
dc.title | Geometry-aware principal component analysis for symmetric positive definite matrices | en |
dc.type | Article accepté pour publication ou publié | |
dc.description.abstracten | Symmetric positive definite (SPD) matrices in the form of covariance matrices, for example, are ubiquitous in machine learning applications. However, because their size grows quadratically with respect to the number of variables, high-dimensionality can pose a difficulty when working with them. So, it may be advantageous to apply to them dimensionality reduction techniques. Principal component analysis (PCA) is a canonical tool for dimensionality reduction, which for vector data maximizes the preserved variance. Yet, the commonly used, naive extensions of PCA to matrices result in sub-optimal variance retention. Moreover, when applied to SPD matrices, they ignore the geometric structure of the space of SPD matrices, further degrading the performance. In this paper we develop a new Riemannian geometry based formulation of PCA for SPD matrices that (1) preserves more data variance by appropriately extending PCA to matrix data, and (2) extends the standard definition from the Euclidean to the Riemannian geometries. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals and for texture image classification. | en |
dc.relation.isversionofjnlname | Machine Learning | |
dc.relation.isversionofjnldate | 2016-11 | |
dc.relation.isversionofjnlpages | 1-30 | en |
dc.relation.isversionofdoi | 10.1007/s10994-016-5605-5 | en |
dc.contributor.countryeditoruniversityother | JAPAN | |
dc.subject.ddclabel | Géométrie; Probabilités et mathématiques appliquées | en |
dc.relation.forthcoming | non | en |
dc.relation.forthcomingprint | non | en |
dc.description.ssrncandidate | non | en |
dc.description.halcandidate | oui | en |
dc.description.readership | recherche | en |
dc.description.audience | International | en |
dc.relation.Isversionofjnlpeerreviewed | oui | en |
dc.relation.Isversionofjnlpeerreviewed | oui | en |
dc.date.updated | 2017-03-07T12:41:15Z | |
hal.identifier | hal-01484571 | * |
hal.version | 1 | * |
hal.author.function | aut | |
hal.author.function | aut | |
hal.author.function | aut |
Files in this item
Files | Size | Format | View |
---|---|---|---|
There are no files associated with this item. |