Geometry-aware stationary subspace analysis
Horev, Inbal; Yger, Florian; Sugiyama, Masashi (2016), Geometry-aware stationary subspace analysis, in Robert J. Durrant, Kee-Eung Kim, Proceedings of The 8th Asian Conference on Machine Learning (ACML 2016), JMLR: Workshop and Conference Proceedings, p. 430-444
TypeCommunication / Conférence
External document linkhttp://jmlr.org/proceedings/papers/v63/Horev84.html
Book titleProceedings of The 8th Asian Conference on Machine Learning (ACML 2016)
Book authorRobert J. Durrant, Kee-Eung Kim
MetadataShow full item record
Abstract (EN)In many real-world applications data exhibits non-stationarity, i.e., its distribution changes over time. One approach to handling non-stationarity is to remove or minimize it before attempting to analyze the data. In the context of brain computer interface (BCI) data analysis this is sometimes achieved using stationary subspace analysis (SSA). The classic SSA method finds a matrix that projects the data onto a stationary subspace by optimizing a cost function based on a matrix divergence. In this work we present an alternative method for SSA based on a symmetrized version of this matrix divergence. We show that this frames the problem in terms of distances between symmetric positive definite (SPD) matrices, suggesting a geometric interpretation of the problem. Stemming from this geometric viewpoint, we introduce and analyze a method which utilizes the geometry of the SPD matrix manifold and the invariance properties of its metrics. Most notably we show that these invariances alleviate the need to whiten the input matrices, a common step in many SSA methods which often introduces error. We demonstrate the usefulness of our technique in experiments on both synthetic and real-world data.
Subjects / KeywordsStationary subspace analysis; dimensionality reduction; Riemannian geometry; SPD manifold; Grassmann manifold
Showing items related by title and author.