Geometry-Aware visualization of high dimensional Symmetric Positive Definite matrices
Résumé
Symmetric Positive Definite (SPD) matrices are pervasive in machine learning, from data features (such as covariance matrices) to optimization process. These matrices induce a Riemannian structure, where the curvature plays a critical role in the success of approaches based on those geometries. Yet, for ML practitioners wanting to visualize SPD matrices, the existing (flat) Euclidean approaches will hide the curvature of the manifold. To overcome this lack of expressivity in the existing algorithms, we introduce Riemannian versions of two state-of-the-art techniques, namely t-SNE and Multidimensional Scaling. Therefore, we are able to reduce a set of c × c SPD matrices into a set of 2 × 2 SPD matrices in order to capture the curvature information and avoid any distortion induced by flattening the representation in a Euclidean setup. Moreover, our approaches pave the way for targeting more general dimensionality reduction applications while preserving the geometry of the data. We performed experiments on controlled synthetic dataset to ensure that the lowdimensional representation preserves the geometric properties of both SPD Gaussian and geodesics. We also conduct experiments on various real datasets, such as video, anomaly detection, brain signal and others.
Domaines
Informatique [cs]Origine | Fichiers produits par l'(les) auteur(s) |
---|