The need for efficiently comparing and representing datasets with unknown alignment spans various fields, from model analysis and comparison in machine learning to trend discovery in collections of medical datasets. We use manifold learning to compare the intrinsic geometric structures of different datasets by comparing their diffusion operators, symmetric positive-definite (SPD) matrices that relate to approximations of the continuous Laplace-Beltrami operator from discrete samples. Existing methods typically assume known data alignment and compare such operators in a pointwise manner. Instead, we exploit the Riemannian geometry of SPD matrices to compare these operators and define a new theoretically-motivated distance based on a lower bo...
We introduce an estimator for distances in a compact Riemannian manifold M based on graph Laplacian ...
Self-supervised learning (SSL) has emerged as a desirable paradigm in computer vision due to the ina...
Deep neural networks implement a sequence of layer-by-layer operations that are each relatively easy...
We take a non-Euclidean view at three classical machine learning subjects: low-dimensional embedding...
The symmetric positive definite (SPD) matrices, forming a Riemannian manifold, are commonly used as ...
Inherent to state-of-the-art dimension reduction algorithms is the assumption that global distances ...
Representational similarity analysis (RSA) summarizes activity patterns for a set of experimental co...
In this work, we propose a novel framework for estimating the dimension of the data manifold using a...
Probabilistic Dimensionality Reduction methods can provide a flexible data representation and a more...
Representational similarity analysis (RSA) tests models of brain computation by investigating how ne...
Thesis (Ph.D.)--University of Washington, 2013In this work, we explore and exploit the use of differ...
Part 1: MAKE TopologyInternational audienceMost Machine Learning techniques traditionally rely on so...
Recently, a novel Log-Euclidean Riemannian metric [28] is proposed for statistics on symmetric posit...
The most useful data mining primitives are distance measures. With an effective distance measure, it...
Manifolds discovered by machine learning models provide a compact representation of the underlying d...
We introduce an estimator for distances in a compact Riemannian manifold M based on graph Laplacian ...
Self-supervised learning (SSL) has emerged as a desirable paradigm in computer vision due to the ina...
Deep neural networks implement a sequence of layer-by-layer operations that are each relatively easy...
We take a non-Euclidean view at three classical machine learning subjects: low-dimensional embedding...
The symmetric positive definite (SPD) matrices, forming a Riemannian manifold, are commonly used as ...
Inherent to state-of-the-art dimension reduction algorithms is the assumption that global distances ...
Representational similarity analysis (RSA) summarizes activity patterns for a set of experimental co...
In this work, we propose a novel framework for estimating the dimension of the data manifold using a...
Probabilistic Dimensionality Reduction methods can provide a flexible data representation and a more...
Representational similarity analysis (RSA) tests models of brain computation by investigating how ne...
Thesis (Ph.D.)--University of Washington, 2013In this work, we explore and exploit the use of differ...
Part 1: MAKE TopologyInternational audienceMost Machine Learning techniques traditionally rely on so...
Recently, a novel Log-Euclidean Riemannian metric [28] is proposed for statistics on symmetric posit...
The most useful data mining primitives are distance measures. With an effective distance measure, it...
Manifolds discovered by machine learning models provide a compact representation of the underlying d...
We introduce an estimator for distances in a compact Riemannian manifold M based on graph Laplacian ...
Self-supervised learning (SSL) has emerged as a desirable paradigm in computer vision due to the ina...
Deep neural networks implement a sequence of layer-by-layer operations that are each relatively easy...