International audienceDimension reduction (DR) methods provide systematic approaches for analyzing high-dimensional data. A key requirement for DR is to incorporate global dependencies among original and embedded samples while preserving clusters in the embedding space. To achieve this, we combine the principles of optimal transport (OT) and principal component analysis (PCA). Our method seeks the best linear subspace that minimizes reconstruction error using entropic OT, which naturally encodes the neighborhood information of the samples. From an algorithmic standpoint, we propose an efficient block-majorization-minimization solver over the Stiefel manifold. Our experimental results demonstrate that our approach can effectively preserve hi...
We address the problem of performing Principal Component Analysis over a family of probability measu...
International audienceClustering in high-dimensional spaces is nowadays a recurrent problem in many ...
Recently there has been a lot of interest in geometri-cally motivated approaches to data analysis in...
Dimension reduction (DR) methods provide systematic approaches for analyzing high-dimensional data. ...
International audienceWe present a versatile adaptation of existing dimensionality reduction (DR) ob...
International audienceOptimal Transport (OT) defines geometrically meaningful "Wasserstein" distance...
Many approaches in machine learning rely on a weighted graph to encode the similarities between samp...
© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Euclidea...
We present new algorithms to compute the mean of a set of empirical probability measures under the o...
We seek a generalization of regression and principle component analysis (PCA) in a metric space wher...
High-dimensional data representation is an important problem in many different areas of science. Now...
We study the extraction of nonlinear data models in high-dimensional spaces with modified self-organ...
AMS subject classifications. 33F05, 49M99, 65D99, 90C08International audienceThis article introduces...
We address the problem of performing Principal Component Analysis over a family of probability measu...
International audienceClustering in high-dimensional spaces is nowadays a recurrent problem in many ...
Recently there has been a lot of interest in geometri-cally motivated approaches to data analysis in...
Dimension reduction (DR) methods provide systematic approaches for analyzing high-dimensional data. ...
International audienceWe present a versatile adaptation of existing dimensionality reduction (DR) ob...
International audienceOptimal Transport (OT) defines geometrically meaningful "Wasserstein" distance...
Many approaches in machine learning rely on a weighted graph to encode the similarities between samp...
© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Euclidea...
We present new algorithms to compute the mean of a set of empirical probability measures under the o...
We seek a generalization of regression and principle component analysis (PCA) in a metric space wher...
High-dimensional data representation is an important problem in many different areas of science. Now...
We study the extraction of nonlinear data models in high-dimensional spaces with modified self-organ...
AMS subject classifications. 33F05, 49M99, 65D99, 90C08International audienceThis article introduces...
We address the problem of performing Principal Component Analysis over a family of probability measu...
International audienceClustering in high-dimensional spaces is nowadays a recurrent problem in many ...
Recently there has been a lot of interest in geometri-cally motivated approaches to data analysis in...