We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances that respect the expected metric lead to more appropriate generation of new data.Peer Reviewe
When considering probabilistic pattern recognition methods, especially methods based on Bayesian ana...
Using the tools of category theory and differential geometry, we extend the geometric notions conseq...
ABSTRACT: This paper develops a theory of cluster-ing and coding which combines a geometric model wi...
We investigate the geometrical structure of probabilistic generative dimensionality reduction models...
Probabilistic Dimensionality Reduction methods can provide a flexible data representation and a more...
We take up on recent work on the Riemannian geometry of generative networks to propose a new approac...
This thesis introduces geometric representations relevant to the analysis of datasets of random vect...
We study a probabilistic numerical method for the solution of both\u000A boundary and initial value ...
In this paper, we develop a new classification method for manifold-valued data in the framework of p...
International audienceWith the possibility of interpreting data using increasingly complex models, w...
Deep generative models have de facto emerged as state of the art when it comes to density estimation...
International audienceThis paper presents novel mathematical results in support of the probabilistic...
Geometry plays an important role in modern statistical learning theory, and many different aspects o...
When dealing with a parametric statistical model, a Riemannian manifold can naturally appear by endo...
When considering probabilistic pattern recognition methods, especially methods based on Bayesian ana...
Using the tools of category theory and differential geometry, we extend the geometric notions conseq...
ABSTRACT: This paper develops a theory of cluster-ing and coding which combines a geometric model wi...
We investigate the geometrical structure of probabilistic generative dimensionality reduction models...
Probabilistic Dimensionality Reduction methods can provide a flexible data representation and a more...
We take up on recent work on the Riemannian geometry of generative networks to propose a new approac...
This thesis introduces geometric representations relevant to the analysis of datasets of random vect...
We study a probabilistic numerical method for the solution of both\u000A boundary and initial value ...
In this paper, we develop a new classification method for manifold-valued data in the framework of p...
International audienceWith the possibility of interpreting data using increasingly complex models, w...
Deep generative models have de facto emerged as state of the art when it comes to density estimation...
International audienceThis paper presents novel mathematical results in support of the probabilistic...
Geometry plays an important role in modern statistical learning theory, and many different aspects o...
When dealing with a parametric statistical model, a Riemannian manifold can naturally appear by endo...
When considering probabilistic pattern recognition methods, especially methods based on Bayesian ana...
Using the tools of category theory and differential geometry, we extend the geometric notions conseq...
ABSTRACT: This paper develops a theory of cluster-ing and coding which combines a geometric model wi...