The regularization functional induced by the graph Laplacian of a random neighborhood graph based on the data is adaptive in two ways. First it adapts to an underlying manifold structure and second to the density of the data-generating probability measure. We identify in this paper the limit of the regularizer and show uniform convergence over the space of Hoelder functions. As an intermediate step we derive upper bounds on the covering numbers of Hoelder functions on compact Riemannian manifolds, which are of independent interest for the theoretical analysis of manifold-based learning methods
Higher-order regularization problem formulations are popular frameworks used in machine learning, in...
One fundamental assumption in object recognition as well as in other computer vision and pattern rec...
One fundamental assumption in object recognition as well as in other computer vision and pattern rec...
The regularization functional induced by the graph Laplacian of a random neighborhood graph based on...
The regularization functional induced by the graph Laplacian of a random neighborhood graph based on...
As more and more complex data sources become available, the analysis of graph and manifold data has ...
Given a sample from a probability measure with support on a submanifold in Euclidean space one can c...
AbstractIn recent years manifold methods have attracted a considerable amount of attention in machin...
Given a sample from a probability measure with support on a submanifold in Euclidean space one can c...
We derive uniform convergence bounds and learning rates for regularized principal manifolds. This bu...
ABSTRACT Manifold regularization is an approach which exploits the geometry of the marginal distrib...
We observe the distances between estimated function outputs on data points to create an anisotropic ...
Manifold regularization is an approach which exploits the geometry of the marginal distribution. The...
The common graph Laplacian regularizer is well-established in semi-supervised learning and spectral ...
The common graph Laplacian regularizer is well-established in semi-supervised learning and spectral ...
Higher-order regularization problem formulations are popular frameworks used in machine learning, in...
One fundamental assumption in object recognition as well as in other computer vision and pattern rec...
One fundamental assumption in object recognition as well as in other computer vision and pattern rec...
The regularization functional induced by the graph Laplacian of a random neighborhood graph based on...
The regularization functional induced by the graph Laplacian of a random neighborhood graph based on...
As more and more complex data sources become available, the analysis of graph and manifold data has ...
Given a sample from a probability measure with support on a submanifold in Euclidean space one can c...
AbstractIn recent years manifold methods have attracted a considerable amount of attention in machin...
Given a sample from a probability measure with support on a submanifold in Euclidean space one can c...
We derive uniform convergence bounds and learning rates for regularized principal manifolds. This bu...
ABSTRACT Manifold regularization is an approach which exploits the geometry of the marginal distrib...
We observe the distances between estimated function outputs on data points to create an anisotropic ...
Manifold regularization is an approach which exploits the geometry of the marginal distribution. The...
The common graph Laplacian regularizer is well-established in semi-supervised learning and spectral ...
The common graph Laplacian regularizer is well-established in semi-supervised learning and spectral ...
Higher-order regularization problem formulations are popular frameworks used in machine learning, in...
One fundamental assumption in object recognition as well as in other computer vision and pattern rec...
One fundamental assumption in object recognition as well as in other computer vision and pattern rec...