We develop a novel classifier in a kernel feature space de-fined by the eigenspectrum of the Laplacian data matrix. The classification cost function is derived from a distance measure between probability densities. The Laplacian data matrix is obtained based on a training set, while test data is mapped to the kernel space using the Nyström routine. In that space, the test data is classified based on the angle be-tween the test point and the training data class means. We illustrate the performance of the new classifier on synthetic and real data. 1
Kernel methods are nonparametric feature extraction techniques that attempt to boost the learning ca...
Dimensionality reduction algorithms, which aim to select a small set of efficient and discriminant f...
The definition of the Mahalanobis kernel for the classification of hyperspectral remote sensing imag...
International audienceAs annotations of data can be scarce in large-scale practical problems, levera...
Spectral methods, as an unsupervised technique, have been used with success in data mining such as L...
Abstract A new distance measure between probability density functions (pdfs) is introduced, which we...
This material is posted here with permission of the IEEE. Internal or personal use of this material ...
Supervised classification is one of the most powerful techniques to analyze data, when a-priori info...
The importance of the support vector machine and its applicability to a wide range of problems is we...
Spectral methods have recently emerged as a powerful tool for dimensionality reduction and man-ifold...
With Laplacian eigenmaps the low-dimensional manifold of high-dimensional data points can be uncover...
Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold ...
Discriminant feature extraction plays a fundamental role in pattern recognition. In this paper, we p...
Kernel methods have been applied successfully in many data mining tasks. Subspace kernel learning wa...
Abstract — Recent work has revealed a close connection between certain information theoretic diverge...
Kernel methods are nonparametric feature extraction techniques that attempt to boost the learning ca...
Dimensionality reduction algorithms, which aim to select a small set of efficient and discriminant f...
The definition of the Mahalanobis kernel for the classification of hyperspectral remote sensing imag...
International audienceAs annotations of data can be scarce in large-scale practical problems, levera...
Spectral methods, as an unsupervised technique, have been used with success in data mining such as L...
Abstract A new distance measure between probability density functions (pdfs) is introduced, which we...
This material is posted here with permission of the IEEE. Internal or personal use of this material ...
Supervised classification is one of the most powerful techniques to analyze data, when a-priori info...
The importance of the support vector machine and its applicability to a wide range of problems is we...
Spectral methods have recently emerged as a powerful tool for dimensionality reduction and man-ifold...
With Laplacian eigenmaps the low-dimensional manifold of high-dimensional data points can be uncover...
Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold ...
Discriminant feature extraction plays a fundamental role in pattern recognition. In this paper, we p...
Kernel methods have been applied successfully in many data mining tasks. Subspace kernel learning wa...
Abstract — Recent work has revealed a close connection between certain information theoretic diverge...
Kernel methods are nonparametric feature extraction techniques that attempt to boost the learning ca...
Dimensionality reduction algorithms, which aim to select a small set of efficient and discriminant f...
The definition of the Mahalanobis kernel for the classification of hyperspectral remote sensing imag...