k nearest neighbors (kNN) is one of the most widely used supervised learning algorithms to classify Gaussian distributed data, but it does not achieve good results when it is applied to nonlinear manifold distributed data, especially when a very limited amount of labeled samples are available. In this paper, we propose a new graph-based kNN algorithm which can effectively handle both Gaussian distributed data and nonlinear manifold distributed data. To achieve this goal, we first propose a constrained Tired Random Walk (TRW) by constructing an R-level nearest-neighbor strengthened tree over the graph, and then compute a TRW matrix for similarity measurement purposes. After this, the nearest neighbors are identified according to the TRW matr...
International audienceEnsemble methods (EMs) have become increasingly popular in data mining because...
Propagating similarity information along the data man-ifold requires careful selection of local neig...
In this thesis, we develop methods for constructing an A-weighted metric (x - y)' A( x - y) that im...
k nearest neighbors (kNN) is one of the most widely used supervised learning algorithms to classify ...
Background: Traditional data classification techniques usually divide the data space into sub-spaces...
In this paper, we propose a locality-constrained and sparsity-encouraged manifold fitting approach, ...
International audienceIn case of insufficient data samples in highdimensional classification problem...
Originally motivated by computational considerations, we demonstrate how computational efficient and...
This thesis is related to distance metric learning for kNN classification. We use the k nearest neig...
The standard kNN algorithm suffers from two major drawbacks: sensitivity to the parameter value k, i...
We study clustering algorithms based on neighborhood graphs on a random sample of data points. The q...
AbstractWe study clustering algorithms based on neighborhood graphs on a random sample of data point...
Building a good graph to represent data structure is important in many computer vision and machine l...
We study clustering algorithms based on neighborhood graphs on a random sample of data points. The q...
Sample weighting and variations in neighbourhood or data-dependent distance metric definitions are t...
International audienceEnsemble methods (EMs) have become increasingly popular in data mining because...
Propagating similarity information along the data man-ifold requires careful selection of local neig...
In this thesis, we develop methods for constructing an A-weighted metric (x - y)' A( x - y) that im...
k nearest neighbors (kNN) is one of the most widely used supervised learning algorithms to classify ...
Background: Traditional data classification techniques usually divide the data space into sub-spaces...
In this paper, we propose a locality-constrained and sparsity-encouraged manifold fitting approach, ...
International audienceIn case of insufficient data samples in highdimensional classification problem...
Originally motivated by computational considerations, we demonstrate how computational efficient and...
This thesis is related to distance metric learning for kNN classification. We use the k nearest neig...
The standard kNN algorithm suffers from two major drawbacks: sensitivity to the parameter value k, i...
We study clustering algorithms based on neighborhood graphs on a random sample of data points. The q...
AbstractWe study clustering algorithms based on neighborhood graphs on a random sample of data point...
Building a good graph to represent data structure is important in many computer vision and machine l...
We study clustering algorithms based on neighborhood graphs on a random sample of data points. The q...
Sample weighting and variations in neighbourhood or data-dependent distance metric definitions are t...
International audienceEnsemble methods (EMs) have become increasingly popular in data mining because...
Propagating similarity information along the data man-ifold requires careful selection of local neig...
In this thesis, we develop methods for constructing an A-weighted metric (x - y)' A( x - y) that im...