This paper introduces a new local asymmetric weighting scheme for the nearest neighbour classification algorithm. It is shown both with theoretical arguments and computer experiments that good compression rates can be achieved outperforming the accuracy of the standard nearest neighbour classification algorithm and obtaining almost the same accuracy as the k-NN algorithm with k optimised in each data set. Moreover, the learning procedure, based on reinforcement, is quite robust against suboptimal choices of the reinforcement, punishment and compression paramenter
Premi extraordinari ex-aequo en l'àmbit d'Electrònica i Telecomunicacions. Convocatoria 1999 - 2000N...
Göpfert JP, Wersing H, Hammer B. Interpretable locally adaptive nearest neighbors. Neurocomputing. 2...
Abstract. The Nearest Neighbor (NN) classification/regression tech-niques, besides their simplicity,...
This paper introduces a new local asymmetric weighting scheme for the nearest neighbor classificatio...
This paper introduces a new local asymmetric weighting scheme for the nearest neighbor classificatio...
A local distance measure for the nearest neighbor classification rule is shown to achieve high comp...
This paper presents two metrics for the Nearest Neighbor Classifier that share the property of being...
In this thesis, we develop methods for constructing an A-weighted metric (x - y)' A( x - y) that im...
We present Stochastic Neighbor Compression (SNC), an algorithm to compress a dataset for the purpose...
We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is propo...
Choosing a distance preserving measure or metric is fun-damental to many signal processing algorithm...
We consider the problem of learning a local metric to enhance the performance of nearest neighbor cl...
The Nearest Neighbor (NN) classification/regression techniques, besides their simplicity, are amongs...
We present the first sample compression algorithm for nearest neighbors with non-trivial performance...
This paper presents a new class of local similarity metrics, called AASM, that are not symmetric and...
Premi extraordinari ex-aequo en l'àmbit d'Electrònica i Telecomunicacions. Convocatoria 1999 - 2000N...
Göpfert JP, Wersing H, Hammer B. Interpretable locally adaptive nearest neighbors. Neurocomputing. 2...
Abstract. The Nearest Neighbor (NN) classification/regression tech-niques, besides their simplicity,...
This paper introduces a new local asymmetric weighting scheme for the nearest neighbor classificatio...
This paper introduces a new local asymmetric weighting scheme for the nearest neighbor classificatio...
A local distance measure for the nearest neighbor classification rule is shown to achieve high comp...
This paper presents two metrics for the Nearest Neighbor Classifier that share the property of being...
In this thesis, we develop methods for constructing an A-weighted metric (x - y)' A( x - y) that im...
We present Stochastic Neighbor Compression (SNC), an algorithm to compress a dataset for the purpose...
We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is propo...
Choosing a distance preserving measure or metric is fun-damental to many signal processing algorithm...
We consider the problem of learning a local metric to enhance the performance of nearest neighbor cl...
The Nearest Neighbor (NN) classification/regression techniques, besides their simplicity, are amongs...
We present the first sample compression algorithm for nearest neighbors with non-trivial performance...
This paper presents a new class of local similarity metrics, called AASM, that are not symmetric and...
Premi extraordinari ex-aequo en l'àmbit d'Electrònica i Telecomunicacions. Convocatoria 1999 - 2000N...
Göpfert JP, Wersing H, Hammer B. Interpretable locally adaptive nearest neighbors. Neurocomputing. 2...
Abstract. The Nearest Neighbor (NN) classification/regression tech-niques, besides their simplicity,...