© 2014 Elsevier Ltd. All rights reserved. Representing data as a linear combination of a set of selected known samples is of interest for various machine learning applications such as dimensionality reduction or classification. k-Nearest Neighbors (k NN) and its variants are still among the best-known and most often used techniques. Some popular richer representations are Sparse Representation (SR) based on solving an l1-regularized least squares formulation, Collaborative Representation (CR) based on l2-regularized least squares, and Locally Linear Embedding (LLE) based on an l1-constrained least squares problem. We propose a novel sparse representation, the Iterative Nearest Neighbors (INN). It combines the power of SR and LLE with the co...
Leading machine learning techniques rely on inputs in the form of pairwise similarities between obje...
For many computer vision and machine learning problems, large training sets are key for good perform...
For many computer vision and machine learning problems, large training sets are key for good perform...
Representing data as a linear combination of a set of selected known samples is of interest for vari...
Representative data in terms of a set of selected samples is of interest for various machine learnin...
Naive Bayes Nearest Neighbor (NBNN) has been proposed as a powerful, learning-free, non-parametric a...
Abstract. Naive Bayes Nearest Neighbor (NBNN) has been proposed as a powerful, learning-free, non-pa...
International audienceA new method is introduced that makes use of sparse image representations to s...
In this paper, we introduce a Regression Nearest Neighbor framework for general classification tasks...
Abstract—In this letter, a sparse representation-based nearest neighbor (SRNN) classifier is propose...
Conventionally, the k nearest-neighbor (kNN) classification is implemented with the use of the Eucli...
Timofte R., Van Gool L., ''Iterative nearest neighbors'', Pattern recognition, vol. 48, no. 1, pp. 6...
We propose a new collaborative neighbor representation algorithm for face recognition based on a rev...
We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is propo...
Copyright © 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
Leading machine learning techniques rely on inputs in the form of pairwise similarities between obje...
For many computer vision and machine learning problems, large training sets are key for good perform...
For many computer vision and machine learning problems, large training sets are key for good perform...
Representing data as a linear combination of a set of selected known samples is of interest for vari...
Representative data in terms of a set of selected samples is of interest for various machine learnin...
Naive Bayes Nearest Neighbor (NBNN) has been proposed as a powerful, learning-free, non-parametric a...
Abstract. Naive Bayes Nearest Neighbor (NBNN) has been proposed as a powerful, learning-free, non-pa...
International audienceA new method is introduced that makes use of sparse image representations to s...
In this paper, we introduce a Regression Nearest Neighbor framework for general classification tasks...
Abstract—In this letter, a sparse representation-based nearest neighbor (SRNN) classifier is propose...
Conventionally, the k nearest-neighbor (kNN) classification is implemented with the use of the Eucli...
Timofte R., Van Gool L., ''Iterative nearest neighbors'', Pattern recognition, vol. 48, no. 1, pp. 6...
We propose a new collaborative neighbor representation algorithm for face recognition based on a rev...
We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is propo...
Copyright © 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rig...
Leading machine learning techniques rely on inputs in the form of pairwise similarities between obje...
For many computer vision and machine learning problems, large training sets are key for good perform...
For many computer vision and machine learning problems, large training sets are key for good perform...