K nearest neighbor classifier (K-NN) is widely discussed and applied in pattern recognition and machine learning, however, as a similar lazy classifier using local information for recognizing a new test, neighborhood classifier, few literatures are reported on. In this paper, we introduce neighborhood rough set model as a uniform framework to understand and implement neighborhood classifiers. This algorithm integrates attribute reduction technique with classification learning. We study the influence of the three norms on attribute reduction and classification, and compare neighborhood classifier with KNN, CART and SVM. The experimental results show that neighborhood-based feature selection algorithm is able to delete most of the redundant a...
In this paper, we propose a coarse to fine K nearest neighbor (KNN) classifier (CFKNNC). CFKNNC diff...
In the rough-set field, the objective of attribute reduction is to regulate the variations of measur...
AbstractNearest Neighbor Classifiers demand high computational resources i.e, time and memory. Reduc...
Rough set theories are utilized in class-specific feature selection to improve the classification pe...
The k-nearest neighbor classifier follows a simple, yet powerful algorithm: collect the k data point...
Rough set theory has been successfully applied to many fields, such as data mining, pattern recognit...
Nearest neighbor (NN) classification relies on the assumption that class conditional probabilities a...
Abstract: It is useful to measure classification complexity for understanding classification tasks, ...
International audienceEnsemble methods (EMs) have become increasingly popular in data mining because...
We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is propo...
The neighborhood rough set (NRS) is used to remove redundant features after identifying neighborhood...
The k-nearest-neighbor rule is a well known pattern recognition technique with very good results in ...
k-nearest neighbors (k-NN), which is known to be a simple and efficient approach, is a non-parametri...
Due to increase in large number of document on the internet data mining becomes an important key par...
The k Nearest Neighbors (kNN) method is a widely used technique to solve classification or regressio...
In this paper, we propose a coarse to fine K nearest neighbor (KNN) classifier (CFKNNC). CFKNNC diff...
In the rough-set field, the objective of attribute reduction is to regulate the variations of measur...
AbstractNearest Neighbor Classifiers demand high computational resources i.e, time and memory. Reduc...
Rough set theories are utilized in class-specific feature selection to improve the classification pe...
The k-nearest neighbor classifier follows a simple, yet powerful algorithm: collect the k data point...
Rough set theory has been successfully applied to many fields, such as data mining, pattern recognit...
Nearest neighbor (NN) classification relies on the assumption that class conditional probabilities a...
Abstract: It is useful to measure classification complexity for understanding classification tasks, ...
International audienceEnsemble methods (EMs) have become increasingly popular in data mining because...
We consider improving the performance of k-Nearest Neighbor classifiers. A reg-ularized kNN is propo...
The neighborhood rough set (NRS) is used to remove redundant features after identifying neighborhood...
The k-nearest-neighbor rule is a well known pattern recognition technique with very good results in ...
k-nearest neighbors (k-NN), which is known to be a simple and efficient approach, is a non-parametri...
Due to increase in large number of document on the internet data mining becomes an important key par...
The k Nearest Neighbors (kNN) method is a widely used technique to solve classification or regressio...
In this paper, we propose a coarse to fine K nearest neighbor (KNN) classifier (CFKNNC). CFKNNC diff...
In the rough-set field, the objective of attribute reduction is to regulate the variations of measur...
AbstractNearest Neighbor Classifiers demand high computational resources i.e, time and memory. Reduc...