Misclassification costs of minority class data in real-world applications can be very high. This is a challenging problem especially when the data is also high in dimensionality because of the increase in overfitting and lower model interpretability. Feature selection is recently a popular way to address this problem by identifying features that best predict a minority class. This paper introduces a novel feature selection method call SYMON which uses symmetrical uncertainty and harmony search. Unlike existing methods, SYMON uses symmetrical uncertainty to weigh features with respect to their dependency to class labels. This helps to identify powerful features in retrieving the least frequent class labels. SYMON also uses harmony search to ...
Selecting the most miniature possible set of genes from microarray datasets for clinical diagnosis a...
Feature selection is beneficial for improving the performance of general machine learning tasks by e...
Abstract. In this work, we suggest a new feature selection technique that lets us use the wrapper ap...
Misclassification costs of minority class data in real-world applications can be very high. This is ...
In the area of data mining, feature selection is an important task for classification and dimensiona...
In high dimensional datasets, feature selection plays a significant task for dimensionality reductio...
Many search strategies have been exploited for the task of feature selection (FS), in an effort to i...
Finding an optimal subset of features that maximizes classification accuracy is still an open proble...
Many strategies have been exploited for the task of feature selection, in an effort to identify more...
Feature selection for supervised learning concerns the problem of selecting a number of important fe...
Many search strategies have been exploited in implementing feature selection, in an effort to identi...
Classifier ensembles constitute one of the main research directions in machine learning and data min...
Classifier ensembles constitute one of the main research directions in machine learning and data min...
Feature selection and classification of imbalanced data sets are two of the most interesting machine...
© 2017 IEEE. Feature selection is beneficial for improving the performance of general machine learni...
Selecting the most miniature possible set of genes from microarray datasets for clinical diagnosis a...
Feature selection is beneficial for improving the performance of general machine learning tasks by e...
Abstract. In this work, we suggest a new feature selection technique that lets us use the wrapper ap...
Misclassification costs of minority class data in real-world applications can be very high. This is ...
In the area of data mining, feature selection is an important task for classification and dimensiona...
In high dimensional datasets, feature selection plays a significant task for dimensionality reductio...
Many search strategies have been exploited for the task of feature selection (FS), in an effort to i...
Finding an optimal subset of features that maximizes classification accuracy is still an open proble...
Many strategies have been exploited for the task of feature selection, in an effort to identify more...
Feature selection for supervised learning concerns the problem of selecting a number of important fe...
Many search strategies have been exploited in implementing feature selection, in an effort to identi...
Classifier ensembles constitute one of the main research directions in machine learning and data min...
Classifier ensembles constitute one of the main research directions in machine learning and data min...
Feature selection and classification of imbalanced data sets are two of the most interesting machine...
© 2017 IEEE. Feature selection is beneficial for improving the performance of general machine learni...
Selecting the most miniature possible set of genes from microarray datasets for clinical diagnosis a...
Feature selection is beneficial for improving the performance of general machine learning tasks by e...
Abstract. In this work, we suggest a new feature selection technique that lets us use the wrapper ap...