AbstractFeature selection is a technique to choose a subset of variables from the multidimensional data which can improve the classification accuracy in diversity datasets. In addition, the best feature subset selection method can reduce the cost of feature measurement. This work focuses on the use of wrapper feature selection. This study use methods of sequential forward selection (SFS), sequential backward selection (SBS) and optimize selection (evolutionary) based on ensemble algorithms namely Bagging and AdaBoost by subset evaluations which are performed using two classifiers; Decision Tree and Naïve Bayes. Thirteen datasets containing different numbers of attributes and dimensions are obtained from the UCI Machine Learning Repository. ...