Feature selection, as a preprocessing step to machine learning, has been very effective in reducing\ud dimensionality, removing irrelevant data, increasing learning accuracy, and improving result comprehensibility.\ud Traditional feature selection methods resort to random sampling in dealing with data sets\ud with a huge number of instances. In this paper, we introduce the concept of active feature selection,\ud and investigate a selective sampling approach to active feature selection in a filter model setting. We\ud present a formalism of selective sampling based on data variance, and apply it to a widely used feature\ud selection algorithm Relief. Further, we show how it realizes active feature selection and reduces\ud the required number...