We introduce a framework for feature selection based on dependence maximization between the selected features and the labels of an estimation problem, using the Hilbert-Schmidt Independence Criterion. The key idea is that good features should be highly dependent on the labels. Our approach leads to a greedy procedure for feature selection. We show that a number of existing feature selectors are special cases of this framework. Experiments on both artificial and real-world data show that our feature selector works well in practice
In a classification problem, we would like to assign a model to the observed data using its features...
International audienceDimensionality reduction using feature extraction and selection approaches is ...
In the past decade, various sparse learning based unsupervised feature selection methods have been d...
We introduce a framework for feature selection based on dependence maximization between the selected...
{We introduce a framework of feature selection based on dependence maximization between the selected...
We introduce a framework for filtering features that employs the Hilbert-Schmidt Independence Criter...
In many important real world applications the initial representation of the data is inconvenient, or...
<p>The goal of supervised feature selection is to find a subset of input features that are responsib...
The goal of supervised feature selection is to find a subset of input features that are responsible ...
This paper uses a classical approach to feature selection: minimization of a cost function applied o...
Abstract. The proposed feature selection method aims to find a minimum subset of the most informativ...
This letter introduces a nonlinear measure of independence between random variables for remote sensi...
The goal of supervised feature selection is to find a subset of input features that are responsible ...
Analyzing high-dimensional data stands as a great challenge in machine learning. In order to deal wi...
The paper addresses the problem of making dependency-aware feature selection feasible in pattern rec...
In a classification problem, we would like to assign a model to the observed data using its features...
International audienceDimensionality reduction using feature extraction and selection approaches is ...
In the past decade, various sparse learning based unsupervised feature selection methods have been d...
We introduce a framework for feature selection based on dependence maximization between the selected...
{We introduce a framework of feature selection based on dependence maximization between the selected...
We introduce a framework for filtering features that employs the Hilbert-Schmidt Independence Criter...
In many important real world applications the initial representation of the data is inconvenient, or...
<p>The goal of supervised feature selection is to find a subset of input features that are responsib...
The goal of supervised feature selection is to find a subset of input features that are responsible ...
This paper uses a classical approach to feature selection: minimization of a cost function applied o...
Abstract. The proposed feature selection method aims to find a minimum subset of the most informativ...
This letter introduces a nonlinear measure of independence between random variables for remote sensi...
The goal of supervised feature selection is to find a subset of input features that are responsible ...
Analyzing high-dimensional data stands as a great challenge in machine learning. In order to deal wi...
The paper addresses the problem of making dependency-aware feature selection feasible in pattern rec...
In a classification problem, we would like to assign a model to the observed data using its features...
International audienceDimensionality reduction using feature extraction and selection approaches is ...
In the past decade, various sparse learning based unsupervised feature selection methods have been d...