Feature selection in learning to rank has recently emerged as a crucial issue. Whereas several preprocessing approaches have been proposed, only a few have focused on integrating feature selection into the learning process. In this paper, we propose a general framework for feature selection in learning to rank using support vector machines with a sparse regularization term. We investigate both classical convex regularizations, such as ℓ1 or weighted ℓ1, and nonconvex regularization terms, such as log penalty, minimax concave penalty, or ℓp pseudo-norm with p<;1. Two algorithms are proposed: the first, an accelerated proximal approach for solving the convex problems, and, the second, a reweighted ℓ1 scheme to address nonconvex regularization...
International audienceWe develop an exact penalty approach for feature selection in machine learning...
Many supervised learning problems are considered difficult to solve either because of the redundant ...
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research...
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
This paper introduces 1 a new support vector machine (SVM) formulation to obtain sparse solutions in...
International audienceTo select the most useful and the least redundant features to be used in ranki...
National audienceTo select the most useful and the least redundant features to be used in ranking fu...
In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that use...
In this paper, we propose a unified framework for improved structure estimation and feature selectio...
Sparse learning problems, known as feature selection problems or variable selection problems, are a ...
National audienceLearning to rank algorithms are dealing with a very large amount of features to aut...
International audienceIn supervised classification, data representation is usually considered at the...
International audienceWe develop an exact penalty approach for feature selection in machine learning...
Many supervised learning problems are considered difficult to solve either because of the redundant ...
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research...
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
This paper introduces 1 a new support vector machine (SVM) formulation to obtain sparse solutions in...
International audienceTo select the most useful and the least redundant features to be used in ranki...
National audienceTo select the most useful and the least redundant features to be used in ranking fu...
In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that use...
In this paper, we propose a unified framework for improved structure estimation and feature selectio...
Sparse learning problems, known as feature selection problems or variable selection problems, are a ...
National audienceLearning to rank algorithms are dealing with a very large amount of features to aut...
International audienceIn supervised classification, data representation is usually considered at the...
International audienceWe develop an exact penalty approach for feature selection in machine learning...
Many supervised learning problems are considered difficult to solve either because of the redundant ...
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research...