In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that uses regularization to achieve sparsity in function estimation. To achieve this, we used an adaptive L0 penalty that has a ridge structure and, therefore, does not introduce additional computational complexity to the algorithm. In addition to this, we used an alternative approach based on a similar proposal in the Support Vector Machine (SVM) literature. Through numerical studies, we demonstrated the effectiveness of our proposals. We believe that this is the first time someone discussed a sparse version of Support Vector Regression (in terms of variable selection and not in terms of support vector selection)
A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsim...
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
In this paper, we review state-of-the-art methods for feature selection in statistics with an applic...
In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that use...
This is an electronic version of the paper presented at the 19th European Symposium on Artificial Ne...
The real-world data nowadays is usually in high dimension. For example, one data image can be repres...
This paper introduces 1 a new support vector machine (SVM) formulation to obtain sparse solutions in...
Feature selection in learning to rank has recently emerged as a crucial issue. Whereas several prepr...
Suykens et al. (Neurocomputing (2002), in press) describe a weighted least-squares formulation of th...
Least squares support vector machines (LSSVMs) have been widely applied for classification and regre...
The Least Squares Support Vector Machine (LS-SVM) is a modified SVM with a ridge regression cost fun...
Kernel-based methods for support vector machines (SVM) have shown highly advantageous performance in...
We study the problem of learning a sparse linear regression vector under additional conditions on th...
<p>We propose a new binary classification and variable selection technique especially designed for h...
Recent work has focused on the problem of conducting linear regression when the number of covariates...
A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsim...
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
In this paper, we review state-of-the-art methods for feature selection in statistics with an applic...
In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that use...
This is an electronic version of the paper presented at the 19th European Symposium on Artificial Ne...
The real-world data nowadays is usually in high dimension. For example, one data image can be repres...
This paper introduces 1 a new support vector machine (SVM) formulation to obtain sparse solutions in...
Feature selection in learning to rank has recently emerged as a crucial issue. Whereas several prepr...
Suykens et al. (Neurocomputing (2002), in press) describe a weighted least-squares formulation of th...
Least squares support vector machines (LSSVMs) have been widely applied for classification and regre...
The Least Squares Support Vector Machine (LS-SVM) is a modified SVM with a ridge regression cost fun...
Kernel-based methods for support vector machines (SVM) have shown highly advantageous performance in...
We study the problem of learning a sparse linear regression vector under additional conditions on th...
<p>We propose a new binary classification and variable selection technique especially designed for h...
Recent work has focused on the problem of conducting linear regression when the number of covariates...
A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsim...
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
In this paper, we review state-of-the-art methods for feature selection in statistics with an applic...