We present a feature selection method for solving sparse regularization problem, which hasa composite regularization of $\ell_p$ norm and $\ell_{\infty}$ norm.We use proximal gradient method to solve this \L1inf operator problem, where a simple but efficient algorithm is designed to minimize a relatively simple objective function, which contains a vector of $\ell_2$ norm and $\ell_\infty$ norm. Proposed method brings some insight for solving sparsity-favoring norm, andextensive experiments are conducted to characterize the effect of varying $p$ and to compare with other approaches on real world multi-class and multi-label datasets
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
Feature selection plays an important role in many machine learning and data mining applications. In ...
In pattern recognition and machine learning, a classification problem refers to finding an algorithm...
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
Nowadays, the explosive data scale increase provides an unprecedented opportunity to apply machine l...
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
We study the problem of recovering a sparse vector from a set of linear measure-ments. This problem ...
Abstract. Proximal methods have recently been shown to provide ef-fective optimization procedures to...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
Sparse modeling has been highly successful in many realworld applications. While a lot of interests ...
A variety of feature selection methods based on sparsity regularization have been developed with dif...
Sparse modeling has been highly successful in many real-world applications. While a lot of interests...
This paper considers the problem of recovering either a low rank matrix or a sparse vector from obse...
Multi-task sparse feature learning aims to improve the generalization performance by exploiting the ...
In pattern recognition and machine learning, a classification problem refers to finding an algorithm...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
Feature selection plays an important role in many machine learning and data mining applications. In ...
In pattern recognition and machine learning, a classification problem refers to finding an algorithm...
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
Nowadays, the explosive data scale increase provides an unprecedented opportunity to apply machine l...
International audienceFeature selection in learning to rank has recently emerged as a crucial issue....
We study the problem of recovering a sparse vector from a set of linear measure-ments. This problem ...
Abstract. Proximal methods have recently been shown to provide ef-fective optimization procedures to...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
Sparse modeling has been highly successful in many realworld applications. While a lot of interests ...
A variety of feature selection methods based on sparsity regularization have been developed with dif...
Sparse modeling has been highly successful in many real-world applications. While a lot of interests...
This paper considers the problem of recovering either a low rank matrix or a sparse vector from obse...
Multi-task sparse feature learning aims to improve the generalization performance by exploiting the ...
In pattern recognition and machine learning, a classification problem refers to finding an algorithm...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
Feature selection plays an important role in many machine learning and data mining applications. In ...
In pattern recognition and machine learning, a classification problem refers to finding an algorithm...