We derive a novel norm that corresponds to the tightest convex relaxation of spar-sity combined with an `2 penalty. We show that this new k-support norm provides a tighter relaxation than the elastic net and can thus be advantageous in in sparse prediction problems. We also bound the looseness of the elastic net, thus shedding new light on it and providing justification for its use.
Sparse methods for supervised learning aim at finding good linear predictors from as few variables a...
International audienceIn this paper, we study the support recovery guarantees of underdetermined spa...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
We derive a novel norm that corresponds to the tightest convex relaxation of spar-sity combined with...
arXiv:1204.5043International audienceWe derive a novel norm that corresponds to the tightest convex ...
International audienceWe study the problem of statistical estimation with a signal known to be spars...
The k-support norm is a regularizer which has been successfully applied to sparse vector prediction ...
The k-support norm has been recently introduced to perform correlated sparsity regularization. Altho...
To restrict ourselves to the regime of sparse solutions has become the new paradigm for modern stati...
Sparsity, or cardinality, as a tool for feature selection is extremely common in a vast number of cu...
Editor: the editor This paper proposes a new robust regression interpretation of sparse penalties su...
Nowadays, the explosive data scale increase provides an unprecedented opportunity to apply machine l...
In recent years, methods for sparse approximation have gained considerable attention and have been s...
We initiate the study of trade-offs between sparsity and the number of measurements in sparse recove...
This paper tackles a compressed sensing problem with the unknown signal showing a flexible block spa...
Sparse methods for supervised learning aim at finding good linear predictors from as few variables a...
International audienceIn this paper, we study the support recovery guarantees of underdetermined spa...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
We derive a novel norm that corresponds to the tightest convex relaxation of spar-sity combined with...
arXiv:1204.5043International audienceWe derive a novel norm that corresponds to the tightest convex ...
International audienceWe study the problem of statistical estimation with a signal known to be spars...
The k-support norm is a regularizer which has been successfully applied to sparse vector prediction ...
The k-support norm has been recently introduced to perform correlated sparsity regularization. Altho...
To restrict ourselves to the regime of sparse solutions has become the new paradigm for modern stati...
Sparsity, or cardinality, as a tool for feature selection is extremely common in a vast number of cu...
Editor: the editor This paper proposes a new robust regression interpretation of sparse penalties su...
Nowadays, the explosive data scale increase provides an unprecedented opportunity to apply machine l...
In recent years, methods for sparse approximation have gained considerable attention and have been s...
We initiate the study of trade-offs between sparsity and the number of measurements in sparse recove...
This paper tackles a compressed sensing problem with the unknown signal showing a flexible block spa...
Sparse methods for supervised learning aim at finding good linear predictors from as few variables a...
International audienceIn this paper, we study the support recovery guarantees of underdetermined spa...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...