Owing to their statistical properties, non-convex sparse regularizers have attracted much interest for estimating a sparse linear model from high dimensional data. Given that the solution is sparse, for accelerating convergence, a working set strategy addresses the optimization problem through an iterative algorithm by incre-menting the number of variables to optimize until the identification of the solution support. While those methods have been well-studied and theoretically supported for convex regularizers, this paper proposes a working set algorithm for non-convex sparse regularizers with convergence guarantees. The algorithm, named FireWorks, is based on a non-convex reformulation of a recent primal-dual approach and leverages on the ...
In the area of sparse recovery, numerous researches hint that non-convex penalties might induce bett...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
Sparse representation and low-rank approximation are fundamental tools in fields of signal processin...
Owing to their statistical properties, non-convex sparse regularizers have attracted much interest f...
The use of non-convex sparse regularization has attracted much interest when estimating a very spars...
A popular strategy for determining solutions to linear least-squares problems relies on using sparsi...
International audienceThe use of non-convex sparse regularization has attracted much interest when e...
Abstract The use of convex regularizers allow for easy optimization, though they often produce biase...
We study iterative/implicit regularization for linear models, when the bias is convex but not necess...
Sparse modeling has been highly successful in many real-world applications. While a lot of interests...
Non-convex regularizers are more and more applied to high-dimensional inference with s-parsity prior...
Sparse modeling has been highly successful in many realworld applications. While a lot of interests ...
This paper considers the problem of recovering either a low rank matrix or a sparse vector from obse...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
We investigate implicit regularization schemes for gradient descent methods applied to unpenalized l...
In the area of sparse recovery, numerous researches hint that non-convex penalties might induce bett...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
Sparse representation and low-rank approximation are fundamental tools in fields of signal processin...
Owing to their statistical properties, non-convex sparse regularizers have attracted much interest f...
The use of non-convex sparse regularization has attracted much interest when estimating a very spars...
A popular strategy for determining solutions to linear least-squares problems relies on using sparsi...
International audienceThe use of non-convex sparse regularization has attracted much interest when e...
Abstract The use of convex regularizers allow for easy optimization, though they often produce biase...
We study iterative/implicit regularization for linear models, when the bias is convex but not necess...
Sparse modeling has been highly successful in many real-world applications. While a lot of interests...
Non-convex regularizers are more and more applied to high-dimensional inference with s-parsity prior...
Sparse modeling has been highly successful in many realworld applications. While a lot of interests ...
This paper considers the problem of recovering either a low rank matrix or a sparse vector from obse...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
We investigate implicit regularization schemes for gradient descent methods applied to unpenalized l...
In the area of sparse recovery, numerous researches hint that non-convex penalties might induce bett...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
Sparse representation and low-rank approximation are fundamental tools in fields of signal processin...