Analysis sparsity is a common prior in inverse problem or machine learning including special cases such as Total Variation regularization, Edge Lasso and Fused Lasso. We study the geometry of the solution set (a polyhedron) of the analysis l1-regularization (with l2 data fidelity term) when it is not reduced to a singleton without any assumption of the analysis dictionary nor the degradation operator. In contrast with most theoretical work, we do not focus on giving uniqueness and/or stability results, but rather describe a worst-case scenario where the solution set can be big in terms of dimension. Leveraging a fine analysis of the sub-level set of the regularizer itself, we draw a connection between support of a solution and the minimal f...
Conventional algorithms for sparse signal recovery and sparse representation rely on l1-norm regular...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
Sparse representation and low-rank approximation are fundamental tools in fields of signal processin...
Analysis sparsity is a common prior in inverse problem or machine learning including special cases s...
This paper investigates the theoretical guarantees of L1-analysis regularization when solving linear...
This paper investigates the theoretical guarantees of \ell^1-analysis regularization when solving li...
Regularization plays a pivotal role when facing the challenge of solving ill-posed inverse problems,...
International audienceThis paper deals with the non-uniqueness of the solutions of an analysis—Lasso...
In this paper, we aim at recovering an unknown signal x0 from noisy L1measurements y=Phi*x0+w, where...
To be published in 10th international conference on Sampling Theory and Applications - Full papersIn...
Regularization plays a pivotal role when facing the challenge of solving ill-posed inverse problems,...
Regularization plays a pivotal role when facing the challenge of solving ill-posed inverse problems,...
Abstract—In this paper, we establish robustness to noise perturbations of polyhedral regularization ...
We present a theoretical analysis and comparison of the effect of $ ℓ _{ 1 } $ versus $ ℓ _{ 2 } $ r...
Series : Applied and Numerical Harmonic AnalysisInternational audienceInverse problems and regulari...
Conventional algorithms for sparse signal recovery and sparse representation rely on l1-norm regular...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
Sparse representation and low-rank approximation are fundamental tools in fields of signal processin...
Analysis sparsity is a common prior in inverse problem or machine learning including special cases s...
This paper investigates the theoretical guarantees of L1-analysis regularization when solving linear...
This paper investigates the theoretical guarantees of \ell^1-analysis regularization when solving li...
Regularization plays a pivotal role when facing the challenge of solving ill-posed inverse problems,...
International audienceThis paper deals with the non-uniqueness of the solutions of an analysis—Lasso...
In this paper, we aim at recovering an unknown signal x0 from noisy L1measurements y=Phi*x0+w, where...
To be published in 10th international conference on Sampling Theory and Applications - Full papersIn...
Regularization plays a pivotal role when facing the challenge of solving ill-posed inverse problems,...
Regularization plays a pivotal role when facing the challenge of solving ill-posed inverse problems,...
Abstract—In this paper, we establish robustness to noise perturbations of polyhedral regularization ...
We present a theoretical analysis and comparison of the effect of $ ℓ _{ 1 } $ versus $ ℓ _{ 2 } $ r...
Series : Applied and Numerical Harmonic AnalysisInternational audienceInverse problems and regulari...
Conventional algorithms for sparse signal recovery and sparse representation rely on l1-norm regular...
Non-convex sparsity-inducing penalties outperform their convex counterparts, but generally sacrifice...
Sparse representation and low-rank approximation are fundamental tools in fields of signal processin...