As a tractable approach, regularization is frequently adopted in sparse optimization. This gives rise to regularized optimization, which aims to minimize the ℓ0 norm or its continuous surrogates that characterize the sparsity. From the continuity of surrogates to the discreteness of the ℓ0 norm, the most challenging model is the ℓ0-regularized optimization. There is an impressive body of work on the development of numerical algorithms to overcome this challenge. However, most of the developed methods only ensure that either the (sub)sequence converges to a stationary point from the deterministic optimization perspective or that the distance between each iteration and any given sparse reference point is bounded by an error bound in the sense...
summary:In this paper, we propose a primal interior-point method for large sparse minimax optimizati...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
International audienceFor dealing with sparse models, a large number of continuous approximations of...
Sparse optimization has seen an evolutionary advance in the past decade with extensive applications ...
Hard-thresholding-based algorithms have seen various advantages for sparse optimization in controlli...
Sparse optimization has seen its advances in recent decades. For scenarios where the true sparsity i...
In this paper, we propose new methods to efficiently solve convex optimization problems encountered ...
Algorithms based on the hard thresholding principle have been well studied with sounding theoretical...
The regularized Newton method (RNM) is one of the efficient solution methods for the unconstrained c...
In this paper we suggest a cubic regularization for a Newton method as applied to unconstrained mini...
Optimization is a crucial scientific tool used throughout applied mathematics. In optimization one t...
We consider the projected gradient algorithm for the nonconvex best subset selection problem that mi...
International audienceWe show the benefit which can be drawn from recent global rational optimizatio...
In this paper, we provide theoretical analysis for a cubic regularization of Newton method as applie...
Regularization, or penalization, is a simple yet effective method to promote some desired solution s...
summary:In this paper, we propose a primal interior-point method for large sparse minimax optimizati...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
International audienceFor dealing with sparse models, a large number of continuous approximations of...
Sparse optimization has seen an evolutionary advance in the past decade with extensive applications ...
Hard-thresholding-based algorithms have seen various advantages for sparse optimization in controlli...
Sparse optimization has seen its advances in recent decades. For scenarios where the true sparsity i...
In this paper, we propose new methods to efficiently solve convex optimization problems encountered ...
Algorithms based on the hard thresholding principle have been well studied with sounding theoretical...
The regularized Newton method (RNM) is one of the efficient solution methods for the unconstrained c...
In this paper we suggest a cubic regularization for a Newton method as applied to unconstrained mini...
Optimization is a crucial scientific tool used throughout applied mathematics. In optimization one t...
We consider the projected gradient algorithm for the nonconvex best subset selection problem that mi...
International audienceWe show the benefit which can be drawn from recent global rational optimizatio...
In this paper, we provide theoretical analysis for a cubic regularization of Newton method as applie...
Regularization, or penalization, is a simple yet effective method to promote some desired solution s...
summary:In this paper, we propose a primal interior-point method for large sparse minimax optimizati...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
International audienceFor dealing with sparse models, a large number of continuous approximations of...