A number of first-order methods have been proposed for smooth multiobjective optimization for which some form of convergence to first-order criticality has been proved. Such convergence is global in the sense of being independent of the starting point. In this paper, we analyse the rate of convergence of gradient descent for smooth unconstrained multiobjective optimization, and we do it for non-convex, convex, and strongly convex vector functions. These global rates are shown to be the same as for gradient descent in single-objective optimization and correspond to appropriate worst-case complexity bounds. In the convex cases, the rates are given for implicit scalarizations of the problem vector function
In this paper, we present new methods for black-box convex minimization. They do not need to know in...
3 figuresA class of algorithms for unconstrained nonconvex optimization is considered where the valu...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
A number of first-order methods have been proposed for smooth multiobjective optimization for which ...
The worst-case evaluation complexity of finding an approximate first-order critical point using grad...
AbstractIn this work we propose a Cauchy-like method for solving smooth unconstrained vector optimiz...
The standard assumption for proving linear convergence of first order methods for smooth convex opti...
In many machine learning problems such as the dual form of SVM, the objective function to be minimiz...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
The problem of minimax optimization arises in a wide range of applications. When the objective funct...
In this paper, we present new methods for black-box convex minimization. They do not need to know in...
Many descent methods for multiobjective optimization problems have been developed in recent years. I...
In this paper we prove that the broad class of direct-search methods of directional type, based on i...
In this paper we prove that the broad class of direct-search methods of directional type, based on i...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
In this paper, we present new methods for black-box convex minimization. They do not need to know in...
3 figuresA class of algorithms for unconstrained nonconvex optimization is considered where the valu...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
A number of first-order methods have been proposed for smooth multiobjective optimization for which ...
The worst-case evaluation complexity of finding an approximate first-order critical point using grad...
AbstractIn this work we propose a Cauchy-like method for solving smooth unconstrained vector optimiz...
The standard assumption for proving linear convergence of first order methods for smooth convex opti...
In many machine learning problems such as the dual form of SVM, the objective function to be minimiz...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
The problem of minimax optimization arises in a wide range of applications. When the objective funct...
In this paper, we present new methods for black-box convex minimization. They do not need to know in...
Many descent methods for multiobjective optimization problems have been developed in recent years. I...
In this paper we prove that the broad class of direct-search methods of directional type, based on i...
In this paper we prove that the broad class of direct-search methods of directional type, based on i...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
In this paper, we present new methods for black-box convex minimization. They do not need to know in...
3 figuresA class of algorithms for unconstrained nonconvex optimization is considered where the valu...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...