A new decomposition optimization algorithm, called path-following gradient-based decomposition, is proposed to solve separable convex optimization problems. Unlike path-following Newton methods considered in the literature, this algorithm does not require any smoothness assumption on the objective function. This allows us to handle more general classes of problems arising in many real applications than in the path-following Newton methods. The new algorithm is a combination of three techniques, namely smoothing, Lagrangian decomposition and path-following gradient framework. The algorithm decomposes the original problem into smaller subproblems by using dual decomposition and smoothing via self-concordant barriers, updates the dual variable...
Cover title.Includes bibliographical references.Partially supported by the U.S. Army Research Office...
. This paper presents a unified analysis of decomposition algorithms for continuously differentiable...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
A new decomposition optimization algorithm, called path-following gradient-based decomposition, is p...
Abstract. In this paper we propose a distributed algorithm for solving large-scale separable convex ...
Convex programming has played an important role in studying a wide class of applications arising fro...
© 2015 Springer Science+Business Media New York In this paper, a class of separable convex optimizat...
The paper considers the minimization of a separable convex function subject to linear ascending cons...
Abstract. We consider convex optimization and variational inequality problems with a given separable...
© 2014 American Mathematical Society. This paper considers the convex minimization problem with lin...
Abstract We propose a novel distributed method for convex optimization problems with a certain separ...
International audienceWe propose a new first-order splitting algorithm for solving jointly the prima...
The augmented Lagrangian method (ALM) is one of the most successful first-order methods for convex p...
summary:We consider general convex large-scale optimization problems in finite dimensions. Under usu...
In this paper we propose a distributed dual gradient algorithm for minimizing linearly constrained s...
Cover title.Includes bibliographical references.Partially supported by the U.S. Army Research Office...
. This paper presents a unified analysis of decomposition algorithms for continuously differentiable...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
A new decomposition optimization algorithm, called path-following gradient-based decomposition, is p...
Abstract. In this paper we propose a distributed algorithm for solving large-scale separable convex ...
Convex programming has played an important role in studying a wide class of applications arising fro...
© 2015 Springer Science+Business Media New York In this paper, a class of separable convex optimizat...
The paper considers the minimization of a separable convex function subject to linear ascending cons...
Abstract. We consider convex optimization and variational inequality problems with a given separable...
© 2014 American Mathematical Society. This paper considers the convex minimization problem with lin...
Abstract We propose a novel distributed method for convex optimization problems with a certain separ...
International audienceWe propose a new first-order splitting algorithm for solving jointly the prima...
The augmented Lagrangian method (ALM) is one of the most successful first-order methods for convex p...
summary:We consider general convex large-scale optimization problems in finite dimensions. Under usu...
In this paper we propose a distributed dual gradient algorithm for minimizing linearly constrained s...
Cover title.Includes bibliographical references.Partially supported by the U.S. Army Research Office...
. This paper presents a unified analysis of decomposition algorithms for continuously differentiable...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...