This paper examines the computational complexity certification of the fast gradient method for the solution of the dual of a parametric con-vex program. To this end, a lower iteration bound is derived such that for all parameters from a compact set a solution with a specified level of subop-timality will be obtained. For its practical importance, the derivation of the smallest lower iteration bound is considered. In order to determine it, we in-vestigate both the computation of the worst case minimal Euclidean distance between an initial iterate and a Lagrange multiplier and the issue of finding the largest step size for the fast gradient method. In addition, we argue that optimal preconditioning of the dual problem cannot be proven to decr...
Code available at https://github.com/AdrienTaylor/GreedyMethodsInternational audienceWe describe a n...
We present and computationally evaluate a variant of the fast gradient method by Nesterov that is ca...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
This paper examines the computational complexity certification of the fast gradient method for the s...
We provide Frank–Wolfe (≡ Conditional Gradients) method with a convergence analysis allowing to appr...
In many machine learning problems such as the dual form of SVM, the objective function to be minimiz...
This thesis focuses on three themes related to the mathematical theory of first-order methods for co...
Abstract The application of the fast gradient method to the dual QP leads to the Dual Fast Projected...
Abstract. This paper presents a new dual formulation for quadratically constrained convex programs (...
The conditions of relative smoothness and relative strong convexity were recently introduced for the...
Summarization: The chapter deals with the parametric linear-convex mathematical programming (MP) pro...
In this paper we prove a new complexity bound for a variant of Accelerated Coordinate Descent Method...
In this paper, we propose an efficient approach for solving a class of large-scale convex optimizati...
Linear programming (LP) and semidefinite programming (SDP) are among the most important tools in Ope...
Linear quadratic model predictive control (MPC) with input constraints leads to an optimization prob...
Code available at https://github.com/AdrienTaylor/GreedyMethodsInternational audienceWe describe a n...
We present and computationally evaluate a variant of the fast gradient method by Nesterov that is ca...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
This paper examines the computational complexity certification of the fast gradient method for the s...
We provide Frank–Wolfe (≡ Conditional Gradients) method with a convergence analysis allowing to appr...
In many machine learning problems such as the dual form of SVM, the objective function to be minimiz...
This thesis focuses on three themes related to the mathematical theory of first-order methods for co...
Abstract The application of the fast gradient method to the dual QP leads to the Dual Fast Projected...
Abstract. This paper presents a new dual formulation for quadratically constrained convex programs (...
The conditions of relative smoothness and relative strong convexity were recently introduced for the...
Summarization: The chapter deals with the parametric linear-convex mathematical programming (MP) pro...
In this paper we prove a new complexity bound for a variant of Accelerated Coordinate Descent Method...
In this paper, we propose an efficient approach for solving a class of large-scale convex optimizati...
Linear programming (LP) and semidefinite programming (SDP) are among the most important tools in Ope...
Linear quadratic model predictive control (MPC) with input constraints leads to an optimization prob...
Code available at https://github.com/AdrienTaylor/GreedyMethodsInternational audienceWe describe a n...
We present and computationally evaluate a variant of the fast gradient method by Nesterov that is ca...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...