We study the gradient method under the assumption that an additively inexact gradient is available for, generally speaking, non-convex problems. The non-convexity of the objective function, as well as the use of an inexactness specified gradient at iterations, can lead to various problems. For example, the trajectory of the gradient method may be far enough away from the starting point. On the other hand, the unbounded removal of the trajectory of the gradient method in the presence of noise can lead to the removal of the trajectory of the method from the desired exact solution. The results of investigating the behavior of the trajectory of the gradient method are obtained under the assumption of the inexactness of the gradient and the cond...
International audienceWe consider the problem of optimizing the sum of a smooth convex function and ...
In this paper, we propose the gradient descent type methods to solve convex optimization problems in...
Cover title.Includes bibliographical references (p. 21-23).Supported by ARPA. F30602-92-C-0030 Suppo...
In this article, we investigate an accelerated first-order method, namely, the method of similar tri...
Nonconvex optimization with great demand of fast solvers is ubiquitous in modern machine learning. T...
We develop a new proximal--gradient method for minimizing the sum of a differentiable, possibly nonc...
We present a strikingly simple proof that two rules are sufficient to automate gradient descent: 1) ...
We develop a new proximal--gradient method for minimizing the sum of a differentiable, possibly nonc...
We consider stopping rules in conjugate gradient type iteration methods for solving linear ill‐posed...
We propose a new first-order method for minimizing nonconvex functions with a Lipschitz continuous g...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
International audienceWe consider the problem of optimizing the sum of a smooth convex function and ...
In this paper, we propose the gradient descent type methods to solve convex optimization problems in...
Cover title.Includes bibliographical references (p. 21-23).Supported by ARPA. F30602-92-C-0030 Suppo...
In this article, we investigate an accelerated first-order method, namely, the method of similar tri...
Nonconvex optimization with great demand of fast solvers is ubiquitous in modern machine learning. T...
We develop a new proximal--gradient method for minimizing the sum of a differentiable, possibly nonc...
We present a strikingly simple proof that two rules are sufficient to automate gradient descent: 1) ...
We develop a new proximal--gradient method for minimizing the sum of a differentiable, possibly nonc...
We consider stopping rules in conjugate gradient type iteration methods for solving linear ill‐posed...
We propose a new first-order method for minimizing nonconvex functions with a Lipschitz continuous g...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
International audienceWe consider the problem of optimizing the sum of a smooth convex function and ...
In this paper, we propose the gradient descent type methods to solve convex optimization problems in...
Cover title.Includes bibliographical references (p. 21-23).Supported by ARPA. F30602-92-C-0030 Suppo...