The worst-case complexity of the steepest-descent algorithm with exact line-searches for unconstrained smooth optimization is analyzed, and it is shown that the number of iterations of this algorithm which may be necessary to find an iterate at which the norm of the objective function’s gradient is less that a prescribed ǫ is, essentially, a multiple of 1/ǫ2, as is the case for variants of the same algorithms using inexact linesearches.
In this paper we prove that the broad class of direct-search methods of directional type, based on i...
Abstract: Let the least value of the function F (x), x∈Rn, be required, where n ≥ 2. If the gradient...
This work was also published as a Rice University thesis/dissertation: http://hdl.handle.net/1911/1...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
The steepest descent method has a rich history and is one of the simplest and best known methods for...
The worst-case evaluation complexity of finding an approximate first-order critical point using grad...
In this paper, we prove that the broad class of direct-search methods of directional type based on i...
In this paper, we prove that the broad class of direct-search methods of directional type based on i...
In the context of the derivative-free optimization of a smooth objective function, it has been shown...
Steepest descent method is a simple gradient method for optimization. This method has a slow converg...
In this paper we prove that the broad class of direct-search methods of directional type, based on i...
In this paper we prove that the broad class of direct-search methods of directional type, based on i...
Abstract: Let the least value of the function F (x), x∈Rn, be required, where n ≥ 2. If the gradient...
This work was also published as a Rice University thesis/dissertation: http://hdl.handle.net/1911/1...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
We consider the gradient (or steepest) descent method with exact line search applied to a strongly c...
The steepest descent method has a rich history and is one of the simplest and best known methods for...
The worst-case evaluation complexity of finding an approximate first-order critical point using grad...
In this paper, we prove that the broad class of direct-search methods of directional type based on i...
In this paper, we prove that the broad class of direct-search methods of directional type based on i...
In the context of the derivative-free optimization of a smooth objective function, it has been shown...
Steepest descent method is a simple gradient method for optimization. This method has a slow converg...
In this paper we prove that the broad class of direct-search methods of directional type, based on i...
In this paper we prove that the broad class of direct-search methods of directional type, based on i...
Abstract: Let the least value of the function F (x), x∈Rn, be required, where n ≥ 2. If the gradient...
This work was also published as a Rice University thesis/dissertation: http://hdl.handle.net/1911/1...