Steepest descent method is a simple gradient method for optimization. This method has a slow convergence in heading to the optimal solution, which occurs because of the zigzag form of the steps. Barzilai and Borwein modified this algorithm so that it performs well for problems with large dimensions. Barzilai and Borwein method results have sparked a lot of research on the method of steepest descent, including alternate minimization gradient method and Yuan method. Inspired by previous works, we modified the step size of the steepest descent method. We then compare the modification results against the Barzilai and Borwein method, alternate minimization gradient method and Yuan method for quadratic function cases in terms of the iterations nu...
The Barzilai–Borwein (BB) gradient method is favourable over the classical steepest descent method b...
The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [2]...
In recent years several proposals for the step-size selection have largely improved the gradient me...
The steepest descent method has a rich history and is one of the simplest and best known methods for...
The negative gradient direction to find local minimizers has been associated with the classical stee...
We derive two-point step sizes for the steepest-descent method by approximating the secant equation....
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepes...
Motivated by the superlinear behavior of the Barzilai-Borwein (BB) method for two-dimensional quadra...
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepes...
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepes...
Steepest Descent is one of the pioneers method in solving optimization problem since it is globally ...
We propose a new adaptive and composite Barzilai–Borwein (BB) step size by integrating the advanta...
The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [2]...
The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [2]...
The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [2]...
The Barzilai–Borwein (BB) gradient method is favourable over the classical steepest descent method b...
The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [2]...
In recent years several proposals for the step-size selection have largely improved the gradient me...
The steepest descent method has a rich history and is one of the simplest and best known methods for...
The negative gradient direction to find local minimizers has been associated with the classical stee...
We derive two-point step sizes for the steepest-descent method by approximating the secant equation....
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepes...
Motivated by the superlinear behavior of the Barzilai-Borwein (BB) method for two-dimensional quadra...
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepes...
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepes...
Steepest Descent is one of the pioneers method in solving optimization problem since it is globally ...
We propose a new adaptive and composite Barzilai–Borwein (BB) step size by integrating the advanta...
The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [2]...
The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [2]...
The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [2]...
The Barzilai–Borwein (BB) gradient method is favourable over the classical steepest descent method b...
The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [2]...
In recent years several proposals for the step-size selection have largely improved the gradient me...