The selection of updating formulas for the H matrix and the subproblem of one-dimensional search are considered for variable metric algorithms not using exact linear searches in the subproblem. It is argued that the convex class of updating formulas given by Fletcher is the logical choice for such algorithms. It is shown that direct linear searches are more efficient than linear searches using directional derivatives at each point, particularly for objective functions of many variables. Features of algorithms using the convex class of formulas without exact searches are discussed. It is proven that effects on these algorithms of scaling of the variables of the objective function are identical to effects of transforming the initial H matrix...
Includes bibliographical references (pages 112-113)The study is a discussion of particular methodolo...
In this paper variable metric algorithms are extended to solve general nonlinear programming proble...
Variable metric techniques are a crucial ingredient in many first order optimization algorithms. In ...
The selection of updating formulas for the H matrix and the subproblem of one-dimensional search are...
Iterative algorithms for the numerical solution of non-smooth optimization problems involving an obj...
We develop a class of methods for minimizing a nondifferentiable function which is the maximum of a...
A novel iterative algorithm for the solution of convex or non-convex optimization problems is presen...
As a first step to the realization of a new computer program to solve general nonlinear optimization...
Abstract. This is a method for determining numerically local minima of differentiable functions of s...
Abstract: Let the least value of the function F (x), x∈Rn, be required, where n ≥ 2. If the gradient...
A new family of numerically efficient variable metric or quasi-Newton methods for unconstrained opti...
Dixon’s theorem (Math. Programming, 2 (1972), PP. 383–387) states that all variable metric methods i...
Variable Metric Methods are "Newton-Raphson-like " algorithms for unconstrained minimizati...
We develop a new proximal–gradient method for minimizing the sum of a differentiable, possibly nonco...
We develop a new proximal-gradient method for minimizing the sum of a differentiable, possibly nonco...
Includes bibliographical references (pages 112-113)The study is a discussion of particular methodolo...
In this paper variable metric algorithms are extended to solve general nonlinear programming proble...
Variable metric techniques are a crucial ingredient in many first order optimization algorithms. In ...
The selection of updating formulas for the H matrix and the subproblem of one-dimensional search are...
Iterative algorithms for the numerical solution of non-smooth optimization problems involving an obj...
We develop a class of methods for minimizing a nondifferentiable function which is the maximum of a...
A novel iterative algorithm for the solution of convex or non-convex optimization problems is presen...
As a first step to the realization of a new computer program to solve general nonlinear optimization...
Abstract. This is a method for determining numerically local minima of differentiable functions of s...
Abstract: Let the least value of the function F (x), x∈Rn, be required, where n ≥ 2. If the gradient...
A new family of numerically efficient variable metric or quasi-Newton methods for unconstrained opti...
Dixon’s theorem (Math. Programming, 2 (1972), PP. 383–387) states that all variable metric methods i...
Variable Metric Methods are "Newton-Raphson-like " algorithms for unconstrained minimizati...
We develop a new proximal–gradient method for minimizing the sum of a differentiable, possibly nonco...
We develop a new proximal-gradient method for minimizing the sum of a differentiable, possibly nonco...
Includes bibliographical references (pages 112-113)The study is a discussion of particular methodolo...
In this paper variable metric algorithms are extended to solve general nonlinear programming proble...
Variable metric techniques are a crucial ingredient in many first order optimization algorithms. In ...