We develop a framework (employing scaling functions) for the construction of multi-step quasi-Newton methods (for unconstrained optimization) which utilize values of the objective function. The methods are constructed via interpolants of the m+ 1 most recent iterates / gradient evaluations, and possess a free parameter which introduces an additional degree of flexibility. This permits the interpolating polynomials to assimilate information (in the form of function-values) which is readily available at each iteration. This information is incorporated in updating the Hessian approximation at each iteration, in an attempt to accelerate convergence. We concentrate on a specific example from the general family of methods, corresponding to a part...
In this paper, we investigate quasi-Newton methods for solving unconstrained optimization problems. ...
AbstractWe consider multistep quasi-Newton methods for unconstrained optimization. These methods wer...
AbstractQuasi-Newton methods update, at each iteration, the existing Hessian approximation (or its i...
AbstractWe develop a framework employing scaling functions for the construction of multistep quasi-N...
AbstractWe develop a framework employing scaling functions for the construction of multistep quasi-N...
AbstractIn previous work, the authors (1993, 1994) developed the concept of multi-step quasi-Newton ...
AbstractIn previous work, the authors (1993, 1994) developed the concept of multi-step quasi-Newton ...
AbstractQuasi-Newton methods update, at each iteration, the existing Hessian approximation (or its i...
Previous work on so-called "fixed-point" multi-step quasi-Newton methods for unconstrained...
Over the past twelve years, multi-step quasi-Newton methods for the unconstrained optimization of a ...
AbstractWe consider multi-step quasi-Newton methods for unconstrained optimization. These methods we...
Multi-step methods derived in [1–3] have proven to be serious contenders in practice by outperformin...
AbstractMulti-step quasi-Newton methods for optimisation (using data from more than one previous ste...
This work aims at ensuring smoothness of interpolation in both the iterate and the gradient spaces i...
Many methods for solving minimization problems are variants of Newton method, which requires the spe...
In this paper, we investigate quasi-Newton methods for solving unconstrained optimization problems. ...
AbstractWe consider multistep quasi-Newton methods for unconstrained optimization. These methods wer...
AbstractQuasi-Newton methods update, at each iteration, the existing Hessian approximation (or its i...
AbstractWe develop a framework employing scaling functions for the construction of multistep quasi-N...
AbstractWe develop a framework employing scaling functions for the construction of multistep quasi-N...
AbstractIn previous work, the authors (1993, 1994) developed the concept of multi-step quasi-Newton ...
AbstractIn previous work, the authors (1993, 1994) developed the concept of multi-step quasi-Newton ...
AbstractQuasi-Newton methods update, at each iteration, the existing Hessian approximation (or its i...
Previous work on so-called "fixed-point" multi-step quasi-Newton methods for unconstrained...
Over the past twelve years, multi-step quasi-Newton methods for the unconstrained optimization of a ...
AbstractWe consider multi-step quasi-Newton methods for unconstrained optimization. These methods we...
Multi-step methods derived in [1–3] have proven to be serious contenders in practice by outperformin...
AbstractMulti-step quasi-Newton methods for optimisation (using data from more than one previous ste...
This work aims at ensuring smoothness of interpolation in both the iterate and the gradient spaces i...
Many methods for solving minimization problems are variants of Newton method, which requires the spe...
In this paper, we investigate quasi-Newton methods for solving unconstrained optimization problems. ...
AbstractWe consider multistep quasi-Newton methods for unconstrained optimization. These methods wer...
AbstractQuasi-Newton methods update, at each iteration, the existing Hessian approximation (or its i...