In optimization, one of the main challenges of the widely used family of Quasi-Newton methods is to find an estimate of the Hessian matrix as close as possible to the real matrix. In this paper, we develop a new update formula for the estimate of the Hessian starting from the Powell-Symetric-Broyden (PSB) formula and adding pieces of information from the previous steps of the optimization path. This lead to a multisecant version of PSB, which we call generalised PSB (gPSB), but which does not exist in general as was proven before. We provide a novel interpretation of this non-existence. In addition, we provide a formula that satisfies the multisecant condition and is as close to symmetric as possible and vice versa for a second formula. Sub...
A secant equation (quasi-Newton) has one of the most important rule to find an optimal solution in n...
AbstractWe consider multistep quasi-Newton methods for unconstrained optimization. These methods wer...
AbstractQuasi-Newton methods update, at each iteration, the existing Hessian approximation (or its i...
Working with Quasi-Newton methods in optimization leads to one important challenge, being to find an...
One of the frequently used families of methods for the resolution of non-linear optimization problem...
Quasi-Newton methods are often used in the frame of non-linear optimization. In those methods, the q...
Quasi-Newton methods are often used in the frame of non-linear optimization. In those methods, the q...
For Quasi-Newton methods, one of the most important challenges is to find an estimate of the Jacobia...
Quasi-Newton (qN) techniques approximate the Newton step by estimating the Hessian using the so-call...
This paper develops a modified quasi-Newton method for structured unconstrained optimization with pa...
The class of Least Change Secant Update Quasi-Newton (LCSU QN) methods is often used for root findin...
Based on the idea of maximum determinant positive definite matrix completion, Yamashita proposed a s...
Symmetric rank-one (SR1) is one of the competitive formulas among the quasi-Newton (QN) methods. In ...
AbstractThe secant equation, which underlies all standard ‘quasi-Newton’ minimisation methods, arise...
AbstractThis paper presents a modified quasi-Newton method for structured unconstrained optimization...
A secant equation (quasi-Newton) has one of the most important rule to find an optimal solution in n...
AbstractWe consider multistep quasi-Newton methods for unconstrained optimization. These methods wer...
AbstractQuasi-Newton methods update, at each iteration, the existing Hessian approximation (or its i...
Working with Quasi-Newton methods in optimization leads to one important challenge, being to find an...
One of the frequently used families of methods for the resolution of non-linear optimization problem...
Quasi-Newton methods are often used in the frame of non-linear optimization. In those methods, the q...
Quasi-Newton methods are often used in the frame of non-linear optimization. In those methods, the q...
For Quasi-Newton methods, one of the most important challenges is to find an estimate of the Jacobia...
Quasi-Newton (qN) techniques approximate the Newton step by estimating the Hessian using the so-call...
This paper develops a modified quasi-Newton method for structured unconstrained optimization with pa...
The class of Least Change Secant Update Quasi-Newton (LCSU QN) methods is often used for root findin...
Based on the idea of maximum determinant positive definite matrix completion, Yamashita proposed a s...
Symmetric rank-one (SR1) is one of the competitive formulas among the quasi-Newton (QN) methods. In ...
AbstractThe secant equation, which underlies all standard ‘quasi-Newton’ minimisation methods, arise...
AbstractThis paper presents a modified quasi-Newton method for structured unconstrained optimization...
A secant equation (quasi-Newton) has one of the most important rule to find an optimal solution in n...
AbstractWe consider multistep quasi-Newton methods for unconstrained optimization. These methods wer...
AbstractQuasi-Newton methods update, at each iteration, the existing Hessian approximation (or its i...