Numerical optimization algorithms often require the (symmetric) matrix of second derivatives, $\nabla^{2} f (x)$, of some problem function $f: R^{n} \rightarrow R^{1}$. If the Hessian matrix is large and sparse then estimation by finite differences can be quite attractive since several schemes allow for estimation in $\ll n$ gradient evaluations. The purpose of this paper is to analyze, from a combinatorial point of view, a class of methods known as substitution methods. We present a concise characterization of such methods in graph-theoretic terms. Using this characterization, we develop a complexity analysis of the general problem and derive a roundoff error bound on the Hessian approximation. Moreover, the graph model immediatel...
Abstract Matrix partitioning problems that arise in the efficient estimation ofsparse Jacobians and ...
Modern methods for numerical optimization calculate (or approximate) the matrix of second derivative...
The background of this thesis is algorithmic differentiation (AD) of in practice very computationall...
Large scale optimization problems often require an approximation to the Hessian matrix. If the Hess...
summary:Necessity of computing large sparse Hessian matrices gave birth to many methods for their ef...
We consider the problem of approximating the Hessian matrix of a smooth non-linear function using a ...
We revisit the role of graph coloring in modeling problems that arise in efficient estimation of la...
The solution of a nonlinear optimization problem often requires an estimate of the Hessian matrix f...
Sparse Hessian matrices occur often in statistics, and their fast and accurate estimation can improv...
The computation of a sparse Hessian matrix H using automatic differentiation (AD) can be made effici...
There are several benefits of taking the Hessian of the objective function into account when designi...
We consider variants of trust-region and adaptive cubic regularization methods for non-convex optimi...
Given a mapping with a sparse Jacobian matrix, the problem of minimizing the number of function eval...
Large-scale optimization algorithms frequently require sparse Hessian matrices that are not readil...
This research concerns the algorithmic study of Hessian approximation in the context of multilevel n...
Abstract Matrix partitioning problems that arise in the efficient estimation ofsparse Jacobians and ...
Modern methods for numerical optimization calculate (or approximate) the matrix of second derivative...
The background of this thesis is algorithmic differentiation (AD) of in practice very computationall...
Large scale optimization problems often require an approximation to the Hessian matrix. If the Hess...
summary:Necessity of computing large sparse Hessian matrices gave birth to many methods for their ef...
We consider the problem of approximating the Hessian matrix of a smooth non-linear function using a ...
We revisit the role of graph coloring in modeling problems that arise in efficient estimation of la...
The solution of a nonlinear optimization problem often requires an estimate of the Hessian matrix f...
Sparse Hessian matrices occur often in statistics, and their fast and accurate estimation can improv...
The computation of a sparse Hessian matrix H using automatic differentiation (AD) can be made effici...
There are several benefits of taking the Hessian of the objective function into account when designi...
We consider variants of trust-region and adaptive cubic regularization methods for non-convex optimi...
Given a mapping with a sparse Jacobian matrix, the problem of minimizing the number of function eval...
Large-scale optimization algorithms frequently require sparse Hessian matrices that are not readil...
This research concerns the algorithmic study of Hessian approximation in the context of multilevel n...
Abstract Matrix partitioning problems that arise in the efficient estimation ofsparse Jacobians and ...
Modern methods for numerical optimization calculate (or approximate) the matrix of second derivative...
The background of this thesis is algorithmic differentiation (AD) of in practice very computationall...