The authors discuss the role of automatic differentiation tools in optimization software. We emphasize issues that are important to large-scale optimization and that have proved useful in the installation of nonlinear solvers in the NEOS Server. Our discussion centers on the computation of the gradient and Hessian matrix for partially separable functions and shows that the gradient and Hessian matrix can be computed with guaranteed bounds in time and memory requirements
In comparison to symbolic differentiation and numerical differencing, the chain rule based technique...
Abstract. Simulation of many physical phenomena requires the numerical solution of non-linear partia...
The solution of a nonlinear optimization problem often requires an estimate of the Hessian matrix f...
We discuss the role of automatic differentiation tools in optimization software. We emphasize issues...
Modern methods for numerical optimization calculate (or approximate) the matrix of second derivative...
The accurate and efficient computation of gradients for partially separable functions is central to ...
The accurate and ecient computation of gradients for partially separable functions is central to the...
Many optimization methods are available at the present time. The software that implements a particul...
Multidisciplinary Design Optimization (MDO) by means of formal sensitivity analysis requires that ea...
AbstractWe introduce the basic notions of automatic differentiation, describe some extensions which ...
Our work under this support broadly falls into five categories: automatic differentiation, sparsity,...
This paper describes an investigation into the performance of three Ada packages for automatic diffe...
. Automatic differentiation (AD) is a technique that augments computer codes with statements for the...
Differentiation is one of the fundamental problems in numerical mathemetics. The solution of many op...
Engineering optimization problems may be formulated as nonlinear programs (NLP), defined by an objec...
In comparison to symbolic differentiation and numerical differencing, the chain rule based technique...
Abstract. Simulation of many physical phenomena requires the numerical solution of non-linear partia...
The solution of a nonlinear optimization problem often requires an estimate of the Hessian matrix f...
We discuss the role of automatic differentiation tools in optimization software. We emphasize issues...
Modern methods for numerical optimization calculate (or approximate) the matrix of second derivative...
The accurate and efficient computation of gradients for partially separable functions is central to ...
The accurate and ecient computation of gradients for partially separable functions is central to the...
Many optimization methods are available at the present time. The software that implements a particul...
Multidisciplinary Design Optimization (MDO) by means of formal sensitivity analysis requires that ea...
AbstractWe introduce the basic notions of automatic differentiation, describe some extensions which ...
Our work under this support broadly falls into five categories: automatic differentiation, sparsity,...
This paper describes an investigation into the performance of three Ada packages for automatic diffe...
. Automatic differentiation (AD) is a technique that augments computer codes with statements for the...
Differentiation is one of the fundamental problems in numerical mathemetics. The solution of many op...
Engineering optimization problems may be formulated as nonlinear programs (NLP), defined by an objec...
In comparison to symbolic differentiation and numerical differencing, the chain rule based technique...
Abstract. Simulation of many physical phenomena requires the numerical solution of non-linear partia...
The solution of a nonlinear optimization problem often requires an estimate of the Hessian matrix f...