We discuss the role of automatic differentiation tools in optimization software. We emphasize issues that are important to large-scale optimization and that have proved useful in the installation of nonlinear solvers in the NEOS Server. Our discussion centers on the computation of the gradient and Hessian matrix for partially separable functions and shows that the gradient and Hessian matrix can be computed with guaranteed bounds in time and memory requirements
. Automatic differentiation (AD) is a technique that augments computer codes with statements for the...
Automatic differentiation is applied to the optimal design of microelectronic manufacturing equipmen...
Often the most robust and efficient algorithms for the solution of large-scale problems involving no...
The authors discuss the role of automatic differentiation tools in optimization software. We emphasi...
Modern methods for numerical optimization calculate (or approximate) the matrix of second derivative...
The accurate and ecient computation of gradients for partially separable functions is central to the...
The accurate and efficient computation of gradients for partially separable functions is central to ...
AbstractWe introduce the basic notions of automatic differentiation, describe some extensions which ...
Multidisciplinary Design Optimization (MDO) by means of formal sensitivity analysis requires that ea...
This paper describes an investigation into the performance of three Ada packages for automatic diffe...
Many optimization methods are available at the present time. The software that implements a particul...
Engineering optimization problems may be formulated as nonlinear programs (NLP), defined by an objec...
Differentiation is one of the fundamental problems in numerical mathemetics. The solution of many op...
Our work under this support broadly falls into five categories: automatic differentiation, sparsity,...
In comparison to symbolic differentiation and numerical differencing, the chain rule based technique...
. Automatic differentiation (AD) is a technique that augments computer codes with statements for the...
Automatic differentiation is applied to the optimal design of microelectronic manufacturing equipmen...
Often the most robust and efficient algorithms for the solution of large-scale problems involving no...
The authors discuss the role of automatic differentiation tools in optimization software. We emphasi...
Modern methods for numerical optimization calculate (or approximate) the matrix of second derivative...
The accurate and ecient computation of gradients for partially separable functions is central to the...
The accurate and efficient computation of gradients for partially separable functions is central to ...
AbstractWe introduce the basic notions of automatic differentiation, describe some extensions which ...
Multidisciplinary Design Optimization (MDO) by means of formal sensitivity analysis requires that ea...
This paper describes an investigation into the performance of three Ada packages for automatic diffe...
Many optimization methods are available at the present time. The software that implements a particul...
Engineering optimization problems may be formulated as nonlinear programs (NLP), defined by an objec...
Differentiation is one of the fundamental problems in numerical mathemetics. The solution of many op...
Our work under this support broadly falls into five categories: automatic differentiation, sparsity,...
In comparison to symbolic differentiation and numerical differencing, the chain rule based technique...
. Automatic differentiation (AD) is a technique that augments computer codes with statements for the...
Automatic differentiation is applied to the optimal design of microelectronic manufacturing equipmen...
Often the most robust and efficient algorithms for the solution of large-scale problems involving no...