The adjoint mode of Algorithmic Differentiation (AD) is particularly attractive for computing gradients. However, this mode needs to use the intermediate values of the original simulation in reverse order at a cost that increases with the length of the simulation. AD research looks for strategies to reduce this cost, for instance by taking advantage of the structure of the given program. In this work, we consider on one hand the frequent case of Fixed-Point loops for which several authors have proposed adapted adjoint strategies. Among these strategies, we select the one introduced by B. Christianson. We specify further the selected method and we describe the way we implemented it inside the AD tool Tapenade. Experiments on a medium-size...
PhDSimulations are used in science and industry to predict the performance of technical systems. Ad...
Abstract. This paper presents a new functionality of the Automatic Dierentiation (AD) Tool tapenade....
The context of this work is Automatic Differentiation (AD). Fundamentally, AD transforms a program t...
The adjoint mode of Algorithmic Differentiation (AD) is particularly attractive for computing gradie...
Le mode adjoint de la Différentiation Algorithmique (DA) est particulièrement intéressant pour le ca...
International audienceA computational fluid dynamics code is differentiated using algorithmic differ...
Adjoint algorithms, and in particular those obtained through the adjoint mode of Automatic Different...
International audienceA computational fluid dynamics code relying on a high-order spatial discretiza...
This paper presents a new functionality of the Automatic Differentiation (AD) Tool Tapenade. Tapenad...
ABSTRACT. Adjoint methods are the choice approach to obtain gradients of large simulation codes. Aut...
Efficient Algorithmic Differentiation of Fixed-Point loops requires a specific strategy to avoid exp...
The context of this work is Automatic Differentiation (AD). Fundamentally, AD transforms a program t...
This dissertation is concerned with algorithmic differentiation (AD), which is a method for algorith...
AbstractAn essential performance and correctness factor in numerical simulation and optimization is ...
Adjoint algorithmic differentiation by operator and function overloading is based on the interpretat...
PhDSimulations are used in science and industry to predict the performance of technical systems. Ad...
Abstract. This paper presents a new functionality of the Automatic Dierentiation (AD) Tool tapenade....
The context of this work is Automatic Differentiation (AD). Fundamentally, AD transforms a program t...
The adjoint mode of Algorithmic Differentiation (AD) is particularly attractive for computing gradie...
Le mode adjoint de la Différentiation Algorithmique (DA) est particulièrement intéressant pour le ca...
International audienceA computational fluid dynamics code is differentiated using algorithmic differ...
Adjoint algorithms, and in particular those obtained through the adjoint mode of Automatic Different...
International audienceA computational fluid dynamics code relying on a high-order spatial discretiza...
This paper presents a new functionality of the Automatic Differentiation (AD) Tool Tapenade. Tapenad...
ABSTRACT. Adjoint methods are the choice approach to obtain gradients of large simulation codes. Aut...
Efficient Algorithmic Differentiation of Fixed-Point loops requires a specific strategy to avoid exp...
The context of this work is Automatic Differentiation (AD). Fundamentally, AD transforms a program t...
This dissertation is concerned with algorithmic differentiation (AD), which is a method for algorith...
AbstractAn essential performance and correctness factor in numerical simulation and optimization is ...
Adjoint algorithmic differentiation by operator and function overloading is based on the interpretat...
PhDSimulations are used in science and industry to predict the performance of technical systems. Ad...
Abstract. This paper presents a new functionality of the Automatic Dierentiation (AD) Tool tapenade....
The context of this work is Automatic Differentiation (AD). Fundamentally, AD transforms a program t...