In this work, we study the problem of checkpointing strategies for adjoint computation on synchrone hierarchical platforms. Specifically we consider computational platforms with several levels of storage with different writing and reading costs. When reversing a large adjoint chain, choosing which data to checkpoint and where is a critical decision for the overall performance of the computation. We introduce H-Revolve, an optimal algorithm for this problem. We make it available in a public Python library along with the implementation of several state-of-the-art algorithms for the variant of the problem with two levels of storage. We provide a detailed description of how one can use this library in an adjoint computation software in the fiel...
International audienceIn this paper, we design and analyze strategies to replicate the execution of ...
Classical reverse-mode automatic differentiation (AD) imposes only a small constant-factor overhead ...
The adjoint mode of Algorithmic Differentiation (AD) is particularly attractive for computing gradie...
International audienceWe reexamine the work of Aupy et al. on optimal algorithms for hierarchical ad...
International audienceWe reexamine the work of Stumm and Walther on multistage algorithms for adjoin...
AbstractAdjoints are an important computational tool for large-scale sensitivity evaluation, uncerta...
Adjoint equations of differential equations have seen widespread applications in optimization, inver...
Abstract. This paper presents a new functionality of the Automatic Dierentiation (AD) Tool tapenade....
For adjoint calculations, debugging, and similar purposes one may need to reverse the execution of...
Adjoint equations of differential equations have seen widespread applications in opti-mization, inve...
International audienceIn this talk Dr Pallez will discuss the impact of memory in the computation of...
Inversion and PDE-constrained optimization problems often rely on solving the adjoint problem to cal...
This paper presents a new functionality of the Automatic Differentiation (AD) Tool Tapenade. Tapenad...
<p>Checkpointing is a classical technique to mitigate the overhead of adjoint Algorithmic Differenti...
For adjoint calculations, parameter estimation, and similar purposes one may need to reverse the exe...
International audienceIn this paper, we design and analyze strategies to replicate the execution of ...
Classical reverse-mode automatic differentiation (AD) imposes only a small constant-factor overhead ...
The adjoint mode of Algorithmic Differentiation (AD) is particularly attractive for computing gradie...
International audienceWe reexamine the work of Aupy et al. on optimal algorithms for hierarchical ad...
International audienceWe reexamine the work of Stumm and Walther on multistage algorithms for adjoin...
AbstractAdjoints are an important computational tool for large-scale sensitivity evaluation, uncerta...
Adjoint equations of differential equations have seen widespread applications in optimization, inver...
Abstract. This paper presents a new functionality of the Automatic Dierentiation (AD) Tool tapenade....
For adjoint calculations, debugging, and similar purposes one may need to reverse the execution of...
Adjoint equations of differential equations have seen widespread applications in opti-mization, inve...
International audienceIn this talk Dr Pallez will discuss the impact of memory in the computation of...
Inversion and PDE-constrained optimization problems often rely on solving the adjoint problem to cal...
This paper presents a new functionality of the Automatic Differentiation (AD) Tool Tapenade. Tapenad...
<p>Checkpointing is a classical technique to mitigate the overhead of adjoint Algorithmic Differenti...
For adjoint calculations, parameter estimation, and similar purposes one may need to reverse the exe...
International audienceIn this paper, we design and analyze strategies to replicate the execution of ...
Classical reverse-mode automatic differentiation (AD) imposes only a small constant-factor overhead ...
The adjoint mode of Algorithmic Differentiation (AD) is particularly attractive for computing gradie...