International audienceIn this paper gradient and Hamiltonian dynamics are investigated in both discrete-time and sampled-data contexts. At first, the discrete gradient function is profitably employed to define discrete gradient and Hamiltonian dynamics. On these bases, it is shown that representations of these forms can be recovered when computing the sampled-data equivalent models to gradient and Hamiltonian continuous-time dynamics
In this paper, the differential/difference representation (DDR) of an input-affine dynamics under sa...
Hamiltonian Monte Carlo (HMC) sampling methods provide a mechanism for defining dis-tant proposals w...
International audienceIn a recent paper by Bourdin and Trélat, a version of the Pontryagin maximum p...
International audienceIn this paper gradient and Hamiltonian dynamics are investigated in both discr...
This paper investigates the transformation of Hamiltonian structures under sampling. It is shown tha...
Continuous relaxations play an important role in discrete optimization, but have not seen much use i...
The process of machine learning can be considered in two stages: model selection and parameter estim...
Hamiltonian Monte Carlo (HMC) sampling methods provide a mechanism for defining dis-tant proposals w...
The process of machine learning can be considered in two stages model selection and parameter estim...
The discrete gradient methods are integrators designed to preserve invariants of ordinary differenti...
Pre-PrintThe process of machine learning can be considered in two stages model selection and paramet...
Abstract: Stochastic gradient descent is an optimisation method that combines classical gradient des...
AbstractDifference equations for Hamiltonian systems are derived from a discrete variational princip...
In this work, we revisit the theoretical properties of Hamiltonian stochastic differential equations...
Difference equations for Hamiltonian systems are derived from a discrete variational principle. The ...
In this paper, the differential/difference representation (DDR) of an input-affine dynamics under sa...
Hamiltonian Monte Carlo (HMC) sampling methods provide a mechanism for defining dis-tant proposals w...
International audienceIn a recent paper by Bourdin and Trélat, a version of the Pontryagin maximum p...
International audienceIn this paper gradient and Hamiltonian dynamics are investigated in both discr...
This paper investigates the transformation of Hamiltonian structures under sampling. It is shown tha...
Continuous relaxations play an important role in discrete optimization, but have not seen much use i...
The process of machine learning can be considered in two stages: model selection and parameter estim...
Hamiltonian Monte Carlo (HMC) sampling methods provide a mechanism for defining dis-tant proposals w...
The process of machine learning can be considered in two stages model selection and parameter estim...
The discrete gradient methods are integrators designed to preserve invariants of ordinary differenti...
Pre-PrintThe process of machine learning can be considered in two stages model selection and paramet...
Abstract: Stochastic gradient descent is an optimisation method that combines classical gradient des...
AbstractDifference equations for Hamiltonian systems are derived from a discrete variational princip...
In this work, we revisit the theoretical properties of Hamiltonian stochastic differential equations...
Difference equations for Hamiltonian systems are derived from a discrete variational principle. The ...
In this paper, the differential/difference representation (DDR) of an input-affine dynamics under sa...
Hamiltonian Monte Carlo (HMC) sampling methods provide a mechanism for defining dis-tant proposals w...
International audienceIn a recent paper by Bourdin and Trélat, a version of the Pontryagin maximum p...