International audienceBackpropagating gradients through random variables is at the heart of numerous machine learning applications. In this paper, we present a general framework for deriving stochastic backpropagation rules for any continuous distribution. Our approach exploits the link between the characteristic function and the Fourier transform, to transport the derivatives from the parameters of the distribution to the random variable. Our method generalizes previously known estimators, and results in new estimators for the gamma, beta, Dirichlet and Laplace distributions. Furthermore, we show that the classical deterministic backproapagation rule in neural networks, is a special case of stochastic backpropagation with Dirac distributio...
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised ...
Digital backpropagation gained popularity due to its ability to combat deterministic nonlinear effec...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
International audienceBackpropagating gradients through random variables is at the heart of numerous...
This is the final version of the article. It first appeared from International Conference on Learnin...
We introduce a novel training principle for prob-abilistic models that is an alternative to max-imum...
We introduce a novel training principle for probabilistic models that is an al-ternative to maximum ...
We investigate a new approach to compute the gradients of artificial neural networks (ANNs), based o...
This paper proposes a backpropagation-based feedforward neural network for learning probability dist...
This paper proposes a backpropagation-based feedforward neural network for learning probability dist...
We introduce a novel training principle for probabilistic models that is an alternative to maximum l...
Despite of remarkable progress on deep learning, its hardware implementation beyond deep learning ac...
We marry ideas from deep neural networks and approximate Bayesian inference to derive a gen-eralised...
Abstract. Recently, we proposed to transform the outputs of each hidden neu-ron in a multi-layer per...
The paper studies a stochastic extension of continuous recurrent neural networks and analyzes gradie...
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised ...
Digital backpropagation gained popularity due to its ability to combat deterministic nonlinear effec...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
International audienceBackpropagating gradients through random variables is at the heart of numerous...
This is the final version of the article. It first appeared from International Conference on Learnin...
We introduce a novel training principle for prob-abilistic models that is an alternative to max-imum...
We introduce a novel training principle for probabilistic models that is an al-ternative to maximum ...
We investigate a new approach to compute the gradients of artificial neural networks (ANNs), based o...
This paper proposes a backpropagation-based feedforward neural network for learning probability dist...
This paper proposes a backpropagation-based feedforward neural network for learning probability dist...
We introduce a novel training principle for probabilistic models that is an alternative to maximum l...
Despite of remarkable progress on deep learning, its hardware implementation beyond deep learning ac...
We marry ideas from deep neural networks and approximate Bayesian inference to derive a gen-eralised...
Abstract. Recently, we proposed to transform the outputs of each hidden neu-ron in a multi-layer per...
The paper studies a stochastic extension of continuous recurrent neural networks and analyzes gradie...
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised ...
Digital backpropagation gained popularity due to its ability to combat deterministic nonlinear effec...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...