Motivated by the goal of enabling energy-efficient and/or lower-cost hardware implementations of deep neural networks, we describe a method for modifying the standard backpropagation algorithm that significantly reduces the memory usage during training by up to a factor of 32 compared with standard single-precision floating point implementations. The method is inspired by recent work on feedback alignment in the context of seeking neurobiological correlates of backpropagationbased learning; similar to feedback alignment, we also calculate gradients imprecisely. Specifically, our method introduces stochastic binarization of hidden-unit activations for use in the backward pass, after they are no longer used in the forward pass. We show that w...
Artificial neural networks (ANNs) trained using backpropagation are powerful learning architectures ...
The state-of-the art machine learning approach to training deep neural networks, backpropagation, is...
An ongoing challenge in neuromorphic computing is to devise general and computationally efficient mo...
Despite of remarkable progress on deep learning, its hardware implementation beyond deep learning ac...
Training deep neural networks on large-scale datasets requires significant hardware resources whose ...
The family of feedback alignment (FA) algorithms aims to provide a more biologically motivated alter...
The family of feedback alignment (FA) algorithms aims to provide a more biologically motivated alter...
Significant success has been reported recently ucsing deep neural networks for classification. Such ...
Recent advances in deep neural networks (DNNs) owe their success to training algorithms that use bac...
We present a method to train self-binarizing neural networks, that is, networks that evolve their we...
The backpropagation (BP) algorithm is often thought to be biologically implausible in the brain. One...
The backpropagation (BP) algorithm is often thought to be biologically implausible in the brain. One...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
Random backpropagation (RBP) is a variant of the backpropagation algorithm for training neural netwo...
International audienceWith the adoption of smart systems, artificial neural networks (ANNs) have bec...
Artificial neural networks (ANNs) trained using backpropagation are powerful learning architectures ...
The state-of-the art machine learning approach to training deep neural networks, backpropagation, is...
An ongoing challenge in neuromorphic computing is to devise general and computationally efficient mo...
Despite of remarkable progress on deep learning, its hardware implementation beyond deep learning ac...
Training deep neural networks on large-scale datasets requires significant hardware resources whose ...
The family of feedback alignment (FA) algorithms aims to provide a more biologically motivated alter...
The family of feedback alignment (FA) algorithms aims to provide a more biologically motivated alter...
Significant success has been reported recently ucsing deep neural networks for classification. Such ...
Recent advances in deep neural networks (DNNs) owe their success to training algorithms that use bac...
We present a method to train self-binarizing neural networks, that is, networks that evolve their we...
The backpropagation (BP) algorithm is often thought to be biologically implausible in the brain. One...
The backpropagation (BP) algorithm is often thought to be biologically implausible in the brain. One...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
Random backpropagation (RBP) is a variant of the backpropagation algorithm for training neural netwo...
International audienceWith the adoption of smart systems, artificial neural networks (ANNs) have bec...
Artificial neural networks (ANNs) trained using backpropagation are powerful learning architectures ...
The state-of-the art machine learning approach to training deep neural networks, backpropagation, is...
An ongoing challenge in neuromorphic computing is to devise general and computationally efficient mo...