While end-to-end training of Deep Neural Networks (DNNs) yields state of the art performance in an increasing array of applications, it does not provide insight into, or control over, the features being extracted. We report here on a promising neuro-inspired approach to DNNs with sparser and stronger activations. We use standard stochastic gradient training, supplementing the end-to-end discriminative cost function with layer-wise costs promoting Hebbian ("fire together," "wire together") updates for highly active neurons, and anti-Hebbian updates for the remaining neurons. Instead of batch norm, we use divisive normalization of activations (suppressing weak outputs using strong outputs), along with implicit $\ell_2$ normalization of neuron...
Deep convolutional sparse coding (D-CSC) is a framework reminiscent of deep convolutional neural net...
Hardware accelerators for neural network inference can exploit common data properties for performanc...
Sparse representation plays a critical role in vision problems, including generation and understandi...
Recently, sparse training methods have started to be established as a de facto approach for training...
Deep neural networks (DNN) are the state-of-the-art machine learning models outperforming traditiona...
Sparse neural networks attract increasing interest as they exhibit comparable performance to their d...
Brain-inspired event-driven processors execute deep neural networks (DNNs) in a sparsity-aware manne...
Deep Neural Networks (DNNs) have greatly advanced several domains of machine learning including imag...
International audienceSparsifying deep neural networks is of paramount interest in many areas, espec...
DNNs have been finding a growing number of applications including image classification, speech recog...
\u3cp\u3eThrough the success of deep learning in various domains, artificial neural networks are cur...
Deep neural nets (DNNs) compression is crucial for adaptation to mobile devices. Though many success...
Artificial neural networks (ANNs) have emerged as hot topics in the research community. Despite the ...
Inspired by the physiology of neuronal systems in the brain, artificial neural networks have become ...
Deep convolutional sparse coding (D-CSC) is a framework reminiscent of deep convolutional neural net...
Hardware accelerators for neural network inference can exploit common data properties for performanc...
Sparse representation plays a critical role in vision problems, including generation and understandi...
Recently, sparse training methods have started to be established as a de facto approach for training...
Deep neural networks (DNN) are the state-of-the-art machine learning models outperforming traditiona...
Sparse neural networks attract increasing interest as they exhibit comparable performance to their d...
Brain-inspired event-driven processors execute deep neural networks (DNNs) in a sparsity-aware manne...
Deep Neural Networks (DNNs) have greatly advanced several domains of machine learning including imag...
International audienceSparsifying deep neural networks is of paramount interest in many areas, espec...
DNNs have been finding a growing number of applications including image classification, speech recog...
\u3cp\u3eThrough the success of deep learning in various domains, artificial neural networks are cur...
Deep neural nets (DNNs) compression is crucial for adaptation to mobile devices. Though many success...
Artificial neural networks (ANNs) have emerged as hot topics in the research community. Despite the ...
Inspired by the physiology of neuronal systems in the brain, artificial neural networks have become ...
Deep convolutional sparse coding (D-CSC) is a framework reminiscent of deep convolutional neural net...
Hardware accelerators for neural network inference can exploit common data properties for performanc...
Sparse representation plays a critical role in vision problems, including generation and understandi...