We propose a semi-supervised learning strategy for deep Convolutional Neural Networks (CNNs) in which an unsupervised pre-training stage, performed using biologically inspired Hebbian learning algorithms, is followed by supervised end-to-end backprop fine-tuning. We explored two Hebbian learning rules for the unsupervised pre-training stage: soft-Winner-Takes-All (soft-WTA) and nonlinear Hebbian Principal Component Analysis (HPCA). Our approach was applied in sample efficiency scenarios, where the amount of available labeled training samples is very limited, and unsupervised pre-training is therefore beneficial. We performed experiments on CIFAR10, CIFAR100, and Tiny ImageNet datasets. Our results show that Hebbian outperforms Variational A...
Unsupervised learning permits the development of algorithms that are able to adapt to a variety of d...
Pytorch implementation of Hebbian learning algorithms to train deep convolutional neural networks. ...
Current commonly used image recognition convolutional neural networks share some similarities with t...
In this paper, we investigate Hebbian learning strategies applied to Convolutional Neural Network (C...
We propose to address the issue of sample efficiency, in Deep Convolutional Neural Networks (DCNN), ...
We explore competitive Hebbian learning strategies to train feature detectors in Convolutional Neura...
The concept of Hebbian learning refers to a family of learning rules, inspired by biology, according...
Learning algorithms for Deep Neural Networks are typically based on supervised end-to-end Stochastic...
Hebbian plasticity in winner-take-all (WTA) networks is highly attractive for neuromorphic on-chip l...
One of the major paradigms for unsupervised learning in Artificial Neural Networks is Hebbian learni...
Hebbian plasticity in winner-take-all (WTA) networks is highly attractive for neuromorphic on-chip l...
We investigate the properties of feedforward neural networks trained with Hebbian learning algorit...
Multi-layer models of sparse coding (deep dictionary learning) and dimensionality reduction (PCANet)...
In the past few years, Deep Neural Network (DNN) architectures have achieved outstanding results in ...
In the past few years, Deep Neural Network (DNN) architectures have achieved outstanding results in ...
Unsupervised learning permits the development of algorithms that are able to adapt to a variety of d...
Pytorch implementation of Hebbian learning algorithms to train deep convolutional neural networks. ...
Current commonly used image recognition convolutional neural networks share some similarities with t...
In this paper, we investigate Hebbian learning strategies applied to Convolutional Neural Network (C...
We propose to address the issue of sample efficiency, in Deep Convolutional Neural Networks (DCNN), ...
We explore competitive Hebbian learning strategies to train feature detectors in Convolutional Neura...
The concept of Hebbian learning refers to a family of learning rules, inspired by biology, according...
Learning algorithms for Deep Neural Networks are typically based on supervised end-to-end Stochastic...
Hebbian plasticity in winner-take-all (WTA) networks is highly attractive for neuromorphic on-chip l...
One of the major paradigms for unsupervised learning in Artificial Neural Networks is Hebbian learni...
Hebbian plasticity in winner-take-all (WTA) networks is highly attractive for neuromorphic on-chip l...
We investigate the properties of feedforward neural networks trained with Hebbian learning algorit...
Multi-layer models of sparse coding (deep dictionary learning) and dimensionality reduction (PCANet)...
In the past few years, Deep Neural Network (DNN) architectures have achieved outstanding results in ...
In the past few years, Deep Neural Network (DNN) architectures have achieved outstanding results in ...
Unsupervised learning permits the development of algorithms that are able to adapt to a variety of d...
Pytorch implementation of Hebbian learning algorithms to train deep convolutional neural networks. ...
Current commonly used image recognition convolutional neural networks share some similarities with t...