Deep Neural Networks (DNNs) have two key deficiencies, their dependence on high precision computing and their inability to perform sequential learning, that is, when a DNN is trained on a first task and the same DNN is trained on the next task it forgets the first task. This phenomenon of forgetting previous tasks is also referred to as catastrophic forgetting. On the other hand a mammalian brain outperforms DNNs in terms of energy efficiency and the ability to learn sequentially without catastrophically forgetting. Here, we use bio-inspired Spike Timing Dependent Plasticity (STDP) in the feature extraction layers of the network with instantaneous neurons to extract meaningful features. In the classification sections of the network we use a...
We present a model of spike-driven synaptic plasticity inspired by experimental observations and mot...
Biological neurons communicate primarily via a spiking process. Recurrently connected spiking neural...
Deep neural networks are used in many state-of-the-art systems for machine perception. Once a networ...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
The ability to learn tasks in a sequential fashion is crucial to the development of artificial intel...
Spiking neural networks are biologically plausible counterparts of artificial neural networks. Artif...
Overcoming Catastrophic Forgetting in neural networks is crucial to solving continuous learning prob...
Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomeno...
Humans can learn several tasks in succession with minimal mutual interference but perform more poorl...
Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomeno...
We present a model of spike-driven synaptic plasticity inspired by experimental observations and mot...
Biological neurons communicate primarily via a spiking process. Recurrently connected spiking neural...
Deep neural networks are used in many state-of-the-art systems for machine perception. Once a networ...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
Continual learning is the ability to acquire a new task or knowledge without losing any previously c...
The ability to learn tasks in a sequential fashion is crucial to the development of artificial intel...
Spiking neural networks are biologically plausible counterparts of artificial neural networks. Artif...
Overcoming Catastrophic Forgetting in neural networks is crucial to solving continuous learning prob...
Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomeno...
Humans can learn several tasks in succession with minimal mutual interference but perform more poorl...
Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomeno...
We present a model of spike-driven synaptic plasticity inspired by experimental observations and mot...
Biological neurons communicate primarily via a spiking process. Recurrently connected spiking neural...
Deep neural networks are used in many state-of-the-art systems for machine perception. Once a networ...