Humans can learn several tasks in succession with minimal mutual interference but perform more poorly when trained on multiple tasks at once. The opposite is true for standard deep neural networks. Here, we propose novel computational constraints for artificial neural networks, inspired by earlier work on gating in the primate prefrontal cortex, that capture the cost of interleaved training and allow the network to learn two tasks in sequence without forgetting. We augment standard stochastic gradient descent with two algorithmic motifs, so-called “sluggish” task units and a Hebbian training step that strengthens connections between task units and hidden units that encode task-relevant information. We found that the “sluggish” units introdu...
While humans can learn to perform many specific and highly specialized behaviors,perhaps what is mos...
Abstract. Working memory is a key component of intelligence that the brain implements as persistent ...
Biological agents do not have infinite resources to learn new things. For this reason, a central asp...
Humans can learn several tasks in succession with minimal mutual interference but perform more poorl...
Humans can learn to perform multiple tasks in succession over the lifespan ("continual" learning), w...
Humans can learn to perform multiple tasks in succession over the lifespan ("continual" learning), w...
How do humans and other animals learn new tasks? A wave of brain recording studies has investigated ...
The ability to sequentially learn multiple tasks without forgetting is a key skill of biological bra...
Neural networks struggle in continual learning settings from catastrophic forgetting: when trials ar...
Depending on environmental demands, humans performing in a given task are able to exploit multiple c...
Depending on environmental demands, humans performing in a given task are able to exploit multiple c...
Depending on environmental demands, humans performing in a given task are able to exploit multiple c...
Neural networks are very powerful computational models, capable of outperforming humans on a variety...
Memories are stored and recalled throughout the lifetime of an animal, but many models of memory, in...
Deep Neural Networks (DNNs) have two key deficiencies, their dependence on high precision computing ...
While humans can learn to perform many specific and highly specialized behaviors,perhaps what is mos...
Abstract. Working memory is a key component of intelligence that the brain implements as persistent ...
Biological agents do not have infinite resources to learn new things. For this reason, a central asp...
Humans can learn several tasks in succession with minimal mutual interference but perform more poorl...
Humans can learn to perform multiple tasks in succession over the lifespan ("continual" learning), w...
Humans can learn to perform multiple tasks in succession over the lifespan ("continual" learning), w...
How do humans and other animals learn new tasks? A wave of brain recording studies has investigated ...
The ability to sequentially learn multiple tasks without forgetting is a key skill of biological bra...
Neural networks struggle in continual learning settings from catastrophic forgetting: when trials ar...
Depending on environmental demands, humans performing in a given task are able to exploit multiple c...
Depending on environmental demands, humans performing in a given task are able to exploit multiple c...
Depending on environmental demands, humans performing in a given task are able to exploit multiple c...
Neural networks are very powerful computational models, capable of outperforming humans on a variety...
Memories are stored and recalled throughout the lifetime of an animal, but many models of memory, in...
Deep Neural Networks (DNNs) have two key deficiencies, their dependence on high precision computing ...
While humans can learn to perform many specific and highly specialized behaviors,perhaps what is mos...
Abstract. Working memory is a key component of intelligence that the brain implements as persistent ...
Biological agents do not have infinite resources to learn new things. For this reason, a central asp...