In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. In this paper, we propose a biologically inspired neural network model which overcomes this problem. The proposed model consists of two distinct networks: one is a Hopfield type of chaotic associative memory and the other is a multilayer neural network. We consider that these networks correspond to the hippocampus and the neocortex of the brain, respectively. Information given is firstly stored in the hippocampal network with fast learning algorithm. Then the stored information is recalled by chaotic behavior of each neuron in t...
19 pagesInternational audienceWe explore a dual-network architecture with self-refreshing memory (An...
19 pagesInternational audienceWe explore a dual-network architecture with self-refreshing memory (An...
19 pagesInternational audienceWe explore a dual-network architecture with self-refreshing memory (An...
Abstract—In neural networks, when new patters are learned by a network, they radically interfere wit...
In neural networks, when new patters are learned by a network, they radically interfere with previou...
Version abrégée en FrançaisInternational audienceGradient descent learning procedures are most often...
Version abrégée en FrançaisInternational audienceGradient descent learning procedures are most often...
Version abrégée en FrançaisInternational audienceGradient descent learning procedures are most often...
In recent years, the possible applications of artificial intelligence (AI) and deep learning have in...
In recent years, the possible applications of artificial intelligence (AI) and deep learning have in...
This paper proposes a new dynamical memory system based on chaotic neural networks, and its learning...
This thesis describes a connectionist approach to learning and long-term memory consolidation, inspi...
This thesis describes a connectionist approach to learning and long-term memory consolidation, inspi...
For the last twenty years, several assumptions have been expressed in the fields of information proc...
For the last twenty years, several assumptions have been expressed in the fields of information proc...
19 pagesInternational audienceWe explore a dual-network architecture with self-refreshing memory (An...
19 pagesInternational audienceWe explore a dual-network architecture with self-refreshing memory (An...
19 pagesInternational audienceWe explore a dual-network architecture with self-refreshing memory (An...
Abstract—In neural networks, when new patters are learned by a network, they radically interfere wit...
In neural networks, when new patters are learned by a network, they radically interfere with previou...
Version abrégée en FrançaisInternational audienceGradient descent learning procedures are most often...
Version abrégée en FrançaisInternational audienceGradient descent learning procedures are most often...
Version abrégée en FrançaisInternational audienceGradient descent learning procedures are most often...
In recent years, the possible applications of artificial intelligence (AI) and deep learning have in...
In recent years, the possible applications of artificial intelligence (AI) and deep learning have in...
This paper proposes a new dynamical memory system based on chaotic neural networks, and its learning...
This thesis describes a connectionist approach to learning and long-term memory consolidation, inspi...
This thesis describes a connectionist approach to learning and long-term memory consolidation, inspi...
For the last twenty years, several assumptions have been expressed in the fields of information proc...
For the last twenty years, several assumptions have been expressed in the fields of information proc...
19 pagesInternational audienceWe explore a dual-network architecture with self-refreshing memory (An...
19 pagesInternational audienceWe explore a dual-network architecture with self-refreshing memory (An...
19 pagesInternational audienceWe explore a dual-network architecture with self-refreshing memory (An...