AbstractRecent results on the memory storage capacity of the outer-product algorithm indicate that the algorithm stores of the order of n/log n memories in a network of n fully interconnected linear threshold elements when it is required that each memory be exactly recovered from a probe which is close enough to it. In this paper a rigourous analysis is presented of generalizations of the outer-product algorithm to higher-order networks of densely interconnected polynomial thresh-old units of degree d. Precise notions of memory storage capacity are formulated, and it is demonstrated that both static and dynamic storage capacities of all variants of the outer-product algorithm of degree d are of the order of nd/log n
A fundamental problem in neuroscience is understanding how working memory-the ability to store infor...
The information capacity of Kanerva's Sparse Distributed Memory (SDM) and Hopfield-type neural netwo...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
AbstractThe focus of the paper is the estimation of the maximum number of states that can be made st...
A model of associate memory incorporating global linearity and pointwise nonlinearities in a state s...
AbstractGeneral high order neural networks [LD…] (models which are multinomial as opposed to linear ...
AbstractThis paper deals with a neural network model in which each neuron performs a threshold logic...
The outer-product method for programming the Hopfield model is discussed. The method can result in m...
A generalized associative memory model with potentially high capacity is presented. A memory of this...
Determining the memory capacity of two layer neural networks with $m$ hidden neurons and input dimen...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
Local learning neural networks have long been limited by their inability to store correlated pattern...
Sequence memory is an essential attribute of natural and artificial intelligence that enables agents...
The neural network is a powerful computing framework that has been exploited by biological evolution...
This paper deals with a neural network model in which each neuron performs a threshold logic functio...
A fundamental problem in neuroscience is understanding how working memory-the ability to store infor...
The information capacity of Kanerva's Sparse Distributed Memory (SDM) and Hopfield-type neural netwo...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
AbstractThe focus of the paper is the estimation of the maximum number of states that can be made st...
A model of associate memory incorporating global linearity and pointwise nonlinearities in a state s...
AbstractGeneral high order neural networks [LD…] (models which are multinomial as opposed to linear ...
AbstractThis paper deals with a neural network model in which each neuron performs a threshold logic...
The outer-product method for programming the Hopfield model is discussed. The method can result in m...
A generalized associative memory model with potentially high capacity is presented. A memory of this...
Determining the memory capacity of two layer neural networks with $m$ hidden neurons and input dimen...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
Local learning neural networks have long been limited by their inability to store correlated pattern...
Sequence memory is an essential attribute of natural and artificial intelligence that enables agents...
The neural network is a powerful computing framework that has been exploited by biological evolution...
This paper deals with a neural network model in which each neuron performs a threshold logic functio...
A fundamental problem in neuroscience is understanding how working memory-the ability to store infor...
The information capacity of Kanerva's Sparse Distributed Memory (SDM) and Hopfield-type neural netwo...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...