We derive the Gardner storage capacity for associative networks of threshold linear units, and show that with Hebbian learning they can operate closer to such Gardner bound than binary networks, and even surpass it. This is largely achieved through a sparsification of the retrieved patterns, which we analyze for theoretical and empirical distributions of activity. As reaching the optimal capacity via nonlocal learning rules like back propagation requires slow and neurally implausible training procedures, our results indicate that one-shot self-organized Hebbian learning can be just as efficient
1 Introduction Recently, the bump formations in recurrent neural networks have been analyzedin sever...
It has been found that the performance of an associative memory model trained with the perceptron le...
The final publication is available at Springer via http://dx.doi.org/10.1007/11840817_25Proceedings ...
Threshold-linear (graded response) units approximate the real firing behaviour of pyramidal neurons ...
Attractor neural networks (ANNs) are one of the leading theoretical frameworks for the formation and...
In neural networks, two specific dynamical behaviours are well known: 1) Networks naturally find pat...
We investigate the pattern completion performance of neural auto-associative memories composed of bi...
The original publication is available at www.springerlink.com . Copyright SpringerThe performance ch...
We consider the problem of neural association for a network of non-binary neurons. Here, the task is...
Abstract—We consider the problem of neural association for a network of non-binary neurons. Here, th...
The consequences of imposing a sign constraint on the standard Hopfield architecture associative mem...
We consider the problem of neural association for a network of nonbinary neurons. Here, the task is ...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
Copyright SpringerThe consequences of two techniques for symmetrically diluting the weights of the s...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
1 Introduction Recently, the bump formations in recurrent neural networks have been analyzedin sever...
It has been found that the performance of an associative memory model trained with the perceptron le...
The final publication is available at Springer via http://dx.doi.org/10.1007/11840817_25Proceedings ...
Threshold-linear (graded response) units approximate the real firing behaviour of pyramidal neurons ...
Attractor neural networks (ANNs) are one of the leading theoretical frameworks for the formation and...
In neural networks, two specific dynamical behaviours are well known: 1) Networks naturally find pat...
We investigate the pattern completion performance of neural auto-associative memories composed of bi...
The original publication is available at www.springerlink.com . Copyright SpringerThe performance ch...
We consider the problem of neural association for a network of non-binary neurons. Here, the task is...
Abstract—We consider the problem of neural association for a network of non-binary neurons. Here, th...
The consequences of imposing a sign constraint on the standard Hopfield architecture associative mem...
We consider the problem of neural association for a network of nonbinary neurons. Here, the task is ...
Understanding the theoretical foundations of how memories are encoded and retrieved in neural popula...
Copyright SpringerThe consequences of two techniques for symmetrically diluting the weights of the s...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
1 Introduction Recently, the bump formations in recurrent neural networks have been analyzedin sever...
It has been found that the performance of an associative memory model trained with the perceptron le...
The final publication is available at Springer via http://dx.doi.org/10.1007/11840817_25Proceedings ...