We consider the properties of “Potts” neural networks where each neuron can be in $Q$ different states. For a “Potts-perceptron” with $N$ $Q$-states input neurons and one $Q'$ states output neutron, we compute the maximal storage capacity for unbiased patterns. In the large $N$ limit the maximal number of patterns that can be stored is found to be proportional to $N(Q-1)f(Q')$, where $f(Q')$ is of order 1
We analyze the storage capacity of the Hopfield model with spatially correlated patterns ¸ i (i.e....
We propose to measure the memory capacity of a state machine by the numbers of discernible states, w...
This paper presents a further theoretical analysis on the asymptotic memory capacity of the generali...
We define a Potts version of neural networks with q states. We give upper and lower bounds for the s...
AbstractThe focus of the paper is the estimation of the maximum number of states that can be made st...
Quantum neural networks form one pillar of the emergent field of quantum machine learning. Here quan...
We study a fully-connected parity machine with K hidden units for continuous weights. The geometrica...
International audienceThe optimal storage properties of three different neural network models are st...
We study the number p of unbiased random patterns which can be stored in a neural network of N neuro...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
For realistic neural network applications the storage and recognition of gray-tone patterns, i.e., p...
We introduce and analyze a minimal network model of semantic memory in the human brain. The model is...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
We analyze the storage capacity of a variant of the Hopfield model with semantically correlated patt...
We analyze the storage capacity of the Hopfield model with spatially correlated patterns ¸ i (i.e....
We propose to measure the memory capacity of a state machine by the numbers of discernible states, w...
This paper presents a further theoretical analysis on the asymptotic memory capacity of the generali...
We define a Potts version of neural networks with q states. We give upper and lower bounds for the s...
AbstractThe focus of the paper is the estimation of the maximum number of states that can be made st...
Quantum neural networks form one pillar of the emergent field of quantum machine learning. Here quan...
We study a fully-connected parity machine with K hidden units for continuous weights. The geometrica...
International audienceThe optimal storage properties of three different neural network models are st...
We study the number p of unbiased random patterns which can be stored in a neural network of N neuro...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
The study of neural networks by physicists started as an extension of the theory of spin glasses. Fo...
For realistic neural network applications the storage and recognition of gray-tone patterns, i.e., p...
We introduce and analyze a minimal network model of semantic memory in the human brain. The model is...
Recurrent neural networks have been shown to be able to store memory patterns as fixed point attract...
We analyze the storage capacity of a variant of the Hopfield model with semantically correlated patt...
We analyze the storage capacity of the Hopfield model with spatially correlated patterns ¸ i (i.e....
We propose to measure the memory capacity of a state machine by the numbers of discernible states, w...
This paper presents a further theoretical analysis on the asymptotic memory capacity of the generali...