We study a fully-connected parity machine with K hidden units for continuous weights. The geometrical structure of the weight space of this model is analysed in terms of the volumes associated with the internal representations of the training set. By examining the asymptotic behaviour of order parameters in the large K limit, we find the maximum number ru,, the storage capacity, of patterns per input unit to be K In K/ln2 up to leading order, which saturates the mathematical bound given by Mitchison and Durbin. Unlike the committee machine, the storage capacity per weight remains unchanged compared with the corresponding tree-like architecture.X114sciescopu
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
A long standing open problem in the theory of neural networks is the development of quantitative met...
We prove that any algorithm for learning parities requires either a memory of quadratic size or an e...
We study a fully-connected parity machine with K hidden units for continuous weights. The geometrica...
We study the storage capacity of a fully connected committee machine with a large number K of hidden...
An algorithm for the training of a special multilayered feed-forward neural network is presented. Th...
We consider the properties of “Potts” neural networks where each neuron can be in $Q$ different stat...
The problem of computing the storage capacity of a feed-forward network, with L hidden layers, N inp...
We define a Potts version of neural networks with q states. We give upper and lower bounds for the s...
AbstractThe PRAM model of parallel computation is examined with respect to wordsize, the number of b...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
We study the number p of unbiased random patterns which can be stored in a neural network of N neuro...
AbstractThe focus of the paper is the estimation of the maximum number of states that can be made st...
We present a model of long term memory : learning within irreversible bounds. The best bound values ...
We study generalizations of the Hopfield model for associative memory which contain interactions of ...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
A long standing open problem in the theory of neural networks is the development of quantitative met...
We prove that any algorithm for learning parities requires either a memory of quadratic size or an e...
We study a fully-connected parity machine with K hidden units for continuous weights. The geometrica...
We study the storage capacity of a fully connected committee machine with a large number K of hidden...
An algorithm for the training of a special multilayered feed-forward neural network is presented. Th...
We consider the properties of “Potts” neural networks where each neuron can be in $Q$ different stat...
The problem of computing the storage capacity of a feed-forward network, with L hidden layers, N inp...
We define a Potts version of neural networks with q states. We give upper and lower bounds for the s...
AbstractThe PRAM model of parallel computation is examined with respect to wordsize, the number of b...
An algorithm for the training of multilayered feedforward neural networks is presented. The strategy...
We study the number p of unbiased random patterns which can be stored in a neural network of N neuro...
AbstractThe focus of the paper is the estimation of the maximum number of states that can be made st...
We present a model of long term memory : learning within irreversible bounds. The best bound values ...
We study generalizations of the Hopfield model for associative memory which contain interactions of ...
In standard attractor neural network models, specific patterns of activity are stored in the synapti...
A long standing open problem in the theory of neural networks is the development of quantitative met...
We prove that any algorithm for learning parities requires either a memory of quadratic size or an e...