We obtained an analytical expression for the computational complexity of many layered committee machines with a finite number of hidden layers (L < 8) using the generalization complexity measure introduced by Franco et al (2006) IEEE Trans. Neural Netw. 17 578. Although our result is valid in the large-size limit and for an overlap synaptic matrix that is ultrametric, it provides a useful tool for inferring the appropriate architecture a network must have to reproduce an arbitrary realizable Boolean function
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
We consider the sample complexity of concept learning when we classify by using a fixed Boolean func...
This paper addresses the relationship between the number of hidden layer nodes in a neural network, ...
We obtained an analytical expression for the computational complexity of many layered committee mach...
The problem of learning by examples in ultrametric committee machines (UCMs) is studied within the f...
The problem of computing the storage capacity of a feed-forward network, with L hidden layers, N inp...
We study the number of hidden layers required by a multilayer neural network with threshold units to...
Heuristic tools from statistical physics have been used in the past to locate the phase transitions ...
AbstractWhat is the smallest multilayer perceptron able to compute arbitrary and random functions? P...
Abstract. The generalization ability of different sizes architectures with one and two hidden layers...
The computational power of neural networks depends on properties of the real numbers used as weights...
We study the space of functions computed by random-layered machines, including deep neural networks ...
In this paper we investigate multi-layer perceptron networks in the task domain of Boolean functions...
We develop a new technique of proving lower bounds for the randomized communica-tion complexity of b...
A long standing open problem in the theory of neural networks is the development of quantitative met...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
We consider the sample complexity of concept learning when we classify by using a fixed Boolean func...
This paper addresses the relationship between the number of hidden layer nodes in a neural network, ...
We obtained an analytical expression for the computational complexity of many layered committee mach...
The problem of learning by examples in ultrametric committee machines (UCMs) is studied within the f...
The problem of computing the storage capacity of a feed-forward network, with L hidden layers, N inp...
We study the number of hidden layers required by a multilayer neural network with threshold units to...
Heuristic tools from statistical physics have been used in the past to locate the phase transitions ...
AbstractWhat is the smallest multilayer perceptron able to compute arbitrary and random functions? P...
Abstract. The generalization ability of different sizes architectures with one and two hidden layers...
The computational power of neural networks depends on properties of the real numbers used as weights...
We study the space of functions computed by random-layered machines, including deep neural networks ...
In this paper we investigate multi-layer perceptron networks in the task domain of Boolean functions...
We develop a new technique of proving lower bounds for the randomized communica-tion complexity of b...
A long standing open problem in the theory of neural networks is the development of quantitative met...
The performance of an Artificial Neural Network (ANN) strongly depends on its hidden layer architect...
We consider the sample complexity of concept learning when we classify by using a fixed Boolean func...
This paper addresses the relationship between the number of hidden layer nodes in a neural network, ...