AbstractMultilayer perceptrons can compute arbitrary dichotomies of a set of N points of [0, 1]d. The minimal size of such networks was studied by Baum (1988, J. Complexity4, 193-215) using the parameter N. In this paper, we show that this question can be addressed using another parameter, the minimum distance δ between the two classes. We derive related upper and lower bounds on the size of nets capable of computing arbitrary dichotomies
International audienceIt has been shown that, when used for pattern recognition with supervised lear...
We study the sample complexity of learning neural networks by providing new bounds on their Rademach...
A general relationship is developed between the VC-dimension and the statistical lower epsilon-capac...
AbstractMultilayer perceptrons can compute arbitrary dichotomies of a set of N points of [0, 1]d. Th...
AbstractWhat is the smallest multilayer perceptron able to compute arbitrary and random functions? P...
We investigate the network complexity of multilayered perceptrons for solving exactly a given proble...
SIGLEAvailable at INIST (FR), Document Supply Service, under shelf-number : RP 11329 / INIST-CNRS - ...
We investigate the network complexity of multi-layered perceptrons for solving ex-actly a given prob...
It is known that any dichotomy of {−1, 1}n can be learned (separated) with a higher-order neuron (po...
This paper relies on the entropy of a data-set (i.e., number-of-bits) to prove tight bounds on the s...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
We present results on the number of linear regions of the functions that can be represented by artif...
A product unit is a formal neuron that multiplies its input values instead of summing them. Further...
We consider the problem of learning in multilayer feed-forward networks of linear threshold units. W...
An important issue in neural network research is how to choose the number of nodes and layers such a...
International audienceIt has been shown that, when used for pattern recognition with supervised lear...
We study the sample complexity of learning neural networks by providing new bounds on their Rademach...
A general relationship is developed between the VC-dimension and the statistical lower epsilon-capac...
AbstractMultilayer perceptrons can compute arbitrary dichotomies of a set of N points of [0, 1]d. Th...
AbstractWhat is the smallest multilayer perceptron able to compute arbitrary and random functions? P...
We investigate the network complexity of multilayered perceptrons for solving exactly a given proble...
SIGLEAvailable at INIST (FR), Document Supply Service, under shelf-number : RP 11329 / INIST-CNRS - ...
We investigate the network complexity of multi-layered perceptrons for solving ex-actly a given prob...
It is known that any dichotomy of {−1, 1}n can be learned (separated) with a higher-order neuron (po...
This paper relies on the entropy of a data-set (i.e., number-of-bits) to prove tight bounds on the s...
Ellerbrock TM. Multilayer neural networks : learnability, network generation, and network simplifica...
We present results on the number of linear regions of the functions that can be represented by artif...
A product unit is a formal neuron that multiplies its input values instead of summing them. Further...
We consider the problem of learning in multilayer feed-forward networks of linear threshold units. W...
An important issue in neural network research is how to choose the number of nodes and layers such a...
International audienceIt has been shown that, when used for pattern recognition with supervised lear...
We study the sample complexity of learning neural networks by providing new bounds on their Rademach...
A general relationship is developed between the VC-dimension and the statistical lower epsilon-capac...