A product unit is a formal neuron that multiplies its input values instead of summing them. Furthermore, it has weights acting as exponents instead of being factors. We investigate the complexity of learning for networks containing product units. We establish bounds on the Vapnik-Chervonenkis (VC) dimension that can be used to assess the generalization capabilities of these networks. In particular, we show that the VC dimension for these networks is not larger than the best known bound for sigmoidal networks. For higherorder networks we derive upper bounds that are independent of the degree of these networks. We also contrast these results with lower bounds
We consider the problem of learning in multilayer feed-forward networks of linear threshold units. W...
. W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks ...
. We consider the VC-dimension of a set of the neural networks of depth s with w adjustable paramet...
The Vapnik-Chervonenkis dimension has proven to be of great use in the theoretical study of generali...
Techniques from differential topology are used to give polynomial bounds for the VC-dimension of sig...
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedfor...
AbstractMost of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on...
In this paper, we introduce the discretized-Vapnik-Chervonenkis (VC) dimension for studying the comp...
The Vapnik-Chervonenkis dimension (VC-dim) characterizes the sample learning complexity of a classif...
In this paper, we introduce the discretized-Vapnik-Chervonenkis (VC) dimension for studying the comp...
The Vapnik-Chervonenkis dimension (VC-dim) characterizes the sample learning complexity of a classif...
A general relationship is developed between the VC-dimension and the statistical lower epsilon-capac...
The Vapnik-Chervonenkis dimension VC-dimension(N) of a neural net N with n input nodes is defined as...
This paper shows that neural networks which use continuous activation functions have VC dimension at...
AbstractThis paper shows that neural networks which use continuous activation functions have VC dime...
We consider the problem of learning in multilayer feed-forward networks of linear threshold units. W...
. W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks ...
. We consider the VC-dimension of a set of the neural networks of depth s with w adjustable paramet...
The Vapnik-Chervonenkis dimension has proven to be of great use in the theoretical study of generali...
Techniques from differential topology are used to give polynomial bounds for the VC-dimension of sig...
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedfor...
AbstractMost of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on...
In this paper, we introduce the discretized-Vapnik-Chervonenkis (VC) dimension for studying the comp...
The Vapnik-Chervonenkis dimension (VC-dim) characterizes the sample learning complexity of a classif...
In this paper, we introduce the discretized-Vapnik-Chervonenkis (VC) dimension for studying the comp...
The Vapnik-Chervonenkis dimension (VC-dim) characterizes the sample learning complexity of a classif...
A general relationship is developed between the VC-dimension and the statistical lower epsilon-capac...
The Vapnik-Chervonenkis dimension VC-dimension(N) of a neural net N with n input nodes is defined as...
This paper shows that neural networks which use continuous activation functions have VC dimension at...
AbstractThis paper shows that neural networks which use continuous activation functions have VC dime...
We consider the problem of learning in multilayer feed-forward networks of linear threshold units. W...
. W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks ...
. We consider the VC-dimension of a set of the neural networks of depth s with w adjustable paramet...