AbstractThis paper shows that neural networks which use continuous activation functions have VC dimension at least as large as the square of the number of weightsw. This results settles a long-standing open question, namely whether the well-knownO(wlogw) bound, known for hard-threshold nets, also held for more general sigmoidal nets. Implications for the number of samples needed for valid generalization are discussed
) Wolfgang Maass* Institute for Theoretical Computer Science Technische Universitaet Graz Klosterwie...
Abstract. It is shown that high-order feedforward neural nets of constant depth with piecewise-polyn...
We calculate lower bounds on the size of sigmoidal neural networks that approximate continuous funct...
This paper shows that neural networks which use continuous activation functions have VC dimension at...
AbstractThis paper shows that neural networks which use continuous activation functions have VC dime...
Techniques from differential topology are used to give polynomial bounds for the VC-dimension of sig...
The Vapnik-Chervonenkis dimension VC-dimension(N) of a neural net N with n input nodes is defined as...
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedfor...
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedfor...
It has been known for quite a while that the Vapnik-Chervonenkis dimension (VC-dimension) of a feedf...
AbstractMost of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on...
This paper reviews some of the recent results in applying the theory of Probably Approximately Corre...
. W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks ...
The Vapnik-Chervonenkis dimension has proven to be of great use in the theoretical study of generali...
. We consider the VC-dimension of a set of the neural networks of depth s with w adjustable paramet...
) Wolfgang Maass* Institute for Theoretical Computer Science Technische Universitaet Graz Klosterwie...
Abstract. It is shown that high-order feedforward neural nets of constant depth with piecewise-polyn...
We calculate lower bounds on the size of sigmoidal neural networks that approximate continuous funct...
This paper shows that neural networks which use continuous activation functions have VC dimension at...
AbstractThis paper shows that neural networks which use continuous activation functions have VC dime...
Techniques from differential topology are used to give polynomial bounds for the VC-dimension of sig...
The Vapnik-Chervonenkis dimension VC-dimension(N) of a neural net N with n input nodes is defined as...
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedfor...
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedfor...
It has been known for quite a while that the Vapnik-Chervonenkis dimension (VC-dimension) of a feedf...
AbstractMost of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on...
This paper reviews some of the recent results in applying the theory of Probably Approximately Corre...
. W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks ...
The Vapnik-Chervonenkis dimension has proven to be of great use in the theoretical study of generali...
. We consider the VC-dimension of a set of the neural networks of depth s with w adjustable paramet...
) Wolfgang Maass* Institute for Theoretical Computer Science Technische Universitaet Graz Klosterwie...
Abstract. It is shown that high-order feedforward neural nets of constant depth with piecewise-polyn...
We calculate lower bounds on the size of sigmoidal neural networks that approximate continuous funct...