The authors present a class of efficient algorithms for PAC learning continuous functions and regressions that are approximated by feedforward networks. The algorithms are applicable to networks with unknown weights located only in the output layer and are obtained by utilizing the potential function methods of Aizerman et al. Conditions relating the sample sizes to the error bounds are derived using martingale-type inequalities. For concreteness, the discussion is presented in terms of neural networks, but the results are applicable to general feedforward networks, in particular to wavelet networks. The algorithms can be directly adapted to concept learning problems
AbstractWe describe a generalization of the PAC learning model that is based on statistical decision...
Function approximation is a very important task in environments where computation has to be based on...
We consider learning on multilayer neural nets with piecewise poly-nomial activation functions and a...
We discuss two classes of convergent algorithms for learning continuous functions (and also regressi...
The problem of function estimation using feedforward neural networks based on an indpendently and id...
This paper reviews some of the recent results in applying the theory of Probably Approximately Corre...
AbstractWe describe a generalization of the PAC learning model that is based on statistical decision...
AbstractThis paper applies the theory of Probably Approximately Correct (PAC) learning to multiple o...
This paper applies the theory of Probably Approximately Correct (PAC) learning to multiple output fe...
Regression or function classes of Euclidean type with compact support and certain smoothness propert...
A PAC teaching model -under helpful distributions -is proposed which introduces the classical ideas...
A PAC teaching model -under helpful distributions -is proposed which introduces the classical ideas...
AbstractThis paper applies the theory of Probably Approximately Correct (PAC) learning to multiple o...
This paper discusses within the framework of computational learning theory the current state of know...
This paper deals with learnability of concept classes defined by neural networks, showing the hardne...
AbstractWe describe a generalization of the PAC learning model that is based on statistical decision...
Function approximation is a very important task in environments where computation has to be based on...
We consider learning on multilayer neural nets with piecewise poly-nomial activation functions and a...
We discuss two classes of convergent algorithms for learning continuous functions (and also regressi...
The problem of function estimation using feedforward neural networks based on an indpendently and id...
This paper reviews some of the recent results in applying the theory of Probably Approximately Corre...
AbstractWe describe a generalization of the PAC learning model that is based on statistical decision...
AbstractThis paper applies the theory of Probably Approximately Correct (PAC) learning to multiple o...
This paper applies the theory of Probably Approximately Correct (PAC) learning to multiple output fe...
Regression or function classes of Euclidean type with compact support and certain smoothness propert...
A PAC teaching model -under helpful distributions -is proposed which introduces the classical ideas...
A PAC teaching model -under helpful distributions -is proposed which introduces the classical ideas...
AbstractThis paper applies the theory of Probably Approximately Correct (PAC) learning to multiple o...
This paper discusses within the framework of computational learning theory the current state of know...
This paper deals with learnability of concept classes defined by neural networks, showing the hardne...
AbstractWe describe a generalization of the PAC learning model that is based on statistical decision...
Function approximation is a very important task in environments where computation has to be based on...
We consider learning on multilayer neural nets with piecewise poly-nomial activation functions and a...