We provide some theoretical results on sample complexity of PAC learning when the hypotheses are given by subsymbolical devices such as neural networks. In this framework we give new foundations to the notion of degrees of freedom of a statistic and relate it to the complexity of a concept class. Thus, for a given concept class and a given sample size, we discuss the efficiency of subsymbolical learning algorithms in terms of degrees of freedom of the computed statistic. In this setting we appraise the sample complexity overhead coming from relying on approximate hypotheses and display an increase in the degrees of freedom yield by embedding available formal knowledge into the algorithm. For known sample distribution, these quantities are r...
The thesis explores efficient learning algorithms in settings which are more restrictive than the PA...
AbstractThe PAC model of learning and its extension to real valued function classes provides a well-...
We study the problem of determining, for a class of functions ¡, whether an unknown target function ...
AbstractWe provide some theoretical results on sample complexity of PAC learning when the hypotheses...
This paper focuses on a general setup for obtaining sample size lower bounds for learning concept cl...
There are many types of activity which are commonly known as ‘learning’. Here, we shall discuss a ma...
AbstractSome basic issues in the statistical mechanics of learning from examples are reviewed. The a...
Probably Approximately Correct (i.e., PAC) learning is a core concept of sample complexity theory, a...
We present a new perspective for investigating the Probably Approximate Correct (PAC) learnability o...
In a variety of PAC learning models, a tradeo between time and information seems to exist: with unl...
AbstractThis paper focuses on a general setup for obtaining sample size lower bounds for learning co...
We study the sample complexity of learning neural networks by providing new bounds on their Rademach...
Zero temperature Gibbs learning is considered for a connected committee machine with K hidden units....
We study the problem of determining, for a class of functions H , whether an unknown target function...
AbstractSome basic issues in the statistical mechanics of learning from examples are reviewed. The a...
The thesis explores efficient learning algorithms in settings which are more restrictive than the PA...
AbstractThe PAC model of learning and its extension to real valued function classes provides a well-...
We study the problem of determining, for a class of functions ¡, whether an unknown target function ...
AbstractWe provide some theoretical results on sample complexity of PAC learning when the hypotheses...
This paper focuses on a general setup for obtaining sample size lower bounds for learning concept cl...
There are many types of activity which are commonly known as ‘learning’. Here, we shall discuss a ma...
AbstractSome basic issues in the statistical mechanics of learning from examples are reviewed. The a...
Probably Approximately Correct (i.e., PAC) learning is a core concept of sample complexity theory, a...
We present a new perspective for investigating the Probably Approximate Correct (PAC) learnability o...
In a variety of PAC learning models, a tradeo between time and information seems to exist: with unl...
AbstractThis paper focuses on a general setup for obtaining sample size lower bounds for learning co...
We study the sample complexity of learning neural networks by providing new bounds on their Rademach...
Zero temperature Gibbs learning is considered for a connected committee machine with K hidden units....
We study the problem of determining, for a class of functions H , whether an unknown target function...
AbstractSome basic issues in the statistical mechanics of learning from examples are reviewed. The a...
The thesis explores efficient learning algorithms in settings which are more restrictive than the PA...
AbstractThe PAC model of learning and its extension to real valued function classes provides a well-...
We study the problem of determining, for a class of functions ¡, whether an unknown target function ...