AbstractLearning from data with generalization capability is studied in the framework of minimization of regularized empirical error functionals over nested families of hypothesis sets with increasing model complexity. For Tikhonov's regularization with kernel stabilizers, minimization over restricted hypothesis sets containing for a fixed integer n only linear combinations of all n-tuples of kernel functions is investigated. Upper bounds are derived on the rate of convergence of suboptimal solutions from such sets to the optimal solution achievable without restrictions on model complexity. The bounds are of the form 1/n multiplied by a term that depends on the size of the sample of empirical data, the vector of output data, the Gram matrix...
This thesis studies the problem of supervised learning using a family of machines, namely kernel lea...
We consider learning algorithms induced by regularization methods in the regression setting. We sho...
Intuitively, we expect that averaging --- or bagging --- different regressors with low correlation ...
Under mild assumptions on the kernel, we obtain the best known error rates in a regularized learning...
Learning from data under constraints on model complexity is studied in terms of rates of approximate...
AbstractWe consider the regression problem by learning with a regularization scheme in a data depend...
Various regularization techniques are investigated in supervised learning from data. Theoretical fea...
AbstractMany learning algorithms use hypothesis spaces which are trained from samples, but little th...
We develop some new error bounds for learning algorithms induced by regularization methods in the re...
Abstract. We consider the problem of determining a model for a given system on the basis of experime...
AbstractWe consider a coefficient-based regularized regression in a data dependent hypothesis space....
We define notions of stability for learning algorithms and show how to use these notions to derive g...
In this work we study performances of different machine learning models by focusing on regularizatio...
This dissertation is about learning representations of functions while restricting complexity. In ma...
This dissertation is about learning representations of functions while restricting complexity. In ma...
This thesis studies the problem of supervised learning using a family of machines, namely kernel lea...
We consider learning algorithms induced by regularization methods in the regression setting. We sho...
Intuitively, we expect that averaging --- or bagging --- different regressors with low correlation ...
Under mild assumptions on the kernel, we obtain the best known error rates in a regularized learning...
Learning from data under constraints on model complexity is studied in terms of rates of approximate...
AbstractWe consider the regression problem by learning with a regularization scheme in a data depend...
Various regularization techniques are investigated in supervised learning from data. Theoretical fea...
AbstractMany learning algorithms use hypothesis spaces which are trained from samples, but little th...
We develop some new error bounds for learning algorithms induced by regularization methods in the re...
Abstract. We consider the problem of determining a model for a given system on the basis of experime...
AbstractWe consider a coefficient-based regularized regression in a data dependent hypothesis space....
We define notions of stability for learning algorithms and show how to use these notions to derive g...
In this work we study performances of different machine learning models by focusing on regularizatio...
This dissertation is about learning representations of functions while restricting complexity. In ma...
This dissertation is about learning representations of functions while restricting complexity. In ma...
This thesis studies the problem of supervised learning using a family of machines, namely kernel lea...
We consider learning algorithms induced by regularization methods in the regression setting. We sho...
Intuitively, we expect that averaging --- or bagging --- different regressors with low correlation ...