CBCL-252 This paper presents an approach to model selection for regularized least-squares on reproducing kernel Hilbert spaces in the semi-supervised setting. The role of effective dimension was recently shown to be crucial in the definition of a rule for the choice of the regularization parameter, attaining asymptotic optimal performances in a minimax sense. The main goal of the present paper is showing how the effective dimension can be replaced by an empirical counterpart while conserving optimality. The empirical effective dimension can be computed from independent unlabelled samples. This makes the approach particularly appealing in the semi-supervise
Learning from data under constraints on model complexity is studied in terms of rates of approximate...
International audienceRegularization is used to find a solution that both fits the data and is suffi...
In this work we are interested in the problems of supervised learning and variable selection when th...
This paper presents an approach to model selection for regularized least-squares on reproducing kern...
We develop a theoretical analysis of generalization performances of regularized least-squares on rep...
We develop a theoretical analysis of the generalization perfor-mances of regularized least-squares a...
We develop a theoretical analysis of the generalization perfor- mances of regularized least-squares ...
We consider a learning algorithm generated by a regularization scheme with a concave regularizer for...
AbstractA standard assumption in theoretical study of learning algorithms for regression is uniform ...
We investigate the problem of model selection for learning algorithms depending on a continuous para...
Abstract. We provide sample complexity of the problem of learning halfspaces with monotonic noise, u...
The note proposes an efficient nonlinear identification algorithm by combining a locally regularized...
The paper proposes to combine a locally regularized orthogonal least squares (LROLS) model selection...
The paper proposes to combine a locally regularized orthogonal least squares (LROLS) model selection...
Various regularization techniques are investigated in supervised learning from data. Theoretical fea...
Learning from data under constraints on model complexity is studied in terms of rates of approximate...
International audienceRegularization is used to find a solution that both fits the data and is suffi...
In this work we are interested in the problems of supervised learning and variable selection when th...
This paper presents an approach to model selection for regularized least-squares on reproducing kern...
We develop a theoretical analysis of generalization performances of regularized least-squares on rep...
We develop a theoretical analysis of the generalization perfor-mances of regularized least-squares a...
We develop a theoretical analysis of the generalization perfor- mances of regularized least-squares ...
We consider a learning algorithm generated by a regularization scheme with a concave regularizer for...
AbstractA standard assumption in theoretical study of learning algorithms for regression is uniform ...
We investigate the problem of model selection for learning algorithms depending on a continuous para...
Abstract. We provide sample complexity of the problem of learning halfspaces with monotonic noise, u...
The note proposes an efficient nonlinear identification algorithm by combining a locally regularized...
The paper proposes to combine a locally regularized orthogonal least squares (LROLS) model selection...
The paper proposes to combine a locally regularized orthogonal least squares (LROLS) model selection...
Various regularization techniques are investigated in supervised learning from data. Theoretical fea...
Learning from data under constraints on model complexity is studied in terms of rates of approximate...
International audienceRegularization is used to find a solution that both fits the data and is suffi...
In this work we are interested in the problems of supervised learning and variable selection when th...